Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation

Search Page

Filters

My NCBI Filters

Results by year

Table representation of search results timeline featuring number of search results per year.

Year Number of Results
1898 1
1946 1
1963 1
1974 2
1978 1
1981 1
1983 2
1986 2
1987 2
1988 3
1990 3
1993 2
1994 1
1995 2
1997 3
1998 1
1999 2
2000 1
2001 2
2002 1
2003 4
2004 2
2006 5
2007 3
2008 1
2009 7
2010 8
2011 10
2012 9
2013 8
2014 9
2015 26
2016 19
2017 29
2018 34
2019 34
2020 49
2021 71
2022 52
2023 64
2024 29

Text availability

Article attribute

Article type

Publication date

Search Results

447 results

Results by year

Filters applied: . Clear all
The following term was not found in PubMed: kottititum
Page 1
Post-Contextual-Bandit Inference.
Bibaut A, Chambaz A, Dimakopoulou M, Kallus N, van der Laan M. Bibaut A, et al. Adv Neural Inf Process Syst. 2021 Dec;34:28548-28559. Adv Neural Inf Process Syst. 2021. PMID: 35785105 Free PMC article.
Contextual bandit algorithms are increasingly replacing non-adaptive A/B tests in e-commerce, healthcare, and policymaking because they can both improve outcomes for study participants and increase the chance of identifying good or even best policies. To support credible i …
Contextual bandit algorithms are increasingly replacing non-adaptive A/B tests in e-commerce, healthcare, and policymaking because th …
The Evolution of the One-Armed Bandit.
Kowey PR, Robinson VM. Kowey PR, et al. J Am Coll Cardiol. 2019 Nov 12;74(19):2376-2378. doi: 10.1016/j.jacc.2019.09.019. J Am Coll Cardiol. 2019. PMID: 31699277 Free article. No abstract available.
Baricitinib and β-Cell Function in Patients with New-Onset Type 1 Diabetes.
Waibel M, Wentworth JM, So M, Couper JJ, Cameron FJ, MacIsaac RJ, Atlas G, Gorelik A, Litwak S, Sanz-Villanueva L, Trivedi P, Ahmed S, Martin FJ, Doyle ME, Harbison JE, Hall C, Krishnamurthy B, Colman PG, Harrison LC, Thomas HE, Kay TWH; BANDIT Study Group. Waibel M, et al. N Engl J Med. 2023 Dec 7;389(23):2140-2150. doi: 10.1056/NEJMoa2306691. N Engl J Med. 2023. PMID: 38055252 Clinical Trial.
CONCLUSIONS: In patients with type 1 diabetes of recent onset, daily treatment with baricitinib over 48 weeks appeared to preserve beta-cell function as estimated by the mixed-meal-stimulated mean C-peptide level. (Funded by JDRF International and others; BANDIT Australian …
CONCLUSIONS: In patients with type 1 diabetes of recent onset, daily treatment with baricitinib over 48 weeks appeared to preserve beta-cell …
A Contextual-Bandit-Based Approach for Informed Decision-Making in Clinical Trials.
Varatharajah Y, Berry B. Varatharajah Y, et al. Life (Basel). 2022 Aug 21;12(8):1277. doi: 10.3390/life12081277. Life (Basel). 2022. PMID: 36013456 Free PMC article.
We also evaluated a context-free multi-arm-bandit-based approach, using the same dataset, to showcase the benefits of our approach. ...The contextual-bandit and multi-arm bandit approaches provide 72.63% and 64.34% gains, respectively, compared to a random as …
We also evaluated a context-free multi-arm-bandit-based approach, using the same dataset, to showcase the benefits of our approach. . …
Bandit tusks.
[No authors listed] [No authors listed] Nat Biotechnol. 2018 Nov 9;36(11):1032. doi: 10.1038/nbt1118-1032a. Nat Biotechnol. 2018. PMID: 30412197 No abstract available.
Some performance considerations when using multi-armed bandit algorithms in the presence of missing data.
Chen X, Lee KM, Villar SS, Robertson DS. Chen X, et al. PLoS One. 2022 Sep 12;17(9):e0274272. doi: 10.1371/journal.pone.0274272. eCollection 2022. PLoS One. 2022. PMID: 36094920 Free PMC article.
When comparing the performance of multi-armed bandit algorithms, the potential impact of missing data is often overlooked. ...We focus on two-armed bandit algorithms with binary outcomes in the context of patient allocation for clinical trials with relatively small …
When comparing the performance of multi-armed bandit algorithms, the potential impact of missing data is often overlooked. ...We focu …
Uncertainty and Exploration.
Gershman SJ. Gershman SJ. Decision (Wash D C ). 2019 Jul;6(3):277-286. doi: 10.1037/dec0000101. Epub 2018 Oct 1. Decision (Wash D C ). 2019. PMID: 33768122 Free PMC article.
Random exploration algorithms are sensitive to total uncertainty across actions, whereas directed exploration algorithms are sensitive to relative uncertainty. This paper reports a multi-armed bandit experiment in which total and relative uncertainty were orthogonally mani …
Random exploration algorithms are sensitive to total uncertainty across actions, whereas directed exploration algorithms are sensitive to re …
Adversarial bandit approach for RIS-aided OFDM communication.
Ahmed Ouameur M, Anh LDT, Massicotte D, Jeon G, de Figueiredo FAP. Ahmed Ouameur M, et al. EURASIP J Wirel Commun Netw. 2022;2022(1):111. doi: 10.1186/s13638-022-02184-6. Epub 2022 Nov 17. EURASIP J Wirel Commun Netw. 2022. PMID: 36411764 Free PMC article. Review.
We propose two low-training overhead and energy-efficient adversarial bandit-based schemes with outstanding performance gains when compared to DL-based reflection beamforming reference methods. ...
We propose two low-training overhead and energy-efficient adversarial bandit-based schemes with outstanding performance gains when co …
Cascaded Algorithm Selection With Extreme-Region UCB Bandit.
Hu YQ, Liu XH, Li SQ, Yu Y. Hu YQ, et al. IEEE Trans Pattern Anal Mach Intell. 2022 Oct;44(10):6782-6794. doi: 10.1109/TPAMI.2021.3094844. Epub 2022 Sep 14. IEEE Trans Pattern Anal Mach Intell. 2022. PMID: 34232866
While the lower-level process employs an anytime tuning approach, the upper-level process is naturally formulated as a multi-armed bandit, deciding which algorithm should be allocated one more piece of time for the lower-level tuning. To achieve the goal of finding the bes …
While the lower-level process employs an anytime tuning approach, the upper-level process is naturally formulated as a multi-armed bandit
447 results