Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation

Search Page

Filters

My NCBI Filters

Results by year

Table representation of search results timeline featuring number of search results per year.

Year Number of Results
2014 1
2017 1
2020 2
2021 2
2022 4
2023 5
2024 0

Text availability

Article attribute

Article type

Publication date

Search Results

14 results

Results by year

Filters applied: . Clear all
Page 1
Tirofiban for Stroke without Large or Medium-Sized Vessel Occlusion.
Zi W, Song J, Kong W, Huang J, Guo C, He W, Yu Y, Zhang B, Geng W, Tan X, Tian Y, Liu Z, Cao M, Cheng D, Li B, Huang W, Liu J, Wang P, Yu Z, Liang H, Yang S, Tang M, Liu W, Huang X, Liu S, Tang Y, Wu Y, Yao L, Shi Z, He P, Zhao H, Chen Z, Luo J, Wan Y, Shi Q, Wang M, Yang D, Chen X, Huang F, Mu J, Li H, Li Z, Zheng J, Xie S, Cai T, Peng Y, Xie W, Qiu Z, Liu C, Yue C, Li L, Tian Y, Yang D, Miao J, Yang J, Hu J, Nogueira RG, Wang D, Saver JL, Li F, Yang Q; RESCUE BT2 Investigators. Zi W, et al. N Engl J Med. 2023 Jun 1;388(22):2025-2036. doi: 10.1056/NEJMoa2214299. N Engl J Med. 2023. PMID: 37256974 Clinical Trial.
Automatic text classification of actionable radiology reports of tinnitus patients using bidirectional encoder representations from transformer (BERT) and in-domain pre-training (IDPT).
Li J, Lin Y, Zhao P, Liu W, Cai L, Sun J, Zhao L, Yang Z, Song H, Lv H, Wang Z. Li J, et al. BMC Med Inform Decis Mak. 2022 Jul 30;22(1):200. doi: 10.1186/s12911-022-01946-y. BMC Med Inform Decis Mak. 2022. PMID: 35907966 Free PMC article.
In the second experiment, the BERT in-domain pre-training model (AUC-0.948, F1-0.841) performed significantly better than the BERT based model(AUC-0.868, F1-0.760). Additionally, in the variants of BERT fine-tuning models, Mengzi achieved the highest AUC of 0.878 (F1-0.764 …
In the second experiment, the BERT in-domain pre-training model (AUC-0.948, F1-0.841) performed significantly better than the BERT based mod …
14 results