Machine Learning Driven Channel Thickness Optimization in Dual-Layer Oxide Thin-Film Transistors for Advanced Electrical Performance

Adv Sci (Weinh). 2023 Dec;10(36):e2303589. doi: 10.1002/advs.202303589. Epub 2023 Nov 20.

Abstract

Machine learning (ML) provides temporal advantage and performance improvement in practical electronic device design by adaptive learning. Herein, Bayesian optimization (BO) is successfully applied to the design of optimal dual-layer oxide semiconductor thin film transistors (OS TFTs). This approach effectively manages the complex correlation and interdependency between two oxide semiconductor layers, resulting in the efficient design of experiment (DoE) and reducing the trial-and-error. Considering field effect mobility (𝜇) and threshold voltage (Vth ) simultaneously, the dual-layer structure designed by the BO model allows to produce OS TFTs with remarkable electrical performance while significantly saving an amount of experimental trial (only 15 data sets are required). The optimized dual-layer OS TFTs achieve the enhanced field effect mobility of 36.1 cm2 V-1 s-1 and show good stability under bias stress with negligible difference in its threshold voltage compared to conventional IGZO TFTs. Moreover, the BO algorithm is successfully customized to the individual preferences by applying the weight factors assigned to both field effect mobility (𝜇) and threshold voltage (Vth ).

Keywords: Bayesian optimization; design of experiment; dual-layer channel; machine learning; oxide semiconductors; thin film transistors.