Revisiting the political biases of ChatGPT

Front Artif Intell. 2023 Oct 20:6:1232003. doi: 10.3389/frai.2023.1232003. eCollection 2023.

Abstract

Although ChatGPT promises wide-ranging applications, there is a concern that it is politically biased; in particular, that it has a left-libertarian orientation. Nevertheless, following recent trends in attempts to reduce such biases, this study re-evaluated the political biases of ChatGPT using political orientation tests and the application programming interface. The effects of the languages used in the system as well as gender and race settings were evaluated. The results indicate that ChatGPT manifests less political bias than previously assumed; however, they did not entirely dismiss the political bias. The languages used in the system, and the gender and race settings may induce political biases. These findings enhance our understanding of the political biases of ChatGPT and may be useful for bias evaluation and designing the operational strategy of ChatGPT.

Keywords: ChatGPT; algorithm bias; large-language model; natural language processing; political bias.

Grants and funding

This research presented in this paper was supported by JSPS KAKENHI (grant number 21H03545).