Newsletter Newsletters Events Events Podcasts Videos Africanews
Loader
Advertisement

AI can predict your personality from chat history, study finds

AI can determine a user's personality traits with up to 60 percent accuracy, a new study finds.
AI can determine a user's personality traits with up to 60 percent accuracy, a new study finds. Copyright  Canva
Copyright Canva
By Anna Desmarais
Published on
Share Comments
Share Close Button

Artificial intelligence was able to predict personality traits such as agreeableness and emotional stability with up to 60 % accuracy, according to a new study.

Artificial intelligence can predict a user’s personality based on their chat history, a new study finds.

ADVERTISEMENT
ADVERTISEMENT

In a pre-print study, researchers from ETH Zurich asked 668 ChatGPT users from the United States and the United Kingdom to share copies of their chat history with them, and trained an AI model to infer personality traits from these chats.

The researchers collected and analysed over 62,000 chats, categorising them by the topics they focused on.

The trained AI model then tried to estimate the likelihood that the user had five personality traits known as the “Big Five” in psychology research: agreeableness, conscientiousness, emotional stability, extraversion, and openness.

The participants were also assessed in a standard psychological test to determine their major personality traits.

Their fine-tuned AI model was able to accurately detect a user’s personality traits with up to 61 % accuracy, the report found. The AI was particularly good at predicting agreeableness and emotional stability, but struggled with conscientiousness.

The AI was most successful when it had a longer chat history to analyse, suggesting that the more a person uses AI, the more likely their personalities are to be identified, the study found.

While the risks to individuals are quite small, the researchers note there are “major risks at large scale,” if personality data is leveraged by bad actors.

For example, they think this data could lead to “large-scale manipulation campaigns spreading disinformation and/or political propaganda.”

Researchers hope their findings can be used to develop tools to reduce the risk of oversharing personal data with AI, such as a system that could automatically remove identifying details.

Go to accessibility shortcuts
Share Comments

Read more