Fresh evidence of ChatGPT’s political bias revealed by comprehensive new study

Published by  Communications

On 17th Aug 2023

A woman using an AI chat bot on her phone.
Getty images.

The artificial intelligence platform ChatGPT shows a significant and systemic left-wing bias, according to a new study by the University of East Anglia (UEA).


The team of researchers in the UK and Brazil developed a rigorous new method to check for political bias.

Published today in the journal Public Choice, the findings show that ChatGPT’s responses favour the Democrats in the US, the Labour Party in the UK, and in Brazil President Lula da Silva of the Workers’ Party.

Concerns of an inbuilt political bias in ChatGPT have been raised previously but this is the first largescale study using a consistent, evidenced-based analysis.

Lead author Dr Fabio Motoki, of Norwich Business School at the University of East Anglia, said: “With the growing use by the public of AI-powered systems to find out facts and create new content, it is important that the output of popular platforms such as ChatGPT is as impartial as possible.

“The presence of political bias can influence user views and has potential implications for political and electoral processes.

“Our findings reinforce concerns that AI systems could replicate, or even amplify, existing challenges posed by the Internet and social media.”

The researchers developed an innovative new method to test for ChatGPT’s political neutrality.

The platform was asked to impersonate individuals from across the political spectrum while answering a series of more than 60 ideological questions.

The responses were then compared with the platform’s default answers to the same set of questions – allowing the researchers to measure the degree to which ChatGPT’s responses were associated with a particular political stance.

To overcome difficulties caused by the inherent randomness of ‘large language models’ that power AI platforms such as ChatGPT, each question was asked 100 times and the different responses collected. These multiple responses were then put through a 1000-repetition ‘bootstrap’ (a method of re-sampling the original data) to further increase the reliability of the inferences drawn from the generated text.

“We created this procedure because conducting a single round of testing is not enough,” said co-author Victor Rodrigues. “Due to the model’s randomness, even when impersonating a Democrat, sometimes ChatGPT answers would lean towards the right of the political spectrum.”

A number of further tests were undertaken to ensure the method was as rigorous as possible. In a ‘dose-response test’ ChatGPT was asked to impersonate radical political positions. In a ‘placebo test’ it was asked politically-neutral questions. And in a ‘profession-politics alignment test’ it was asked to impersonate different types of professionals.

“We hope that our method will aid scrutiny and regulation of these rapidly developing technologies,” said co-author Dr Pinho Neto. “By enabling the detection and correction of LLM biases, we aim to promote transparency, accountability, and public trust in this technology,” he added.

The unique new analysis tool created by the project would be freely available and relatively simple for members of the public to use, thereby “democratising oversight,” said Dr Motoki. As well as checking for political bias, the tool can be used to measure other types of biases in ChatGPT’s responses.

While the research project did not set out to determine the reasons for the political bias, the findings did point towards two potential sources.

The first was the training dataset – which may have biases within it, or added to it by the human developers, which the developers’ ‘cleaning’ procedure had failed to remove. The second potential source was the algorithm itself, which may be amplifying existing biases in the training data.

The research was undertaken by Dr Fabio Motoki (Norwich Business School, University of East Anglia), Dr Valdemar Pinho Neto (EPGE Brazilian School of Economics and Finance - FGV EPGE, and Center for Empirical Studies in Economics - FGV CESE), and Victor Rodrigues (Nova Educação).

‘More Human than Human: Measuring ChatGPT Political Bias’ is published by Public Choice. 

This publication is based on research carried out in Spring 2023 using version 3.5 of ChatGPT and questions devised by The Political Compass.

Latest News

 
A woman in football kit
25 Sep 2023

From Psychology to the Canaries: UEA student on target for Norwich City Women’s Football Club

A University of East Anglia Psychology student has marked her fledgling Norwich City Women’s Football Club career with a quick-fire hat-trick in one of her first...

Read more >
 
A nurse interacting with a patient.
21 Sep 2023

Nurses worldwide rely on intuition to triage patients

Nurses around the world use intuition to work out how sick a patient is before triaging for treatment – according to new research from the University of East...

Read more >
 
Francessca Turrell
18 Sep 2023

UEA nursing apprentice’s sky-high dive for Alzheimer's and Dementia awareness

On Sunday 24 September, University of East Anglia (UEA) nursing apprentice Francessca Turrell will be taking part in a charity skydive for Alzheimer’s Society, a...

Read more >
 
Logo Rewind's yellow book cover with black symbols
14 Sep 2023

New book to focus on Norwich’s medieval logos

‘Logo Rewind: Trademarks of Medieval Norwich’ is a new book from UEA Publishing Project, in collaboration with CreativeUEA and featuring the work of Darren...

Read more >
Are you searching for something?
 
Logo Rewind's yellow book cover with black symbols
14 Sep 2023

New book to focus on Norwich’s medieval logos

‘Logo Rewind: Trademarks of Medieval Norwich’ is a new book from UEA Publishing Project, in collaboration with CreativeUEA and featuring the work of Darren...

Read more >
 
Ziggurats
13 Sep 2023

UEA students discover new room location following RAAC accommodation closures

Over 700 University of East Anglia (UEA) students have discovered where their new university homes will be located, following the closure of some accommodation...

Read more >
 
(L-R) Chris Law MP, Dr Martin Scott, Renu Mehta
13 Sep 2023

New report from UEA Academic asks whether UK Aid Match has been used for ‘charity washing’, ahead of Westminster event

A new report from the University of East Anglia’s Dr Martin Scott into the Government’s UK Aid Match (UKAM) scheme has led to concerns of ‘charity washing’, with...

Read more >
 
Claudio Barchiesi with his bike and a United Kingdom flag
12 Sep 2023

Pedalling with purpose: UEA student’s fundraising cycle from Italy to England

A student at the University of East Anglia (UEA) has completed a charity cycling trip from his hometown in Italy to his grandparent’s house in Suffolk, to raise...

Read more >
 
Student accommodation buildings
11 Sep 2023

University of East Anglia accommodation closes following Government RAAC guidance

Read more >
 
A gloved hand holding a petri dish
11 Sep 2023

The University of East Anglia is set to re-join Horizon Europe

Read more >
 
Two women and a man stood together smiling at the camera
07 Sep 2023

UEA celebrates nurse’s six decades of local service with Honorary Fellowship award

With more than 60 years of nursing experience, Lesley Williams’ inspirational work across the region has been recognised with an Honorary Fellowship from the...

Read more >