Robots and racism: We discriminate against 'darker-coloured' humanoids, study finds

Robots and racism: We discriminate against 'darker-coloured' humanoids, study finds
Copyright 
By Emma Beswick
Share this articleComments
Share this articleClose Button

Humans have similar automatic biases towards 'darker-coloured' robots as they do toward 'people with a darker skin colour', according to new research from the University of Canterbury.

ADVERTISEMENT

A new collaborative research paper has found people carry over stereotypes from humans to robots, which can have negative implications for how they react to robots of different colours.

Researchers from four universities carried out the research in which they replicated a classic social psychological ‘shooter bias’ experiment.

The experiment aims to find out if participants from different backgrounds are more likely to shoot an individual they perceived to be white or black in a split-second decision.

“Using robot and human stimuli, we explored whether these effects would generalise to robots that were racialised as black and white," researchers wrote in the paper.

"Reaction-time measures revealed that participants demonstrated `shooter-bias’ toward both black people and robots racialised as black."

“This result should be troubling for people working in social robotics given the profound lack of diversity in the robots available and under development today,” said UC human-robot interaction expert Associate Professor Christoph Bartneck from the University of Canterbury's HIT Lab NZ.

The paper also points out that most robots currently being sold or developed are either stylised with white material or have a metallic appearance.

Bartneck did, however, say there were some acceptions, such as the robots produced by the Intelligent Robotics Laboratory at Osaka University, Japan, which are modelled on the faces of specific Japanese individuals.

Another exception, he said, was the Bina 48 robot who is "racialised as Black," although again, this robot was created in the image of a particular person rather than "to serve a more general role".

One of the key authors of the paper, UC Psychology Senior Lecturer Dr Kumar Yogeeswaran, a social psychologist with expertise in the areas of diversity, social identity, stereotyping and prejudice, said the lack of social diversity in social robots "may produce all of the problematic outcomes associated with a lack of racial diversity in other fields".

Yogeeswaran said the work suggests that people respond to robots according to societal stereotypes they have for humans is "an even larger concern".

Bartneck concluded that he hoped the paper would “inspire reflection on the social and historical forces that have brought what is now quite a racially diverse community of engineers to – seemingly without recognising it – design and manufacture robots that are easily identified by those outside this community as being almost entirely ‘white’."

Share this articleComments

You might also like

Authorities find remains of missing French toddler Emile Soleil

UK AI Safety Summit: Here’s what to expect and who is attending

Free cash rumours led to queues outside banks in Ireland following technical fault