ANN ARBOR, Mich. — Everybody makes mistakes, even machines! When our robotic friends make an error, however, researchers from the University of Michigan find people tend to be much less forgiving than if a fellow human messed up. Moreover, once a robot loses a human’s trust, it’s very difficult to rebuild the robotic relationship.

Study authors report that when robots make mistakes, their human co-workers inherently see them as untrustworthy. Luckily, researchers examined four strategies that aim to repair and mitigate the negative impact of such mechanical mistakes on human-robot relations: apologies, denials, explanations, and promises on trustworthiness.

3 strikes and you’re out for robots?

A group of 240 human participants took part in this study, collaborating with a robot co-worker to accomplish a task. Sometimes, the robot would make mistakes during the experiment. In these instances, after committing an error and violating their human partner’s trust, the robot would try one of the four repair strategies. Importantly, the results reveal that after three mistakes, not a single one of those repair strategies ever fully repaired trustworthiness. In other words, three strikes and you’re out, bot!

“By the third violation, strategies used by the robot to fully repair the mistrust never materialized,” says Connor Esterwood, a researcher at the U-M School of Information and the study’s lead author, in a university release.

Study co-author Robert Lionel, a professor of information, and Esterwood note these findings also introduce theories of forgiving, forgetting, informing, and misinforming. This leads to two major implications, according to the research team.

First, scientists should prioritize the development of more effective repair strategies to help robots better repair trust after mistakes. Additionally, robots “need” to be sure that they have mastered a particular novel task before even attempting to repair a human’s trust in them.

“If not, they risk losing a human’s trust in them in a way that can not be recovered,” Esterwood explains.

Many robots don’t get a second chance

Study authors also believe these results may go beyond man-robot relations. Even on a human-to-human level, it may be near impossible to fully repair trust using solely apologies, denials, explanations, or promises.

“Our study’s results indicate that after three violations and repairs, trust cannot be fully restored, thus supporting the adage ‘three strikes and you’re out,’” Prof. Lionel adds. “In doing so, it presents a possible limit that may exist regarding when trust can be fully restored.”

Esterwood states that many robots don’t even receive a second chance. Even when a robot can do better after making a mistake and adapting after that mistake, it may not get the opportunity to do so. Thus, study authors argue humanity misses out on the benefits of robots.

Prof. Lionel adds that people may try working around or bypassing the robot, consequently reducing their performances. This type of strategy could eventually lead to performance problems and eventual firing for either lack or performance or noncompliance.

The findings appear in the journal Computers in Human Behavior.

YouTube video

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *