Working Alongside an AI 'Teammate' Encourages Participation from Women
The AI boosts productivity among women on male-dominated teams
An artificial intelligence-powered virtual teammate with a female voice boosts participation and productivity among women on teams dominated by men, according to new Cornell research.
The findings suggest that the gender of an AI’s voice can positively tweak the dynamics of gender-imbalanced teams and could help inform the design of bots used for human-AI teamwork, researchers said.
“I wasn’t expecting that having an AI agent on the team would replicate many similar outcomes other researchers have observed with human-only, gender-imbalanced teams,” said Angel Hsing-Chi Hwang, a postdoctoral associate in information science in the Cornell Ann S. Bowers College of Computing and Information Science. “That was a surprise to me.”
Hwang is the lead author of “The Sound of Support: Gendered Voice Agent as Support to Minority Teammates in Gender-Imbalanced Team,” which received an honorable mention for the Best Paper award at the Association for Computing Machinery (ACM) CHI Conference on Human Factors in Computing Systems, held May 11-16.
The findings mirror previous research in psychology and organizational behavior that shows minority teammates are more likely to participate if the team adds members similar to them, Hwang said.
“But hiring a new person to fill out a team in real-time isn’t realistic,” said Hwang, who studies AI’s impact on work practices. “Our thought was: What if we had on-demand AI agents that can participate and hopefully change team dynamics in a good way?”
To better understand how AI can help gender-imbalanced teams, Hwang and Andrea Stevenson Won, associate professor of communication in the College of Agriculture and Life Sciences and the paper’s co-author, carried out an experiment with around 180 men and women who were assigned to groups of three and asked to collaborate virtually on a set of tasks (the study only included participants who identified as either male or female).
Each group had either one woman or one man and a fourth agent in the form of an abstract shape with either a male or female voice, which would appear on screen and read instructions, contribute an idea and handle timekeeping. There was a catch – the bot wasn’t completely automated. In what’s referred to in human-computer interaction as a “Wizard of Oz” experiment, Hwang was behind the scenes, feeding lines generated by ChatGPT into the bot.
After the experiment, Hwang and Won analyzed the chat logs of team conversations to determine how often participants offered ideas or arguments. They also asked participants to reflect on the level of support offered, their team’s experience, and whether they personally felt marginalized, either by their human teammates or the bot.
“When we looked at participants’ actual behaviors, that’s where we started to see differences between men and women and how they were reacting when there was either a female agent or a male agent on the team,” she said.
“One interesting thing about this study is that most participants didn’t express a preference for a male- or female-sounding voice,” Won said. “This implies that people’s social inferences about AI can be influential even when people don’t believe they are important.”
When women were in the minority, they participated more when the AI’s voice was female, while men in the minority were more talkative but were less focused on tasks when working with a male-sounding bot, researchers found. Unlike the men, women reported significantly more positive perceptions of the AI teammate when women were the minority members, according to researchers.
“With only a gendered voice, the AI agent can provide a small degree of support to women minority members in a group,” said Hwang, who will join the faculty of the Annenberg School for Communication and Journalism at the University of Southern California this Fall.
As to why, Hwang points to existing research into team dynamics.
“We often feel more at ease and thus work better on teams with people who are like us,” she said.
-Note: This news release was originally published on the Cornell University website. As it has been republished, it may deviate from our style guide.