AI Startup ‘Accent Translation’ Technology Accused by Critics of Racism

  • Tech startup Sanas has been accused of racism for its ‘accent translation’ technology.
  • AI and tech industry experts say the startup’s mission legitimizes racism and is a form of “digital whitewashing.”
  • But some call center agents told Insider they think the technology would improve their day jobs.

“Accent translation” startup Sanas faced accusations of racism and discrimination last week after being accused of manipulating non-American accents to sound “whiter.” The company uses speech recognition technology to change a user’s accent in near real time; their main target appears to be foreign call center employees.

Sharath Keshava Narayana, co-founder and COO of Sanas, denied that the startup’s technology is discriminatory, telling Insider that the company always intended to expand its translation model to include other accents. The demo on her website, where the technology translates an Indian accent into a standard American one, shows only her initial model, according to Keshava Narayana.

“It’s not just that an American has trouble understanding someone from India, but also vice versa,” Keshava Narayana told Insider. “As we continue to scale the product and start to see more and more target accents, we believe this will be a localized solution.”

Sanas has been testing translation models in other countries, such as India and the Philippines, and also plans to introduce accent translation in Latin America and Korea, according to the startup.

However, some experts within the tech industry have accused the startup’s product of being a form of “digital whitewashing.” AI and tech angel investor and CEO of the female-led computer programming group FrauenLoop Nakeema Stefflbauer told Insider that the problem with Sanas’ response is that “accents indicate power and belonging.”

See also  New technology coming to Birmingham's Real Time Crime Center: 'We have to stay one step ahead' of criminals, says boss

“When this is marketed, there is only one direction that everyone is herded in,” he said. “It’s not so much about understanding as it is about comfort, for groups that don’t want to understand, empathize, or engage with people who are different at all. This technology does nothing to ensure the comfort of the hypothetical call center worker.”

He added that until Sanas advertises this technology to clients in the global South as a tool to better understand and communicate with Americans and Western Europeans, then “it is a one-way ‘solution’ that reinforces racialized hierarchies, whether intended or No”.

Tech and AI industry insiders and call center workers spoke to Insider about what they saw as the cultural costs, as well as potential benefits, arising from Sanas. While the company says the goal of its technology is to make people on the phone around the world sound more “local,” Stefflbauer and others in the AI ​​field worry it’s another step toward homogenization. from the world of start-ups, something Silicon Valley has been repeatedly accused of. to perpetuate.

“What is this trying to tell us in terms of what the future sounds like and how we should all be experiencing voices online and communicating with people?” Stefflbauer said. “Who are the people we should communicate with and who are the people we never hear from?”

founders of sana

The founding team of Sanas.


Tech industry insiders say accent ‘translation’ is a form of ‘digital whitening’

Sanas, who has raised $32 million in funding, says her goal is to help people sound “more local, global” on her website. In an interview with the BBC, Keshava Narayana said that 90% of the company’s employees and its four founders are immigrants, and denied criticism that the company tries to make the world look “white and American”.

But Mia Shah-Dand, founder of Women in AI Ethics and Lighthouse3, told Insider that as an immigrant from India with a non-American accent, she found Sanas’ announcement “very triggering,” especially as someone who has been “mocked.” . and discriminated against [their] accent.”

She said technology is trying to erase what makes people unique and telling them that who they are “isn’t good enough.”

“It seems like everything in Silicon Valley, as long as it’s legitimized by Stanford or MIT, is fine,” he said. “People will accept racism, they will accept sexism, as long as the people who do it belong to one of these prestigious universities.”

Shah-Dand added that the Sanas product reinforces a power dynamic that “goes back to the days of colonialism.” Instead of addressing the root causes of racism and discrimination, “accent translation” leans toward a form of “whitening,” a power dynamic seen in many historically colonized countries where people felt pressured to whiten your skin to fit European beauty standards.

“It’s Silicon Valley’s version of digital whitening,” Shah-Dand said. “Instead of technology making the world a better place, it’s amplifying, it’s helping, it’s monetizing all that hate and racism instead of trying to fix anything.”

Stefflbauer told Insider that he found Sanas’ technology “really disappointing and unsettling,” especially in the growing culture of bringing “the whole self to work.”

“Only certain people can bring their whole selves, everyone outside of this mythical norm is not invited to bring anything of themselves,” he said, referring to the 2018 dark surreal comedy “Sorry to Bother You,” where a salesman black phone discovers that the new doors of professional success open for him only after he adopts a “white” sounding voice.

“It’s really another example of what we’re up against in terms of trying to make the tech industry and the products and services that come out of it reflect the real world,” Stefflbauer said.

He added that he doesn’t see how this technology would really address racial bias in any way.

“You don’t even try to come close to that in your solution,” he said. “It basically offers support and cover for people who will misbehave with accentuated people that they have some sort of deal with to continue to do so.”

Call center agents told Insider they face racialized hostility

The founders of Sanas said they came up with the idea for the startup after a friend from Stanford University underperformed at his call center job because of his thick Central American accent.

Call center agents Insider spoke with said their jobs can be brutal, and doubly so if they have a racially distinctive accent or name.

“Unfortunately in this world, there are many people who will feel better than you or choose to talk down to you when they hear your accent,” Dafina Swann, who has worked in call centers for more than five years, said.

Swann, who is from Trinidad and Tobago, said he received a lot of “hostile” and “negative” comments from people who demanded he speak to someone from the United States. He had also heard of cases where colleagues were given racist names, such as the n-word, and told they were “not human, they were black.”

To minimize the racial backlash they face, some call center agents told Insider that they already try to mimic customers’ accents and even change their names. Sometimes the directive to change their names comes from the agents’ managers or employers.

“After I started introducing myself as Michael O’Connor, my performance ratings from customer surveys went up: all green, green, green,” Osama Badr, a call center agent from Egypt, told Insider.

Sanas co-founder Keshava Narayana said he also had a similar experience while working in a call center, where he underwent six weeks of accent training and was told to change his name to “Ethan.”

“There are certain incidents that stick with you for a long time, and this was one of them,” he told Insider.

Some fear that manipulated voices could signal a homogenized future in technology

Shah-Dand said she’s unconvinced by the technology’s defenses, saying people are exposed to and able to understand different accents, but only because call center workers are treated as “less than” receive unfair abuse.

“There are a lot of people who have a very thick accent, like Boutros Boutros-Ghali, for example,” Shah-Dand said, referring to the former United Nations secretary-general. “But because they’re in powerful positions, you make an effort to understand.”

Stefflbauer said in his work that he is always thinking about what digital life will be like in 10 or 20 years, and he worries about what technology like Sanas predicts.

“I see more and more examples of a digital life where no one is black, no one is brown, no one has an accent, no one has a history outside of the North American mythical ideal,” Stefflbauer said. “And the question is: do we want to export this mindset and bring this misery to everyone? Because that’s definitely what it is.” Other artificial intelligence technologies, including facial recognition technologies, have also faced accusations of racism and homogenization.

“Who would be comfortable taking a selfie on Instagram and automatically changing their face to look like someone of a different race?” she said. “That’s essentially what this is.”

But call center employees who have to deal with racist comments in their daily work say a solution like the one Sanas offers could be a godsend.

“It definitely would have made my job easier. Everybody wants to be understood,” Swann said. “There’s a job that needs to be done, and if there’s something that can be implemented to make that job easier, then that’s great.”

Leave a Comment