A startup is using AI to make call center workers sound ‘American’

Sanas says its aim is to cut through some Americans' biases towards minorities. Critics argue their services may make things worse.
Close up of woman's mouth and headset microphone speaking at a call center
Non-native English speakers working call center jobs often endure racism and abuse from their stateside customers. Deposit Photos

Share

American businesses have long outsourced their call center work to countries such as India, Pakistan, and the Philippines in order to cut costs and skirt stateside labor laws. With grueling, nontraditional hours, low pay, and often downright abusive interactions with consumers, telecommunications jobs can be demanding, exhausting, and demoralizing. Now, former veterans of the brutal industry say their new startup can change the way domestic consumers interact with these employees in a way that bolsters the non-native English speakers’ wellbeing, but critics of the product worry it could make things even worse.

Multiple outlets have reported this week on Sanas, a company founded in 2020 to provide companies with a proprietary AI software that works in real-time to alter users’ voices to sound more “Western.” “Using data about the sounds of different accents and how they correspond to each other, Sanas’s AI engine can transform a speaker’s accent into what passes for another one,” explains The Guardian, while adding that, “right now, the focus is on making non-Americans sound like white Americans.”

[Related: Viral AI rapper dropped from major record label for racist content.]

Although Sanas’ top brass believe the technology can improve human-to-human relations, the ethical and psychological implications are immediately apparent—providing a stopgap solution for far more deeply rooted societal issues, diminishing one’s own inherent humanity, and putting the onus on non-Westerners to cater to Western interests.

Sanas’ website sells the technology as “a step towards empowering individuals, advancing equality, and deepening empathy.” One of the company’s co-founders, Sharath Keshava Narayana, previously worked in Indian call centers. He even claims many agents working within the over 1000 call centers already using the voice-alteration product have reported positive experiences with it. Experts outside the industry, however, aren’t so sure.

[Related: How artificial intelligence exploded over the past decade.]

“One of the long-range effects is the erasure of people as individuals,” explains privacy and surveillance researcher Chris Gilliard to The Guardian. “It seems like an attempt to boil everybody down to some homogenized, mechanical voice that ignores all the beauty that comes from people’s languages and dialects and cultures. It’s a really sad thing.”

Unfortunately, it’s easy to envision this kind of software becoming extremely ubiquitous in outsourced customer service jobs. Workers already routinely adopt Americanized nicknames to sound more familiar to Western consumers, and are pressured or required to take voice “naturalization” lessons to lessen their native accents. Companies like Sanas’ may offer workarounds to these issues, but they do nothing to address the larger issues at play. In fact, it could make those issues even harder to tackle moving forward.

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.