Voice assistants are ubiquitous today. Everything from smart speakers to iPhones can be controlled by simply speaking to the likes of Siri and Alexa.
However, sceptics warn that conversational platforms could perpetuate harmful stereotypes about women. They fear virtual assistants using a female voice could enforce the stereotype that women should be subservient.
Those concerns are growing in tandem with the growth of the virtual assistants market. In 2024, 8.4 billion voice assistants will be in use, according to data analysis platform Statista. In other words, there may soon be more voice assistants in the world than there are people. The market for AI platforms will reach $52bn in 2024, according research firm GlobalData’s forecasts.
How do AI-enabled voice assistants work?
Voice assistants are becoming more popular, but how do they actually work?
“Voice assistants bring together voice recognition and natural language understanding (NLU) – often via machine learning – to understand and respond to voice commands,” Marie Angselius Schönbeck, chief impact officer at conversational AI company Artificial Solutions, tells Verdict.
Most voice assistants work using varying models of conversational artificial intelligence (AI) to understand a users’ verbal commands.
“[NLU] technology identifies user intent and extracts other important information from a request,” Sanjeev Kumar, VP of EMEA at conversational AI platform Boost.ai, tells Verdict. “More advanced conversational AI technology uses automatic semantic understanding, an algorithm that works alongside deep learning models to reduce the chance of misunderstanding user intent.”
Voice assistants can vary in how they function, depending on the sophistication of the AI models underlying them, as Josep Bori, research director at GlobalData, tells Verdict.
Bori says the required AI needed for voice assistants is limited, but that simple machine learning models trained to catch voices are sufficient to provider user interfaces.
“However, virtual assistants expected to take broad general questions and instructions, such as Siri or Alexa, require more powerful AI models to interpret instructions and figure out a way to obtain a useful answer,” Bori says. “For virtual assistants to be able to smartly answer any question we expect these services to rely on a strong cloud AI capability.”
Are voice assistants misogynistic?
While virtual assistants like Siri and Alexa are backed by powerful AI models, it’s not really their technological makeup that has elicited the biggest backlash from some market watchers, but rather the prominence of female names or voices used by popular voice assistants.
In the US Siri, Alexa, Cortana, and Google Assistant – which collectively totalled an estimated 92.4% of nation’s market share for smartphone assistants back in 2018 – have traditionally featured female-sounding voices. Apple’s Siri alone made up 45.5% of the market share for smartphone assistants that year, according to Statista.
“By attributing voice assistants with female characteristics, we subconsciously attribute ‘assistance’ – completing tasks on command and generally catering to our needs – to women, in turn pedalling harmful stereotypes that women must serve the general population,” Andreas Nielsen, senior AV learning media producer at online language learning platform Babbel, tells Verdict.
A United Nations report from 2019 said the same thing, accusing these assistants of reinforcing sexist stereotypes.
Several market watchers – including researchers from the Brookings and the Italian Institute for International Political Studies – have linked the issue with the inherent lack of diversity in the tech industry.
“These models don’t exist in a vacuum but reflect a specific viewpoint and possibly the stereotypes of the people who build them,” Laura Petrone, principal analyst at GlobalData, tells Verdict. “As such, the diversity and demographics of the people who design AI models, including chatbots and virtual assistants, are critical to ensure that biases aren’t embedded.”
Petrone says better industry standards are needed to address matters such as these gender characteristics in voice assistants with input from the public.
How will the market evolve in the future?
Tech companies have recognised the issue over the years. Last year, Apple announced that Siri would not be female by default.
That being said, it is important to recognise that voice assistants can offer a smooth user experience and could make it easier for consumers to live comfortable lives. There is also scope for them to boost security.
“One way we could see voice assistants employed in the future is fraud detection in personal banking,” Kumar says. “In the next few years, a sophisticated conversational AI will be able to detect fraudulent activity from voice recognition and/or keystroke recognition.”