Voice AI Ethics
Aug 10, 2022

How Ethics is Impacting the Future of Voice AI Design

As with any new and upcoming technology, voice AI has experienced waves of questions and uncertainty over its privacy policies and data collection over the years. Especially since early practices of voice assistants—we’re looking at you, big tech—caused quite a stir and some skepticism over whether voice assistants had their users’ best interests at heart. 

In this digital age, privacy concerns are only growing. A study by Pew Research revealed that 79% of Americans are concerned about the way their data is being used by companies. It’s now more important than ever to build trust with users, and companies that do will benefit in both the short and long term. 

Voice assistants are relatively new to the game as far as technology goes, with major players only stepping up to the board within the last decade and conversational capabilities just starting to truly innovate the user experience within the last five years. While automotive and call centers are some of the earliest adopters of conversational AI, many industries are still just beginning to learn about the benefits. This adoption lifecycle gives companies a unique opportunity to get ahead of ethical concerns and lay the foundation for a strong relationship with their users. 

We’ve gathered tips on what you need to consider about ethics at every step of your voice AI journey—from choosing a platform all the way through to future iterations. Take a deep dive into every stage of development or jump ahead to where you are in the process. With ethical insights from top voice AI influencers, your inner philosopher will be more than satisfied. 

Step 1 – Choosing a voice AI platform with privacy in mind

When considering a voice AI solution, brands have a few options to consider—build, “rent”, or partner. It’s essential to choose the right solution from the very beginning. Otherwise, time to market could be delayed, or unforeseen consequences could leave brands in the dark about their users’ behaviors and privacy controls. 

Brands can build their own solution, which gives them ample customization and control over data but could take years to create, time other companies spend fostering relationships with their users. They could “rent” a solution, which has initial benefits of quick development and cost savings but can lead to long-term consequences (we’ll talk more on this in a bit). Or, they can partner with a voice AI platform that gives them control over data and privacy while shortening the time to market. 

“We are seeing many brands and businesses expanding beyond the boundaries of Alexa and Google to build their own custom assistants—this raises new concerns when it comes to privacy and trust. Whilst consumers aren’t always trusting of the big platforms—we at least have some insight into how they are handling data and the measures in place to protect us. We need to ensure that as we interact with many more custom assistants from brands, the same checks and balances are in place when it comes to data protection, privacy, and safeguarding.”

James Poulter
CEO of Vixen Labs

While a big tech solution may seem enticing to many brands looking for quick and easy development, there are many aspects of privacy and data that need to be evaluated. When “renting” a voice AI platform, brands don’t have any control over their users’ privacy options or the company’s data collection policies. 

It’s generally best practice for brands to build trust with their users by giving them the choice of whether to opt-in or out of data collection and being transparent about how their data is used and who sees it. With Big Tech, brands don’t have that control. Big tech could sell your users’ data or use it to make their own voice assistant that prices yours out of the market. 

“Though Big Tech apps and integrations deliver a level of welcome familiarity to consumers, the Apple example demonstrates a kind of Trojan Horse scenario. By permitting dominant tech companies to grow their presence within connected vehicles, manufacturers may ultimately see those companies intercept their own relationships with customers.”

Zubin Irani
CRO of SoundHound

With a custom voice assistant, brands can have complete control over privacy settings, communication and user transparency, and data collection. Data is an incredibly important tool to improve upon the voice assistant with fine-tuned iterations, and brands that are locked out from viewing their voice assistant’s accuracy and speed and users’ preferences may fall behind the competition. 

To build, “rent”, or partner for a voice AI solution is a careful decision with many ethical concerns. On this first step into the voice-first era, brands will need to weigh the pros and cons of each solution to choose the avenue that works best for their brand, product, and users. 

Learn more about whether to build, “rent”, or partner here

“Another key aspect is that brands do not want to be tied in with Big Tech vendors. They want the freedom to choose the best of all worlds and to be able to work with innovative players, who are customer-focused, act in the brand’s best interest, and with whom they do not have any conflict of interest and provide flexible commercial solutions.”

Roy Lindermann
CMO of Readspeaker

Step 2 – Creating a diverse and accessible voice AI solution 

Once a voice AI strategy has been chosen, it’s time to get down to business and craft an exceptional experience. There are ethical concerns and considerations to be addressed at each stage of the development and design process, and it’s important to have ethics in mind from the very beginning. While a voice assistant can always be iterated upon in the future, not considering some ethical considerations during early development can cost time and money when work has to be backtracked. 

When crafting a diverse and accessible voice AI solution, it’s important to consider the following pieces: 

  • A diverse set of employees who create it 
  • Testing the experience with a broad set of users
  • A voice assistant that can understand multiple accents and languages
  • Using a connectivity solution that meets the privacy needs of your users

Voice AI ethics go beyond the basics of data and privacy, with diversity, biases, and accessibility also essential elements to consider when building a voice assistant. From the very beginning, engineering teams built with a diverse range of ages, cultures, races, genders, and sexual orientations can help craft a voice assistant free from biases. The same principle can also be applied to user testing. Diversity in testing will help teams get the widest range of feedback and the most accurate reflection of your potential user base. 

“Ethics shouldn’t be a required “exercise”, but a value deeply ingrained in the company’s culture. At every level of the organization, individuals need to understand how their day-to-day decision-making impacts areas of ethics, including integrity, responsibility, fairness, and privacy. Leadership needs to foster a supportive environment that appropriately allocates time, energy, and resources to making the right ethical decisions. Examples might include more product roadmap time to account for accessibility and design, seeking out a diverse set of users for testing, and building in the right layers of privacy and protection. Ethics isn’t about avoiding a scandal, but about inheriting a “do the right thing” mindset from the very beginning.”

Kim Conti 
VP of Product at RAIN

Creating a voice assistant that can understand other accents and languages will not only increase your audience range but also make your voice assistant more accessible to users of all cultures and spread the company’s values of inclusivity. If the voice assistant doesn’t understand the user, they’ll grow frustrated and ultimately stop using it. Excluding specific portions of the population, even unintentionally though lack of training data, can contribute to a less inclusive voice experience.. 

Brands can also choose a connectivity solution that best meets the privacy needs of their users. For example, Edge voice assistants don’t store information in the cloud but only locally on the device. This could work well for industries dealing with sensitive user information, such as healthcare, hospitality, or banking. 

There are many elements to consider when designing a voice assistant, and ethics is one of the many pieces that add up to creating a voice experience that has users top of mind. 

Want to learn more about ethics in voice AI design? Dive into our blog, “The Role of Ethics in Voice Assistant Design”

Step 3 – Communicating your voice AI privacy standards to users

The voice assistant has been built and launched and is out in the world for users to experience. Now, it’s time to clearly communicate privacy standards and build trust with users around how their data is being used and who is seeing it. 

Data collection not only helps improve the voice assistant but also has many benefits for users, including personalization and proactive suggestions. With a custom voice assistant, brands can give users the choice of whether to opt-in or out of data collection—building brand loyalty and trust.

As much as brands need to deliver a fast and accurate voice assistant, they also need to have a reputation for putting their users first with honest and transparent privacy policies. If “renting” from a big tech provider, your brand reputation will be subordinate to users’ opinions of the big tech company.

“A major concern for the foreseeable future will be how voice data can be combined with other types of data to produce unintended consequences. We have very little insight into what behaviors and actions are predicted when we combine the sound of our voice, the words we speak, the situation we’re in, and other socioeconomic factors. Consequently, data arising out of people talking to technology should be policed closely.”

Bradley Metrock
CEO of Project Voice

There are many building blocks and considerations that go into creating a voice assistant from start to finish, but ethics should not be one of the ones that get pushed off until later or only considered after deployment. As voice assistant adoption grows and more eyes turn towards the new and evolving technology, brands with ethical strategies already in place will be the ones that users will turn to. 

At SoundHound, we have all the tools and expertise needed to create custom voice assistants and a consistent brand voice. Explore SoundHound’s independent voice AI platform at SoundHound.com and register for a free account here. Want to learn more? Talk to us about how we can help bring your voice strategy to life.

Kristen Stephens

Kristen is a content writer with a passion for storytelling and marketing. When she’s not writing, she’s hiking, reading, and spending time with her nieces and nephew.

Interested in Learning More?

Subscribe today to stay informed and get regular updates from SoundHound Inc.

Subscription Form Horizontal