This is a short clip, but it deals with a very important issue in the bot and AI space. As voice assistants like Alexa get wider adoption, how do we as developers balance the need for user privacy against the value of organic user data?
Watch the clip of my conversation with Microsoft Technical Evangelist Martin Beeby and CEO/Co-Founder of The Bot Platform Syd Lawrence here, or scroll below the video to read the full transcript.
User Privacy in Environments Where Bots Are Listening (Full Transcript)
Sam Machin (Nexmo Developer Advocate & Alexa Champion): The reasonings I’ve seen for that, particularly with things like Alexa and Google Home is ‘oh, what about privacy?’ right now. And obviously by capturing the audio, you sort of don’t know what’s going on in the background or how much other information the microphone in your home scenario is very controversial, but kind of sanitizing it first.
Martin Beeby (Technical Evangelist at Microsoft): It’s clearly not a technical issue, it’s purely a privacy issue. Which is gonna be—again, for the success of these systems—that’s going to have to be something that we figure out. Privacy around voice and around audio, video as well, they’re gonna be really important things that we have to work with users about.
Because users are quite uncomfortable to share some of their stuff. And rightly so. And how do we maintain people’s privacy while also doing great, really clever things for them as well with voice assistance and video assistance?
“how do we maintain people’s privacy while also doing great, really clever things for them as well with voice assistance and video assistance?”
Sam: Yeah, because the more information we get the better the assistant can be but the more information you get, the worse the privacy case is. It’s finding a balance.
Syd Lawrence (CEO/Co-Founder of The Bot Platform): It’s only as good as the data available.
[Editor’s Note: Watch the full one-hour discussion on the state of AI bot technology.]Tags: AI, bots
This post was written by Glen Kunene