Xbox One owners are not safe from contractors listening on them

The past weeks have been filled with news about smart assistants not just accidentally listening to conversations but those utterances being heard by complete strangers. That, however, it turns out wasn't just limited to the likes of Google Assistant or Amazon Alexa. Even the likes of Microsoft Skype has been subjected to the same treatment so it's no surprise that even the Xbox One console is now reported to have been accidentally recording conversations for human ears to hear without the owner's knowledge.

According to the latest exposé, contractors have been able to listen to audio clips of Xbox owners saying things, intentionally or otherwise to their consoles. Xbox Ones have long been able to accept voice commands even before Cortana integration landed on the gaming platform.

Those voice commands have to be triggered using "Hey Cortana" or simply "Xbox" but, like with any smart speaker or AI assistant device, the margin for error is quite big. The feature activates when it mishears those triggers and starts acting as if the user called it. In other words, it starts recording the audio to send to Microsoft's servers for processing.

And, just like what it did with its own Cortana and Skype, Microsoft sent off a random sampling of those audio clips to contractors for humans to listen. Not to violate privacy, of course, but to improve the performance of the service's machine learning by tagging errors and misinterpreted phrases. Naturally, it can't be helped that those listeners would become privy to otherwise private and personal audio.

The Xbox One is just the latest in a string of revelations surrounding such services and, curiously, Microsoft's third. More than just a special case, it seems that almost any service that comes with speech recognition and queries may or may not have such hidden features. And almost all of them don't even say so in their terms of service, at least not explicitly.