A new report from Vice publication Motherboard, which cites several contractors working for Microsoft, has revealed that the company allowed these workers to listen to recorded audio from Xbox One consoles, including audio from children. The goal of listening in was to improve the console’s voice command functionality, which is typically triggered by saying “Xbox” or “Hey Cortana” — the problem is that the functionality isn’t perfect and can be triggered by mistake, leading to voice recordings that should never have taken place.
Microsoft made a big push around voice commands when it first launched the Xbox One with the revised Kinect camera in November 2013. Responses to Kinect-based experiences were lukewarm at best, and so Microsoft eventually discontinued the sensor in 2017. With an attached headset, however, Xbox users can still leverage voice commands on Xbox with Cortana, a virtual assistant that was brought to Xbox in 2016.
Privacy issues have long been a concern for users of Microsoft’s products. In fact, just last week Motherboard also reported that some Microsoft contractors have listened to Skype calls, including apparent phone sex. Apple and Google have faced public backlash for similar invasions of privacy and those two corporate giants recently suspended their use of human transcribers. The Skype monitoring dates back to 2011, as part of the NSA spying program, which saw the company a bit tongue-tied about its involvement.
When Microsoft introduced the Kinect, people were naturally suspicious about a camera surveilling their living rooms and a microphone capturing audio. In 2013, Microsoft tried to address these concerns, stressing that users were in complete control of when and how the Kinect would see and hear Xbox owners.
Fast forward to 2019, and as Motherboard notes, Microsoft did recently update its privacy policy. Additionally, the company offers its users a dedicated page where they can review and delete audio recordings.
When contacted by GameDaily, Microsoft provided the following statement:
“We’ve long been clear that we collect voice data to improve voice-enabled services and that this data is sometimes reviewed by vendors. We’ve recently updated our privacy statement to add greater clarity that people sometimes review this data as part of the product improvement process. We always get customer permission before collecting voice data, we take steps to de-identify voice snippets being reviewed to protect people’s privacy, and we require that handling of this data be held to the highest privacy standards in the law. At the same time, we’re actively working on additional steps we can take to give customers more transparency and more control over how their data is used to improve products.”
A Microsoft spokesperson then followed up with us to offer an additional statement, clarifying that Xbox usage no longer results in voice recordings being reviewed as part of the normal process. Reviews are now reserved for investigating toxic behavior or other actions that violate the terms of service.
“We stopped reviewing any voice content taken through Xbox for product improvement purposes a number of months ago, as we no longer felt it was necessary, and we have no plans to re-start those reviews,” the spokesperson told GameDaily. “We occasionally review a low volume of voice recordings sent from one Xbox user to another when there are reports that a recording violated our terms of service and we need to investigate. This is done to keep the Xbox community safe and is clearly stated in our Xbox terms of service.”
The latter statement may give Xbox users more confidence in Microsoft’s reasoning for any audio reviews, but it likely doesn’t change how many people feel about data handling and privacy concerns in general, especially in this post-Facebook scandal era we all live in. As one Microsoft contractor who provided Motherboard with a cache of files pointed out, “The fact that I can even share some of this with you shows how lax things are in terms of protecting user data.”
Privacy is of paramount concern in a modern, information-fueled society, where corporations can easily profit off of your data. Microsoft claims it’s following the “highest privacy standards,” but all that really matters is that the company has gained consent. “Consent remains the name of the game on data privacy,” Richard Hoeg, attorney with Hoeg Law, remarked to GameDaily.
“People aren’t [necessarily] expecting their technology to work this way, and further to the extent the applicable privacy policy doesn’t make clear that this is happening, folks would be right to be upset,” Hoeg continued. “The bigger failing there is in respect of the policy itself. Most state laws and even the GDPR allow a company to do most of what it wants with data assuming they get informed consent (similar to the medical concept where the law is very concerned that you properly told folks what was happening/going to happen).”
Sometimes consent comes in the insidious form of terms and conditions, which so many of us hastily agree to without even thinking (or sometimes even reading any of it).
“As you know, terms and conditions can get to cute on these kinds of things, so if a company tried to [h]ide what they were doing, I could certainly see them getting into trouble,” Hoeg said, adding that how Microsoft has framed the change to its policy is portrayed as “simply clarifying what they thought they had already specified — probably lawyers being cute.”
Ryan J. Black, the Chief Technology Partner and Co-Chair, Video Games & Esports, at McMillan LLP in Vancouver, BC, believes that Microsoft’s update to its policy should be taken as a good sign. And as painful as it might sound, users really should read that policy and any terms of service they’re going to agree to.
“While I get that updating a privacy policy can trigger concern, I should point out that updating a privacy policy to clear up exactly how information is used when questions or concerns arise is exactly the right behaviour,” Black told GameDaily. “If we live in a hypothetical world where a privacy policy wasn’t amended to contain new language but instead simply already had it, I’m pretty sure that people still would not have read it. As someone who regularly drafts clear and concise terms, conditions, privacy policies and other materials that are still rarely read, my overall advice to parents is to take a little bit of time and poke around the privacy policy and terms and conditions of the services that children use.
“Don’t just check that box, don’t just click ‘I agree’, but spend a minute or two. These materials, while long (they’re really complicated businesses!), are sometimes not that hard to read, and a little bit of info is better than none.”
Both Hoeg and Black acknowledged that many consumers in our electronics-based society have almost come to expect some sort of invasion of privacy. It’s the price some are willing to pay for high-tech gadgetry like Amazon’s Alexa or Google Home. That said, the stickier point of contention is when companies can listen to children.
“The terms and conditions of a lot of online services expressly exclude kids under 13 due to U.S. COPPA law concerns, and they’ll tell you that they don’t want the information of kids,” Black explained. “And I do think there’s a strong argument that parents have to be at least somewhat responsible for letting their kids take part in commercial businesses, even if it’s just playing an online game. Surely parents know that playing these games online is a little different than inviting friends over to play Snakes and Ladders, but who knows where the line is?
“In Canada, for example, there’s a huge issue about minors’ ability to consent, and we don’t have explicit differentiation of how to collect consent from minors like the U.S. COPPA provides. The more likely a minor is to use a service, the more problematic a model of privacy based on ‘consent’ or ‘contractual relations’ becomes, and the more important it is to be as explicit and clear as possible about the types of information collected and how it’s to be used.”
It may sound like common sense, but the Canadian privacy commissioners are more and more sensitive to the fact that younger children “aren’t really able to provide meaningful consent to the collection and use of their personal information.” Moreover, Black said that “Privacy laws around the world (like GDPR) are changing to be more restrictive to protect individuals regardless of consent, and this is a trend that may continue.”
Black warned that with the rise in computer-assisted services and artificial intelligence, it could be hard for privacy laws to keep up with the fast pace of technology. Additionally, companies like Microsoft need to be keenly aware that privacy laws are going to be slightly different depending on the region of the world, and as a company they must be sure to comply with regional laws or face the consequences.
“Privacy laws are very different worldwide,” Black said. “Canada has a model that is different than the much-ballyhooed European GDPR, for example, and they aren’t always compatible. We’re not European lawyers, but a big difference is that our model is largely consent based, whereas under GDPR consent takes a backseat to legitimate purposes and reasonable disclosure. The U.S. is even weirder, lacking a federal overarching privacy regime, and attacking it differently state-by-state and sector-by-sector. The internet makes this sort of thing very difficult to practically regulate: Someone from Russia may use a service offered by a Canadian company through a server in the United States that is cached by a service in Europe.”
Most people’s gut reaction to Microsoft recording and listening to children chatting on Xbox or to adults having phone sex on Skype would be, “Wow, that’s incredibly creepy.” Creepiness, of course, is not a basis for a legal case, but it’s still critical that companies take consumer perception into account.
“The surreptitious and overreaching nature of the creepiness in this matter can be a factor in consumer decisions (do I want this electronic listening device in my house? In my bedroom? On my TV? Etc.) and sometimes regulatory decisions,” Black stressed. “Organizations must be transparent in how the personal information will be used, what personal information will be collected, [and] who has access to it.”
Microsoft’s privacy policy update is just the latest example of a major corporation trying to carefully dance around incredibly sensitive data handling. It’s a 21st century issue that’s likely to only get even more complicated in the years ahead.
As Black explained, “It just highlights a few things: (1) consent is becoming an increasingly difficult model to sustain the entire regulation of privacy around; (2) as products and services get more and more complicated, the ‘behind the scenes’ work is going to become more and more complicated too; (3) there is a general need for transparency and clarity around how information, argued by some to be the true currency of the digital era, is going to be used.”
It feels like we could all use an assist on this one: Hey Cortana, can you read me that new privacy policy again?