Oculus Privacy Policy – ‘what you (and others) give us’

I always feel like somebody’s watchin’ me

And I have no privacy

I always feel like somebody’s watchin’ me

Is it just a dream?

– Rockwell

'I Agree', I think

Facebook has recently released its new VR (Virtual Reality) Headset, the Oculus Quest 2. The Oculus Quest 2 is a new powerful budget VR headset that does not require an expensive gaming PC to run. It is a self-contained unit, with no wires tethering it to anything, and has an amazing display resolution of 1832 x 1920 pixels – per eye! All this adds up to make the user enjoy a more immersive and detailed virtual reality experience.

Around this time, Facebook updated its Privacy Policy in relation to their VR hardware division, Oculus. Rarely are privacy policies entirely read or even understood1,9. The updated privacy policy co-insides with the release of this new VR Headset and users may not be fully aware of the implications of clicking “Agree” and giving your consent to these updates.

Along with the usual Facebook Terms and Conditions, the addendum to this privacy policy adds increased complexity. There is now some additional data Facebook collect. The full scope of the amount of information being collected on you is immense – and Facebook now has your permission to know you even more.

So, what are they collecting?
Big(er) Data

Have you ever considered how much data Facebook and its partners have already collected from you? ‘Big Data’ are large, and often complex, data sets. In this context, Big Data is personal data sourced and scraped through your interactions with Facebook. The updated Oculus Privacy Policy states the additional data Facebook now collects includes:

  • VR interactions you have with others, and their information.
  • Information about the devices connected to your account, such as the headset, your TV, mobile phone, local Wi-Fi networks and mobile communication towers.
  • Information on the sites you visit using the VR browser.
  • Objects you create in VR applications.
  • Information that other users may have about you.
  • And the very interesting, ‘Certain identifiers that are unique to you.’

Certain identifiers that are unique to you can include the physical environment around you, your voice, your movement characteristics, your physical features and even your size. They know you very well.

Connecting the dots

Your physical characteristics are only one piece to the human puzzle that is you. You might not really care that Facebook knows your hand size, however, your apps, digital assistants, websites, social media accounts and the devices you use are all ingesting your data. The more you feed them, the more they adapt to curating your online experience2.

The data you disclose to Facebook may be shared or gathered through different channels that you are unaware of. For example, Facebook shares data across all its platforms, such as Instagram and WhatsApp. Facebook also shares some of this with other interested third parties. Some of these third parties include large companies such as Microsoft, Google and Tealium. Tealium has its own data agreements with other major companies including Amazon, Android, Twitter and Apple.

Big data is a big deal

Through complex and often secretive algorithms, your actions online can all be connected. Tealium own patents for algorithms that specialise in Identity Resolution. Identity Resolution is a technique that matches pieces of information about you from various platforms to create a wholistic model of your details and personality3.

The aggregation of your personal data could contain sensitive information based on our web browsing history and this raises real privacy issue concerns4. Oculus’s additional data sets, the concentration of which now contain your physical characteristics, gives Facebook’s algorithms additional power to establish a somewhat incredibly accurate picture of yourself.

Uncovering problems

Algorithms are basically mathematical constructs designed by humans. They can be useful to source and unlock social connections or to suggest items you may like to purchase. However, these datasets are large, and the algorithms used to sort and learn from this data may also uncover human assumptions and problems.

Algorithmic problems may uncover representational violence, in other words, bias contained within them may cause harm by reinforcing inequalities such as race, cultural differences, gender norms and economic standing10.  Bias is realised through machine learning based on the data it is fed5. Implicit assumptions rise to the surface, be they human or machine, leading to profiling inequalities6. There are many examples of this, from Airbnb rejecting renters based on their race, sexual orientation, or gender identity7, to simple cases of economical discrimination based on data collected through app usage data8.

This VR headset’s discomfort
A Privacy Paradox

The Oculus Quest 2 VR headset has certain unique requirements from the user for it to function. The most significant being the user must agree to this addendum to Facebook’s Privacy Policy. Facebook and their hardware hold a structural power over you.

There is a privacy paradox here. You may value your privacy, however, when you own this device and if you want to use it, you must give away some of your rights to that privacy. If this sits a little uneasy with you, what can you do? Unfortunately, in this instance, not a great deal.

Stop looking over my shoulder

It is not all doom and gloom. These additions to Facebook’s privacy policies are in place to ensure the safety of its users11. So, you may just want to accept them. However, setting the Oculus Quest aside for a minute, you can be more proactive in your awareness of what data you share online.

One positive step is by limiting the amount of information collected across all your devices. It will take a little bit of time, but by putting your right to privacy as a high priority there are some constructive steps you can take to protect it.

You can limit how Facebook shares your data with their privacy check-up tools.

Google has similar tools.

Check how other applications or online services are collecting your data and how their respective privacy policies stack up at Terms of Service; Didn’t Read.

Or, for a good overview and further reading on the issues surrounding data, algorithms and privacy you can visit the Future of Privacy Forum.

Footnotes

  1. Bechmann, A. (2015). Non-Informed Consent Cultures: Privacy Policies and App Contracts on Facebook. Journal of media business studies, 11(1), 21-38. https://doi.org/10.1080/16522354.2014.11073574
  2. Osoba, O. A., & Welser, W. I. V. (2017). An Intelligence in Our Image: The Risks of Bias and Errors in Artificial Intelligence. RAND Corporation. https://doi.org/10.7249/RR1744
  3. Srivastava, D. K., & Roychoudhury, B. (2020). Words are important: A textual content based identity resolution scheme across multiple online social networks. Knowledge-Based Systems, 195, 105624. https://doi.org/https://doi.org/10.1016/j.knosys.2020.105624
  4. Peled, O., Fire, M., Rokach, L., & Elovici, Y. (2016). Matching entities across online social networks. Neurocomputing (Amsterdam), 210, 91-106. https://doi.org/10.1016/j.neucom.2016.03.089
  5. Nicol Turner, L. (2018). Detecting racial bias in algorithms and machine learning. Journal of Information, Communication & Ethics in Society, 16(3), 252-260. https://doi.org/http://dx.doi.org/10.1108/JICES-06-2018-0056
  6. Daniels, J. (2015). “My Brain Database Doesn’t See Skin Color”: Color-Blind Racism in the Technology Industry and in Theorizing the Web. American Behavioral Scientist, 59(11), 1377-1393. https://doi.org/10.1177/0002764215578728
  7. Murphy, L. (2016), “Airbnb’s work to fight discrimination and build inclusion”, Airbnb Blogs, available at: https://goo.gl/RUXc6j
  8. https://foreignpolicy.com/2013/05/10/think-again-big-data/
  9. Obar, J. A., & Oeldorf-Hirsch, A. (2020). The biggest lie on the Internet: ignoring the privacy policies and terms of service policies of social networking services. Information, Communication & Society, 23(1), 128-147. https://doi.org/10.1080/1369118X.2018.1486870
  10. Hoffmann, A. L. (2020). Terms of inclusion: Data, discourse, violence. New Media & Society. https://doi.org/10.1177/1461444820958725
  11. Road to VR. https://www.roadtovr.com/facebook-expanded-vr-policies-oculus-quest-2-privacy-policy-terms-of-service/

Leave a Reply

Your email address will not be published. Required fields are marked *

Post comment