Professor proposes making virtual reality more secure

Black man wearing virtual reality headset BANNER

Yan Shvartzshnaider, an assistant professor in the Department of Electrical Engineering & Computer Science at York University’s Lassonde School of Engineering, is one of many researchers working to address virtual privacy concerns and develop practical solutions for growing cybersecurity needs.

Yan Shvartzshnaider
Yan Shvartzshnaider

Imagine travelling to the peak of Mount Everest to observe the mountainous region, or floating in space to study the stars, constellations and planets – all during a class at school. Virtual reality (VR) technologies have made immersive learning experiences like these possible, expanding the opportunities available in traditional education settings.

However, though convenient and often exciting, some VR technologies pose significant privacy risks. “VR technologies are powered by an array of sensors and collect large amounts of data about users and their surrounding environment,” says Shvartzshnaider. “The extensive data collection practices intrinsic to VR technology expose users to a range of novel privacy and security threats.”

Shvartzshnaider and Karoline Brehm, an international exchange master’s student at York from Bauhaus-Universität Weimar, recently developed a paper titled “Understanding Privacy in Virtual Reality Classrooms,” which explores privacy concerns around VR platforms in education settings and proposes a framework to address those challenges. “As technology develops and becomes mainstream in established contexts like education, workplaces and health care, we need to examine and mitigate the associated privacy risks,” says Shvartzshnaider.

One reason concerns can arise around virtual reality’s use in education is that while VR platforms allow both real and virtual environments to exist simultaneously, each environment may adhere to different privacy norms. For example, if a student uses VR technology to attend a virtual lecture while at home, they are in two vastly different environments at the same time. Many VR platforms do not recognize these differences and may collect sensitive information about the user’s home environment, causing a violation of privacy of which the user may not be aware.  

The paper by Shvartzshnaider presents a framework to help examine and address such privacy risks. First, VR providers should gather information from stakeholders – like faculty and students – about their privacy expectations and the context within which the tech will be used. Then, providers should consider how VR technology might impact those privacy expectations, and what data is gathered, followed by examining and identifying potential privacy violations of the technology. The last proposed step would see VR companies apply the gathered information to help inform solutions, ensuring the VR technology adheres to privacy expectations and avoids breaches.

Shvartzshnaider’s proposed framework – and work overall in the recently published paper – are part of ongoing efforts to help contribute to global cybersecurity efforts, especially as emergency technologies like VR and artificial intelligence bring with their innovations privacy concerns.