Industry Leaders Urge Biden to Tread Lightly With Federal Privacy Legislation

WASHINGTON, January 18, 2023 — As immersive virtual reality technologies gain popularity among children and teenagers, there is an increasing need for legislation that specifically addresses the industry’s unprecedented capacity for data collection, said attorneys at a Practicing Law Institute webinar on Friday.

Without downplaying the potential benefits of “metaverse” technology, it is important to understand how it differs from the current internet and how that will impact children, said Leeza Garber, a cybersecurity and privacy attorney.

“When you’re talking about being able to feel something with the haptic gloves, which are in advanced states of development, or even an entirely haptic suit, you’re talking about the potential for cyberbullying, harassment, assault to happen to minors in a completely different playing field — where right now there’s not so much proactive legislation,” Garber said.

Although the metaverse is often framed as a thing of the future, it actually just entails “an immersive, visual, virtual experience,” said Gail Gottehrer, founder and chairperson of the New York State Bar Association’s cybersecurity subcommittee.

Defined as such, the metaverse has already gained widespread popularity. “The next generation of children will spend approximately 10 years in virtual reality in their lives. So that’s the equivalent of around two hours and 45 minutes per day,” Gottehrer said, citing research from the Institution of Engineering and Technology.

The user base of one such platform, Roblox, “includes 50 percent of all kids under the age of 16 in the United States — so it’s huge for minors,” Garber said.

For a generation that has grown up with social media integrated into everyday life, the “interplay of personal data with gaining the benefit of using this type of platform is just simply accepted,” Garber added. “We have to be more proactive in a space where this new iteration of the internet will have the capacity to take in so much more data.”

‘Staggering’ amount of data collected in the metaverse

The data collected by metaverse technology is “staggering,” Gottehrer said. Virtual reality equipment can track eye and head movements, heart rates, muscle tension, brain activity and gait patterns. After just a few minutes of use, the “motion signature” created by this data can be used to identify people with 95 percent accuracy.

This data can also identify neurodiversity and some forms of disability that affect movement, such as Parkinson’s disease.

“If you’re a child and this data is already being collected on you, where might that down the road follow you in your life?” Gottehrer asked.

Only a handful of states have specific regulations for the collection of biometric data, but Garber predicted that more states will likely pass similar legislation, albeit “at a glacial pace.”

However, many experts worry that it will not be fast enough, particularly when it comes to protecting children’s digital privacy. “While we know technology moves at a pace that’s much faster than courts or litigation, there’s really a concern that [the Children’s Online Privacy Protection Act] is dragging behind,” Gottehrer said.

Compounding these concerns is the confusion over who should be setting these regulations in the first place. In September, as privacy legislation stalled in Congress, Sens. Ed Markey, D-Mass., and Richard Blumenthal, D-Conn., wrote a letter urging the Federal Trade Commission to use its regulatory authority to update COPPA.

The letter “does not send a great message,” Garber said. And without decisive government action, tech companies currently hold great power to set the standards and practices that will shape the industry’s regulatory development in the future.

“Self-regulation by metaverse stakeholders — is that is that viable? Is that advantageous?” Gottehrer asked. “I think it’s safe to say we have not seen tremendous success at self-regulation of the current version of the internet — that might be a dramatic understatement.”

For an example of how companies might fail to proactively protect underage users, Gottehrer pointed to Instagram. According to internal documents shown to the Wall Street Journal in September 2021, Facebook knew for some time that Instagram was harmful to the mental health of teenage users, based on internal research, and yet continued to develop products for an even younger audience.

“All of these issues become amplified in an environment where you’re supposed to be completely immersed,” Garber said.



Source