Article originally published in Legal Technology News. Reprinted with permission.
The success of any Metaverse project over the medium-term will be defined by its ability to maintain law, order, privacy and safety for all, particularly the target audience, especially where children under 16 are involved.
As the web provides new ways for us to connect and interact with others, new virtual worlds called Metaverses combine virtual reality and augmented reality devices, and they are multiplying. Some iterations integrate virtual and physical spaces, and web3 technologies can layer on top to enable virtual and fully decentralized economies to thrive.
The Metaverse quickly expands our current understanding of physical limits to spending time in a virtual space and extending real life. This extended reality blurs the lines between the physical and digital worlds. Companies anticipate that the Metaverse will become a place where people and businesses meet, work, play and transact in goods and services—some tangible, some not.
The recent enthusiasm for this immersive digital 3D experience has led to more companies considering a Metaverse strategy. Just recently, LEGO Group and Epic Games announced they are entering into a partnership and look to shape the future of the Metaverse while making it safe and fun for families. The two companies are looking to build an immersive, creativity-inspiring, and engaging digital experience for kids of all ages to enjoy together.
However, privacy, safety and health concerns abound in the Metaverse, just as in the real-world playground. The success of any Metaverse project over the medium-term will be defined by its ability to maintain law, order, privacy and safety for all, particularly the target audience, especially where children under 16 are involved.
Successful entrepreneurs and investors in the Metaverse will do well to design legal, privacy, and safety into the roadmap of their project from inception:
Develop strategies to deal with violations of personal limits. Since its inception, the internet has seen various actors use its functionalities to harass and bother other users. With the advent and increasing usage of virtual reality technologies, online harassment has grown more pervasive and complex.
Companies entering the Metaverse should ask themselves these questions: What happens if an individual’s avatar interacts with another person’s avatar in an unwanted manner? What will be considered “harassment” for interaction in the virtual space where people are no longer represented by just usernames but by controllable avatars? What kind of sanction is appropriate for an infraction? As both companies and consumers dive into this new world, there will unquestionably be growing pains. Companies should resolve these matters as quickly as possible.
Be especially careful with children. Kids enjoy playing in digital and physical worlds. As the Metaverse evolves, it is vital to stay ahead of kids’ privacy and well-being, delivering safe digital engagement. In his State of the Union address, President Biden stressed shielding children from online advertisements and the effects of social media. Politicians have always been vocal about protecting children from the perceived harms of technology. Legislation CAMRA and COPPA have already proposed may extend to the Metaverse.
Children should have the right to play safely, and we should safeguard the privacy of children with extra vigilance. Companies can do more to design VR devices and metaverse systems with heightened safety mechanisms and minimal, user-centric data collection practices.
Remain transparent and inform users that they are interacting with AI bots. The Metaverse is populated by both human and AI entities. For transparency reasons, avatars and digital humans should be easy to identify, so users always know with whom they share their data. The aim should remain the protection of consumers’ privacy. However, companies must strike a balance between enabling privacy-focused policies and enabling a “free-for-all” environment where no one can be held accountable for their actions because no one knows one another.
Understand biometrics and privacy laws. Biometrics is a measurement and statistical analysis of an individual’s physical and behavioral characteristics. Biometrics in the business world has become widespread, and the types of usage are constantly evolving. To that end, many states have enacted laws to protect biometric data. Texas and Washington have broad biometric privacy laws, but neither creates a private right of action. Still, other states like Arizona and New York have enacted tailored biometric privacy measures.
One of the most rigid laws in the U.S. concerning biometric data protection is the Illinois Biometric Information Privacy Act (BIPA). Illinois’ Biometric Information Privacy Act, BIPA, requires private entities that use biometric information to have a written policy establishing guidelines for permanently destroying such information. Collect what you need and allow for the right to be forgotten.
Self-regulate data collection. The European Union’s global data privacy regulation, or GDPR, has specific rules only pertaining to EU citizens. In the United States, California, Colorado, and Virginia, are among those that have already enacted privacy laws, with more states to follow. Companies looking to take a position, create a strategy, and leap into the Metaverse should always consider privacy as an essential tool in their arsenal because a consumer who trusts a company is more willing to share information.
What makes the Metaverse unique is the interaction between user, device, and software and the experiences created within. While the occasions for an individual’s avatar to purchase real estate, buy shoes and play in a virtual world are exciting, how much information to put out there and collect will be perpetually explored by regulators and privacy advocates as they become increasingly ubiquitous.
Receive consumer consent. Consumers should weigh the personal risks of immersing themselves in the virtual world. Companies should have a consent mechanism to implement consumer data rights and educate users on privacy implications. Additionally, consumer consent should be regularly refreshed without perpetual permission. The California Consumer Privacy Act and the Colorado Privacy Act allow consumers to access, correct or delete their data. Any company seeking to operate in the Metaverse should have processes to comply with these requests at the outset.
Finally, actual-world data breaches and security exploits also remain a concern for all who venture into the Metaverse. As our lives continually shift online, we have started seeing the consequences of this transition. When your company ventures into the Metaverse, the priority should be to empower adults and children with tools that give them say and control over their personal digital experience.