A live concert performed in the metaverse raises many legal and regulatory questions.
How is metaverse regulation organized? This is a question answered at Osborne Clarke’s recent Metaverse Week event, where a case study of a live gig at Metaverse helped shed light on the new and upcoming legislation and regulations that need to be in place. implications for metaverse creators and users. These consequences are broad and include broadcast and media regulation, intellectual property (IP) issues, artificial intelligence (AI), data and privacy regulation, and content and interaction.
Media and Intellectual Property
If a live concert is planned in the metaverse, what are the issues to consider from a media regulation and intellectual property perspective? The first issue to consider is entitlement permissions. Taking the example of a live concert, if you want to use the music of others in the metaverse, you may need to prepare special agreements on how the content is licensed. a territorial basis, so consider the need to add the license metaverse for clarification).
Censorship and content standards are other important areas. Legal compliance teams need to be aware of the type of content that is censored or restricted by certain governments and countries and need to consider what artists may or may not include in their performances.
Video and media regulations also need to be addressed. Currently there are no metaverse laws, but like any new development, there is already a web of existing laws that can be applied. In particular, because Metaverse is an AV format, consideration should be given to how the content might interact with existing AV and AV regulations. The standard of audiovisual regulation in the EU is the Audiovisual Media Services Directive, which was updated in 2018, and national implementation is still ongoing. In addition, if you are a metaverse platform provider, you need to find out if the platform qualifies as a video sharing platform service.
Data and AI
What data and AI regulatory frameworks in the UK and EU are likely to be used in a Metaverse gig? Metaverse data is generally covered by the General Data Protection Regulation (GDPR) in the UK and Europe, but there is very little GDPR specific to AI. The GDPR still has major implications for all Metaverse participants, as more participant data will be collected within a Metaverse than during the actual gig.
Metaverse data privacy issues could be an extension of the regulations already in place for the Internet, and providers must ensure that protocols are in place and data is collected fairly and transparently, providing notices to privacy and have permission for tracking technologies.
Platform providers, people who sell virtual art, non-fungible tokens, virtual objects and things, and payment providers who share personal data need to make sure they have contracts. in place that correctly addresses data privacy issues. They need to make this a compliance exercise and be aware of the potential risks involved.
The Artificial Intelligence Act offers different levels of regulation based on the perceived risks posed by AI types. some metaverse-relevant uses of AI may fall into the unacceptable, high-risk, high-regulation bracket if they involve subliminal, manipulative, or exploitative techniques that cause harm, or involve automated face recognition. or other biometric data.
Content and interactions
The development of content and interaction within the metaverse will bring many privacy challenges. Traditionally, online harms have been the focus of media regulators, but privacy regulators have also discovered this area.
Protecting minors as a privacy issue is a global challenge and many authorities issue guidelines. The UK regulator, the Information Commissioner’s Office, has published the Age Appropriate Design Code, which sets out 15 standards that online services must follow.
Digital content and private devices are changing the way we consume content and bringing new ways to protect minors on digital devices. This includes placing clear and visible content labels and technical filters, and using video streaming services that display ratings before users click to play. In the gambling industry, codes of conduct, moderation policies and procedures need to be put in place to enforce disruptive conduct.
In the metaverse, media consumption occurs in a shared space and it brings new challenges, especially around the interior of being able to interact with people with a wider scope.
Toxic behavior: changing responses
The response to toxic users and disruptive behavior in the metaverse is still ongoing. The development of reporting mechanisms, rules of conduct and general ways to deal with problematic conduct will present new and interesting challenges. As always with new developments, it is important to keep an eye on how new laws and regulations address these challenges.
In general, there are three different types of content online. Other content is often banned because it violates criminal laws. Contains content with age restrictions and requirements. And there is a content that is appropriate for every audience.
Currently, the regulation primarily focuses on service providers; however, the regulations are changing and responding to other players. For example, the French Senate passed a bill to strengthen parental controls on the Internet, which would require operating systems to install parental controls on devices by default. Similar legislation was discussed in Germany.
Online damage is an area that is particularly scrutinized by media regulators, data privacy authorities, law enforcement and consumer protection bodies. Data protection authorities are primarily focused on protecting minors and this will increase as users create more data. This will lead to national legislation in the EU and around the world aimed at ensuring safety and reducing harm online.
Comment by Osborne Clarke
What does all this mean for companies operating in the metaverse? These questions have to do with the world. There is debate over similar regulations outside the UK and Europe. For example, California lawmakers are working on a proposed Age Appropriate Design Code Act, and other U.S. initiatives are shaping up to be part of federal AI regulation and Safe Harbor-related reforms. in the user -generated content.
Its “meta conclusion” is that it makes sense to think about compliance through the scheme and a framework that can address these various issues while being flexible enough to accommodate differences in regional and national legislation.