Recommendations for developing governance
Design choices made for metaverse applications, platforms and services control how these systems operate: whether they act lawfully, how they achieve their purpose, and how they distribute harms and benefits to those they impact. Governance processes help ensure these design decisions are made with appropriate authority, information, documentation and oversight.
Given the early stage of the NSW Government's engagement with the metaverse, and of metaverse development generally, it may be premature to implement immediately new governance for departments undertaking metaverse work. However, we do recommend the government begin developing elements that can later form part of an established metaverse governance process. This section presents recommendations for such elements.
Drawing from experience with AI governance
Because of the parallels between AI and metaverse technologies, the NSW Government's May 2022 NSW AI Assurance Framework [186] could serve as a helpful starting point from which to model future metaverse governance.
AI and metaverse technologies are analogous in several respects:
- both are complex, novel technologies with significant scope for benefit and harm
- both have ill-defined boundaries, with no clear definition on exactly which systems should 'count' as AI or the metaverse
- both require specialised, multi-disciplinary expertise to implement which is not widely present in the government
- both often require procurement of products from third parties
- both involve the collection and use of users' personal data.
The AI Assurance Framework is already well-designed to address the above challenges, making its use as a template for metaverse governance compelling.
Establishing a metaverse review committee
Again drawing on the government's experience with AI, the NSW Government's AI Review Committee could serve as a template for a metaverse review committee. This committee would consist of experts from government and industry across disciplines, and oversee the legal, ethical and technical design and impact of the government's metaverse applications.
Initially, such a committee could perform an advisory function. As the government's projects in the metaverse moved beyond pilot stages it could be required to examine and approve higher risk projects in a similar manner to the existing AI Review Committee.
Relevant fields of expertise that members of the committee could be selected to cover include:
- cybersecurity
- psychology and user experience
- ethics
- artificial intelligence
- law, regulation and human rights.
Developing responsible metaverse principles
Defining aspirational principles that capture what ethical or responsible metaverse applications look like is a potential first step towards preventing harm with concrete guidance and regulation. The NSW Government took this same approach with artificial intelligence: beginning with high-level statements about what ideal deployments of AI look like, before moving into more concrete governance measures to manage practical considerations.
Principles developed for an AI setting such as:
- fairness (ensuring the systems do not discriminate against certain individuals or groups)
- accountability (ensuring that the owner of a system is accountable for its impact)
- transparency ( ensuring that information about the system's operation including how it works, its benefits and risks, is available to the right audiences)
- positive impact (ensuring that the system acts in the best interest of its users and does not cause unnecessary or unintended harm)
- privacy (ensuring that the system manages personal data safely and securely)
apply equally well to metaverse applications as they do to AI systems (or really to any system).
More specific principles, intended for organisations developing platforms for online communication (which would include metaverse applications), have been developed by the Australian eSafety Commissioner. [187] These principles emphasise:
- service provider responsibility
- user empowerment and autonomy
- transparency and accountability.
Being general in nature, principles typically need to be augmented with more detailed guidelines or rules in order to inform specific design decisions.
Documenting and recording key design trade-offs
One reason that principles are not generally useful for helping with practical design decisions is that they tend not to acknowledge the central fact of practical ethics: that most real decisions involve balancing trade-offs between different, competing harms and benefits.
Explicit requirements to document and balance potential trade-offs could form part of ensuring responsible use of metaverse technologies. The trade-offs could be framed through the competing interests of involved parties, such as a metaverse application developer, a NSW resident user, and the NSW Government regulator.
More specific trade-offs common to metaverse applications could also be documented and used as a way to inform key design decisions.
Example trade-offs in metaverse applications
- Anonymity can enable whistleblowing and political activism but also make it difficult to police threats, abuse and misinformation.
- Improving a user's sense of presence amplifies positive and negative experiences.
- More aggressive content moderation will create greater protection from inappropriate or abusive content and behaviour, but also wrongly stifle more legitimate expression.
- Interoperability empowers users to move more freely between service providers but may make commercialisation more difficult.
- Data collection about users enables personalisation but erodes privacy.
- Enforcing platform interoperability may increase user choice but make commercialisation more difficult in some circumstances.
Defining ethical requirements for NSW Government metaverse applications
The following rules developed by Dr Louis Rosenberg (Chief Scientist for the Responsible Metaverse Alliance) are intended for large providers of metaverse virtual worlds such as Meta. Rosenberg developed these requirements as a starting point for metaverse platform regulation. [188]
The rules intend to control some of the specific risks of the metaverse examined in The metaverse’s novel opportunities and risks including:
- manipulation of user behaviour through collection of their biometric data and the use of AI to understand and control their emotional state
- fabrication of reality through undisclosed product placement and AI avatars that interact with users, similar to 'bots' in current social media
- privacy risks associated with the collection of behavioural and physiological data.
Transparency rules
Platforms must disclose:
- what user behaviour is tracked and when it's tracked
- if AI is being used for emotion or sentiment analysis
- what aspects of the world are being injected on behalf of a paying third party (product placement) and the identity of that third party
- when an avatar is being controlled by an AI agent or by a human being
- when a user is being targeted by a promotional conversation or interaction on behalf of a third party, and the identity of that third party.
Data collection and use rules
Platforms must limit:
- limit storage of tracking data
- prohibit storing user emotion or sentiment data
- prohibit tracking vital signs / physiological data for anything non-medical.
Manipulation-free interaction rules
Platforms must prohibit:
- behavioural profiling of users
- using promotional AI agents that react to your emotions in real time.
In addition to informing future regulation discussions, the NSW Government could draw from these rules to create a set of red lines for metaverse applications with which it is involved. Such red lines would inform the government's own development of metaverse applications as well as engagement with other metaverse providers.
Developing procurement guidelines and partnership strategies
The NSW Government will likely need to undertake procurement to develop metaverse applications due to specialised expertise and resources required. Developing clear procurement guidelines and strategies for partnership will help the government align the design of the application to its objectives and control risk.
Procurement guidelines can be integrated with existing processes
The NSW Government already has an active ICT procurement process. We recommend integrating the new metaverse procurement requirements with these existing processes. An example is adding a new 'Category S' in the ICT Services Scheme for 'metaverse services' that suppliers can register under. Subcategories could include
- AR and VR software-as-a-service
- AR and VR hardware and support services
- 3D modelling software and capability
- metaverse application testing services.
Requiring transparency from procured systems
As the government is fully accountable for systems it deploys to end users — even if the system involves third party elements — transparency of procured services is crucial. Transparency is necessary for the government to understand and exercise oversight on the metaverse applications it procures. Information such as how and when data is collected and stored, how the product works, how it is tested, and the potential negative impacts, are all as critical for metaverse applications as they are for AI systems or any other large IT system. The provision of such information to the government could be mandated as part of the procurement process.
[186] Digital NSW. NSW AI Assurance Framework. NSW Government - Digital NSW (2022).
[187] eSafety Commissioner. Principles and background. Australian Government - eSafety Commissioner.
[188] Rosenberg, L. Regulation of the Metaverse: A Roadmap. in (2022).