News

Identifying XR Risk Assessment in Industrial Training and Remote Support 

Extended Reality (XR) brings a wealth of new possibilities to the way people work, learn, and interact. By blending physical and digital worlds, XR enables individuals to visualise complex tasks, practice skills in lifelike scenarios, and collaborate remotely as if they were in the same room. From simulating complex industrial environments without risk to either student or physical equipment, to overlaying real‐time instructions on an engineer’s field of vision, XR is rapidly transforming training and support processes across sectors. However, alongside these exciting opportunities come risks that, if left unaddressed, could undermine safety, productivity, and user trust. For decision-makers in industry and education, understanding these risks early is critical to avoid costly setbacks during rollout and use. 

The MOTIVATE XR Project: A Proactive Approach to Risk Assessment

The Horizon Europe project, MOTIVATE XR, explores how XR can be used for industrial training and remote support of maintenance and evaluation tasks – and especially how AI can simplify the creation of relevant content. To ensure the responsible adoption of the XR technologies, MOTIVATE XR emphasises the fundamental need to assess not only their intended benefits but also their unintended and often unforeseen consequences. As XR reshapes how industrial training and remote support are delivered, it may also introduce new forms of cognitive strain, alter social dynamics in the workplace, and raise concerns about safety, data privacy, surveillance, and equity in access. The MOTIVATE XR project is committed to identifying and addressing potential risks early on, recognising that a forward-looking approach to technology development must include safeguards for human values, safety, and trust. To support this work, Delft University of Technology has carried out comprehensive risk identification for XR technologies in the MOTIVATE XR context. 

Identifying and Classifying XR Risks: A Multi-Input Approach

To identify risks associated with XR in industrial training and remote support, insights from structured risk workshops with MOTIVATE XR partners were combined with a large-scale review of scientific literature. Using YAGHMA’s tailored analytical methods, statements pertaining to a core taxonomy of risks were extracted from more than 200.000 academic publications. The taxonomy of risks arising from new technologies ensures a social and ethical focus on the identified risks. At its base there are seven categories: Environmental, Economic, Health and Well-Being, Infrastructure, Policy and Governance, Research and Development, and Social. Using tailored techniques based on vector representations, thematically similar risks were clustered and refined. Subject-matter experts ensured, throughout the process of semantic accuracy, resolved misclassifications, and clarified overlapping themes. They furthermore ensured relevance to MOTIVATE XR, following a strategy to include borderline cases. This multi-input method ensures that the identified risks reflect both the MOTIVATE XR partners’ insights, state-of-the are understanding and remain comprehendible with 50 clearly stated risks, creating a tailored map of risks that are likely to be relevant as concern areas for the project’s technology and their implementation.  

Common and Widely Acknowledged XR Risks

Our analysis confirmed that certain categories of risk are frequently discussed in the literature and widely acknowledged across domains where XR is being implemented. These risks are particularly relevant to industrial training and remote support, and decision-makers should be prepared to address them from the outset. 

Physical discomfort and fatigue remain among the most cited concerns. Extended use of XR headsets can lead to motion sickness, eye strain, and overheating, especially in high-fidelity simulations. These physiological symptoms not only impair user experience but can also reduce task performance, limit session duration, and in some cases lead to safety incidents if users become disoriented or fatigued. These effects are amplified in scenarios requiring precision or long periods of focus, such as maintenance procedures or equipment evaluation tasks.

There is growing concern around the psychological and social effects of immersive environments. The immersive nature of XR may blur the boundary between virtual and real-world experiences, with consequences ranging from mild anxiety to impaired decision-making and distorted perception of reality. Prolonged or emotionally intense XR experiences can affect users’ mental well-being and even alter their self-perception. In collaborative settings, the mediated interaction inherent in XR can lead to reduced empathy, miscommunication, and social detachment, especially when non-verbal cues are lost or misrepresented. These dynamics can be particularly challenging in high-stakes or trust-dependent environments.

Several persistent structural and technical barriers risk limiting XR’s equitable impact. XR systems often rely on extensive data collection, including biometric and behavioural data, which raises serious concerns about privacy, surveillance, and cyber harassmentespecially in employer-managed environments. Meanwhile, high development and maintenance costs create financial barriers that disproportionately affect small businesses, educational institutions, and under-resourced sectors. In parallel, limitations in current XR fidelity and technical constraints may reduce training realism, create over-reliance on digital cues, and ultimately weaken real-world preparedness. When XR is used as a substitute rather than a supplement to hands-on experience, it can hinder skill acquisition, retention, and transfer. 

Context-Specific Risks in Industrial XR

While the commonly discussed risks provide a crucial starting point, they do not represent the full range of challenges identified through our analysis. In the MOTIVATE XR context, where the focus is on industrial training and remote support, several additional risks emerged that are less prominent in mainstream XR discourse but highly relevant in practice. The examples below illustrate some of these context-specific concernsparticularly those tied to integration into workflows, hands-on skill development, legal preparedness, and operational reliability. They represent a subset of the broader risk landscape uncovered during the project. 

One pressing concern is the risk of the workforce deskilling through over-reliance on XR technologies. As XR is used to guide workers through complex tasks step-by-step, it can reduce the need to internalise procedures or develop deeper troubleshooting skills. This may lead to lower long-term competence, diminished adaptability, and ultimately, an unprepared workforce in a rapidly changing industrial landscape. Particularly in remote support settings, where technicians rely on visual overlays or AI-generated suggestions, real-world experience can gradually erodewidening skills gaps rather than closing them. 

Another critical but under-recognised issue is the absence of preparedness for the legal, pedagogical, and certification challenges introduced by XR. Many organisations are still unfamiliar with intellectual property rights for XR-generated content or with best practices for XR-based teaching. Instructors may lack the training to effectively facilitate learning in immersive settings, leading to confusion or subpar outcomes. Moreover, if XR-based training and certifications are rolled out without updating assessment methods and criteria, there is a risk that qualifications will no longer reflect actual competence, undermining trust in certification systems and the workers they certify. 

The technical and infrastructural risks of XR also take on heightened importance in industrial contexts. XR systems that require stable wireless connections, high processing power, or real-time AI support can strain existing IT infrastructureespecially during multi-user sessions or in field environments. A dropped connection or system glitch during a critical support task may delay operations or cause safety issues. In addition, integrating XR into existing workflows can be less efficient than anticipated, particularly for tasks still requiring high tactile feedback or hands-on manipulation. In some cases, XR may inadvertently reduce the quality of interaction rather than enhance it. 

Finally, XR introduces a set of security, ethical, and legal ambiguities that cannot be ignored. Data gathered through XR platformsoften rich in biometric, behavioural, and contextual information—raises serious privacy concerns, especially in the absence of clearly communicated usage policies. Surveillance-like functionality embedded in XR for monitoring trainees or employees can blur work-life boundaries and erode trust. Meanwhile, the lack of established liability frameworks means that if an accident occurs due to AI-driven guidance or hardware failure, neither trainees nor employers may know where responsibility lies. In the worst case, this can lead to prolonged legal uncertainty, reputational damage, or even physical harm. 

Interconnected Risks and the Need for Holistic Evaluation

Across all findings one pattern stands out: risks associated with XR do not occur in isolation. A seemingly minor technical flaw, such as a delayed frame or unstable connection, can ripple into motion sickness, lost productivity, or even legal liability if errors occur in critical environments. Users still trip, collide, or over-exertbut immersion hides real-world cues, raising severity. Many risks are deeply interlinked, cascading across domains. Moreover, the assumption that immersive technologies inherently improve training or collaboration does not always hold; in practice, XR-based sessions may reduce efficiency, increase fatigue, or complicate logisticsparticularly when deployed at scale or in environments with limited infrastructure. Psychological safety, once considered a secondary concern, is at the forefront. Similarly, equity must be redefined beyond cost, availability, or infrastructure needs. Devices that exclude wheelchair users or don’t accommodate prescription eyewear are not inclusive, regardless of price. These observations reinforce a central lesson: XR’s risks must be evaluated holistically and tailored to context. Effective mitigation depends not only on technical upgrades or checklists but also on dialogue between stakeholders who understand the real-world environments in which XR is deployed. 

Conclusion: Proactive Risk Identification for Responsible XR Deployment

The experience from MOTIVATE XR shows that identifying risks early is not about caution for its own sakeit is a necessary step toward deploying XR responsibly and effectively. In industrial and educational contexts where safety, performance, and trust are paramount, overlooking potential downsides can lead to avoidable costs, inefficiencies, or harm. While no one-size-fits-all solution exists, organisations can learn a great deal from the growing body of evidence on XR’s potential downsides. Drawing from state-of-the-art discussions helps anticipate challenges, but every implementation must ultimately be informed by the realities of its own environment. For any organisation considering XR adoption, a clear-eyed understanding of risks is not a barrier – it’s the foundation for long-term value. 

Author

 YAGHMA B.V.

Rie Brammer Larsen, PhD, is a researcher and project lead at YAGHMA B.V., specialising in the assessment of risks and impacts of emerging technologies, with focus on the ethical consequences of artificial intelligence. She is an authorised lead assessor for IEEE CertifAIEd. Rie Brammer Larsen holds a PhD in control theory and its application to logistics systems from Delft University of Technology. 

Share

Categories

Related News

The architectural aluminium sector demands precision and skilled professionals. The Architectural Aluminium Academy (AAA) addresses this need through advanced training and skill development. Read More...
Immersive technologies necessitate responsible data handling and IP protection for trust and innovation. MOTIVATE XR prioritises this with a strong, transparent approach....
Discover how Competence Centers support companies' digital transformation with advanced technologies like Digital Twin, Virtual Reality, and IoT....

Stay up to date

Subscribe to
MOTIVATE XR Newsletter

Subscription Form Homepage