Funding period: 01/2023 – 06/2026

Many industries are transitioning to Industry 4.0 production models by adopting robots in their processes. In parallel, Extended Reality (XR) technologies have reached sufficient maturity to enter the industrial applications domain, with early success cases often related to training workers, remote assistance, access to contextual information, and interaction with digital twins. In the future, robots will be increasingly enhanced with XR applications, which requires that industrial workers understand both technologies and can use and control hybrid solutions confidently. According to the data published in 2021, XR will create 1.2-2.4 million new jobs in the EU by 2025, and European XR markets are expected to grow between €35 billion and €65 billion by 2025. Specific education and training programs will be essential to this transition, especially for vocational school students and professionals in upskilling. They must learn how to program robots and establish a safe and productive human-robot collaboration. The new EU-funded project MASTER will improve the XR ecosystem for teaching and training robotics in manufacturing by providing an open XR platform that integrates key functionalities like creating safe robotic environments, programming flexible robotic applications, and integrating advanced interaction mechanisms. It will also provide high-quality training materials for robotics. Third-party contributions in terms of additional technologies and educational content will be integrated through two open calls for participation. Close collaboration with industry partners also serves to validate the open XR platform. The project started in January with a kick-off meeting at the University of Patras.

The objective of MASTER is to boost the XR ecosystem for teaching and training of robotics in manufacturing by providing an Open XR platform that integrates key functionalities for creating safe robotic environments, programming flexible robotic applications and integrating advanced interaction mechanisms. MASTER will also deliver rich training content on robotics.

Consortium

  • University of Patras, Laboratory for Manufacturing Systems and Automation (Coordinator)
  • Fundación Tekniker
  • Deutsches Forschungszentrum für Künstliche Intelligenz GmbH
  • Virtualware 2007 S.A.
  • Teaching Factory Competence Center
  • Alecop S. Coop.
  • European Science Communication Institute (ESCI) gGmbH

References

Barz, Michael, Bhatti, Omair Shahzad, Alam, Hasan Md Tusfiqur, Nguyen, Duy Minh Ho, & Sonntag, Daniel. (2023, March 27). Interactive Fixation-to-AOI Mapping for Mobile Eye Tracking Data based on Few-Shot Image Classification. 28th International Conference on Intelligent User Interfaces (IUI 2023), Sydney, Australia. https://doi.org/10.1145/3581754.3584179

Kopácsi, László, Barz, Michael, Bhatti, Omair Shahzad, & Sonntag, Daniel. (2023, March 27). IMETA: An Interactive Mobile Eye Tracking Annotation Method for Semi-automatic Fixation-to-AOI mapping. 28th International Conference on Intelligent User Interfaces (IUI 2023), Sydney, Australia. https://doi.org/10.1145/3581754.3584125

Kopácsi, László, Baffy, Benjamin, Baranyi, Gábor, Skaf, Joul, Sörös, Gábor, Szeier, Szilvia, Lőrincz, András, & Sonntag, Daniel. (2023). Cross-Viewpoint Semantic Mapping: Integrating Human and Robot Perspectives for Improved 3D Semantic Reconstruction. Sensors, 23(Deep Learning in Visual and Wearable Sensing for Motion Analysis and Healthcare). https://doi.org/10.3390/s23115126

Co-funded by the European Union