XR4DRAMA partners at the last consortium meeting (Pisa, Italy, October 2022)

First things first: We’re fully aware that XR4DRAMA should have been a wrap by the end of 2022 (at the latest) – and that it’s January 2023 now. The reason for the delay was, of course, the Corona pandemic. And all the challenges that came with it. There were things we simply couldn’t do for months and months, and thus we couldn’t finish on time. We’re not ashamed. And luckily, the EC granted us an extension. The good news: We’re absolutely on track to meet the new XR4DRAMA deadline, which is April 30th. Let’s take a look at what will (have to) happen until then.

With the API deadline already behind us (31.12.22), the deadline for XR4DRAMA system development is set for the end of January (let’s get coding, team!). Final testing of the authoring tool and connected mobile apps (and sensor systems) will commence in early February, with a deadline for major bug fixing coming up in the middle of the month. March will see AAWA and DW doing final full test drives in Vicenza (disaster management pilot) and Corfu City (media production pilot). A penultimate internal review of the pilots and a full plenary meeting will take place in Berlin in late March. Finally, all partners will meet again in Thessaloniki in late April – to discuss project results, tie up loose ends, and prepare for the final official review, which will probably happen sometime in May. 

Beta version of the XR4DRAMA authoring tool during a live test in Pisa

By then, we’ll  hopefully have a (rather sophisticated) MVP that could come with a press release like this:


XR4DRAMA is about increasing situation awareness, i.e. the “perception of environmental elements and events with respect to time or space, the comprehension of their meaning, and the projection of their future status.”

The project’s pilots focused on disaster management and media production planning. However, the XR4DRAMA applications can also cater to other target groups, e.g. event planners/managers, startups and SMEs, all kinds of researchers, and citizens interested in the concept.

To achieve its aim, XR4DRAMA combines a number of cutting-edge technologies, e.g.:

  • XR/AI (3D models, 3D reconstruction, walkable VR maps of a place)
  • IoT (monitoring of vital signs and environmental factors via wearables and sensors)
  • NLProc/AI (to extract, structure, analyze, and summarize text)
  • multi modal data fusion (to combine all types of information and content) 

At the heart of the project, there’s a sophisticated map of a place/area (anywhere in the world – at least in theory) that users can select via the XR4DRAMA authoring tool. This map, based on rough OSM data, can be refined and populated with all kinds of points of interest (from bus stops to hydrants to vegan cafés to landmarks) and with all kinds of content (text, sound, images, 360 images, videos, 3D models) and background information (legal info, emergency numbers, weather reports etc.). Some parts of this process are fully automated, some parts are handled by scouts on sight who may enrich, update, and verify manually via a mobile app (that also offers AR navigation and projection). At the end of the day, users – who are NOT on a site – get a very good idea of what the place looks like / sounds like / feels like, and where it’s safe/convenient to work and hang out. The map is available in 2D and 3D and explorable in VR. Disaster managers can also count on citizen reports (that come in via a different app) and IoT data (sent by field agents wearing smart vests and exterior sensors).


So: keep your fingers crossed that everything will go according to plan – and stay tuned for more news and papers and project results.