PRESENCE

A toolset for hyper-realistic and XR-based human-human and human-machine interactions
Project ID
Funding Organization:
Funding Programme:
HORIZON-CL4-2023-HUMAN-01-CNECT
Funding Instrument:
Research and Innovation Action
Start Date:
01/01/2024
Duration:
36 months
Total Budget:
7,655,707 EUR
ITI Budget:
469,375 EUR
Scientific Responsible:

The concept of presence can be understood as a synthesis of interrelated psychophysical ingredients where multiple perceptual dimensions intervene. A better understanding on how specific aspects, such as plausibility (the illusion that virtual events are really happening), co-presence (the illusion of being with others), or place illusion (the feeling of being there) impact XR experiences is key to improve their quality. The availability and performance of current technologies do not reach high levels of presence in XR, which is essential to get us closer than ever to the perennial VR dream: to be anywhere, doing anything, together with others, from any place. PRESENCE will impact multiple dimensions of presence in physical-digital worlds, addressing three main challenges: i) how to create realistic visual interactions among remote humans, delivering high-end holoportation based on live volumetric capturing, compression and optimization techniques, under heterogeneous computation and network conditions; ii) how to provide realistic touch among remote users and synthetic objects, developing novel haptic systems and enabling spatial multi device synchronization in multi user scenarios; iii) how to produce realistic social interactions among avatars and agents, generating AI virtual humans, representing actual users or AI agents. PRESENCE will ensure the future uptake of research results following a threefold evaluation method: 1) each technology will be independently evaluated to understand its impact on the illusion of presence; 2) each component will be evaluated by the integration team, providing scientific and technical feedback in order to facilitate their use in each project iteration as well as beyond the project scope, towards technology transfer and exploitation; 3) all components will be integrated in two demonstrators (professional and social setups), following a human-centred design approach and ultimately evaluating the user experience.

Our team was tasked with: 1) Multi-view RGB-D system spatial calibration, 2) Real-time human reconstruction algorithms based on multi-view RGB-D data, 3) Gaussian Splatting-based head avatar reconstruction.

Consortium

FUNDACIO PRIVADA I2CAT, INTERNET I INNOVACIO DIGITAL A CATALUNYA
ACTRONIKA
UNIVERSITAET HAMBURG
ETHNIKO KENTRO EREVNAS KAI TECHNOLOGIKIS ANAPTYXIS
RAYTRIX GMBH
SENSEGLOVE B.V.
GO TOUCH VR
DIDIMO, S.A.
VECTION ITALY SRL
UNIVERSITAT DE BARCELONA
UNITY TECHNOLOGIES APS
SOUND HOLDING B.V.
INTERUNIVERSITAIR MICRO-ELECTRONICA CENTRUM
JOANNEUM RESEARCH FORSCHUNGSGESELLSCHAFT MBH
SYNCVR MEDICAL NL B.V.
ZAUBAR UG (HAFTUNGSBESCHRAENKT)

Contact

Dr. Dimitrios Zarpalas
(Scientific Responsible)
Building B - Office 0.18

Information Technologies Institute
Centre of Research & Technology - Hellas
1st km Thermis - Panoramatos, 57001, Thermi - Thessaloniki
Tel.: +30 2310 464160 (ext. 145)
Fax: +30 2310 464164
Email: zarpalas@iti.gr