Humboldt-Universität zu Berlin - Mathematisch-Naturwissenschaftliche Fakultät - Visual Computing

BMBF Project Gestfus

3D Gesture Interaction and Fusion of 3D Images


Period: 01. December 2014 - 30. November 2015


This project is funded by the BMBF, 03ZZ0404B

Many topics within the 3D sensation alliance deal with the fusion of 3D image data from different sources. Augmented Reality (AR) applications for example face the challenge to integrate the virtual content into a real world image in a realistic and consistent way. Fusing 3D image data from different sources can also be used to create 3D content for Virtual Reality applications.

The fused 3D image data should ensure a non-disturbing and comfortable depth perception. The Visual Computing Group will work on methods for a consistent geometric and photometric fusion of 3D data from different sources. The geometric fusion registers the 3D data (translation, rotation, scale) and is responsible to ensure a consistent depth ordering and occlusion handling. When dealing with dynamic data, the geometric fusion process must also ensure a time consistent registration. Methods for the photometric fusion synchronize the image properties (blur, noise, color temperature, imaging pipeline artifacts) and adapt the scene illumination (light sources, shadows).

It is planned to create a dataset containing scenes which combine different 3D image data. For each scene, different instances will be generated with varying parameters of the fusion process. This dataset can then be used for a user study in order to evaluate the impact of the fusion parameters on the overall perception and comfort of a user.

image fusion

Showcase of a fusion process in an AR application. A virtual model of a church (left) is created based on a historic image. The goal is to render the virtual model on top of its nowadays image (middle). To ensure a smooth integration, the image properties of the nowadays image are adapted to the image properties of the historic image (right).

bmbf />

<p> </p>