CeBIT 2010: Live 3-D-TV

CeBIT 2010: Live 3-D-TV
Two synchronized MicroHDTV cameras from IIS record the scene that is then transmitted in 3-D. Credit: Fraunhofer IIS

This is the year in which 3D cinema and 3D TV will make the breakthrough. At CeBIT in Hannover, Fraunhofer researchers are presenting technologies and standards that are hastening the progress.

Strikers and defenders furiously compete for the ball. Suddenly, the forward drops into the penalty area. Penalty shot. The penalty taker carefully sets the ball just right. Cut to the goal camera. Like a cannon ball, the leather flies over and past the heads of the spectators, who are completely awestruck. Except that these soccer fans are not sitting in the stadium, but rather in front of a , far away from the hustle and bustle of FIFA World Cup football in South Africa.

2010 will be the year in which cinema and television make the jump into the . Blockbusters like James Cameron's Avatar, Pixar's Ice Age and Dawn of the Dinosaurs have brought in billions box office ticket sales throughout the globe. And now, the time for 3D movies for television has also come. The industry announced the first 3D televisions will be ready for production by summer. A few games of FIFA World Cup football have already been captured in 3-D. Yet before 3D technology becomes the standard equipment for the movie screen and the telly, a few questions still require some clarification. For instance, how can the recording process and post-processing be optimized, and the costs for them be reduced? Indeed, Cameron's science fiction extravaganza gobbled down 250 million US-dollars in the making, and required four years of computer work. How can the tools for the post-production of movies be improved? And the sixty-four thousand dollar question: 3D-glasses, or no 3D-glasses?

To address these issues, experts from the film industry, academia and research joined forces in the consortium "PRIME: Production and Projection- Techniques for Immersive Media." Together they are exploring and developing business models and techniques for cinema, television and gaming. Participating partners include KUK Filmproduktion GmbH, Loewe, Kinoton GmbH, DVS Digital Video Systems AG, Flying Eye, the Film & Television Academy in Potsdam HFF Konrad Wolff, the University of Duisburg-Essen and the Fraunhofer Institutes for Integrated Circuits IIS in Erlangen and for Telecommunications, Heinrich-Hertz-Institut HHI in Berlin. The German federal ministry for economics and technology is funding the project.

3D films pose tougher challenges than their two-dimensional counterparts, since two images are always needed in order to create a spatial depiction. For this reason, at least two cameras must be used to record the film, and a 3D screen is needed to display both images. One image for the left eye, and one image for the right. Stereoscopy has evolved into a recording technology for high-resolution home theater. This process demands the utmost precision from the camera crew and post-production, because an individual film has to be produced for each eye. In editing and in post-processing, both streams must be processed together in absolute synchrony. "The most infinitesimal shift or tilt of the camera becomes visible on the screen, and can even make you feel nauseous," explains Stephan Gick, group manager for digital camera systems at IIS.

For the movie theater, a scene is shot with two synchronized MicroHDTV cameras from IIS. For this, the team around Stephan Gick propelled the technology forward in a way that allows reliable images to be digitally recorded for the right and the left eye. Stereo or side-by-side rigs, a specially-constructed camera structure, simulate the distance of the human eye as realistically as possible. The "Genlock" process is designed to guarantee that the cameras record in synchronous imaging. In this respect, the one camera acts as the "master" - the digital leader. Using the exact same settings, the second camera captures the calibration, color fidelity and geometry.

Especially when it comes to 3D live transmissions, the camera team has to be able to count on these settings. One helpful tool for the recording and transmission of three dimensional data in real-time is STAN, the stereoscopic analyzer that HHI jointly developed with KUK Filmproduktion. This combination of hardware and software records and analyzes stereo images so that they can be processed in real time. During a take, the feedback loop passes on the calculated values directly to the camera, so that errors or incorrect settings can be detected - and corrected - in real time. Researchers at HHI are also working assiduously on one special highlight in the PRIME project - the 3D panorama. The scientists can already present the initial results at a showroom in Berlin.

Citation: CeBIT 2010: Live 3-D-TV (2010, February 26) retrieved 19 March 2024 from https://phys.org/news/2010-02-cebit-d-tv.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Fujifilm unveils 3D digital camera

0 shares

Feedback to editors