3D-News Archive March 2008

Subscribe to our mailing list, and receive the latest 3D-News items by E-Mail!
Just enter your E-Mail Address below:

DTS Digital Images Fine Tunes 'Journey to the Center of the Earth 3D' for Upcoming Summer 2008 Release
3D-News Posted: Saturday, March 29, 2008 (6:28 UTC) | Posted By: Webmaster

DTS Digital Images, Inc. has provided custom image processing services for New Line Cinema and Walden Media's Journey to the Center of the Earth 3D, the first full-length, live action feature shot in digital 3D (slated for release in Summer 2008 and recently previewed at the Showest conference in Las Vegas).

"Journey to the Center of the Earth 3D is the world's first digitally-captured stereoscopic live action feature film. Not only is it ground-breaking on a technical level, but it's a beautiful movie and lots of fun," said Jonas Thayler, Vice President of Post Production AFG/Walden Media. "At the end of the post production workflow, much of which we designed ourselves, we found we needed more sophisticated noise reduction than our tools could provide. We knew DTS could bring all of our problem scenes up to the quality of the rest of the movie and indeed they did, seamlessly and within the confines of our budget. There are others who can do this sort of thing, but when you can't afford to do it twice, hire a specialist."

3D is emerging as the future for digital cinema presentations with its ability to engage movie-goers with richer, enhanced viewing experiences. In order to create 3D digital cinema content, stereoscopic images must be created. And while the technology to create these images has been developing, capturing it can create unique problems.

To deliver an excellent 3D experience, one of the main challenges is to create "left eye" and "right eye" images which are identical in every way except for their perspective. To accomplish this, images are captured by cameras separated by a few inches, in order to emulate human eyes. When the left and right eye images are matched, the 3D is easy to watch and quite believable - but accomplishing this level of consistency is no easy task. The native "left eye" and "right eye" images, as photographed, are quite different because they are captured by different cameras shooting through the different optics so that two big cameras can capture images from just inches apart. The differences between these raw images manifest themselves as imbalances in color, video noise and sharpness levels.

DTS Digital Images was retained to apply its unique imaging algorithms to a number of the most problematic sections of this movie to remove these imbalances and provide a superior digital 3D experience. In addition, the filmmakers behind Journey to the Center of the Earth had numerous shots that were substantially enlarged during post-production. This means that these blow-ups did not match the adjacent shots because the blow-up process made them look much noisier and much less sharp. DTS Digital Images applied their unique custom noise reduction and detail enhancement technologies to create a seamless look from one shot to the next.

"DTS Digital Images is committed to delivering remarkable quality moving pictures through its breakthrough image processing algorithms and we were honored to be able to give an assist to the Journey to the Center of the Earth 3D team," said Mike Inchalik, Vice President, Strategy and Business Development, DTS Digital Images. "We recognize the importance that 3D holds for the future of digital cinema and we're pleased be able to contribute our expertise and technology to enable the delivery of an optimal 3D experience for viewers."

Mechdyne Enables Virtual Reality 'Mission to Mars' To Take Flight in New Visualization Center at Washington University St. Louis
3D-News Posted: Tuesday, March 4, 2008 (20:52 UTC) | Posted By: Webmaster

Mechdyne Corporation announced that it has installed an immersive CAVE™ display system as the focal point of the new Fossett Laboratory for Virtual Planetary Exploration at Washington University St. Louis (WUSTL). As a teaching and research facility of the Department of Earth & Planetary Sciences, the Laboratory will provide 3D imaging capability for visualization of data collected by national and international space exploration programs. One highlight of the Fossett Laboratory's work in 2008 will be immersive visualizations of imagery gathered as part of the ongoing Mars Exploration Program.

The new CAVE system supports stereoscopic projection on three walls and the floor (each 7.5' h x 10' w) to create a surround screen environment. Mechdyne integrated its Beacon™ projection technology along with a wireless motion tracking system and virtual wand that allows scientists to easily 'fly' through visualizations. Mechdyne also provided its CAVELib software and Conduit for ArcGIS, which is used to 3D-enable data to create a fully immersive, Virtual Reality experience in the CAVE.

"The CAVE system installed in the Fossett Laboratory brings a unique capability to faculty, students and guest researchers working to interpret data from man's exploration of planetary bodies throughout the solar system," said Professor Raymond Arvidson, chair of the Department of Earth and Planetary Sciences at WUSTL. "We can effectively transport groups of people to the surface of other planets and create experiences that will illuminate and enhance understanding of the geology and geophysics both of other planets and our own home."

The Laboratory, which was funded through a gift from noted adventurer and WUSTL alumnus Steve Fossett, is an important addition to the Earth & Planetary Sciences facilities at WUSTL. The Lab includes the CAVE system, housed in a purpose-built room, along with a graphics computing cluster based on HP quad-processor servers and NVidia graphics. Washington University is home to the Geosciences Node (http://pds-geosciences.wustl.edu/) of the NASA Planetary Data System (http://pds.jpl.nasa.gov/), with responsibility for management of data related to the study of the surfaces and interiors of terrestrial planetary bodies. For the Mars visualizations, researchers at the Fossett Laboratory will draw on data from such missions as Mars Global Surveyor, 2001 Mars Odyssey, the Mars Exploration Rovers, Mars Reconnaissance Orbiter and Phoenix. WUSTL houses more than 10 TB (Terabytes) of data collected in various NASA missions, including hyperspectral imagery gathered by orbital systems and landers. The facility also has access to visual imagery from other PDS nodes, which will be combined with spectral data to provide data rich imagery.

"Tools such as the CAVE are necessary not only to help us better understand the data we are collecting but also to help draw the best and brightest new students to the field of planetary sciences," said Keith Bennett, Deputy Manager for Operations of the PDS Geosciences Node. "This may be the first planetary geosciences facility in the world with a dedicated 'virtual reality' imaging center, giving us the potential to perform first-of-its-kind work in data integration and analysis. During the next few months, we will be evaluating techniques for working with the data to get the most out of the unique capability of the visualization environment."

VisuMotion unveils 3D Camera to shoot real 3D videos for glasses-free 3D-Displays
3D-News Posted: Tuesday, March 4, 2008 (20:45 UTC) | Posted By: Webmaster

At the world's leading IT fair, CeBIT 2008, VisuMotion unveils its ground-breaking 3D camera that allows 3D capturing of cross-platform footage for use with glasses-free 3D-displays.

VisuMotion will show off this new technology at the booth of Thuringia A40 in hall 9 in form of a live/live transmission from the 3D camera to a multi-user 3D-display. The new 3D camera is available for rental services as of now and selling announced by mid of 2008 in a professional version. By means of this astonishing technology, for the first time in the history of the 3D industry it is possible to create real video footage independent from the 3D-display type that is used later on for the depiction of the takes.

The use of glasses-free 3D displays is rapidly increasing worldwide, particularly for digital signage applications, 3D gaming, research and development as well as for medicine and Virtual Reality setups. Market researches foresee a worldwide turnover of 3D products in 2010 of several 100,000,000 Euros.

VisuMotion's product portfolio also includes 3D Rendering Plug-ins for Autodesk's 3D Studio Max and Maya, the Compositing and Editing Software "3D StreamLab", the 3D Application Driver "DeepOutside3D" as well as the playback software "3D Movie Center".

The multi-stream (multi-view) rendering required for autostereoscopic 3D displays is executed by means of the plug-ins for the well-known animation packages. This means, 3D animations (or still pictures) are automatically rendered in various perspective views.

The work flow seamlessly leads to the 3D Editing and Compositing solution "3D StreamLab" that allows the simultaneous postproduction of all different video streams (i.e. the multiple views).

3D Content creators will both benefit from the unique high-speed rendering algorithms implemented in this master piece of software and profit by the fact that once 3D videos are rendered and cut by VisuMotion software, these can be played back on nearly any type of 3D display that is commercially available now and in the future.

To complete the 3D solution chain, VisuMotion provides the 3D playback software "3D Movie Center" which is available as Library and Network Edition.

The Library Edition runs on a stand-alone PC that drives a (single) 3D display. In order to increase the customer benefit, from now on each license of the Library Edition includes an ActiveX-license by means of which the user may integrate a 3D-playback into own applications.

The Network Edition enables to control a network setup of 3D displays.

Up to now it was impossible to combine 3D displays of different manufacturers in such a network when being driven by a single server. This limitation has now been overcome thanks to VisuMotion's software solutions. And even more: The 3D Movie Center is downward compatible to usual two-dimensional footage which means such display networks can be formed by 3D and 2D displays at the same time.

The product "DeepOutside3D" allows the glasses-free 3D visualisation of certain DirectX based applications such as 3D games. A completely new feature is also the support of certain OpenGL based applications. A well thought-trough license algorithm provides for the fact that the user only pays the support for those applications that he really uses.

Alexander Schmidt, CTO Software, describes VisuMotion's business approach as follows: "The unsolved challenges for the use of autostereoscopic displays consisted mostly of various non-congruent hardware and software approaches that exist parallel to each other, the neglect of suitable 3D infrastructure to drive 3D-displays and the fact that compelling real video 3D content could hardly be made for 3D displays so far. VisuMotion is dissolving these growth barriers of the 3D industry by providing universal 3D infrastructure conceptions that support the display products of nearly any maker. Thus, there is also a good potential to play a substantial role in forming standards for the 3D industry. By providing a powerful 3D camera, VisuMotion takes away another bottle neck for the wide use of 3D-displays."

Sensics Supplies NASA with HMD-Based Panoramic, High Resolution Telepresence System
3D-News Posted: Tuesday, March 4, 2008 (16:52 UTC) | Posted By: Webmaster

Sensics, the panoramic head-mounted display company, reported that it successfully completed the delivery of a Phase II SBIR project for NASA, which included a unique panoramic, high-resolution telepresence system based on the piSight HMD line.

Deployed last year at the Lyndon B Johnson Space Center for the Robonaut, a humanoid robot developed by NASA and DARPA, the Sensics system includes a panoramic, high-resolution camera array and the piSight™ panoramic, high-resolution head-mounted display.

The combined camera plus piSight system will serve as the "eyes" of the Robonaut. NASA's Robonaut system can work side by side with humans, or alone in high-risk situations. Telepresence uses Virtual Reality display technology to visually immerse the operator into the robot’s workspace, facilitating operation and interaction with the Robonaut.

From of the broad piSight product line, NASA selected the 150-43b model which is 150 degrees wide, 60 degrees tall, has strong stereoscopic overlap and displays 6 million full-color pixels per eye. The system is integrated with a precision six degree-of-freedom motion tracker. Video acquired through the camera array is compressed, sent over a low-bandwidth communications network and then displayed in full stereo inside the HMD.

"Since its commercial launch last year, the piSight line of panoramic HMDs has been selected by many demanding customers for a variety of simulation, training, design and research applications," said Dr. Larry Brown, president and founder of Sensics. “We are very pleased to add NASA to our growing list of satisfied customers."

Nuke 5 Arrives
3D-News Posted: Tuesday, March 4, 2008 (16:49 UTC) | Posted By: Webmaster

Leading visual effects software developer The Foundry has announced the next major release of its powerful compositing application, Nuke. Nuke 5 features a brand new user interface, the addition of Python for scripting, support for stereoscopic workflow, and with the ability to read, process and write over 1,000 channels per stream, it now features the industry's broadest support for EXR images.

Since taking on the development of Nuke The Foundry's goal has been to remain true to the original concept of developing a comprehensive compositing solution that is 'designed by artists for artists'. With user requirements as the driver, The Foundry has concentrated on enriching and refining the product, building on its already strong foundations and focussing development efforts on areas that required renovation. Nuke 5 is the first major step towards this goal.

"Nuke is a complete pipeline tool. We created a pipeline for a job within days that would not have been possible with any other software. Within three days of getting our license we generated over 85,000 frames of data that equalled 350 variations for the client. Nuke 5's support for OpenEXR is so effortless that our renders come in and we just work." Robert Nederhorst, VFX Supervisor at SpeedShape's Los Angeles studio.

Applying over a decade of software development experience and close collaboration with the growing Nuke community, The Foundry has reworked Nuke's UI to improve the user experience and make it more approachable for a broader range of artists. In addition to augmenting the existing floating window layouts with a flexible panes and panels system, Nuke 5 features per-node mask inputs, and expanded LUT support for file I/O colourspace conversion.

Since its inception, Nuke's extensive scripting capabilities have been a key feature for many of its adopters, and The Foundry has taken this to the next level by adding support for the Python scripting language for the version 5 release.

The Foundry's product development has long-benefited from close working relationships with some of the industry's leading post production facilities. The latest challenge facing these cutting edge artists is the dramatic increase in volumes of stereoscopic projects. The Foundry has responded by laying the groundwork for efficient multi-view compositing in Nuke 5.

'With the latest Nuke developments we are reinforcing our commitment to deliver first-class products that assist creativity, workflow and productivity, no matter how demanding the pipeline,' said Dr Bill Collis, CEO, The Foundry. 'Close relationships with our Nuke and plug-in customers are fundamental to our work ' we listen to what they say they need for their pipelines today and tomorrow and this drives our R&D.'

Available on Linux, Windows, and Mac platforms Nuke delivers unparalleled speed, an efficient multi-channel scanline rendering engine, and powerful feature set unrivalled in the desktop market.

'Hannah Montana': FotoKem Taps Quantel's Groundbreaking Stereoscopic Technology to Post 3D Feature in Record Time
3D-News Posted: Tuesday, March 4, 2008 (16:32 UTC) | Posted By: Webmaster

Disney's Hannah Montana/Miley Cyrus: Best of Both Worlds Concert Tour not only broke box office records when it was released earlier this month, playing to sold out shows across the United States—it also made history. It was the first live action feature to open in digital 3D, and it was the first film produced using Quantel's Pablo 4K with the Stereoscopic 3D option.

The film, directed by Bruce Hendricks, was also produced in record time. Shot in Salt Lake City in November, the concert film was in theaters a mere 11 weeks later. That allowed Disney to capitalize on the intense interest in the Hannah Montana/Miley Cyrus live concert tour, which wrapped up its U.S. run just days before the movie's debut. Completing an ordinary feature film in less than three months would have been a tall order, but, given the daunting technological hurdles, to do so with a 3D movie was an almost super-human feat.

Color grading and compositing was completed at FotoKem using a pair of Pablo 4Ks, each with the stereoscopic 3D option, in DI Theaters set up specifically for 3D work. An industry leader in both 35mm and 70mm 3D, the venerable facility was the first in Hollywood to acquire Quantel's new stereoscopic technology, which was introduced last September.

Still, considering that the project effectively required coloring and conforming two 80-minute films (due to the left eye/right eye film streams), it was not a task that FotoKem took on lightly. "At the beginning, we said, ‘Can it be done?'" recalled John Nicolard, FotoKem's Head of Digital Production. "We carefully evaluated everything on paper and concluded that we could, and then we dove in and did it. It turned out to be an amazing project from beginning to end."

FotoKem's first step was to set up a pipeline to take full advantage of tools and efficiencies inherent in Quantel's Pablo 4K platform and stereoscopic 3D technology in order to keep pace with the film's breakneck production schedule. Anticipating that convergence (effectively adjusting Z space) would present a challenge, FotoKem General Manager of Digital Film Services Bill Schultz, an Academy Award-winner for Scientific and Engineering Achievement, worked with Quantel engineers to implement special developmental software that allowed convergence adjustments to be made in real-time without rendering.

"Bill was able to incorporate Quantel's new software into our existing pipeline—which was robust already," explained Nicolard. "That allowed us to handle the amount of work that needed to be done, address technical requirements to make the film look as good as it does, and get it done in the allotted time. The ability to make real-time 3D convergence adjustments was the single biggest win."

Still, the project required a literal round-the-clock effort. Academy Award-winner Michael Tronic, edited the film, cutting 19 songs (12 of which were eventually used in the film) on average at one per day. FotoKem then went to work, conforming each new sequence overnight for a screening with Hendricks and Tronic the following morning.

"Our Pablo 4K pipeline allowed us to quickly and accurately conform each song, and that was a key to the success of the project," Nicolard stated. "Bruce and Michael were able to watch new scenes literally hours after they were cut. They discussed it and suggested changes—and we were able to make those changes immediately and move forward."

In addition to grading, Fotokem used the Pablo 4K and Stereoscopic 3D technology to perform a variety of visual effects functions. The majority involved subtleties such as removing a camera flag from a performer's eye. The system allowed such effects work to be done in stereo and before rendering, resulting in more accurate adjustments and less time spent waiting for media to render.

"We added text to identify various people who appear in the film," recalled Nicolard. "And Pablo gave us the ability to dynamically converge the text, independent of the background plate. That gave us better control over the IDs. It was a big advantage."

The productivity of the workflow was also enhanced by Genetic Engineering, Quantel's team-working infrastructure. The system allowed two Pablos to share data so that grading and conforming work could be carried out simultaneously.

Convergence posed the biggest challenge to the smooth operation of the workflow. "Production didn't shoot with locked-off cameras," Nicolard noted. "They had seven 3D rigs that were on cranes and flying all over the place, so points of convergence varied. When you cut scenes together, it can be a little disconcerting, because your eye is moving all over the place. As a result, we needed to put each sequence through an additional balancing pass to make it more comfortable to look at."

The developmental software provided by Quantel engineers provided the solution to such convergence issues. The software offered the ability to play out and process two streams of synchronous, high resolution media simultaneously without rendering. Not only did that make conforming 3D material almost as quick and straightforward as conforming conventional 2-D media, it enabled stereo strength and convergence to be adjusted on the fly. FotoKem artists were thus able to experiment interactively and achieve the perfect 3D effect for each shot.

Given the time pressures and the fact that this was the first true "battle test" of the Quantel stereoscopic technology, a few complications might have been expected, but the work proceeded virtually without a hitch. "We worked unbelievable hours—but everybody was dong that. The sound people, the production people, the editor and the director were all working morning, noon and night," Nicolard said. "It was a wonderful collaborative effort. The Quantel software worked extremely well. If we had not had the 3D convergence capability we would never have been able to complete this on time. Never."

Stereoscopy.com 3D-News (ISSN: 1612-6823) is published monthly by Stereoscopy.com, P.O. Box 102634, 70022 Stuttgart, Germany.
Editor-in-Chief: Alexander Klein.

Worldwide subscriptions to the electronic version of the Stereoscopy.com 3D-News are provided free of charge.

Material in this publication is copyrighted © by Stereoscopy.com. All rights reserved. Reproduction in whole or in part without written permission is prohibited.

Subscribe to our mailing list, and receive the latest 3D-News items by E-Mail!
Just enter your E-Mail Address below:

Please do not forget to visit the Stereoscopy.com Bookshop, offering the world's largest selection of books in 3D and about 3D.
Learn More Click Here to Pay

Button left Back to the Stereoscopy.com 3D-News Page
Button left Back to the Stereoscopy.com Homepage
Button up Back to the Services Page

Last modified on August 31, 2006

Copyright © 2000-2018 by Stereoscopy.com and Alexander Klein. All rights reserved.