Monday, August 4, 2008

Fantastic (3D) Voyage: Journey to the Center of the Earth

http://www.videography.com/articles/article_15889.shtml

 

by Peter Caranicas 

 

The classic Jules Verne sci-fi novel about a hazardous expedition deep into our planet's interior, Journey to the Center of the Earth, was first made into a movie in 1959. The remake released this summer, Journey to the Center of the Earth 3D, recounts a similar story, but, technologically, it stands worlds apart. The newer film boasts not only cutting-edge visual effects but also makes use of the latest stereoscopic technology to enhance the moviegoing experience.

 

Produced by New Line Cinema and Walden Media, this version was directed by Eric Brevig, a seasoned visual effects supervisor (The Day After Tomorrow, Pearl Harbor) in his feature directorial debut.

 

For some, 3D still conjures up pictures of well-dressed people sitting in '50s movie theaters wearing goofy glasses that separate the image projected from a 35mm print into left-eye and right-eye versions, creating the illusion of depth. But the technology has evolved way beyond your father's 3D--so much so that Hollywood is betting on stereoscopic digital projection to save it from the perils of audience drift in the Internet age.

 

Today's 3D technology incorporates striking innovations throughout the film process, from image acquisition, through postproduction, and on to exhibition.

 

Pace Production

For Journey, the camera technology was provided by Pace, the Burbank-based facility and equipment house that has long pioneered 3D acquisition, having collaborated for years on 3D projects with director James Cameron.

 

Pace supplied four of its proprietary Fusion 3D camera systems, which were co-developed with Cameron and have been used on films including Cameron's Ghosts of the Abyss, Aliens of the Deep and the upcoming Avatar and Battle Angel; Robert Rodriguez' Spy Kids 3D and The Adventures of Shark Boy & Lava Girl in 3D; Wild Ocean 3D; U2 3D; Hannah Montana/Miley Cyrus: Best of Both Worlds Concert Tour; the upcoming Jonas Brothers concert tour; and the 2007 NBA All-Star Game. (Final Destination 4 is currently in production with the latest Pace/Cameron Fusion F23 system.) Fusion consists of two customized Sony HDC-F950 HD cameras rigged as a pair, one for left-eye image capture and the other for right-eye information. The files for Journey were recorded onto HDCAM SR on dual tapes.

 

Shooting took place on stages in Montreal for 12 weeks, and for another two weeks in Reykjavik, Iceland, where the outdoor action takes place as the characters venture into the bowels of the earth through a volcano.

 

A Pace technician was on site to make sure the equipment was working properly. “Every Ferrari needs a good mechanic,” proudly quips Pace founder and CEO Vince Pace.

 

Ryan Sheridan, vice president of imaging technology integration, explains that Pace's proprietary Fusion system retains the physical imagers inside the Sony cameras, but, he adds, “we build all of our motion controllers, hardware and the electronics that power the whole system.”

 

The key, he adds, is the extreme precision of the Pace equipment, which allows operators to “shoot 2D but capture 3D.” In other words, they can concentrate on acquiring a good 2D image in the traditional way, and the tools will ensure the capture of a good 3D image.

 

“That way there isn't a need for a 'stereographer' to sit there, dictate, and slow production down, saying things like, 'This should be popping out, this should feel 3D,'” Sheridan adds. “We try to democratize 3D, to get away from the gimmicks and make it a language.”

 

But 3D capture needs to be precise. Even with near-perfection on the set, things may still need to be adjusted afterwards. There are instances where “you get a slightly different sharpness of the image for the left and the right eye,” says John Lowry, founder of image processing and post house Lowry Digital.

 

“None of these things are perfect, and there's always room for image processing to help things along.” For example, there can be differences between the left-eye and right-eye information. “The color of the sky might be different, or there can be vertical displacements,” Lowry says. “You can adapt to them, but later you wonder why you have a headache. These things need to be watched very carefully.”

 

For Journey, Lowry Digital did some custom processing work, including noise reduction to even out the camera information, matching one camera to another and making the images sharper. “We were able to balance things up,” Lowry says.

 

Indeed, all the post work on Journey involved technology equally as sophisticated as the wizardry used in image acquisition. When planning that work, Walden Media postproduction vice president Jonas Thaler turned to veteran postproduction supervisor Steven Kaminsky (Superman Returns, The Mask), who managed the post, editorial and 3D mastering.

 

3D Post

“My [first] task was to figure out and plan our 3D mastering process, which included putting together an in-house 3D DI team of specialists, procuring hardware, selecting software, planning for reusability, and developing the overall business model for mastering the film,” says Kaminsky.

 

The film's DI project studio was located at Widget Post in Hollywood. "We set up servers, workstations, 3D projection and everything that we needed in one room. It was a beautiful purpose-built project studio where the 3D DI team worked every day, and the filmmakers, VFX team and studio staff convened daily for working sessions.”

 

One system Kaminsky ended up using in the DI process was Assimilate's Scratch, which he had used on Superman Returns. “We looked at their 3D capabilities and found that Scratch had the 3D features we needed.”

 

“Our biggest concern for Scratch was the size of this project,” Thaler adds. “There was a huge amount of footage and content from several sources, and we had roughly 800 visual effects, which [because of the stereoscopic nature of this work] means you multiply by two for the number of files. This was an ambitious project for a cost-effective software tool that we set up ourselves, but it performed amazingly well.”

 

Kaminsky assembled an experienced DI mastering crew, including conform supervisor Gary Jackemuk, who also filled the role of daily go-to for operating the digital pipeline, DI colorist Jeff Olm, I/O supervisor Michael Fellows and visual effects artist Judith Bell, who took on a number of special projects and hundreds of in-house shots done directly by the 3D DI team.

 

Hardware at the 3D DI suite included a dozen Hewlett-Packard 8400 quad-processor workstations with Nvidia GeForce FX5600 graphics cards, 14 Samsung monitors and three Facilis 24D TerraBlock servers. Four Scratch stations were set up--one each for Conform, Color Grading, QC and VFX Reviews.

 

Kaminsky explains, “The real-time feature set of Scratch really pays off. The team was quickly able to assemble the film, provide a live working build for reviewing and finishing VFX, and create the color language from the very beginning. The team spent 10 months on postproduction, which, with Scratch, was able to be done in stereoscopic 3D from the onset.”

 

Footage from the Cameron/Pace Fusion 3D cameras was recorded onto dual HDCAM SR tapes (one tape for each eye). Once footage was converted to DPX files, it was imported into Scratch. Avid was used to edit the offline cut and create the EDLs. The visual effects were delivered in DPX format and were easily dropped into Scratch.

 

Two projectors were used for the 3D viewing. “Viewing in 3D was really crucial to the project. With the enormous amount of files for this type of production, you want to avoid 2D and the ongoing iterations of viewing at another facility, and then doing the work over again,” says Kaminsky. “The real-time 3D pipeline and working process enabled a dynamic, fluid workflow for all of us.”

 

Gary Jackemuk, who worked with Kaminsky on Superman Returns, supervised the conform and finishing group. “We were in uncharted territory, at the bleeding edge in building the 3D pipeline,” says Jackemuk. “We knew Scratch could manage a lot of color data, and this project was going to put Scratch through its paces. We were learning as we moved forward, and the Assimilate team was a big help along the way. I also developed some custom software to conform the left- and right-eye data simultaneously, then pushed it into XML files for Scratch to ingest.”

 

Jackemuk adds, “We had Mike Fellows as our I/O supervisor, one of the best there is, and he managed all the data using the Scratch data management tool. He would do a batch capture of the HD SR tapes using the Symmetry Bluefish card, one eye at a time off the Avid EDL, and then I could conform for the next phases. We had a huge number of visual effects from a variety of VFX houses, and Mike had to carefully QC every shot twice, for each eye. Mike kept all that data, all the versions throughout the entire project, running smoothly. We built some custom FileMaker Pro database tools so that color notes, shot QC status and VFX DPX delivery inventory were all managed in one system. This helped us keep the process organized and enabled the VFX team to have a live view of the DI status of their shots updated in real time as we worked.”

 

“We did all the rendering in Scratch in two passes: left eye and right eye. The final conformed, graded and QC’d files were exported to DPX sequences and delivered to RealD, Dolby and Technicolor for the final encoding, conversion, packaging and creation of the release masters,” adds Jackemuk.

 

Jeff Olm did all the color grading and post stereo adjustments for Journey. He also worked with the director, DP Chuck Schuman and VFX Supervisor Christopher Townsend to set the convergence for the focal point that aligns the left and right eyes for every shot. (Convergence is the perception of how far off the screen the image appears to the audience, or the illusion of depth.) “I worked the right eye for the primary adjustment. The Scratch guys made a special hot key for me that would immediately copy over the left eye to the right eye in real-time stereo. This was a huge time-saver,” says Olm.

 

“We then imported the files into RealD’s 3D theater and did a trim pass in just a few hours. This was a major time savings for the production.”

 

Olm continues, "For the grading sessions, we used a 30-foot screen with two NEC digital projectors with circular polarizers: one polarizer for the right eye and one for the left. Digital projection is key for stereo, so all the images stay in alignment. This way you can view the imagery in 3D with 3D glasses to see the full stereo effect. We could see the results in real-time 3D and make adjustments, which no other film lab or post house could do at this time."

 

Active Convergence

Another development in 3D technology used on Journey was a tool called “active convergence,” with which the filmmakers were able to change the focal point in 3D. “Much like a traditional filmmaker changes focus, we put the point of convergence where the action is,” explains the director, “so that as people are watching and experiencing the 3D movie, it makes it much easier to watch.”

 

“By selectively changing the lenses’ angles, we can adjust the apparent screen depth during a shot to follow the action,” elaborates the cinematographer. “For scenes in which the camera needed to be placed closer to the actors than the Fusion’s side-by-side setup would allow, we used Pace’s compact 'Beam Splitter' camera rig that allowed for closer spacing between the lenses. To the audience, this means less eye-crossing and more comfortable viewing.”

 

cineSync Ensures Color Consistency

Other tools critical to the postproduction workflow of Journey were the cineSync remote review-and-approval system and the cineSpace color management system, which are made by Rising Sun Research of Australia. cineSync and cineSpace were used by Montreal’s Meteor Studios (now Lumiere VFX) to keep quality consistent during visual effects creation for the film.

 

Meteor was the lead visual effects house on the project and worked with Quebec-based Hybride to create environments for specific shots and sequences. While the effects work was concentrated in Quebec, postproduction moved among Montreal, Vancouver and Los Angeles, which required the effects and post teams to perform daily remote reviews of both the mono (master eye) and stereoscopic views.

 

In this kind of complex workflow, cineSpace “ensures that all artists are looking at the same thing, and that what's seen on the screen during grading and visual effects reviews is a faithful representation of the final output,” says Rising Sun's Jeremy Pollard.

 

In the case of Journey, cineSpace was used “to check the color accuracy of the left- and right-eye images, which are themselves 2D images, maintaining consistency through the pipeline and between different facilities,” adds cineSync product manager Rory McGregor.

 

RealD and 3D

Journey is the first full-length live-action narrative film to be released by a major studio at digital 3D theaters--and it also happens to be the widest 3D release in the history of the technology. (The film will also be simultaneously featured in 2D theaters.)

 

What's more, Journey also marks the first theatrical release available to the public in RealD XL, a larger-screen technology that allows projection to screens over 60 feet wide with a single projector.

 

According to president and co-founder Joshua Greer, RealD now has about 1,000 locations in the U.S. Coincidentally, Greer was at Walden Media seven years ago when Journey was greenlit. “For me personally, it has been a very long time to get to this point,” he says.

 

Greer believes that the growth of 3D will have a significant impact on the way films are shot. “It fundamentally changes not only the type of gear you use but how you think about setting up and framing a shot. We're discovering that as studios are getting more and more committed to the medium, they're having to think about such things even before they board out the film, almost from the time scripts are being conceived.”

 

All the production details seem to have worked out as planned on Journey. Pace, for one, wouldn't have done anything differently. “The Journey project went very well,” he says. “As Jim [James Cameron] states, 'When you go first, you don't make mistakes.'”

 

In the end, moviegoers will decide the future of 3D with their box office dollars. If they like the experience, they'll keep coming back. Thaler believes they will. “Journey to the Center of the Earth 3D offers a depth of imagery that brings the viewers into the film, as if they’re looking through a picture window and can feel themselves ready to step into the action,” he says.

 

No comments:

Post a Comment

Please comment as you wish.