Thursday, April 24, 2008

IMAX and Warner Bros. Pictures Hit $400 Million Box Office Milestone

http://www.imax.com/corporate/content/investor/intro.jsp

 

April 24, 2008

 

Studio's Movies Presented in IMAX a Hit with Moviegoers

 

Partnership Helped Shape New Distribution Platform for Event Movies

 

LOS ANGELES, April 24 /PRNewswire-FirstCall/ - IMAX Corporation and Warner Bros. Pictures today announced that they have crossed the $400 million mark at the IMAX box office with films released through the partnership between the two companies. Since June 2003, Warner Bros. Pictures has released 16 titles in IMAX's format, five of which were in IMAX(R) 3D, including two live action pictures that were partially converted into 3D with IMAX's proprietary 2D to 3D conversion technology. All titles were digitally re-mastered into the image and sound quality of The IMAX Experience(R) using IMAX's proprietary IMAX DMR(R) (Digital Re-Mastering) technology. Also included in the tally are the studio's two original IMAX 3D co-productions.

 

"Our successful partnership with IMAX has helped to shape an entirely new distribution window and a completely new form of premium cinematic entertainment," said Dan Fellman, President of Domestic Distribution at Warner Bros. Pictures. "The IMAX(R) theatre network has enabled us to generate incremental box office returns, and we are very enthusiastic about our upcoming film slate, which will include both original IMAX 3D programming and our tent-pole releases."

 

"The IMAX release adds a layer of excitement to major titles, and with more audiences looking for something special at the movies, we anticipate continued strong box office returns from these theatres internationally," said Veronika Kwan-Rubinek, President of International Distribution, Warner Bros. Pictures.

 

"Warner Bros. Pictures' dedication to the IMAX film business, their talent for making great films and their enthusiastic endorsement of IMAX DMR and original programming has enabled our partnership to reach this impressive milestone," said IMAX Co-Chairmen and Co-CEOs Richard L. Gelfond and Bradley J. Wechsler. "With our digital projection system on the verge of rapidly expanding the IMAX network worldwide, we are confident that we'll reach many more milestones with Warner Bros. Pictures."

 

The studio's Hollywood IMAX releases that contributed to the $400 million milestone and helped shape the IMAX network as a new distribution platform include The Matrix Revolutions, The Matrix Reloaded, Harry Potter and the Prisoner of Azkaban, The Polar Express (in IMAX 3D), Batman Begins, Charlie and the Chocolate Factory, Harry Potter and the Goblet of Fire, V for Vendetta, Poseidon, Superman Returns (in IMAX 3D), The Ant Bully (in IMAX 3D), Happy Feet, 300, Harry Potter and the Order of the Phoenix (in IMAX 3D), Beowulf (international) and I Am Legend.

 

The original IMAX 3D films co-produced and released with Warner Bros. Pictures include NASCAR 3D: The IMAX Experience (2003), which has grossed nearly $24 million at the IMAX box office, and Deep Sea 3D (2006), which has grossed an impressive $64 million at the IMAX box office on a limited number of screens, and it is still playing well throughout the IMAX network.

 

"Warner Bros. Pictures has been instrumental in bringing The IMAX Experience into mainstream," added Greg Foster, Chairman and President of IMAX Filmed Entertainment. "They were the first studio to release a feature film in IMAX 3D, the first studio to release a live action film that had been converted into 3D, and they introduced the world to a whole new brand of original IMAX 3D programming that is both educational and entertaining. We look forward to continued success with them and their incredible filmmakers as we collaborate in the release of a broad range of pictures over the next few years."

 

The 2008, 2009 and 2010 lineup for Warner Bros. Pictures and IMAX currently include Speed Racer (May 9), The Dark Night (July 18), Harry Potter and the Half Blood Prince (November 21), Under the Sea 3D (February 2009) and Hubble 3D (February 2010).

 

 

Real D Launches Consumer and Home Products Division! YES - 3D For Your Home Theater!!

http://marketsaw.blogspot.com/2008/01/real-d-launches-consumer-and-home.html

 

Thursday, January 03, 2008

Real D Launches Consumer and Home Products Division! YES - 3D For Your Home Theater!!

Awesome news folks! Real D is launching a Consumer and Home Products Division to take advantage of their technology advances in the home as well as in the theater. So after much speculation in the space, a true leader, Real D, has emerged as a champion of bringing 3D to the Home Theater.

This also means that now the studios have a potential
DVD distribution model for their brand spankin' new 3D movies that they are flogging at the cinema. See the movie in 3D, then buy the DVD for your Real D enabled 3D Home Theater. Piracy issues have been drastically reduced but not eliminated - see comments!

Real D announced that
Koji Hase has been appointed to the position of President of Worldwide Consumer Electronics. Here is a quote regarding his background:

http://usstock.jrj.com.cn/news/2008-01-04/000003133134.html

REAL D 3D Names Renowned DVD Pioneer Koji Hase President of Worldwide Consumer Electronics-资讯中心-金融界http://i.ixnp.com/images/v3.26/t.gif via kwouthttp://i.ixnp.com/images/v3.26/t.gif

Awesome news! Now I have true plans for http://www.3DHomeVideos.comhttp://i.ixnp.com/images/v3.26/t.gif ! :-)

 

How Things Get Invented

http://community.reald.com/blogs/real_d_blog/archive/2007/07/27/469.aspx

 

Source: Lenny Lipton, RealD CEO

 

The history of motion pictures is an interesting one, and I am learning more about it in the context of my present work inventing stereoscopic motion picture systems, and in connection with the work I am doing with studios and filmmakers.  I am taking working with filmmakers seriously because the quality of the Real D system is judged by the content projected on our screens.  I was recently appointed as the co-chair (Peter Andersen is the other co-chair) of the sub-committee of the ASC Technology Committee tasked to help figure out workflow production pipeline and stereoscopic cinematographic issues.  These subjects are tentative and need to be developed and we’re all learning together. 

 

 The stereoscopic cinema, in its present incarnation, as manufactured by Real D, is entirely dependent upon digital and computer technology.  Digital projection allows for a single projector, while other stereoscopic systems use two projectors.  Two projectors work well in IMAX theaters, based on my observations.  I cannot say the same for theme parks, whether they use film or digital technology, because there are occasions when the projected image is out of adjustment. 

 

 Replacing multiple machines with a single machine – i.e. a projector – is the way to go, especially in today’s projection booths; because typically there is no projectionist in the booth at the time the film is being projected.  There is a technician who will assemble the film reels and make sure everything is going to project well, but then somebody else – maybe the kid at the candy counter – who actually works the projector and makes adjustments.  (Interestingly the kid at the candy counter may be well qualified to work the servers and projectors because of his or her PC experience.)

 

The product that I invented, the projection ZScreen®, has been used for years for the projection of CAD and similar images for industrial applications.  Real D turned the ZScreen into a product that had to work even better for theatrical motion picture applications.  It turns out that the film industry has very high standards when it comes to image quality.  This is easy to understand, because the industry lives or die by image quality. 

 

The stereoscopic cinema has had a long gestation.  To date, this is the longest gestation of any technology advance in the history of the cinema.  For example, within about three decades of the invention of the cinema, sound was added.  There were numerous efforts to make sound a part of the cinema and make it a bona fide product.  In the three-year period from about 1927 to 1930, rapid advances were made both in sound technology and in aesthetics.  If you take a look at movies that were made in 1927, and then you see movies that were made in 1930 or 1931, there’s a gigantic difference.  Movies made in the early 1930s look a lot like, and sound like, modern movies.  There was a tremendous advance in the technology and in filmmaker know-how in a short period of time.

 

It is the creative professionals who will perfect the stereoscopic medium.  That’s exactly what they did every time a new technology came along, whether it was sound, color, widescreen, or computer-generated images.  In fact, those are the major additions to the cinema, and they all took decades to become an ongoing part of the cinema. Ads for movies never say, “This is a sound movie,” or “This is a color movie,” or “This movie is in the widescreen (or ‘scope) aspect ratio.”  It’s assumed.  It’s a rare movie that is in black-and-white.  It’s an even rarer movie that is silent.  And nobody is going back to shooting 4:3 Edison aspect ratio movies.  (Curiously, that’s more or less the aspect ratio used by IMAX for their cinema of immersion.) 

 

 An attempt was made in the early 1980s to use a single projector with the above-and-below format – essentially two Techniscope frames that could be projected through mirrors or prisms or split lenses, optically superimposed on the screen, and polarized.  The audience used polarizing glasses to view the images in 3-D.  I was the chairman of the SMPTE working group that established the standards for the above-and-below format.  But as soon as the standards were established, the above-and-below format was more or less abandoned.  A few films like Comin’ At Ya! or Jaws 3-D, and one I worked on, Rottweiler: Dogs of Hell were projected above-and-below, an approach that was technically inadequate.  For one thing it was hard to adjust properly and set up the projector to achieve even illumination.  I know; I set up a few, and it was tough to do a good job because of the design of the lamp housings and the projectors. 

 

 Curiously it was the above-and-below format that led me to the first flicker-free stereoscopic field-sequential computer and television systems.  I noticed that the above-and-below format was applicable to video, because that which is juxtaposed spatially can, with the injection of a synchronization pulse between the two frames, become juxtaposed temporally when played back on a CRT monitor; so the first StereoGraphics systems used the above-and-below format. 

 

 The above-and-below video format, which is applicable to video or computer graphics, results in a field-sequential image that can be viewed using shuttering or related polarizing selection techniques.  I design the first flicker-free field sequential system in 1980.  It used early electro-optics that were clunky, but the flickerfree principal was established.  Using 60 Hz video, for example, with the above and below format, one achieved a 120 Hz result, that is to say, 60 fields per second per eye.  The field sequential system is what is used for the Real D projection system.  The electro-optics are different.  There’s the ZScreen modulator used in the optical path in front of the projection lens, and audience members wear polarizing eyewear. (The combination of ZScreen and polarizing eyewear actually form a shutter.  You can classify the system as either shuttering for selection or polarization, but in fact a proper classification is that it uses both polarization and shuttering.)  But the principal is the same as that used for the early stereo systems I developed.  The right eye sees the right image while the left sees nothing and vice versa, ad infinitum, or as long as the machine is turned on.

 

 The issue I had to solve in 1980 was this:  How to make an innately 60 Hz device work twice as fast.  And the above-and-below format did just that.  We had to modify the monitors to run fast, but for a CRT monitored it wasn’t that hard. There are two parts to stereoscopic systems’ issues:  The selection device design and content creation.  Today we are faced with the same design issue I was faced with in 1980.  In addition, content creation has always been a major issue and that’s why I am working with the film industry to work out compositional and workflow issues.

 

Engineer Jim Stewart (left) and I are working on the first electronic stereoscopic field-sequential system that produced flickerfree images (Circa 1980).  We used two black and white NTSC TV cameras as shown, and combined the signals to play on a Conrac monitor, which, without modification, could run at 120 Hz.  The images were half height, but we proved the principal.  Stewart is wearing a pair of welder’s goggles in which we mounted PLZT (lead lanthanum zirconate titanate) electro-optical shutters we got from Motorola.  The shutters had been designed for flash blindness goggles for pilots who dropped atomic bombs.  I kid you not.

 

Published Friday, July 27, 2007 9:59 AM by Moderator

The Three Dimensional Cinema and Digital Technology: A Match Made in Heaven

http://community.reald.com/blogs/real_d_blog/archive/2008/01/28/540.aspx

The Three Dimensional Cinema and Digital Technology: A Match Made in Heaven http://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gifhttp://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gifhttp://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gifhttp://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gifhttp://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gif

The revival of the stereoscopic theatrical cinema is intimately linked to the rise of digital technology for the production and projection of motion pictures. The term “digital” when applied to cinema means many things. Most people would assume a strong linkage with computers; and indeed computers play an important part in the digital cinema, from image capture or generation to projection. Whether the computers are servers or in projectors, the digital cinema depends not only on this technology but on modern display technology, including the Texas Instruments DLP light engines. My purpose is to acquaint the reader with some understanding of how the stereoscopic medium and the digital medium work together nicely for the capture or the creation of stereoscopic images.

It would be more pleasing to me, for one, to call this the electronic cinema, rather than the digital cinema, but this isn’t an article about technology definitions and everybody knows what I am talking about. Oddly, it is electronic movies or television that has begun to replace chemical-based photography, because the current digital cinema is clearly an outgrowth of television. It is this isomorphism that gives the studios such fits because the distinction between the 1920 TV standard and the 2K theatrical standard is of interest to and possibly only noticeable to experts.

It is my purpose to illuminate why the combination of stereoscopy and digital technology is such a neat one, providing so many benefits. First let’s take a look at the content creation aspects of the medium, which fall into several categories: Live-action photography, animation by means of computer generated images, animation by means of performance capture, and conversion from planar to stereo.

The differentiation between computer generated animation and performance capture is one that does not have a sharp dividing line. There are movies that are touted as having performance capture, such as Beowulf, and there are movies such as Monster House that also use performance capture but make no mention of it in their promotion or advertising. In a naïve time the use of rotoscoping, the progenitor of motion capture, was a hush-hush affair and reports of its use in Snow White were denied by Disney. But they obviously used it.

For computer generated images and for performance or motion capture, digital technology plays a powerful role – and indeed it is utterly impossible to conceive of this means of content creation without digital technology. In the 3D boomlet of the fifties, except for a few cell animation shorts, all the features were live action. But in this, the first two years of the renaissance, until recently, all the features were CG animation. Which is history looping back on itself, because stereoscopy was invented using drawings, before photography was invented.

All of the major animation studios that produce computer generated images have physicists and imaging specialists who are attempting to produce a computer world that can be rendered with remarkable real-world fidelity or with controlled departures from the real world, to produce a beautiful visual effect. The people who create this content – the animators, background artists and other specialists – for the most part deal with content creation on an intuitive level. They aren’t doing calculations, but they are using computers. They need to be able to do what they do as any creative artist does, using on intuition to work the medium.

Whether their endeavors are based on animator’s skills or the artist’s ability to create backgrounds, generally speaking they are dealing with three-dimensional databases that exist as algorithms and numbers in a computer. These three-dimensional databases have to be fully rendered and captured by a virtual camera, and for a stereoscopic version what is required are two perspective views; so there must be two virtual cameras. These two cameras must be set up and coordinated according to the geometry of stereoscopic image capture.

The same kind of remarks can be made for performance capture, in which motion vectors of the actors’ bodies and faces are turned into a database. That database is then manipulated into characters that are inserted into a computer generated world, or for that matter the characters could be placed into photography of the real world.

For camera-captured images digital (or electronic) technology leaves film-based photography of stereoscopic images in the dust. Cameras that depend on modern CMOS and CCD technology aren’t digital, but produce analog signals that are captured digitally and are then recorded digitally either on hard drives or tape drives. One benefit of these video cameras is that they can be lighter and more compact than film cameras. This is important because two cameras make a rig, and two big heavy cameras become a big, heavy, clunky rig. Also, it is very good to be able to get the lenses as close together as possible especially for close-ups but also for medium shots.

During capture and immediately after capture it is desirable to look at the images. It is possible to look at the images on various kinds of stereoscopic monitors during photography, and without the need to process film and look at dailies (typically the next day), the cinematographer, the director and other creative and technical people can look at the images right away. In fact, they can often look at them on large screens – sometimes on a theater-size screen. It is very important to be able to do this, because it is so hard to visualize how stereoscopic images will look. It turns out to be a real bear to be able to predict the stereoscopic effect. If you have to resort to calculators and rules to try to figure out whether the image is going to look good, stereoscopic photography becomes difficult to do. But if you can actually see what you’ve done real-time (or shortly thereafter), you can improve and correct and tweak what you’re shooting. The same remarks that are made here with regard to the ability to view stereoscopic camera-captured images apply to computer generated images, because the content creators are able to look at stereoscopic images real-time on their desktops or in their sweatboxes. (A sweatbox is a little theater.)

The same considerations apply to conversion technology. There are a number of firms that now specialize in converting planar to stereoscopic movies. They are all doing more or less the same thing depending on artists and computers to help them get a decent result with reasonable throughput. The basic idea involved is outlining of foreground objects, laying the skins of those objects on a wire frame mesh or a depth map, and treating the background by filling in missing data and modeling the background where required. All of which would be impossible without digital technology.

We have looked at the major ways in which content can be created. We will now look at a vital portion of the filmmaking process, which is post-production. Post-production involves an array of procedures that create the film after photography. These include the manipulation of picture elements and sound elements into a finished product that can then be released to the theaters. When a stereoscopic film is cut it’s a good thing to be able to see it in 3D so that the editor and the director can understand how shots interact with each other. There are many prejudices, opinions and myths about stereoscopic cutting – about what works and what doesn’t work in 3-D movies: for example, whether a lot of depth-of-field is required, whether fast cuts are allowable or slow cuts are better to allow the stereoscopic effect to build. Theories matter only to a small extent. The eyes of the beholder rule. So if editors and directors can see what they are doing stereoscopically, that’s a tangible benefit. And the well-known advantages of cutting a film digitally apply here in spades. It is beneficial because of the difficulties in visualizing stereoscopic images and how shots interact.

An important process for camera generated material in particular is called rectification, which is a term that comes to us from aerial photography. If the left and right images have any distortions or magnification errors, they can, to a large extent, be fixed in post-production by tweaking the geometry of the two images so they correspond. This becomes important for zoom lenses, because zoom lenses have great big problems in terms of optics centration, which causes spurious generation of parallax values. Problems can be fixed in post-production and there are both proprietary and off-the-shelf tools for doing so. For the most part, these errors can be eliminated; and they can also include color and density shifts in the left and right images that can occur in cinematography.

The most important and well-developed element in this current phase of the stereoscopic cinema is stereoscopic projection. By emerging from a single DLP light engine projector, a stereoscopic image can be created using the time-multiplex or field-sequential mode. I am the first person to create flicker-free images for the time-sequential process, and the primary inventory of the major selection techniques used with field-sequential stereoscopic presentations: CrystalEyes active shuttering eyewear and the ZScreen, the electro-optical modulator used by Real D. It fits in front of the projection lens and switches the characteristics of polarized light in synchrony with the projection fields. Other systems are extant, such as shuttering eyewear systems, or the Dolby system which is an advanced form of anaglyph.

Only the projectors made by manufactures licensing DLP technology from Texas Instruments, Christie, Barco, and NEC, meet the required specification for field-sequential 3D. In order to make the Real D, NuVision shuttering eyewear, or Dolby systems work, you have to have a rapid sequence of frames projected on the screen. And only the DLP can refresh fast enough. In the case of material captured at the film standard rate of 24 frames per second, these systems work best when projecting at 144 frames per second. There are two 24-fps images for 48 fps, and each image is repeated three times for a total of 144 fps. The images are concatenated, and a train of images (left, right, left, right, left, right, and so on) reach the eyes. Half the time when you look at the image your right eye is seeing only the right images and is seeing nothing in the left eye, and vice versa. If everything is done right, the result is a good because the left and right images are treated identically by the projector in terms of geometry and illumination. The repetition rate of 144 frames per second lets us approach left and right frame projection simultaneity, another important factor.

Dual-projection systems require a lot of tweaking; and even after they have been tweaked they can drift out of spec. It’s not that it is impossible to make dual-projection systems work. It is simply that they are not a real product that you can count on given current technology in digital cinemas, which requires not only the projection of a beautiful image but a dependable process and an image that does not require constant monitoring.

Digital technology –content creation, post-production, and projection – has enabled the stereoscopic medium to become a part of the filmmaking armamentarium; not only to provide beautiful projection but to provide a dependable product, free from the mistakes of the past, that I don’t want to dwell on because they’re such a bummer. But today’s modern 3D digital projection is free from fatigue and eyestrain, and can now allow content creators to do their best to discover the art of this new medium. We’re going to see several years of experimentation and discovery, and at the end of that time the stereoscopic medium will be on a firm foundation. Creative people will never stop creating, but we will reach a plateau where many of the creative and production technical processes become routinized. Oddly enough, the reintroduction of the stereoscopic cinema comes down to turning that which had been more or less a laboratory experiment into a routine.

And none of this would have been possible without DLP projection which is the invention of Larry Hornbeck, who I just had the pleasure of meeting at the SPIE Stereoscopic Displays and Applications Conference in San Jose. Larry asked me for my autograph, so I asked him for his. As you can see, his autograph is on the back of a pair of paper 3D eyewear, which is entirely appropriate.

Published Monday, January 28, 2008 12:23 PM by Moderator

 

Compositional differences: Real D VS. IMAX

Compositional differences: Real D VS. IMAX http://community.reald.com/Themes/default/images/common/star-left-on.gifhttp://community.reald.com/Themes/default/images/common/star-right-on.gifhttp://community.reald.com/Themes/default/images/common/star-left-on.gifhttp://community.reald.com/Themes/default/images/common/star-right-on.gifhttp://community.reald.com/Themes/default/images/common/star-left-on.gifhttp://community.reald.com/Themes/default/images/common/star-right-on.gifhttp://community.reald.com/Themes/default/images/common/star-left-on.gifhttp://community.reald.com/Themes/default/images/common/star-right-on.gifhttp://community.reald.com/Themes/default/images/common/star-left-on.gifhttp://community.reald.com/Themes/default/images/common/star-right-on.gif

http://community.reald.com/blogs/real_d_blog/archive/2008/02/15/547.aspx

http://www.reald.com/_images/lenny_blog.jpgThe history of western art, of painting and then photography, has been informed by the rectangle.  Virtually all painting and photography exists within the confines of a rectangular frame, and it is the edges of the rectangle that create the compositional boundaries and structure.  You don’t look through a rectangle when you’re seeing the visual world but the rectangle abruptly limits the visual field in a painting, or a photograph, or a projected motion picture image.  The placement of objects in space within the limits of the rectangle determines the composition.  Up until now visual artists have had to deal with the rectangle.

With the introduction of head-mounted displays, the rectangle could take a rest – or at least try to disappear.  The idea of virtual reality or augmented reality as exemplified by HMDs is a relatively new idea.  You can find references to the concept in science fiction in which images are piped directly into the brain.  

Granted, most HMDs have a limited field of view and the rectangle remains apparent.  But the idea, in its full embodiment, would be for the image to subtend most or all of the visual field, and over the years I’ve seen devices that were designed to do just that – to immerse the viewer in a visual experience.  The reason that I take the trouble to describe this is that the IMAX experience is one that attempts to immerse the viewer in the totality of the image – to remove the rectangular boundaries.  The Real D experience, and most other motion picture projection, involves the rectangle.  The audience member will be aware of the rectangle; and the traditional concepts of composition apply.  The kind of balance, juxtaposition and placement of objects within the image field is critical in classical composition but is different for IMAX screens.

In IMAX, in which people are sitting close to a giant screen, the periphery of the screen is more difficult to discern and the rectangle becomes relatively unimportant.  The idea behind IMAX is to immerse you in the experience.  So people who shoot IMAX movies have to think about a different kind of composition. I’m concerned with the stereoscopic cinema so my remarks are predicated on that interest.  In the stereoscopic cinema, the rectangular boundaries are important because of the well-known effect of the conflict of stereoscopic cue of parallax and the extra-stereoscopic cue called interposition.  If off-screen (negative) parallax values are occluded by the screen edges – and this is especially true for the vertical edges of the screen – there will be a conflict of cues, which some people (possibly most people) interpret as a region of confusion.  Some people may say that the image looks like it’s pulled back into the plane of the screen; some will report that the image looks odd.  In any event, it’s something that has to be dealt with in the conventional stereoscopic cinema and doesn’t need to be dealt with in IMAX because the screen is so large that it’s hard to see the edges of the surround. (Another thought to put into the mix that might further confusion rather than understanding is that we are in a time of transition in which people are learning how to look at stereo movies and maybe with the passage of time the screen edge conflict will come to be accepted.)

The big screen changes the way IMAX composes a stereoscopic image.  In fact, when looking at IMAX movies in stereo, everything appears to be playing into theater space.  I define theater space as that which is in front of the plane of the screen, and screen space is that which plays behind the plane of the screen, and the boundary layer between the two (at the zero parallax condition) is the plane of the screen.  In IMAX, the plane of the screen more or less disappears.  It becomes not necessarily unimportant, but it certainly has a different meaning from that in the conventional stereoscopic cinema, or the rectangle-bounded stereoscopic cinema.  A lot of IMAX films are set up so that the background points will have about 2-1/2 inches of positive parallax – which is the average interpupillary separation for the adult male population – so in most cases it will avoid producing divergence to produce a comfortable result when viewing background points.  

The net effect of looking at an IMAX screen for many people is that everything is playing out into the theater.  Some people report that they feel immersed in the screen or in the image and that they feel within the image – and that’s certainly what IMAX is trying to accomplish.  Since they don’t have to worry about the conflict of cues at the screen surround, and they’ve got such a big image, their compositional theory is a different one from the conventional stereoscopic cinema.  

The upshot of this for IMAX is that it has a tremendous parallax budget.  Since they don’t have to worry about a conflict of cues at the screen surround, they’re free to have large values of parallax.  In fact, the last IMAX film I saw in a preview (and because it was in preview I don’t think it’s fair to critique it or mention what the film was) had parallax that was measured not in inches, not in feet, but in yards!  

To some people this is fun.  But to me, it produced an obtrusive image that I felt like I couldn’t get away from. The truth is that, for me anyway, the IMAX stereo experience is not a lot of fun.  Most people may not share that experience and the last time I went to see an IMAX 3-D feature with my family my wife and kids had a good time even if I didn’t.  I also find the big parallax values tiring on the eyes, and the increased ghosting with just a little bit of head-tipping not to be a plus.

For the Real D cinema and for other stereoscopic processes projecting on conventional cinema screens, which are typically between 20 and 50 feet in scope, the experience is different. Directors of stereography like Phil McNally and Rob Engle, at DreamWorks and at Sony Imageworks respectively, have been using a technique that allows the conventional cinema to also have a very large parallax budget.

It was almost two years ago that I sat with Phil McNally in the Egyptian Theatre on Hollywood Boulevard and watch films at the 3D Expo.  We saw shorts that had been shot by Raymond and Nigel Spottiswoode in the ‘50s.  These were in black and white in the 1.3:1 aspect ratio.  Before we went into the theater, Phil had talked about the idea of the floating window that he said he’d read about it in my book, Foundations of the Stereoscopic Cinema.  Phil and I sat together and watched the Spottiswoode films with the floating window, and Phil said (with a British accent): “This is gonna work”; and he used it while he was the stereo supervisor at Disney for Meet the Robinsons.  He used vertical edges that were added to the left and right edges of the picture frame to build a virtual surround – in other words, a floating window that hovers in space between the screen and the audience so that images with large parallax values at the edges of the screen will not have a conflict of cues.  

One paradox of the stereoscopic cinema is that from the plane of the screen to stereo-optical infinity there is a parallax budget of only 2-1/2 inches (after which divergence sets in).  But you can have many inches of off-screen parallax without bothering anybody’s eyes.  While it may not be an order of magnitude – maybe it’s half an order of magnitude – that gives you a big parallax budget, certainly good enough to represent any shot.  The problem that is solved is that there is no longer a parallax-occlusion conflict occurring at the edges of the screen.  Movies like Meet the Robinsons and Beowulf have used the floating window to good effect to increase the parallax budget.

Since the allowable theater space parallax is much greater than the screen space parallax, this gives the filmmaker a chance to exploit the medium and to increase the depth effect.  In fact, it is now on a par with that which is in IMAX theaters.  However, the image is still contained by the rectangle.  People talk about this stereoscopic method as being immersive, but who knows exactly what that means?  They may very well feel more immersed in the picture because it is a stereoscopic picture with a bigger range of parallax.  One thing is certain:  A trade-off has been made between parallax budget and off-screen effects.  That’s because the amount an object appears in front of the screen is compared to the plane of the screen that is usually defined by the screen surround (the physical black border around the screen edges).  With a printed on surround the frame of reference is now literally the virtual window floating in space.  But the parallax values of the windows can be varied from shot to shot, and even within a shot and the values for the left edge don’t have to match the right edge so shots requiring off-screen effects can be had.  The virtual window can be thought of as an animated window that can be placed at the Z location at will so off-screen effects can be accomplished.

Published Friday, February 15, 2008 10:04 AM by Moderator

 

A Glossary for Stereoscopic Filmmaking

http://community.reald.com/blogs/real_d_blog/archive/2008/02/08/544.aspx

A Glossary for Stereoscopic Filmmaking http://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gifhttp://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gifhttp://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gifhttp://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gifhttp://community.reald.com/Themes/default/images/common/star-left-off.gifhttp://community.reald.com/Themes/default/images/common/star-right-off.gif

Introduction

In any discipline nomenclature turns out to be of obvious importance.  It’s crucial for all the people who are doing the thing to agree on the same set of definitions.  Without that, it’s impossible to communicate – or it’s impossible to communicate without ambiguity.  Part of my job at Real D is to work with filmmakers and studios to help them (and me) figure out how to make stereoscopic movies.  This is an art that is being invented.  Digital projection has opened the door for advances in stereoscopic content creation.  Whereas in the past stereoscopic filmmaking was an erratic or a sporadic activity, or at best one that was practiced only by a handful of people typically for large format or theme park attractions, the door has now been opened for the conventional theatrical cinema, which has a greater activity level.  

In my trips to studios and filmmakers, I have learned that people are using different terminology.  Sometimes they are trying to say the same thing with different words, and sometimes they are trying to say different things with the same words.  It’s confusing, and it doesn’t help to develop a nascent art form to have such ambiguity or downright confusion.  So here is my stab at a glossary that is based on the one that appeared in the CrystalEyes handbook that I put together when I was at StereoGraphics.  I am hoping that in the weeks and months that follow the readers and I will have a chance to add to it.  So if you have entries or ideas for concepts that need to be defined, send them to me.  

I’ll provide a relatively benign example of what can go wrong.  In this example of a lack of precision everybody can more or less figure out what’s meant.  It’s the matter of interaxial versus interpupillary or interocular.  When doing stereoscopic cinematography, whether you are talking about virtual cameras in a computer-generated world or real cameras in the visual field, there needs to be some accepted term for the distance between the cameras; or more specifically the distance between the camera lenses; and to be even more specific, the distance between the camera lens axes. The interaxial is the distance between the optical axes of the camera heads’ lenses.  I call them camera heads since a stereo rig (or camera) has two heads and if we call the twin lensed camera a camera we have failed to distinguish between the camera and the camera heads. It’s the distance between the lens axes that, to a large extent, determines the strength stereoscopic effect.  

The distance between people’s eyes, depending upon what field of medicine or science you’re in, is called the interpupillary or interocular distance.  The distance between your eyes’ lens axes when your eyes are converged at infinity is a number that doesn’t change once you reach adulthood, and the spread is between 55 and 75 millimeters for the adult population. Typically, the interpupillary or interocular distance is given as an average of 65 millimeters for men.  This has led some misguided workers to believe that stereoscopic movies should be shot with camera lenses whose axes are always the interpupillary separation.  

People who are involved with the capture of stereoscopic images will use the term “interpupillary” or “interocular” when they mean interaxial.  A camera lens, whether it’s a virtual or a real camera, has a lens axis that passes through the optical center of the lens and, if the lens is set up correctly, is orthogonal to the plane of the image sensor.  The distance between the left and right axes is called the interaxial separation.  So if you’re talking about the distance between people’s eyes, please call it interpupillary or interocular and everybody will know what you’re talking about.  If you’re talking about the distance between camera lenses, call it interaxial so we’re all speaking the same language.

The Glossary (a work in progress)

Accommodation.  The focusing of the eyes -- or more properly the ability of the eyes’ lenses to change shape in order to focus.

Accommodation/Vergence Relationship.  The learned relationship established through early experience between the focusing of the eyes and verging of the eyes when looking at a particular object point in the visual world. Usually called the accommodation/convergence relationship (or the convergence accommodation relationship.)

Anaglyph.  Wavelength selection using complimentary colored images and color filters to filter or pass the appropriate perspective views to the appropriate eyes.

Autostereoscopic. 
Sometimes called auto-stereo, which can be confused with a car sound system.

Beamsplitter.
Technically this is a couple of prisms cemented together with a semi-silvered layer to split a light beam into two halves.  For the rig used for stereo-cinematography a thin sheet of glass is used in the optical path that is semi-silvered and such a device is more properly called a pellicule (or pellicle).

Binocular.  Two eyes.  The term binocular stereopsis (two-eyed solid seeing) is used in some psychology books for the depth sense more simply described as stereopsis.

Circular polarization.  A form of polarized light in which the tip of the electric vector of the light ray moves through a corkscrew in space.

Conjugate Points.  See Corresponding Points.

Convergence.
  The inward rotation of the eyes, in the hori­zontal direction, producing fusion.  The more general term is vergence which includes inward and outward rotation. The term has also been used, confusingly, to describe the movement of left and right image fields or the rotation (toe-in) of camera heads.

Corresponding Points.  The image points of the left and right fields referring to the same point on the object.  The distance between the corresponding points on the projection screen is defined as parallax.  Also known as conjugate or homologous points.

Crosstalk.  Incomplete isolation of the left and right image channels so that one leaks (leakage) or bleeds into the other.  Looks like a double exposure.  Crosstalk is a physical entity and can be objectively measured, whereas ghosting is a subjective term.

Depth Range.  A term that applies to stereoscopic images created with cameras.  The limits are defined as the range of distances in camera space from the background point producing maximum acceptable positive parallax to the foreground point producing maximum acceptable negative parallax.

Disparity.  The distance between conjugate points on overlaid retinae, sometimes called retinal disparity.  The corresponding term for the display screen is parallax.

Extrastereoscopic Cues.  Those depth cues appreciat­ed by a person using only one eye.  Also called monoc­ular cues.  They include interposition, geometric perspective, motion parallax, aerial perspective, relative size, shading, and textural gradient.

Field-Sequential.  In the context of cinema-stereoscopy, the rapid alternation of left and right perspective views projected on the screen.

Floating windows.  Invented by Raymond and Nigel Spottiswoode, this is the use of printed vertical bands to create a surround to supplant the physical screen surround.  The result is a so-called virtual window that is floating in space to eliminate the screen edge cue conflicts and to extend the parallax budget of the projected image.

Fusion.  The combination, by the mind, of the left and right images -- seen by the left and right eyes -- into a single image.

Ghosting.  The perception of crosstalk is called ghosting.

HIT.   Horizontal image translation.  The horizontal shifting of the two image fields to change the value of the parallax of corresponding points.  The term convergence has been confusingly used to denote this concept.

Homologous Points. 
See Corresponding Points.

Interaxial distance.  Also interaxial separation.  The distance between camera lenses' axes. See t.

Interocular distance.  See t.

Interpupillary Distance.  Also interpupillary or inte­rocular separation.  The distance between the eyes' axes.  See t.

Linear polarization.  A form of polarized light in which the tip of the electric vector of the light ray remains confined to a plane.


Monocular Cues. 
See Extrastereoscopic Cues.

Multiplexing.
  The technique for placing the two images required for an stereoscopic display within an existing bandwidth.

Parallax.  The distance between conjugate points.  It may be measured with a ruler or, given the distance of an observer from the screen, in terms of angular measure.  In the latter case the parallax angle direct­ly provides information about disparity.

Parallax budget.  The range of parallax values, from maximum negative to maximum positive, that is within an acceptable range for comfortable viewing.

Planar.  Flat.  Two-dimensional.  A planar image is one contained in a two-dimensional space, but not neces­sarily one which appears flat.  It may have all the depth cues except stereopsis.

Plano-Stereoscopic.  A stereoscopic projected image that is made up of two planar images.

Ramsdell rig.  See beamsplitter.   

Retinal Disparity.
  See Disparity.

Rig.
  Dual camera heads in a properly engineered mounting used to shoot stereo movies.

Screen Space.  The region appearing to be within a screen or behind the surface of the screen.  Images with positive parallax will appear to be in screen space. The boundary between screen and theater space is the plane of the screen and has zero parallax.  See theater space.

Selection Device.  The hardware used to present the appropriate image to the appropriate eye and to block the unwanted image.  For 3D movie the selection device is usually eyewear used in conjunction with a device at the projector, like a polarizing device.

Stereo.  Short for stereoscopic.  If you are trying to learn about multi-channel sound you are in the wrong place.

Stereoplexing.  Stereoscopic multiplexing.  A means to incorporate information for the left and right per­spective views into a single information channel with­out expansion of the bandwidth.

Stereopsis.  The binocular depth sense, literally, "solid seeing."

Stereoscope.  A device for viewing plano-stereoscopic images.  It is usually an optical device with twin viewing systems.

Stereoscopy.  The art and science of creating images with the depth sense stereopsis.


Surround.  The vertical and horizontal edges immediately adjacent to the screen.

t.  In stereoscopy, t is used to denote the distance between the eyes, called the interpupillary or inte­rocular distance.  tc is used to denote the distance between stereoscopic camera heads' lens axes and is called the interaxial.

Theater Space.
  The region appearing to be in front of the screen or out into the audience.  Can also be called audience space. Images with negative parallax will appear to be in theater space. The boundary between screen and theater space is the plane of the screen and has zero parallax.  See screen space.

Window.  The stereo window corresponds to the screen surround unless floating windows are used.

ZPS.  Zero parallax setting or the means used to con­trol screen parallax to place an object in the plane of the screen. ZPS may be controlled by HIT, or toe-in.  We can refer to the plane of zero parallax, or the point of zero parallax (PZP) so achieved. Prior terminology says that left and right images are converged when in the plane of the screen.  That term should be avoided because it may be confused with the convergence of the eyes, and because the word implies rotation of camera heads  Such rotation produces geometric distortion and may be expedient in camera rigs but is unforgiveable in a CG virtual camera rig.

Published Friday, February 08, 2008 11:15 AM by Moderator

 

International Datacasting's 3D Digital Cinema Product With Sensio Technology a Hit at European Satexpo Event

http://www.digitalcinemainfo.com/international-datacasting-corporation_04_17_08.php

 

April 17, 2008

 

Source: International Datacasting corporation

 

International Datacasting Corporation, a global leader in providing advanced solutions for multimedia content distribution via satellite, announced the European debut of its SuperFlex Pro Cinema 3D Live product line with embedded Sensio® 3-D technology at the recent SatExpo tradeshow in Rome, Italy.

 

IDC provides complete end-to-end secure satellite network solutions for the delivery of movie files and live events for digital cinema and e-cinema applications. The newest addition to IDC's SuperFlex digital cinema product line was the highlight of the event. The Pro Cinema 3D Live Encoder and Decoder features integrated leading-edge Sensio® technology. Pro Cinema units were installed in a theatre at the SatExpo event and flawlessly delivered 3D content to an audience comprised of Italian theatre owners and distributors in addition to regular cinema patrons.

 

SAT EXPO is an advanced space and telecommunications event that attracts over 10,000 industry representatives to see the latest in advanced satellite technologies from around the world. This year, digital cinema and the delivery of digital movies were a major focus of the event. The Pro Cinema products were presented in partnership with Open Sky, an IDC customer that is implementing digital cinema solutions throughout Italy.

 

The SuperFlex Pro Cinema 3D Live Encoder and Decoder supports both 2D and 3D live and pre-recorded events, which allows cinemas and other venues increased revenue opportunities via alternative content programming.

 

"We've been providing leading-edge digital cinema technology solutions for many years. I'm thrilled to see this business starting to take off in Europe and other international markets. These new opportunities come at an important time in our company's growth. Our European subsidiary PROFline will be front and center in supporting this technology rollout as it continues to develop." said Ron Clifton, President and CEO of IDC.

 

The Superflex Pro Cinema 3D Live line of products will be a highlight of IDC's booth, #C8437, at this years NAB ShowTM, held in Las Vegas, April 14 to 17. Produced annually by the National Association of Broadcasters, the NAB ShowTM is the broadcast industry's largest annual gathering and an important showcase for new technologies.

Fithian: Next year pivotal for 3-D

http://www.hollywoodreporter.com/hr/content_display/business/news/e3i841febf193b81360d31e52f8102ca456

 

By Carolyn Giardina

 

April 13, 2008

 

More NAB coverage

 

LAS VEGAS -- National Association of Theatre Owners president John Fithian sent 3-D stakeholders a stern warning that they face a "potential train wreck" in 2009.

 

Next year is widely considered a pivotal one for the format, with a least 10 3-D features slated for release, including James Cameron's anticipated "Avatar"; DreamWorks Animation's first 3-D feature, "Monsters vs. Aliens"; and several titles from Disney.

 

"We are at an extremely critical juncture in the transition to digital cinema, but the (deployment) deals have to be done," Fithian said Sunday at the National Association of Broadcasters Show's Digital Cinema Summit. "We are at an impasse over the financials."

 

The deployment deals generally rely on a virtual print fee model through which studios contribute an agreed fee per screen, per movie to offset exhibitors' installation costs.

 

 

"Unless digital cinema deals are made in the next one to two months, we will not have time to (deploy the screens) for 2009," Fithian said.

 

"Despite that, some of my friends at the studios are insisting that they should pay lower VPFs (in current negotiations) than they did in the first round of deals," he said, asserting that the model worked in the first round. "3-D cannot be an excuse for lowering VPFs."

 

He warned that if the studios want the planned 3-D rollout, and in time for 2009, "these deals need to be struck right now."

 

Today, nearly 5,000 digital cinema screens -- with slightly more than 1,000 3-D capable -- are installed in North America. Stakeholders have been citing at least 4,000 3-D-ready screens as a target for spring 2009, in order to accommodate the scheduled releases.

Insight forms 3-D consortium

http://www.hollywoodreporter.com/hr/content_display/business/news/e3i721645931ea9e8467a207b275aca3b0a

 

By Carolyn Giardina

 

April 15, 2008

 

More NAB coverage

 

LAS VEGAS -- Driven by the momentum of 3-D in digital cinema, the U.S. Display Consortium and consulting firm Insight Media launched the 3D@Home Consortium with 22 founding members Tuesday at the NAB Show in Las Vegas.

 

Led by Philips, Samsung and Walt Disney Studios Home Entertainment, the consortium was formed as a nonprofit alliance aimed at speeding adoption of 3-D in the home.

 

"In 2008, millions of TVs capable of showing stereoscopic 3-D content will be purchased by consumers," Insight Media president Chris Chinnock said. "The value of DLP, PDP and LCD TVs sold in 2008 that are capable of showing HD-quality, stereoscopic 3-D content is expected to exceed $2 billion, making this market large enough to attract the interest and attention of many players."

 

Still, such key issues as delivery standards need to be addressed for this industry to move forward.

 

The consortium's initial short-term goals include creating and publishing technical roadmaps, developing educational materials for consumer and retail channels and facilitating the development and dissemination of industry standards.

 

The USDC is an industry-led, public/private partnership that provides services to the flat-panel display and flexible microelectronics industries. Founders include Thomson, Imax, TDVision, 3DIcon, Corning, Planar Systems, QPC Laser, SeeReal, 3ality, DDD, In-Three, Quantum Data, Sensio, Fraunhofer Institute IPMS, Sim2, Setred, Universal, Holografika and Volfoni.

Tuesday, April 22, 2008

"Fly Me To The Moon" Exclusive Premiere: Jules Verne Festival - Paris, April 26th

http://marketsaw.blogspot.com/2008/04/fly-me-to-moon-exclusive-premiere-jules.html

 

Tuesday, April 22, 2008

 

Going to Paris? Well then check out the Jules Verne Adventure Film Festival running from Friday, April 25th to 27th. I have been talking with Johannes Palmroos who has been very passionate about getting the word out about this fine show AND the excellent content that very much includes 3D.

 

Here's what James Cameron and George Lucas had to say about the show:

 

"The Jules Verne Adventure Festival is quite unique. It celebrates the spirit of authentic adventure and exploration, which we all need today and in the future."

 

James Cameron

"This Festival is much more than a just a film festival. That’s why it is much more interesting. As Jules Verne did, it helps us to think outside the box."

 

George Lucas

An awesome addition to this year's show is the exclusive premiere of the FULL 3D feature "Fly Me To The Moon"! In fact there are several interesting 3D screenings namely:

 

Friday 25 April @ 20:30 - Trailer and hopefully also segments of "Voyage to the Center of the Earth 3D" will be shown in 3D before the documentary "Tara".

 

Saturday 26 April 20:30 - The Jules Verne award ceremony is followed by the 40 minute "Dolphins And Whales 3D" as well as the full feature "Fly Me To the Moon"in an exclusive premiere! (Tony Curtis will be present at this screening)

 

Sunday 27 April, 10:00 (morning) - "Ocean Wonderland 3D" and "Sharks 3D".

 

Tickets and additional info about other events can be found at www.julesvernefestival.com.

 

All screenings are at The Grand Rex. 1, Boulevard Poissonnière. 75002. Paris.

 

Ahhhhh... Nothing like Paris in the spring - à bien tôt!

 

julesvernefestival.comhttp://i.ixnp.com/images/v3.26/t.gif

NBA to be shown live in 3D with stacked 4K

http://celluloidjunkie.com/?p=265

 

Maverick billionaire Mark Cuban’s Dallas Mavericks will have their upcoming 25 March game captured, beamed and shown in live digital 3D, through a partnership with Pace Fusion 3D. From the press release:

 

The March 25 game against the Los Angeles Clippers from the American AirlinesBusiness-Travel-Looks-Ahead-to-2008  Center will be beamed across town via satellite into Dallas Mavericks owner Mark CubanMark-Cuban ‘s Magnolia Theatre in Dallas‘ West Village where an invitation-only audience will watch unforgettable images through special 3D glasses using Sony‘s SXRD 3D Projection System on an 18×42-foot screen, making it feel as if you‘re sitting courtside. In addition to VIP guests, the audience will include over 100 lucky Mavericks fans, who can win tickets to the event by entering an online sweepstakes at mavs.com

 

FSN Southwest will utilize the proprietary PACE/Cameron Fusion Sports System to capture the action on the court and deliver a unique depth of field perspective to the Magnolia Theatre audience. Each of the four 3D systems that will be used is designed with two high-definition cameras that capture the left eye and right eye imagery separately and create one three-dimension effect. The result is a ‘wow‘ visual experience that makes the action seem so close and spectacular most viewers will probably forget they‘re sitting miles away in a movie theatre.

 

To display the game stereoscopically two SXRD’s will need to be stacked. While some may scoff that this highlights the SXRD’s inability to handle 3D from a single projector, it should be noted that a growing number of theatres are opting for stacked DLP 2K solutions instead of signing a long term licensing deal with RealD. Call me sexist or call me just-not-interested-in-basketball, but I’m wondering how well Pace will capture the cheerleader action in 3D. 

 

CHRISTIE 3D2P - DUAL-PROJECTOR 3D DIGITAL CINEMA SOLUTION

http://www.christiedigital.com/AMEN/Products/christie3d2p.htm

 

Christie 3D solutions make it happen

 

Consumer demand for the 3D experience continues to grow. With the higher audience attendance of 3D digital presentations across the globe, and higher ticket prices, it is no wonder that 3D cinema is one of the hottest topics in the cinema industry. Since this immersive environment is unique to theatres, the fascination of 3D digital technology is bound to reinvent the cinema experience for a wider audience of all ages.

 

As content continues to explode on the screen—with at least eight or nine releases expected in 2008 and over 15 releases scheduled for 2009—exhibitors are looking for the best 3D solution that provides the highest quality image on the screen.

 

More light, more choices

With light standards and requirements increasing across the board, the Christie 3D2P solution excels in projecting stunning high resolution 3D images to create bright, true-to-life color visuals on the screen. Christie, a pioneer in 3D technology, has focused on delivering the exceptional high brightness necessary for 3D content. Along with a complete range of technical support to meet customers’ expectations for continuous, reliable product performance, installation support and project management, Christie also provides all the 3D kit essentials for spectacular 3D presentations.

 

Complete ownership of the 3D solution

Christie delivers a 3D solution that makes total sense in terms of ownership, cost, and flexibility. Of significant note is that there are no expensive licensing fees using the Christie 3D2P projector system. With no lock down of the projectors, there is no waste of unused seats in the theatre. The flexibility of having access to a second digital projector that can be easily moved whenever and wherever it is needed within the complex results in a positive return on investment. The configuration allows the exhibitor to easily move the feature from screen to screen during the feature’s engagement.

 

With the flexible Christie 3D2P solution, higher box office revenues, larger audiences, more satisfied customers, a better bottom line, and growing 3D content in the pipeline, digital 3D will enable the cinema industry to maintain its competitive edge over ‘stay-at-home’ movie-goers.

D-Cinema Frustrations Aired At NAB

http://celluloidjunkie.com/

 

April 15th, 2008

 

Despite the slew of news stories coming out of the Digital Cinema Summit at the NAB Show this past weekend there was little new to report regarding technological advances and nascent business deals. In fact, the if the only real news that seemed to come out of the summit was the frustration over the fact that there was little new to talk about.

 

Because the summit is held a month after ShoWest the expectation going in was that nothing major would be announced by major players in the space such as studios, integrators or vendors, though some industry players used the forum to voice their opinions, controversial or not, about the lack of speed with which d-cinema is being adopted. Attendees were dealt a one-two punch during the Sunday sessions by speakers Michael Karagosian of MKPE and John Fithian, the President and CEO, National Association of Theatre Owners.

 

Moderating a morning panel discussion titled “The Exhibition Perspective: Truth and Consequences in the D-Cinema Rollout” Karagosian highlighted the issues acting as roadblocks to d-cinema adoption, including the dwindling virtual print fees (VPFs)studios are willing to pay. Some studios, such as Warner Bros. have been slow to sign up to VPF agreements, or haven’t signed them at all. According to The Hollywood Reporter Karagoisan suggested:

 

. . . .that if one major studio held back annually on just two blockbuster titles that the exhibitor contributions might grow from 20% to 32% of the total.

 

This could be a real problem since, as Karagosian pointed out, the cost for an exhibitor to make the conversion to digital could be as high as 200% - 300% more than sticking with current film based systems over the next 25-years.

 

After lunch, John Fithian gave a dutch uncle keynote address which had everyone buzzing, since he played no punches in pointing out the pending “train wreck” the industry could be headed for if the number of d-cinema installations don’t increase before 2009 when studios plan on releasing 10 films in digital 3-D, including James Cameron’s “Avatar”. Variety reports that Fithian stated of the upcoming releases:

 

“We don’t have the screens for them. We have less than 1,000 3-D screens in the U.S. and fewer than that in the rest of the world.”

 

Fithian then turned his attention to the stalemate occurring between studios and exhibitors over VPFs which are meant to subsidize the rollout of D-cinema equipment.

 

“Unless the deals are done in the next month or two we won’t have time to do the installations in time. We literally need the deals now to make the slate work. If the studios want this to happen in time for 2009, the deals have to be struck, and they have to be struck right now.”

 

Whether VPF agreements start closing or not some exhibitors may still take a wait and see approach. Mark Collins of Marcus Theatres, who participated in Karagosian’s panel discussion, said of his circuit’s conversion to digital:

 

“We have not seen any cost benefits. We need to start promoting digital cinema as something that is different from what (moviegoers) have.”

 

Though many in the industry have made similar statements, Collins may be onto something as this type of marketing approach has worked in the past during the rollout of digital audio in movie theatres.

 

The summit also played host to the usual suspects from studios, such as Howard Lukk, Disney’s Vice President of Production Technology and Wendy Aylsworth, the Sr. Vice President of Technology at Warner Bros. Both spoke mostly of technical issues such as the need for more 3-D production and post-production equipment. Aylsworth focused on the problem of subtitles in 3-D content, a topic she previously presented with a great depth of knowledge at the SMPTE conference in October of 2007.

 

D-Cinema Frustrations Aired At NAB

http://celluloidjunkie.com/

 

April 15th, 2008

 

Despite the slew of news stories coming out of the Digital Cinema Summit at the NAB Show this past weekend there was little new to report regarding technological advances and nascent business deals. In fact, the if the only real news that seemed to come out of the summit was the frustration over the fact that there was little new to talk about.

 

Because the summit is held a month after ShoWest the expectation going in was that nothing major would be announced by major players in the space such as studios, integrators or vendors, though some industry players used the forum to voice their opinions, controversial or not, about the lack of speed with which d-cinema is being adopted. Attendees were dealt a one-two punch during the Sunday sessions by speakers Michael Karagosian of MKPE and John Fithian, the President and CEO, National Association of Theatre Owners.

 

Moderating a morning panel discussion titled “The Exhibition Perspective: Truth and Consequences in the D-Cinema Rollout” Karagosian highlighted the issues acting as roadblocks to d-cinema adoption, including the dwindling virtual print fees (VPFs)studios are willing to pay. Some studios, such as Warner Bros. have been slow to sign up to VPF agreements, or haven’t signed them at all. According to The Hollywood Reporter Karagoisan suggested:

 

. . . .that if one major studio held back annually on just two blockbuster titles that the exhibitor contributions might grow from 20% to 32% of the total.

 

This could be a real problem since, as Karagosian pointed out, the cost for an exhibitor to make the conversion to digital could be as high as 200% - 300% more than sticking with current film based systems over the next 25-years.

 

After lunch, John Fithian gave a dutch uncle keynote address which had everyone buzzing, since he played no punches in pointing out the pending “train wreck” the industry could be headed for if the number of d-cinema installations don’t increase before 2009 when studios plan on releasing 10 films in digital 3-D, including James Cameron’s “Avatar”. Variety reports that Fithian stated of the upcoming releases:

 

“We don’t have the screens for them. We have less than 1,000 3-D screens in the U.S. and fewer than that in the rest of the world.”

 

Fithian then turned his attention to the stalemate occurring between studios and exhibitors over VPFs which are meant to subsidize the rollout of D-cinema equipment.

 

“Unless the deals are done in the next month or two we won’t have time to do the installations in time. We literally need the deals now to make the slate work. If the studios want this to happen in time for 2009, the deals have to be struck, and they have to be struck right now.”

 

Whether VPF agreements start closing or not some exhibitors may still take a wait and see approach. Mark Collins of Marcus Theatres, who participated in Karagosian’s panel discussion, said of his circuit’s conversion to digital:

 

“We have not seen any cost benefits. We need to start promoting digital cinema as something that is different from what (moviegoers) have.”

 

Though many in the industry have made similar statements, Collins may be onto something as this type of marketing approach has worked in the past during the rollout of digital audio in movie theatres.

 

The summit also played host to the usual suspects from studios, such as Howard Lukk, Disney’s Vice President of Production Technology and Wendy Aylsworth, the Sr. Vice President of Technology at Warner Bros. Both spoke mostly of technical issues such as the need for more 3-D production and post-production equipment. Aylsworth focused on the problem of subtitles in 3-D content, a topic she previously presented with a great depth of knowledge at the SMPTE conference in October of 2007.