IBC Big Screen 2017: Cinema in Focus

The IBC Big Screen Experience has become an unmissable programme for motion-picture professionals. Jim Slater reports on a critical examination of the future of cinema.

 

 

Since it was the 50th Anniversary of IBC’s founding, this year’s event was particularly special ­— to celebrate the occasion the organisers even produced a beautiful book compiled by Dick Hobbs that traces the growth of the event from its origins in 1967’s ‘summer of love’ when a few hundred people attended The Royal Lancaster Hotel in London, to today when over 55,000 attended the Amsterdam event.

Event Cinema on Steroids
The Friday programme included a session titled ‘Event Cinema on Steroids’ which looked at shooting and delivering HDR and immersive audio alternative content to the Big Screen. Julian Pinn chaired a session at which Andy Beale, and Jamie Hindhaugh from BT Sport joined John Bullen of Sony, Johnny Carr, Vue’s alternative content manager and Stuart Bowling from Dolby talked about the technical and logistical planning that went into carrying a major global event, the UEFA Champions League football final held in Cardiff on 3 June. BT Sport achieved the mammoth task to shoot and broadcast this in both HDR and immersive audio live to homes and — for the first time globally — to a cinema screening room equipped with HDR projection and immersive audio sound. The panel examined the complexities of the multi-camera shoot and explained the workflow challenges. They evaluated the wider technological and commercial viability of advancing the experiential envelope of event cinema; and how the boundaries of theatrical and non-theatrical disciplines are merging. Highlights of the match were shown in Dolby Vision with Atmos sound.

Contrast and HDR

Visualising the Science of High Dynamic Range and Wide Color Gamut was an educational session, with Julian Pinn introducing Pixar’s senior scientist Dominic Glynn and Robert Carroll from Dolby. They concentrated on how HDR images might affect the audience experience, Robert looked at the basics of projector performance, asking ‘how black is black?’ and pointing out that current projectors don’t come anywhere near matching the limits of the human visual system.

Factors affecting the cinema image include the peak luminance of the projector and the dark luminance, but at least as importance is the ‘veiling glare’, the light scattered around the optics of the projector, the porthole glass and reflections from the auditorium onto the screen — you have to consider the complete system, and that will be different in every auditorium. The average picture level will also affect the contrast perceived in the auditorium, so there is a great deal to take into account. Robert compared a Dolby Vision projector (20 stops of on-screen contrast visible) with a standard DCI one and showed that the DCI projector’s contrast ratio is limited by its own veiling glare and not by the room. Robert showed many complex curves in his technical presentation, but the hard work was leavened and clarified by on-screen comparisons of carefully composed images from Mount Rushmore by Candlelight and a clip from The Revenant, with HDR images notable for showing far more detail in the blacks.

Dominic Glynn gave an overview of Pixar’s recent research into wide colour gamuts, and his presentation was given from his iPhone on the Big Cinema screen. He did mention that the projection team had been wary of this idea, but everything worked well. He began with a clip from Pixar’s Inside Out which used a highly saturated colour pallete to achieve ‘emotional’ colours and an effect like using UV light on a black felt background, producing colours that hadn’t before been seen on a cinema screen. Explaining the effects that exposure levels have on the detail in the dark areas, he showed a clip from Brave at normal exposure level and then increased it by seven stops ­— it was remarkable how much more detail was revealed in the blacks. The Pixar researchers had asked how wide the colour gamut needs to be — does it really need to be much greater than P3? He highlighted the colours that cannot currently be achieved in a DCI projector, and then concentrated specifically on the colour capabilities of RGB laser projectors, showing that wider gamuts can readily be achieved and that these could be tailored to differentiate one cinema’s offering from another’s. Pixar already uses WCG to create different image experiences — and to increase the sense of immersion. 

Dominic talked about some of the WCG production processes at Pixar, explaining that they initially use virtual cameras to capture as broad a gamut as possible, allowing them to adjust things as part of the  creative process, only ‘baking in’ the final colour decisions at a later stage.

Questions from the fascinated audience ranged widely. Dominic said that RGB lasers will enable us to produce colours outside the P3 colour triangle, and that there is currently interest in producing a version of a movie with an extended colour gamut that could only be shown in specially equipped venues. Pixar deals with consumer metamerism (where two people sitting together could experience different hues) by involving DoPs fully with each release, allowing them to see what laser images look like in different circumstances and letting them judge how best to achieve their desired ‘look’. Pixar understands the importance of involving cinema-tographers in the grading process, and it is important not to regard ‘graded for xenon’ as some sort of voodoo that prevents us moving forward. It is not only the light source that must be considered when grading images.

In answer to ‘do I need a completely black theatre?’, the earlier response that ‘if your projector is capable of a wider colour gamut and a wider dynamic range, the theatre ambience is less important’ was re-iterated.

Displays that are different…

At last year’s IBC, Jon Karafin received rave reviews for his enthusiastic explanations of what Litro light-field cameras are capable of when applied to cinema, providing unparalleled creative freedom and flexibility on-set and in post-production. Jon has moved on to become CEO and founder of Light Field Lab, a technology startup comprised of scientists, engineers, content creation specialists and light field technology visionaries coming together to design what they call ‘the world’s most innovative holographic ecosystem’.  The company is developing next-generation light field display technologies that can take information light field camera systems provide and recreate ‘full parallax holographic displays’­ — high-resolution holograms without a need for glasses.

There wasn’t a display to see, of course — a prototype is expected in 2018 with a development kit in 2019 — but we were told that their displays will enable photo-real objects to appear as if they float in space, and that powerful integrated processing and imaging solutions being developed will allow these immersive visuals to be delivered over commercial networks at realistic speeds and bandwidths. Future releases of the technology will allow users to interact with holographic objects. Glasses-free viewing of such a display is achieved through emitting light from each photosite (pixel) such that the viewer only sees it when certain conditions, such as ray angle or wavelength, are met. This can be achieved with lasers, optics, or other emerging beam-steering methodologies. The light field display they are designing should provide an experience, just like watching a play in a theatre, where everyone sees the same narrative, but views it from his or her own place in the audience. Such a display will provide a group social experience, rather than one where the audience sees a single identical view, isolated from each other by headsets. The light field can be creatively changed so everyone sees the same thing — or completely different things.

Light Field Lab will first focus on perfecting its holographic display, but in the future they want integration of all the senses into a fully immersive format, and are working towards technologies that allow the viewer to interact with and feel holograms.

Questions to Jon came thick and fast, on topics including motion artefacts, frame rates (72 fps to start, but maybe 200fps later), and whether we are moving away from cinematography to data-ography. Jon reiterated that there is much work to be done in many technical areas, and David Stump camp back via Skype excited at being able to put something before an audience that they have never seen before, summing things up nicely with the rather enigmatic statement that cinematographers will be able to imagine the future, rather than regretting the past!

 

VR/AR and mixed reality too…Wanting to discover as much as possible about current thinking on AR and VR, I switched over to the main conference in the Forum for an afternoon Technology Forward Keynote session which looked at the status and road ahead for VR over a wider market than the cinema industry. It was interesting to learn about the variety of consumer devices that are emerging as well as hearing about successful VR experiences across different platforms that highlight the opportunity ahead for VR from entertainment to sports. In the chair was John Cassy from London-based immersive content studio Factory 42, whose current projects include an advanced interactive and holographic VR production with Sir David Attenborough for Sky. He was also in conversation with Rikard Steiber from Viveport at HTC Vive, an app store for virtual reality experiences.

We were introduced to Ikea Place, an augmented reality app that allows you to take any item from the Ikea catalogue and place it, appropriately scaled, into an image of your own room, so that you can try before you buy and get a good idea of how it would look in situ. This can be on your tablet, or for a more immersive experience, a full VR headset can be used. We were told that Lenovo and Disney have made an augmented reality headset and associated light-saber where you can fight Darth Vader and participate in a number of VR games — just the thing for Christmas.

The speakers talked of the aim for ‘true presence’, software apps that convince your brain you are actually in a virtual world. Soon to appear will be haptics applications, using touch sensation and control to interact with apps — if you are shot by a virtual arrow, you will feel the hit! Much research is going into brain/computer interfaces and eye-tracking will soon enable you just to look at an object to ‘click’ on it, rather than using a mouse.

Viveport has introduced the first Cloud VR system which streams the VR experience. Using Netflix for VR, subscribers can pay $6.99 a month to access up to 2,000 VR titles, and there are also 700 VR experiences available for arcade operators. As well as games, VR apps are used in education and to provide social experiences ­— useful for training — and Facebook friends will be able to hang-out together in VR. You can discover the world in VR using Google Earth VR — look at g.co/earthvr when you are wearing a VR headset.

VR is still at the learning stage, and there seems to be great co-operation between competing suppliers, so it was good to see that Mindshow on the Steam platform allows VR headset owners to create, share, and experience shows in VR. You can make animated movies in VR with your own body and voice, become 3D cartoon characters and act out a show.

As with all media, content is key, and new storytellers will be needed to create ‘never before seen’ experiences. Hollywood is beginning to embrace VR, and Rikard said that they are looking forward to Ready Player One, the forthcoming Spielberg movie set in 2045, where everyone lives in a VR world, since this is likely to make it understandable to everyone and to draw attention to its possibilities.

Rikard said that according to Superdata research, who specialise in stats for gaming and interactive media, the VR/AR market is currently worth $1.8 Billion and is predicted to grow to $37.7Bn by 2020. Existing commercial brands are likely to adopt VR to augment their branding and add value. Prices for equipment are likely to fall considerably, and they are aware of the problem that headsets are still a nuisance — rather like putting on a diving mask was one description. Work is ongoing to design more comfortable and user-friendly VR kit. One idea is to have a ‘see through’ button which would allow a headset user to take hold of a glass of wine when offered it from a real waiter outside the virtual world    already there are numerous lightweight designs for AR glasses.

VR is a transformative platform, a growing opportunity for programme makers and a new way to interact with content for audiences, both in and out of the cinema business. 

Jim Slater

 

Future Technologies

Julian Pinn introduced a session which must qualify for an award for the longest title: “Future Camera and Display Technologies and Applications Leading to AR/VR, Immersive Media, and Holography”. The session lived up to its billing, bringing the audience up to date with an encyclopaedic range of topics.

David Stump ASC appeared on the Big Screen via Skype, and provided a masterclass on aspects of modern and future cinema. He works as director of photography, as visual effects director and as visual effects supervisor on numerous top movies, and is known for his interest and expertise in new technologies. He took us through technical developments from when Panavision film cameras were state of the art to today’s 8K digital cameras. He tempted us with tales of research work going on in the Media Lab of MIT on Femto-Photography, visualising photons in motion at a trillion frames per second.

David talked about High Frame Rate photography, referencing Ang Lee’s work, and explained how RealD Truemotion works as a flexible tool to allow integration of different frame rates. He talked on use of high brightness images, saying they come at a price, and that we will need to revisit frame rates and shutter angles if such images become the norm. He also explained new technologies in image sensor design, introducing us to ‘black silicon’ which is 500 times more sensitive to light than existing sensors, opening up new areas of imaging, and paving the way for development of 1-bit binary sensors. Curved image sensors are also being developed which will require simpler lenses and less complex optics.

David explained how a move to REC 2020 colour will need different workflow patters, and discussed the work that ACES are doing to cope with this and higher bit depths, recommending people to find out more about the range of AMPAS test materials that are being developed for the new technology areas. He finished with a whirlwind look at computational camera technology. The possibilities are fascinating, combining depth information with RGB, gathering 2D and 3D data through the same lens, getting 3D images from a single camera. David left us all a great deal wiser!