Shooting ‘Avengers: Endgame’ for IMAX required a custom 6K camera

A 6K camera captured more of the super heroes than the screen requires.

Share

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Arri
If you squint, it almost looks like an alien laser cannon. Arri

By now, you’ve may have seen the video released by IMAX showing a standard theatrical presentation of the Avengers: Endgame teaser side-by-side with how the same images appear in IMAX. But like a lot things related to IMAX, it’s really difficult to communicate the experience in other media (like a YouTube clip). And while seeing it on a standard screen will still be thrilling, it’s not exactly the same experience.

Chief Technology Officer at IMAX, Brian Bonnick says the exact IMAX process of engineering a movie for the really big screen is proprietary but requires filmmakers to start thinking about the presentation before they even begin shooting. For the Russo Brothers directing team, their IMAX experience started out tentatively. They used a joint customized digital version of ARRI’s then-new large format camera, the Alexa 65, to shoot the airport superhero showdown at the center of Captain America: Civil War in 2016. The cameras clearly made an impression, as the duo quickly decided to use them to shoot all of Avengers: Infinity War and Avengers: Endgame.

The Alexa 65 actually has the ability to capture more data than most systems can even process—but that extra information still comes in handy. “In Christopher Nolan’s last film [Dunkirk], he wanted infinite detail in focus on a shot, so he chose a film-based camera with a resolution of 18K,” explains Bonnick. “Even though a projector’s 4K, we utilize that extra data to improve the quality of the presentation. It’s a technique called ‘oversampling.’ The Alexa is a 6K by roughly 3K pixel, so you’re dealing with more than 4K data, but we use that data in a post-production process that is designed to use every single pixel in the enhancement process.” Sampling from too much information is much easier than trying to extrapolate from a lower-resolution capture or stretch an existing shot.

The extra data also extends to the sound capture as well. As Bonnick explains, typically when sound is captured on set – say, for a massive explosion – the low end of the audio is recorded, but it’s pulled out in post-production because most systems aren’t capable of playing back low-end frequencies without distortion. “We manage the whole sound mix differently. It’s not a 5.1 channel system. We employ what’s called PPS, or Proportional Point Source sound.” This system creates what he refers to as “phantom images” – you can actually direct exactly where you want a sound to be coming from by triangulating three speakers (all of which have full range) to give the illusion of a precise sound origin. “You can hear a cannon going off and a pin drop,” says Bonnick. “This is why the ‘food chain’ idea is so important.”