Dr_X
Miembro habitual
- Mensajes
- 43.431
- Reacciones
- 15.044
Interesantísimo making of de Postcard from Earth. Copio lo más jugoso (casi todo).
This movie also demonstrates the potential of this new canvas for filmmakers. “I’m still processing it all,” the Oscar-nominated director of Black Swan tells The Hollywood Reporter of Sphere, whose interior is coated with a 160,000 square-foot 16K LED display that extends beyond audience members’ peripheral vision and high above and behind their heads. The visuals – shown at a high resolution that creates a sense of depth and of being there – are accompanied by a powerful new beamsplitting sound system and 4D features such as wind and haptic seats.
[...]
Like in the early days of cinema, the visual language and filmmaking tools were developing as the movie was being made. “We started off with nine Red cameras welded together to try to get the resolution we needed to make an image for Sphere,” Aronofsky remembers, adding that they then received the first prototype of the custom 18K Big Sky camera that was invented to create content for Sphere. That camera – used to lense most of Postcard – evolved during production as “we were also trying to figure out what was the language of how to shoot a 270-degree film, how to make audiences feel comfortable with their peripheral vision filled with imagery.”
[...]
Bringing Postcard to the Sphere (at capacity with standing room, Sphere accommodates up to 20,000, but Postcard screenings won’t use all of the seats) also involved a tight production schedule involving a large amount of invention, including developing a complex production and postproduction workflow, new technology and processes for everything from reviewing work to moving huge amounts of data. Aronofsky reports that the movie involved a whopping half-petabyte of data.
Oppenheimer editor Jennifer Lame was recruited to cut the movie, which was done on an Avid Media Composer. A newly developed virtual reality program allowed her to review cuts in what might appear to be Sphere (they also tested cuts in the quarter-size Big Dome at Sphere Studios in Burbank). Industrial Light and Magic and Digital Domain were among companies that contributed their work to the production.
But Aronofsky, Lame and the team ultimately couldn’t watch the movie in the actual Sphere setting until early September, further challenging postproduction.
Picture Shop colorist Tim Stipan (Aronofsky’s The Whale) graded the movie while the director’s longtime collaborator Craig Hennigan served as supervising sound editor, designer and rerecording mixer. “Tim really had to figure out how to time these images. No one had ever timed an 18K image before,” says Aronofsky. “Same thing with the sound. The image being 270 degrees, you want the sounds to be in the right place. But you can’t really mix it on a normal movie screen because you don’t know exactly where that thing is happening. So we had a sort of guess and do our best and then we got into the Sphere itself and the MSG team there figured out how we could actually use that big screen to actually mix the movie.”
For Stipan, the team installed a Baselight color grading system in a room at Sphere, so that he could work in the actual environment. (Baselight maker Filmlight wrote new software to support the Sphere content). Hennigan meanwhile started by creating a Dolby Atmos mix and worked from there.
Shulkind – who has been working with Sphere for nearly four years and was instrumental in developing the Big Sky camera and workflow for filmmakers – remembers working in the Las Vegas venue during September. They had a few hours each morning to check the editing, color timing and sound, and then worked until midnight each day while U2 rehearsed and the crew was putting the finishing touches on the venue.
This even involved testing and preparing final elements, such as the wind effects, which come from the front of the venue. “It takes like 30 seconds for some of the wind to hit you, and so we had to time out how wind comes to the front row and the last row,” he explains. “They put these plastic cups with some tinsel on top of it so we could track when different areas were getting [wind]. … It’s been a tight month.”
Darren Aronofsky Describes His Journey to Creating the First Movie for the Las Vegas Sphere
'Postcard from Earth' debuted Friday at the new entertainment venue, which opened last week with U2's residency.
www.hollywoodreporter.com
This movie also demonstrates the potential of this new canvas for filmmakers. “I’m still processing it all,” the Oscar-nominated director of Black Swan tells The Hollywood Reporter of Sphere, whose interior is coated with a 160,000 square-foot 16K LED display that extends beyond audience members’ peripheral vision and high above and behind their heads. The visuals – shown at a high resolution that creates a sense of depth and of being there – are accompanied by a powerful new beamsplitting sound system and 4D features such as wind and haptic seats.
[...]
Like in the early days of cinema, the visual language and filmmaking tools were developing as the movie was being made. “We started off with nine Red cameras welded together to try to get the resolution we needed to make an image for Sphere,” Aronofsky remembers, adding that they then received the first prototype of the custom 18K Big Sky camera that was invented to create content for Sphere. That camera – used to lense most of Postcard – evolved during production as “we were also trying to figure out what was the language of how to shoot a 270-degree film, how to make audiences feel comfortable with their peripheral vision filled with imagery.”
[...]
Bringing Postcard to the Sphere (at capacity with standing room, Sphere accommodates up to 20,000, but Postcard screenings won’t use all of the seats) also involved a tight production schedule involving a large amount of invention, including developing a complex production and postproduction workflow, new technology and processes for everything from reviewing work to moving huge amounts of data. Aronofsky reports that the movie involved a whopping half-petabyte of data.
Oppenheimer editor Jennifer Lame was recruited to cut the movie, which was done on an Avid Media Composer. A newly developed virtual reality program allowed her to review cuts in what might appear to be Sphere (they also tested cuts in the quarter-size Big Dome at Sphere Studios in Burbank). Industrial Light and Magic and Digital Domain were among companies that contributed their work to the production.
But Aronofsky, Lame and the team ultimately couldn’t watch the movie in the actual Sphere setting until early September, further challenging postproduction.
Picture Shop colorist Tim Stipan (Aronofsky’s The Whale) graded the movie while the director’s longtime collaborator Craig Hennigan served as supervising sound editor, designer and rerecording mixer. “Tim really had to figure out how to time these images. No one had ever timed an 18K image before,” says Aronofsky. “Same thing with the sound. The image being 270 degrees, you want the sounds to be in the right place. But you can’t really mix it on a normal movie screen because you don’t know exactly where that thing is happening. So we had a sort of guess and do our best and then we got into the Sphere itself and the MSG team there figured out how we could actually use that big screen to actually mix the movie.”
For Stipan, the team installed a Baselight color grading system in a room at Sphere, so that he could work in the actual environment. (Baselight maker Filmlight wrote new software to support the Sphere content). Hennigan meanwhile started by creating a Dolby Atmos mix and worked from there.
Shulkind – who has been working with Sphere for nearly four years and was instrumental in developing the Big Sky camera and workflow for filmmakers – remembers working in the Las Vegas venue during September. They had a few hours each morning to check the editing, color timing and sound, and then worked until midnight each day while U2 rehearsed and the crew was putting the finishing touches on the venue.
This even involved testing and preparing final elements, such as the wind effects, which come from the front of the venue. “It takes like 30 seconds for some of the wind to hit you, and so we had to time out how wind comes to the front row and the last row,” he explains. “They put these plastic cups with some tinsel on top of it so we could track when different areas were getting [wind]. … It’s been a tight month.”