Making Movies in the Not So Distant Future
You’ll never believe where I’ve just been! I recently returned from a possible future reality where I was visiting a movie set. The crew was so awesome and showed me so many phenomenal things. I can’t talk about the movie itself because they don’t want anybody to steal the idea before they get a chance to make it. But I can tell you about their workflow and the technology they were implementing.
Okay, lets start with the camera. Man, that thing was amazing! It didn’t really look like a typical camera you’d see today, more like a giant eyeball on a gimbal. They rigged it all sorts of ways and controlled it with a tablet. And when I say controlled it, I mean pretty much every aspect you could think of. You see it’s a light field camera so it captures everything in front of it with depth. Through an app you can control things like focus, focal length, exposure, and frame rate. What was even crazier is that none of that was baked into the actual raw files! It was all settings passed on through metadata. You could also control physical movements of the gimbal like pan and tilt with the tablet. Camera position and movement is also saved as metadata so camera tracking later isn’t even necessary for visual effects.
I noticed that the camera assistants were never reloading fresh media cards, so I began to wonder where the data was being saved to. What was even crazier is that most everybody was getting a live feed from the camera onto their tablets and smart phones, plus they could go back and watch earlier takes. The clips even had clean audio from the audio recorder and all the information from the script supervisor was saved with them. How was all of this possible? I asked and found out that apparently everything was being fed wirelessly to an on set server. The server was processing all of the files, combining and labeling them, and keeping it all organized. It was automatically generating dailies with timecode and title overlays and then pushing everything (including the raw camera files) to the cloud. What was more unbelievable is that all of this was happening instantly!
I had to go check out this server. It happened to be in the editing trailer and it was about the size of today’s Apple TV. I couldn’t believe that the massive amounts of data for an entire feature film shoot could be stored on there and on the cloud simultaneously! I noticed the editor was working on a large tablet, which was nicely mounted into a table, and the full size picture was on a bright 32” monitor in front of him. Magically, there was no cable running from the tablet to the tv monitor. In fact, I realized there weren’t many cables anywhere.
I was amazed by all of this and decided to strike up a conversation with the editor. He explained to me that a lot of the tasks he or an assistant would have done in the past were now automated. Not only was he able to see everything coming from the set live, his video editing application was creating an assembly edit for him! I thought this sounded ridiculous, I mean isn’t that the editor’s job? He explained to me how media and metadata from the set was analyzed and organized. Then that was combined with the shooting script. The more detailed the script, the more accurate the assembly was.
I got to see how this works in action. As each successive take would roll a clip would grow on the timeline. A new take would create an audition over the previous take. When the crew was done with a particular setup the script supervisor would label the director’s favorite take and that would be the selected clip in the audition. As they did more setups the video editing program would live switch between new angles depending on who had lines or what action was taking place in the scene. It kept doing this over and over until they completed the scene. It even added cutaway shots, but how? Well for example, the script might call for a cut away shot of a briefcase and the program automatically knew where to cut the actual shot in! Unreal!
This really got me wondering, why do you even need an editor? If he isn’t the one putting the scenes together, how was he earning a paycheck? The editor laughed at my questions and explained to me that while the artificial intelligence of these systems was good and could get a lot of the monotonous tasks out of his way, there was still a lot of creative work that needed to be done by humans. Humans have taste and style. And while perhaps those things can be somewhat emulated by a computer it always feels inauthentic and artificial. Our emotional response to material is very subjective. You can’t tell a computer to make the scene more funny or more sad, it just doesn’t understand that. Maybe someday, but even at this time in the future, it still wasn’t fully realized.
For now, here’s what he actually had to do: he began working by reviewing the automated assembly, making notes, and adjusting edits for timing. Maybe some shots or entire scenes needed some reordering. Or maybe he needed a reaction shot where there wasn’t one. He then looked for ways to cut down the scene. Sometimes what looks good on paper and even feels real on set doesn’t jive in the edit, so it would need to be omitted. It was his job to make these kinds of creative decisions in order to tell the story.
What really blew me away is how the producers and director, or even another editor halfway around the world, could collaborate with this editor in realtime through the cloud. In fact if a producer wanted to watch the latest edit of a scene they didn’t need to come in and bother the editor. They didn’t even need to open the project in a video editing application. Plus the editor didn’t have to export and upload a movie file. The producer just needs to login online and stream the latest edit through a video player wherever they are. This can be done on a home television set. Whenever they want to make a note they just pause the video, type it in or speak into the remote, and press play again. Then the editor receives a notification and can choose to sync those notes right in his timeline immediately.
That all sounds awesome, but what if the editor is in the middle of making changes? Apparently this system is smart enough to track the edit the producer watched and compare it to what the editor is currently working on and make the proper adjustments. For example, if the producer made a note on a specific shot and the editor moved it earlier that change is tracked and the note would find its proper place. Honestly a lot of this was over my head, but I can say what I saw worked elegantly.
You might wonder how multiple editors collaborate on the same project. Well, their work is constantly backed up and revisions are searchable, even by the name of the editor. It is really important that the editors communicate so they aren’t wasting each others time working on the same scene. But let’s say they didn’t communicate and they did just that. Their two timelines can be merged, and where there are differences a compound or nested clip is created for each variation of the edit and grouped together as auditions. They can show the director both versions and decide together which edit to go with. I thought that was a very elegant solution to a potentially messy situation.
This system was really responsive and fast. I was really curious about what was happening under the hood. I mean the files generated by the camera must have been massive and therefore processor intensive. At the same time everybody was working off of mobile devices. It appeared invisible to me, but apparently what was streamed over all the devices were extremely high quality proxy files generated immediately on the camera at the same time it was capturing the raw files. The settings and metadata of the original clip is always accessible, changeable, and new proxy files could be regenerated. Let’s say you wanted to reframe a shot. It instantly creates a new proxy. Now when it comes time to export deliverables all the original data is accessed to produce the highest quality output. But honestly, you didn’t have to think about all of that, because it just worked!
As I returned from my trip to the future I was obviously overwhelmed. How on earth did we come so far? As I thought long and hard about that I realized that the seeds of this technology already exist today. In fact many pieces of the puzzle are already being used or are in development. Obviously companies like Lytro, Light Iron, Lumaforge, and Intelligent Assistance are doing a lot to push towards this kind of future. But I also had a new found appreciation for what Apple is doing with Final Cut Pro X. It thrives on metadata organization: Content Auto-Analysis, ranged based Keywords, Smart Collections, Compound Clips, Auditions, and Custom Metadata were crucial to making the system I saw in the future work properly. The proxy workflow in Final Cut is already seamless today. Not only that, but the current Magnetic Timeline will make editing a breeze on multitouch devices. A track based timeline would be cumbersome and slow on the future system. Is this the future that Apple and others are already seeing now? They must be at least seeing something like it.
Before I left I was reminded that this was just one potential future and there could be variations to it depending on what we decide to do today. And those who embraced the changing world now had such an advantage in the future. The ones who got it were creating at their pique potential. The ones who didn’t were bitter and unemployed. I realized that even though many people had held on to what they already knew and fought really hard to prevent this future from happening, the change was inevitable. Somebody out there was taking advantage of new technologies and workflows and pushing them forward. It really is in my best interest to always be learning and keeping up with these changes. I’m not talking about blindly embracing every new idea. But testing things out, breaking them, and improving them realizes the full potential of the future. If we have this attitude I really think the future I saw could become a reality within the next decade. I really hope that is the case, because it was incredible!
Comments