As anyone who has gotten into a knife fight over an electrical outlet at an airport terminal can attest, computers control our lives now. And while the good ones deliver us funny dog videos and pornography, the bad ones are trying to steal our jobs. But while we tend to associate the threat of automation with factory workers and travel agents, those artistic types in Hollywood ought to be sleeping with one eye open too. That’s because …
5
Smart Cameras Are Replacing Camera Operators
Smart Cameras Are Replacing Camera Operators
Being a camera operator requires a steady hand, a bunch of technical know-how, and enough social skills to not bludgeon the director’s head in for his unreasonable demands to “just shoot it upside-down.” It also requires having a camera, we should point out. And while advances in camera technology are making the camera operator’s job easier by the minute, how easy can a job get before it simply stops existing?
Robotic arms aren’t only an issue if you’re an autoworker in Detroit. For Microsoft’s new Surface Studio commercial, the director used KIRA, a robotic arm that handled all of the camera movement:
Rather than relying on crappy humans who shake the camera with their stupid breathing and pulses, the cold, emotionless robot is able to move the camera smoothly and repeatedly to the director’s exact liking. That means every single reshoot will be the same, to the millimeter. Now, this technology isn’t exactly new — the famous dinner scene from Back To The Future II in which Michael J. Fox plays three of the characters was one of the first movies to use a similar technology. The difference now is that instead of using them to shoot scenes that physically cannot be shot by a human, we’re using them for things as mundane as TV commercials. Or Gravity.
But even KIRA needs an human master to operate. The next generation of cameras will be calling the shots with their own cold robot brains. In The Robot Skies was released late 2016, and is the first movie to be shot entirely with drones. So what’s the big deal? Camera operators have been using drones to line up tricky shots since wearing T-shirts under blazers was fashionable. The big deal is that Robot Skies used an entirely new breed of drones. Old drones still had humans operating them, deciding what shots would look good and how the camera should move. Working with an artificial intelligence lab in Belgium, the Robot Skies filmmakers built drones with “cinematic algorithms” that would let the little buggers decide for themselves what angles and lighting would look good, and adjust their flight paths accordingly. With enough research, we could very well be seeing movies in the future from Steven Spielbot, Wes Andercyborg and QuIntel Tarantino.
4
Even Scripts Are Being Written By Machines
Even Scripts Are Being Written By Machines
As much as engineers would like to try, it’s impossible to replace all liberal arts majors with a bunch of machines. Take writers, for example. Surely they must be immune to the rise of the machine worker, right? Right? Well, while a robot may never write the next Moby Dick, it wouldn’t take more than a toaster strapped to a typewriter to come up with garbage like Dumb And Dumber To. The machine writer is coming, so you better get your ass in gear and finish that Goonies 2 spec script before it does.
In 2016, an independent filmmaker named Jack Zhang started a Kickstarter for a horror movie called Impossible Things. He claimed that 85 percent of movies don’t make money because studios are taking a mishmash of things and not considering what the audience wants to see, which is an odd criticism to aim at an art form that pays marketing departments to host test audiences. To reintroduce populism into moviemaking, Zhang decided to feed plot points from the most popular horror movies into a computer and create the most popular story arc possible. The result was “a grieving mother who, after the death of her young daughter, succumbs to a severe case of supernaturally induced insanity.” Oh, and the trailer should feature a scene with a piano and a bathtub. If that sounds like a mishmash of every bad horror movie you’ve ever seen, that’s kind of the point.
The Impossible Things trailer was at least enough for the Kickstarter to be fully funded, giving this indie horror a budget of a whopping 30,162 Canadian dollars. Still, and we’re not trying to shit on horror movies, but it might be easier to convince people of computer-generated storytelling by looking at a genre that’s a bit more story-driven. Sci-fi might be a step in the right direction, like the movie Sunspring, a short film experiment made for the 48-Hour Film Festival in London, which was written by an AI program called Benjamin. The producers fed the data of dozens of popular movies into this neural network, and it spat out a script, complete with dialogue, based on the prompts given to it. The producers then made a nine-minute film based on Benjamin’s screenplay:
The movie is amusing, in an uncanny valley sort of way. Most of the dialogue is what could be called “coherent gibberish” — the sentences are grammatically correct (mostly), but they are otherwise incomprehensible. This goes for the directions as well, like this:
docdroid.net
“When you think about it, aren’t we all standing in the stars, man?” *bong rip*
Ironically for a sci-fi movie written by a robot, there’s not a lot of science going on in the plot. The dialogue is mostly about misunderstandings, love triangles, and disappointing sex. The movie ends with a nonsensical Gone Girl-esque monologue about the regrets of lost virginity. Despite being utter nonsense, the movie is still kind of engrossing, even if it’s in a cloning-experiment-gone-wrong sort of way. Maybe the problem here is that Benjamin isn’t in the right business. Maybe its true calling is being an electronic songwriter:
3
We’re Teaching Computers To Be Animators
We’re Teaching Computers To Be Animators
We’ve talked several times before about how getting into VFX or the CGI industry in Hollywood these days is a bit like getting into the anchor-selling business on the sinking deck of the Titanic. The companies spend so much time undercutting one another that they can’t turn a profit on their work, leading to situations like that of Rhythm and Hues, a VFX company that went bankrupt from working on Life Of Pi two weeks before winning an Oscar for their work on Life Of Pi. So naturally, the industry is working tirelessly to reform and make sure that these artists are properly compensated for their work.
Just kidding. They’re trying to replace the artists with computers, because in addition to being less temperamental, they’re also far less needy. But can they truly distill beauty like a visual artist can?
Yup.
Since our progress in the field of Frankenstein-like reanimation has been frustratingly slow, Microsoft and ING teamed up to create a machine that can pretend to be dead people. Rembrandt, more specifically. The computer, appropriately called “the Next Rembrandt,” employs complex algorithms to generate an entirely new painting in the style of Rembrandt. And we don’t just mean that the computer generated a digital replica of a Rembrandt; it recreated the brush strokes and textures using a 3D printer. While it might not be enough to fool experts, it’s certainly good enough for your parents to see it and cancel payments for your art school degree.
But this potential revolution is not without its critics. Keisuke Iwata, a Japanese animator and president of a popular anime channel, sees projects like the Next Rembrandt as the harbinger of doom for meatbag animators. Iwata believes that in the near future, computers will be able to compete with humans in terms of creativity and skill, and computers don’t have preposterous demands like “compensation” or “healthcare.”
mountaindweller/iStock
Mostly.
Studio Ghibli director Hayao Miyazaki, whom we can reasonably call the god emperor of animation, thinks that this AI animation is some depressing nonsense. During a demonstration of AI animation software, which was being used to generate unusual body movement for a horror game (computers can’t think of a reason not to use a head as a foot), Miyazaki wasted no time in saying that he was disgusted and called the demo “an insult to life itself,” which would be pretty stiff criticism coming from a random YouTube commenter, much less one of the most influential animators of all time. He went on to lament that, in our eagerness to figure out ways to outsource our creativity, “humans are losing faith in ourselves.” He’s not wrong. Using a head as a foot? That’s the wave of the future? A bunch of second-graders came up with that exact same idea in the last five minutes. C’mon humanity, we still got a few good decades left in us.
2
Cinematography Can Be Done In Post-Production
Cinematography Can Be Done In Post-Production
From the beginning of cinema up to the olden days of the mid ’90s, there wasn’t much dispute over what exactly the director of photography did. While directors were busy yelling at actors, they worked tirelessly on set to make a movie scene look as good as possible, a lot of which involved waiting patiently for the sun to get into the right fucking spot for the perfect lighting. With the onset of digital cinematography, however, it’s become more and more difficult to determine who should accept the Oscar for Best Cinematography — the director of photography or the green screen?
Oscar contenders with a lot of post-production have drawn criticism from all the insufferable artsy cinematographers who insist on doing things the old-fashioned way. For example, in The Hateful Eight, Quentin Tarantino and cinematographer Robert Richardson not only decided to shoot everything on traditional film, but they also did all the post-production work like color correction using chemical developing techniques (nearly all modern movies shot on film are still digitized for post-production work). For his work, Richardson was nominated for Best Cinematography in 2015. However, in 2012, Claudio Miranda won Best Cinematography for Life Of Pi, even though most of the movie was made on a computer. While Miranda undoubtedly deserves recognition for his camerawork, we can also see Richardson’s point that there’s a pretty big difference between capturing a gorgeous sunset with the right lighting, lens, film, and camera settings versus just CG-ing a sunset later on.
But even in Life Of Pi, Miranda still had to do stuff like focus the camera and use the right kind of lens for the shot. But we’re quickly making that a thing of the past as well. A company called Lytro has developed a new type of camera which, through science/magic, captures holographic images instead of flat 2D images like most cameras.
Doodybutch
If you’re really a huge nerd, here’s a 25-minute video about it.
With a normal camera, you would need to reshoot the same scene with three different focus and aperture settings to capture the three images above. With Lytro’s Cinema camera, you only need to take one picture and then tell a computer what parts of the scene you want in focus and which ones you don’t, as it captures the image in 3D instead of the 2D of a regular camera. You can also completely remove or add background from a certain depth, essentially making even green screens obsolete. With technology like Lytro’s, cinematographers will again have to relearn what the job entails. And if we know greedy studios, that job will entail them learning how to say “Do you want fries with that?” without bursting into tears.
1
Post-Production Will Be Done By AI
Post-Production Will Be Done By AI
Eventually, film sets will be nothing more than Tom Cruise shadowboxing in the Universal basement, with someone filling in the blanks three months later. Except by that time, even that someone will almost certainly also be a computer.
Post-production, or just “post” if you’re the type who thinks shooting one student short makes you part of show business (or just “showbiz”), encompasses a lot of different things. One aspect is the addition of sound effects, which ranges from T-Rex roars and lightsaber whooshes to mundane stuff like leaves rustling and doors closing. Researchers at MIT decided to see if they could teach a computer to match up sound effects with certain on-screen actions, and what do you know, it worked! Their little silicon-powered editor automatically added sound effects to a series of video clips, and human test subjects were unable to tell the difference between the computer’s work and authentically recorded sounds.
Editing is on its way to being automated as well. It’s an expensive process, making a masterpiece out of miles of film (or hundreds of hard drives) which show the same actor mispronounce the word “spoon” 20 times in a row. Naturally, filmmakers are keen to find cheaper ways to do it. In 2014, a group of researchers working for Disney published a paper on an automatic editing algorithm they created. By calculating the 3D position of the cameras in a scene, computers were able to determine what the cameras were focusing on and used that information, along with some basic filmmaking rules, to determine when to cut to different shots. Here’s a sample video filmed using some smartphones and GoPros:
But this isn’t just for editing your snowboarding fails or sex tapes. In 2016, the producers of the horror movie Morgan decided to outsource their trailer to Watson, the IBM supercomputer that made Jeopardy champ Ken Jennings look like the guy you skip over when picking a team for bar trivia night. Specifically, they wanted to it to be scary, so IBM had to teach Watson about fear, and what humans in particular fear. Then they fed it the movie, which is about an AI that becomes too scary for humans so they try to destroy it, and told Watson to make us shit our pants.
We can’t help but notice that this trailer contains neither a bathtub nor a piano.
It might not be perfect, but for a first attempt, Watson still has a disturbingly good grasp on what gives humans the absolute heebie-jeebies. So thanks to Morgan, we now have an advanced computer intelligence that knows how to manipulate human emotions. But hey, it saved some editor a day’s work, so all in all, a fair trade.
When he’s not teaching Watson how to produce constant low-level anxiety in humans, Chris plays piano in the bathtub on Twitter.
Also check out 5 Automated Jobs That Seem To Suggest We’re Trolling Robots and 5 Real Robots Who Totally Suck At Their Job.
Subscribe to our YouTube channel, and check out Why Any Robot Uprising Is Doomed To Fail, and other videos you won’t see on the site!
Follow us on Facebook, and we’ll follow you everywhere.