Today at 10am PST, Instagram is lifting the veil on Hyperlapse, one of the company’s first apps outside of Instagram itself. Using clever algorithm processing, the app makes it easy to use your phone to create tracking shots and fast, time-lapse videos that look as if they’re shot by Scorsese or Michael Mann. What was once only possible with a Steadicam or a $15,000 tracking rig is now possible on your iPhone, for free. (Instagram hopes to develop an Android version soon, but that will require changes to the camera and gyroscope APIs on Android phones.) And that’s all thanks to some clever engineering and an elegantly pared-down interaction design. The product team shared their story with WIRED.
By day, Thomas Dimson quietly works on Instagram’s data, trying to understand how people connect and spread content using the service. Like a lot of people working at the company, he’s also a photo and movie geek—and one of his longest-held affections has been for Baraka, an art-house ode to humanity that features epic tracking shots of peoples all across the world. “It was my senior year, and my friend who was an architect said, ‘You have to see it, it will blow you away,’” says Dimson. He wasn’t entirely convinced. The movie, after all, was famous for lacking any narration or plot. But watching the film in his basement, Dimson was awestruck. “Ever since, it’s always been the back of my mind,” he says.
By 2013, Dimson was at Instagram. That put him back in touch with Alex Karpenko, a friend from Stanford who had sold his start-up to Instagram in 2013. Karpenko and his firm, Luma, had created the first-ever image-stabilization technology for smartphone videos. That was obviously useful to Instagram, and the company quickly deployed it to improve video capture within the app. But Dimson realized that it had far greater creative potential. Karpenko’s technology could be used to shoot videos akin to all those shots in Baraka. “It would have hurt me not to work on this,” says Dimson.
The insight that powered Karpenko’s algorithms began, like so many other startup ideas, as a phD thesis at Stanford. This was 2010, and the iPhone 4 had come out: one of the first phones that could capture HD video. That sounded terrific, in theory, but cramming such a great video camera onto a handheld device meant that the videos themselves were often shaky to the point of being unwatchable. “They were all just crappy,” Karpenko says.
He knew that image stabilization was the answer, but the technologies of that time, which you’d find in Final Cut and myriad other video editing programs, were simply unworkable for smartphones. Why? Imagine a video clip, taken from a moving car. To even the juddering camera motion, image stabilization algorithms typically analyze a movie frame by frame, identifying image fragments common to each. By recording how those shared points jump around across frames, algorithms can then infer how the camera has been moving. By reverse engineering that motion data, software can recreate a new, steadier version of a film clip. Yet every step in that process requires processing muscle. That’s fine for a movie studio, which has massive computers that crank overnight to re-render a scene. It’s ridiculous for a smartphone.
Inspired by a demo in which he saw gyroscopes attached to cameras to de-blur their images, Karpenko had an aha moment: Smartphones didn’t have nearly enough power to replicate video-editing software, but they did have built-in gyroscopes. On a smartphone, instead of using power-hungry algorithms to model the camera’s movement, he could measure it directly. And he could funnel those measurements through a simpler algorithm that could map one frame to the next, giving the illusion that the camera was being held steady. He mocked up a simple demo, and filmed a dot on his wall, while making his hand shake. “The images in the test matched up almost exactly, and that’s when I knew this was doable,” Karpenko says.