clock menu more-arrow no yes mobile

Filed under:

Lytro changed photography. Now can it get anyone to care?

An exclusive look at the company's new Illum camera

David Pierce editor-at-large and Vergecast co-host with over a decade of experience covering consumer tech. Previously, at Protocol, The Wall Street Journal, and Wired.

“Okay, can I take it out of the box now?”

Lytro product director Colvin Pitts wants to show me the camera he’s been working on since 2007. He cautions that it’s just an early model, then gently lifts the stark, black device out of an unmarked box. It looks like a cross between a DSLR and a futuristic weapon. It’s big, with a wide round lens and a large grip, but it weighs less than 2 pounds and is perfectly comfortable in my hands. Its back face is slanted, like someone chopped off part of a larger camera to form this one. Its big, 4-inch touchscreen is glowing. I hold the camera up, point it at the black Sharpie on the table in front of me, and press the shutter. Nothing happens. I press it again. Still nothing.

“Now this is the part where I go back to that caveat at the beginning,” Pitts says. “My camera has frozen in the box.” It’s stuck on the menu screen and won’t budge. The camera, which Todd Roesler, senior director of hardware engineering, quickly swaps for a functioning model, is one of a handful of first-run production models delivered from China. It’s less than three weeks before launch, less than four months before the product codenamed Blitzen is scheduled to ship to customers, and Lytro has just gotten its first taste of how the product will look and work. There’s clearly a lot left to do. “This is the final form,” Pitts says, “but the colors and a lot of things are off. The rings are awful: we screwed up making them, so when you feel it, this is nothing like what the product will feel like.” Even now, it feels solid and polished, with just the right controls in the right places and an instantly familiar usability. The rings do feel horrible, though, loose and rubbery and too quick to turn.

A few tweaks here and there and this black brick will be Lytro’s Illum, a brand-new $1,599 camera designed to show professional photographers, and the world, the power of light-field photography. It’s the company’s second camera, the follow-up to its eponymous point-and-shoot that could refocus a photo after it was shot. The Illum does that better, and takes much better and more versatile pictures in general. But for Lytro, the real plan is only beginning to unfold. The company’s job, its mission, is to fundamentally change the way we think about images. To not just provide better, faster cameras that take beautiful pictures, but to reimagine what a picture is in the first place. That part hasn’t changed since the dawn of photography nearly two centuries ago, and Lytro believes it holds the keys to the next phase.

How to build a hardware company

Since modern photography was invented in 1837, everything and nothing has changed. The tools have evolved — we shoot with different cameras and don’t need darkrooms for developing or controlled explosions for flash — but we’re still capturing static, two-dimensional images. But inside a nondescript office park on Terra Bella avenue in Mountain View, California, a new future is coming into focus.

It’s been a long road for Lytro since the company’s founding in 2006. Its first camera was made entirely with off-the-shelf parts, essentially a prototype that turned out well enough that the company stuck it on Target shelves. It was built by founder and executive chairman Ren Ng and three others, who scraped together the fewest, best parts they could find, and hacked together a way to take light-field shots with essentially traditional camera parts. At one point Lytro’s primary manufacturing partner lost the master recipe for the microlenses, the camera’s most important component, and it was six months before production ramped back up.

This time the team is larger and more experienced (and has diversified its partnerships), not to mention finally armed with enough cachet to get suppliers for whatever it needs. The first camera opened the doors required for Lytro to build exactly what it wanted, and what it wanted was to make Illum, an entirely new kind of camera. "With Illum," Ng says, "we’re able to start to customize that supply chain in a very deep way… to rethink the entire imaging pipeline." It began with the lens, a round barrel with big zoom and fast aperture throughout. Lytro built a lens that can focus on a subject physically touching its glass, and can shoot with remarkably fast shutter speeds (though it still takes a second or two to process each shot, like an old-school Polaroid printing each shot).

That kind of versatility and brightness is unheard of in the camera industry. The Illum has very little glass, and none of the complicated, expensive aspherical elements that traditional cameras require to reflect light onto the sensor. That's because the Illum isn’t really capturing a photograph in any traditional sense when you press the button; it doesn’t imprint reflected light on a sensor the way a traditional camera does. Instead, it’s just capturing loose data, and the processor builds a picture later. "So you can make cheaper lenses if you want, you can make lower-quality lenses acceptable, and you can build new lenses that have higher performance than you’ve ever seen before."

The Illum also has a square blue "Lytro Button" next to its shutter release, which maps in real time the refocusable range of whatever photo you’re framing. A green border frames everything at the front of the photo, all the way to deep orange in the back — it’s constantly showing exactly how someone might be able to tap or click through your photo. It’s a huge aid in understanding how what you see through the camera will translate into an interactive photograph. "Not only are you able to think in that extra dimension," Lytro’s marketing chief Azmat Ali tells me, "we’ll help you to be able to frame that extra dimension. And when you can frame that extra dimension, your creativity is set free."

As I wander the offices taking pictures of everything from the catered lunch spread (an impressive taco bar) to the unsuspecting Lytro engineers, it quickly becomes clear that shooting with the Illum is all about depth, and communicating the size of the scene. It took a little more work and a lot more stage direction to take a great shot, but every shot was more interesting than its potential static equivalent. Each one was like looking into a dollhouse, a tiny 3D representation of the world with new decorations and rooms everywhere I look.

Pitts next hands me an iPhone running a basic version of the app Lytro will release alongside Illum. It’s a grid of images, like a thousand other apps. But when I tap an image to open it, the photo wobbles as it springs into place. As it moves, the perspective shifts — I briefly see shadows and reflections change, and peer around the side of a barrel of a gun. I tap on the front of the barrel, and it snaps into focus. Then I tap the man holding the gun, and his grimacing face clarifies in front of my eyes. Before I know it, I’ve spent three minutes tapping different parts of the screen, exploring every part of the scene: his hands, the gun, the costumed insanity in the background at San Diego Comic Con. It’s a different photo each time, a story I’m helping the photographer narrate. The next photo, a close-up shot of a tape measure, shifts its focus backward to reveal the rough-handed carpenter in the background. Each photo feels more immersive, more memorable than others I’ve seen. More real, somehow.

Before Lytro was Lytro, it was Ng’s 203-page computer science thesis, entitled simply "Digital Light Field Photography." It’s based on two decades of research into "computational photography," a catch-all term for collecting more data with cameras. Ng combined it with his original research at Stanford, in an area called re-lighting — using computer graphics to map how light affects and changes in virtual worlds — to explore how computation could fundamentally change the things we see. His research and thesis focused on how light becomes data, and data becomes photos.

Light-field photography has been discussed since the 1990s, beginning largely with three Stanford professors, Marc Levoy, Mark Horowitz, and Pat Hanrahan. (The term "light field" was first coined in 1936, and Gabriel Lippmann created something like a light-field camera in 1908, though he didn’t have a name for it.) Instead of measuring color and intensity of light as it hits a sensor in a camera, light-field cameras pass that light through a series of lenses (hundreds of thousands in Lytro’s case), which allows the camera to record the direction each ray of light is moving. Understanding light’s direction makes it possible to measure how far away the source of that light is. So where a traditional camera captures a 2D version of a scene, a light-field shot knows where everything in that scene actually is. A processor turns that data into a 3D model like any you’d see in a video game or special effect, and Lytro displays it as a photograph. It’s a little bit like the small bots in Prometheus, spatially mapping an entire room in order to display it back later. Or think of it as a rudimentary holodeck, projecting a simulated scene that changes as you move through and interact with it.

Lytro didn’t invent the science, just found a way to turn the required technology — which was once made up of 100 DSLRs in a rack at Stanford — into a product you can hold in your hands.



‘We’re not a camera company’

Lytro's ultimate, simplest goal is to turn the physical parts of the camera — the lens, the aperture, the shutter — into software. If it can do that, the camera hardware itself becomes secondary; it would take nothing more than a cheap lens and a sensor to build a light-field camera.

That's why the phrase "anything with a lens and a sensor" is a favorite among Lytro’s brass. Imagine a refocusable MRI, or a security camera that could render a person’s face in crystal clarity with a single tap and their license plate in another. Suddenly "Enhance!" becomes a real thing, not just CSI fiction. These uses are what fascinate Ren Ng most, what got him into the field in the first place. "I’ve always been very interested in the question of how computation can fundamentally advance the things that we can see," he says. "This led me to have a fascination with medical imaging, especially things like MRI and scanning, and eventually computer graphics."

Suddenly "Enhance!" becomes a real thing, not just 'CSI' fiction

As we talk, Ng is just home from a trip to Switzerland, back into the pre-launch chaos at Lytro. He talks a mile a minute, particularly when he’s discussing his research — he speaks like a professor, excited but clear, full of historical references and obviously practiced from countless explanations of what Lytro is and does, and just what in the world a light field is, anyway.

CEO Jason Rosenthal shares the fervor, especially when he’s talking about how Lytro becomes more than just a camera company. He leans forward on a conference room table, speaking quickly and punctuating every other sentence with "that’s a big deal."

"If you look at a big-budget Hollywood production today, they’ll spend between 9 and 14 million dollars on just incremental hardware to shoot 3D, because you need multiple rigs. We can do all that in single-lens, single-sensor — that’s a big deal," Rosenthal says. "You look at the credits at the end of a movie and you see Camera Assistant 1, Camera Assistant 2, Camera Assistant 3… they’re doing focus pulls on set. If you can make that an after-the-fact decision, that’s a pretty big deal." Of course to achieve that in practice and not just theory, Lytro would need to make a camera that records video. But that’s on the roadmap: "That’s something that largely gets solved as computational power continues on its Moore’s law rate of increase." Processors double in speed every two years, Moore says; Lytro’s perfectly positioned to take advantage of every increase.

Clockwise from the top left: Jason Rosenthal, CEO; Stan Jirman, Head of Software Engineering; Kurt Akeley, CTO; Azmat Ali, Head of Marketing

Photography has changed, Lytro says — faster, better cameras, new processes and workflows, improved storage mechanisms — but we’ve never fundamentally re-thought the actual picture. We don’t print photos anymore; we look at them on screens, tapping and swiping to get a better look. "Initially," Ng says, "the idea [of digital photography] was that you’d be able to delete your photos and only have developed the ones you want. But it turns out to enable visual communication, right?" What Lytro plans to do is not just change how we take pictures, but to make the very pictures themselves mirror the way we see them in the first place. We interact with the world in three dimensions, touching and moving and constantly changing our perspectives, and light-field photos fit perfectly.

The first photos I saw — taken on ugly, boxy prototypes lent to a few photographers — are remarkable. One in particular sticks out: a frozen-in-time action shot of a motorcycle coming around a corner. Initially, all that’s in focus is a clump of dirt kicked up by the bike. But a click later, the motorcycle comes into full focus, crisper and clearer than I expected – the Illum’s still too slow to be relied on for fast-twitch action photography, but when the shot comes out right it’s breathtaking. Another click, and the background scenery snaps into sharp view. More than any sports photograph I’ve ever seen, it made me feel like I was sitting right next to the track, watching a bike come screaming around the corner, kicking dirt directly at my face.

The new photography class

The immediate response to the first Lytro was that it was fun, but gimmicky. Light-field shooting is often thought of as a sort of crutch; you can shoot an imperfect photo and correct it later. Virtually every smartphone manufacturer has cribbed the idea, though they’re typically just shooting everything in focus and selectively blurring parts as you select them. Lytro claims it’s really about a different kind of photograph, about telling a different, more immersive story. To prove it, it’s been working with a small group of what it calls "creative pioneers" to help them understand what it means to shoot light-field pictures, in hopes that they’ll then teach the world.

A sense of movement and place that’s far more powerful than the panning and zooming Ken Burns made so famous

One such photographer is 22-year-old Kyle Thompson. Thompson’s shots are often complicated, abstract self-portraits, and he says shooting with Lytro forced him to think differently about what he was making. "One thing you have to remember is that you don’t have to focus exactly on one certain point in the photo, but you also have to remember to try and keep all points of the photo interesting." He set up a scene with a gas can and a burning man — fire and abandoned houses are Thompson’s favorite props — but framed it differently with his big, boxy Illum prototype than he would with his 5D Mark II. It changed the way he thought about the image, and how he’ll later present it. He shot from a lower angle than he might otherwise, making sure the gas can, the burning man, and the sky were all in the camera’s refocusable range. "So you can have something that the viewer wouldn’t be able to see right away, and then you move towards it… you can focus on absolutely everything, except this way you can pick what’s in focus so your eyes are drawn toward one specific thing at a time."

Sample photo taken with the original Lytro camera

Thompson never quite says it, but he seems to feel a tension between his artistic vision and the inherent interactivity that comes with Lytro’s "living photos." Photographers freeze a moment, and do so with particular intentions; the focus, the depth of field, the composition all serve a careful purpose. Publishing a light-field shot can feel like putting the camera in a viewer’s hands and letting them choose the settings and shot they want. So for its most particular buyers, Lytro offers "animations," which let you program focus changes, pans, zooms, and perspective shifts, and publish a photograph that people watch rather than simply look at.

Animations lend a certain action to a still photo, a sense of movement and place that’s far more powerful than the panning and zooming Ken Burns made so famous. And these animations are only possible because Lytro makes not only the camera that shoots the photo but the software and services required to show it to others. That’s an opportunity — simple software updates can improve photos over time, or add new features to the viewer — but it’s also a huge burden. Lytro is responsible for every part of its ecosystem and workflow, and it’s had to invent every step of the way.

Lytro’s short-term success hinges on its ability to achieve a balance between familiarity and innovative, to successfully meld the present and the future. That’s why the Illum looks like a DSLR, not a kaleidoscope. It’s why Lytro is talking to companies like Flickr and Facebook about how to integrate living pictures, even as it tries to overthrow them. It’s why the camera has two scroll wheels and a series of customizable buttons, so it’ll handle like the Canon and Nikon cameras pros are already comfortable with. And it’s why Lytro is going out of its way to make it easy to shoot, process, and share pictures the way you always have.

A new future

Azmat Ali identifies the same three things everyone at Lytro does, and confidently draws the roadmap for their convergence. "You’ve got photography, which is all digital now. And then you’ve got computer graphics, which brings incredible capability and gives people incredible power to create. And then virtual reality, this whole 3D space. Those three are going to collide, and where they collide is exactly where Lytro is sitting right now."

Predicting a technological and cultural shift is a dangerous game, as is betting a company on one. Lytro has the capital to wait a while — it’s raised more than $90 million in venture capital, and Rosenthal has said that sales of the first camera were higher than predicted (though he declined to offer hard numbers). But Lytro’s greatest danger may be being too far ahead of the curve. The first digital camera was invented at Kodak in 1975 — but by the time digital became dominant Kodak was all but forgotten. Ng hopes that Lytro’s startup status, without any of the traditional innovator’s dilemma inertia, will keep it from suffering a similar fate. "We need to convince photographers and consumers that light field matters," Rosenthal says. "We’re very up front about this: we’re not quite there with this product. This is still a specialized product for a certain target market. But from here that changes very, very quickly."

Why capture one photo, from one angle, with one perspective, when we could capture everything?

As I sit on a couch in the middle of Lytro’s office, alternately taking photos and seeing them displayed in 3D on a large TV, it becomes clear. This is the future. Not the Illum, necessarily, though it’s one of the more exciting cameras I’ve seen in a while. Maybe not even Lytro, though it’s built a huge lead in its nascent industry. But light-field photography — the notion that the future is about turning the complex physical parts of a camera into software and algorithms, that capturing beautiful photos is little more than a data-crunching problem — seems almost obvious. Why capture one photo, from one angle, with one perspective, when we could capture everything? When I can explore a photo, zooming and panning and focusing and shifting, why would I ever want to just look at it?

Lytro’s first camera was a toy, but it made us think differently about what a photograph might someday be. Now it’s making those ideas truly achievable with the Illum, a professional-grade tool. If it works, if Lytro can convince just a few people that this is the future, I can’t even imagine what might come next.

Watch now: 90 Seconds on The Verge