The AWARE-2 gigapixel camera can capture real-time, billion-pixel images of dynamic events at astonishing resolution. Image: Duke University
In the 1966 cult movie Blow-Up, a hip photographer snaps some random shots of two lovers on a sunny, swinging London day in the park. Back at the studio, he develops the film, makes some enlargements and—voila—a hidden crime scene comes into view. It's a compelling, Oscar-nominated depiction of the camera's power to see things we can't, even though they're happening right under our noses.
A game-changing gigapixel camera developed by David Brady, Ph.D., and his team at the Duke Imaging and Spectroscopy Program at Duke University's Pratt School of Engineering does the same thing, only better—a lot better.
Brady's AWARE-2 camera can record 1-billion-pixel images in one shot over a wide range of spatial scales. If you're into glamor photography with lots of artsy special effects, this isn't the camera for you. But if you're trying to capture as much crisp, clear reality as possible across a wide field of view, the AWARE technology holds promise in applications ranging from defense to drug development.
Why It's Different
The AWARE-2 camera integrates an array of 98 independently focused and exposed 14-megapixel microcameras along the focal surface of a monocentric objective lens. This design achieves near-diffraction-limit performance across a 120 x 42-degree field of view, revealing details and capturing dynamic events with uncanny resolution.
The AWARE-2 as it looks when assembled. The lens peeks out through the hole in the front panel. Image: Duke University
Each microcamera sends a portion of the image recorded by the objective lens onto a sensor that corrects for the flaws. "The resulting partial images can then be stitched during post-processing to create a single image of a large field," he says.
Other cameras designed for gigascale imaging can only approximate reality. They are generally computer-enhanced composites of multiple pictures taken sequentially over time. They may let you create a beautiful high-resolution image of a wildlife habitat, for example, but they can't let you study the exact behavior and interactions of the animals and plants at one crucial moment.
The gigaphotographer's job is to get as much image data as possible so that the end user can mine it to his or her specifications. For example, AWARE's remarkable resolution over a wide field of view will one day enable video streamers and broadcast viewers to pan and zoom in and around large complex images to focus on areas of interest to them. If the camera captures a scene from a baseball game played in a crowded stadium, viewers could choose what to zoom in on: the pitcher's mound, the press box, the popcorn vendor or the people in the cheap seats. All could be examined in stunning resolution regardless of their position in physical space relative to the camera.
The AWARE-2 has 98 independently focused and exposed 14-megapixel microcameras along the focal surface of a monocentric objective lens. Image: Duke University
"There are various examples of film-based gigapixel-class imagers, although to our knowledge none of them approach the full resolution and field of view of our system," says Brady. "There are also numerous examples of the use of scanning, pan-tilt systems to capture high-resolution wide-field images, but these systems are incapable of instantaneous capture of a wide field."
Brady says he knows of only two other true gigapixel-class digital imagers: BAE Systems' Argus IS surveillance system and the PanSTARRS telescope in Hawaii. Both cover smaller fields of view and are designed for imaging faraway objects such as stars or military targets, he adds. "Our approach demonstrates gigapixel imaging in a very small optical volume. We also demonstrate the unique capacity to independently control focus and exposure across the field, which allows us to capture events at finite range. We have also shown that our approach scales to unprecedented pixel count–up to 50 gigapixels."
Funded primarily by the Department of Defense's Defense Advanced Research Projects Administration (DARPA), the device is a natural tool for security and monitoring applications. But that's only a taste of the new worlds yet to be discovered and imaged at unprecedented resolution, Brady says.
A viewer could zoom in on a scene at different distances and see the images in detail. Image: Duke University
"In the near term, our technology will be used for event capture: weddings, sports, rocket launches, tourist sites–and online media," he says. "We expect to demonstrate streaming digital zoom media in concert with conventional broadcast in 2013."
A natural next step for gigapixel photography is in the medical domain, where image quality can be a matter of life and death. Brady sees a role for his camera alongside other medical visualization techniques based on optical, electron, x-ray, and magnetic resonance imaging. "We are working on extremely wide field microscopy based on microcamera arrays, which will enable novel life sciences research and eventually novel diagnostics."
Brady says his approach is inspired by parallel computing. "Computational systems are constructed from common processor platforms to address products ranging from PCs, tablets, and phones to massive supercomputers and servers. Similarly, we expect common microcamera platforms to span consumer, scientific, medical, and defense applications."
Brady's team is constructing next-generation, 5 gigapixel color imaging cameras for completion in late 2012. "Our goal in 2013 is to establish a supply chain capable of manufacturing 100 to 1,000 microcameras per month, which will be used to build cameras for media and event capture."
The entire prototype fits snuggly in a case only 0.75 x 0.75 x 0.5 m. A key design goal is to achieve a Moore's Law-style pattern of predictable increases in performance and decreases in size and cost. "Electronics volume has been reduced by 4X in the next generation design, and we hope to establish a design cycle that will reduce system volume, power, and cost by 2-4x in each two-year cycle," says Brady.
As the technology gets smaller and cheaper, it's only a matter of time before tomorrow's hipster photogs are snapping gigapixel glamor shots and, in the process, chronicling the hidden mysteries of their world.
Michael MacRae is an independent writer.
In the near term, our technology will be used for event capture: weddings, sports, rocket launches, tourist sites–and online media.
David Brady, Duke Imaging and Spectroscopy Program, Duke University
More on this topic
Underwater robots are taking advantage of developing technology to dive deeper than ever in the world's oceans--often at lower costs--to locate and ...
A combination of need and cost is making desalination of saltwater more attractive in the United States and reverse osmosis is the overwhelming choice ...