Scientists at Duke University have just taken the wraps of AWARE2, a camera that can take pictures with 50 gigapixel accuracy. To put that into perspective, when the newest iPhone came out, people were gushing about the quality of its camera. It has an 8 megapixel camera, which is about 499,992 megapixels worse than the AWARE2. “Sure,” you say, “but does this new camera come with Siri?” Well, no, but it’s a 50 GIGAPIXEL CAMERA.
The camera is actually made up of 98 mini-cameras. When the lens is aimed and ready to shoot a picture, each of the individual cameras focus on a separate piece of the picture. A computer then stitches the pictures together to get an ultra-mega picture.
The technology was developed, in part, with funding from DARPA, so there’s a good bet this thing is going to be used on some serious military spying missions. The Wall Street Journal reports the “device produces a still or video image with a billion pixels—five times as much detail as can be seen by a person with 20/20 vision.” If we can count the grains of sand on the hill that the enemy is standing on, we win, right?
The researchers believe that within five years, as the electronic components of the cameras become miniaturised and more efficient, the next generation of gigapixel cameras should be available to the general public.
Again from the Journal:
If the Duke device can be shrunk to hand-held size, it could spark an alternative approach to photography. Instead of deciding where to focus a camera, a user would simply shoot a scene, then later zoom in on any part of the picture and view it in extreme detail. That means desirable or useful portions of a photo could be identified after the image was captured.
By that time every screen we’re looking at should have something like Apple’s retina display. I feel sorry for anybody who actually has to get their picture taken with this thing.