Stanford electronics researchers are developing a camera built around what they call a "multi-aperture image sensor." Pixels are reduced in size and grouped in arrays of 256 pixels. The fascinating thing is how image depth metadata can be stored with the image, allowing photo editors to select objects in the image based on their distance from the camera. Apparently this opens up lots of other new opportunities as well.
Full article at news-service.stanford.edu.