LG’s new premiere smartphone, the G5, has a second rear camera for wide angle shots. But Corephotonics’ new smartphone module shoots with its two cameras at once. It then uses the company’s proprietary software to merge the two pictures and create a significantly improved single image.
Corephotonics’ technology not only leads to sharper images, it also leads to faster autofocus, reduces motion blur and counteracts the Waterloo of smartphone cameras: awful low light images. CNET checked out one of Corephotonics’ prototypes at this year’s Mobile World Congress:
Corephotonics’ method sounds similar to HDR imaging, where a single camera takes multiple low or standard dynamic range photos and merges them to make a photo with, well, a high dynamic range. But as CNET mentions in their video, Corephotonics’ setup is more flexible and powerful because it can pair different cameras to adapt to the device’s target price or purpose.
The company currently has three modules: one achieves up to 3x optical zoom using a 13MP and an 8MP camera, one gets great low light shots by using a black and white 13MP camera with a colored 13MP camera, and the Hawkeye prototype, the one featured in the video, also uses two 13MP cameras but is capable of up to 5x optical zoom. Corephotonics also said on its website that it can pair a standard camera with a depth sensor, similar to the HTC M8 or motion sensors like the Kinect. Even cameras are going multi-core these days. If only we could do the same with batteries.