There’s been a lot of buzz in the media this past week about the first ever pedestrian fatality involving an autonomous vehicle. While there was clearly some kind of failure to see the pedestrian by both the system and the safety driver, self-driving car systems are generally quite good at detecting pedestrians and other objects in clear weather. One thing that today’s autonomy systems (and humans) can’t do well is see through fog. But that may soon change, thanks to a new technology being developed by MIT engineers.
Researchers out of the Camera Culture Group at the MIT Media Lab developed a new imaging method which uses short laser bursts to detect the distance and shape of objects even when they’re completely obscured by fog. A camera counts up the number of light particles that reach it at a regular interval, which gives the system enough information to compute the depth and distance of objects.
The current version of the technology can only penetrate about 22 inches of fog, but the fog used in the test was far denser than what is typically encountered on the road. With some enhancements, it’s possible that the system could work far enough ahead on a foggy road to make a difference in vehicular safety. Keep in mind that this kind of technology might not just help self-driving cars be safer, but it could make driving in fog safer for human drivers by alerting them to obstacles they can’t see with the naked eye.