
Confirming earlier statements by Elon Musk, Tesla Motors announced today that it will move towards using radar as the primary sensor for its’ Autopilot systems, using advanced signal processing to build a robust computer model of cars’ surroundings. Up to this point, radar only supplemented Autopilot’s camera-based detection system.
The changes will begin with Version 8 of Tesla’s vehicle software, which will include more robust radar modeling and begin gathering data to improve radar performance—but will not immediately change how Autopilot sees and makes decisions.
The shift to radar comes in the wake of the first-ever fatality of an operator using the Autopilot system. The circumstances of the May crash, as well as some similar but nonfatal incidents, suggested that the camera-based Autopilot system may have problems detecting overhanging objects. Final findings of an NTSB investigation of that crash have not been released.
While making radar the primary sensor for Autopilot decision-making could help prevent similar incidents, the company says it presents its own challenges. While radar is good at seeing through environmental obstructions like smoke and rain, it is not as good as a camera at detecting people, and has serious problems seeing wood or plastic objects.
Most challenging of all, metallic objects appear highly reflective to a radar system. This, the company says, is a particular problem when it comes to concave surfaces. The bottom of a soda can on the road, for example, “can appear to be a large and dangerous object” in radar. To avoid those kinds of false positives, the updated Tesla system will assemble much more information about objects than before, and construct a moving 3D “point cloud” that evaluates objects over time, not just as a single snapshot.
Click here to read more.
SOURCE: Fortune, David Z. Morris