Cornell researchers argue that cameras can rival lidar if they’re mounted properly. Lidar systems are designed to provide a 3D picture of a vehicle’s surroundings and the road ahead, which is why they’re often mounted atop a vehicle for the best vantage point. However, in a paper that will be presented at the upcoming Conference on Computer Vision and Pattern Recognition, the Cornell team explains that a pair of inexpensive cameras mounted behind a vehicle’s windshield can produce stereoscopic images that can be converted to 3D data almost as precise as that generated by lidar, at a fraction of the cost.
Watch what Tesla Autopilot can see in a rainstorm by Fred Lambert, Electrek – May. 7th 2019
One of the main concerns with self-driving vehicles is how they will react to different climates and weather conditions. Now we get an idea of the potential as we take a look at what Tesla Autopilot can see in a rainstorm at night.
Over the last year, we started getting a much better understanding of what Tesla’s Autopilot can see thanks to the work of Tesla hackers ‘verygreen’ and ‘DamianXVI’.
With access to Tesla’s Autopilot ECU, they have been creating rare looks at what Tesla Autopilot can see and interpret since last summer.
A few months later, They did it again with a new Tesla drive in Paris through the eyes of Autopilot and again using Tesla’s latest version of Autopilot in the new version 9 software.
In a new video, verygreen is now sharing footage showing what Tesla Autopilot is able to detect during heavy rain at night:
The overall performance appears to be quite impressive though there are still some issues to iron out, like the visual detection of the truck was sometimes lacking.
He also made a second video:
Tesla recently held its ‘Autonomy day‘ during which the automaker went into detail about its plan to develop full self-driving capability using its vision based system currently used in Autopilot.
Those are not perfect representations of what Tesla’s system is able to detect through its sensor suite, but it’s the best we have and it’s coming from a third-party. That’s more than we can say about any company developing self-driving capability other than maybe comma.ai and its openpilot.
There’s no other major company developing autonomous driving systems that are already being reviewed by customers to this level.
It’s one of the reasons why I like Tesla’s approach to autonomy. I’d like Tesla (directly) and other companies to release videos like these show just how good their sensor suites can be.
In many ways, they are better than the current sensors used to drive cars: human eyes. Things like that are going to build confidence in self-driving technology.
But in the meantime, please always remember that Autopilot is meant to be a driver assist system and you should always pay attention and be ready to take control.
In addition to two new active safety features powered by Autopilot, ‘Lane Departure Avoidance’ and ‘Emergency Lane Departure Avoidance’, Tesla’s new 2019.16 software update also includes a bunch of updates to existing features and a few new ones. The automaker started releasing the update to its expanding early access fleet. In the release notes, Tesla says that it updated the live environment rendering on screen:
The driving visualization has been adjusted to automatically zoom in and out to better utilize screen space and inform you when a vehicle is detected in your blind spot. The visualization remains zoomed out when driving on highways.
Tesla is also pushing some new changes to its Sentry Mode feature, which is already proving to be quite useful.
Now owners can enable the security feature automatically based on the location:
Sentry Mode Improvements
It’s now easier to enable and disable Sentry Mode by tapping the Sentry Mode icon at the top of your touchscreen when your is in Park.
Your car can also default Sentry Mode to always be enabled when your vehicle is parked by going to Controls > Safety & Security > Sentry Mode > ON if selected ON, you can exclude Home, Work and/or Favorite places by selecting the checkboxes. If a location is selected, Sentry Mode will be disabled when your car is parked near those locations.
More control around the activation of the feature will be welcomed since some owners have been complaining about the high rate of events that Sentry Mode is recording in some situations. It is taking up a lot of space in the USB drives that they are using, which is why it is now recommended to use a drive with a lot of storage capacity.
Tesla is also now enabling owners to have their car check for new software updates available:
Software Update Preference
You now have the option to receive new software updates as soon as they are available for your car configuration and region. Tap Controls > Software > Software Update preference > ADVANCED.
Finally, Tesla is also adding the capability to detect and display conditional speed limits:
Conditional Speed Limits
Your vehicle will now display conditional speed limits, such as speed limit based on time of day, weather condition, etc. If there is a conditional speed limit for your current road, it will be displayed in grey below the regular speed limit sign.
Tesla started pushing the new update, but it can take some time before it reaches the entire fleet based on your configuration and region, as highlighted by the new Software Update Preference feature.
Tesla presented its latest and greatest vehicle autonomy features at its recent Autonomy Investor Day. The brain of the system is the Tesla Full Self-Driving (FSD) computer, which is now included in all Teslas being produced. The Autopilot hardware suite includes 8 cameras, 12 ultrasonic sensors, radar, GPS, an inertial measurement unit, and sensors that measure the angle of the steering wheel and accelerator pedal.
|Tesla’s camera and Waymo’s lidar (Images via Tesla / Waymo)|
One feature it does not have is LIDAR/lidar, a technology that uses pulsed laser light to measure distance to a target, and that is favored by most other companies working on autonomous driving (Uber, Waymo, Cruise). In his Autonomy Investor Day presentation, the ever-iconoclastic Musk dismissed lidar in terms reminiscent of those he has used to describe hydrogen fuel cells. Lidar is “a fool’s errand,” he said. It’s “expensive” and “unnecessary,” and “anyone relying on lidar is doomed. It’s like having a whole bunch of expensive appendices. Like, one appendix is bad, well now you have a whole bunch of them. It’s ridiculous, you’ll see.”
As TechCrunch reports, proponents of lidar cite its ability to see through adverse conditions such as rain, fog, or dust better than cameras. The downsides are that it’s expensive, and uses a lot of power, and that’s why Tesla’s system depends on cameras. Andrej Karpathy, Senior Director of AI at Tesla, explained that visual recognition gives a better picture of the real world — he said lidar systems have trouble distinguishing things like the difference between a plastic bag and a rubber tire. “In that sense, lidar is really a shortcut,” Karpathy said. “It sidesteps the fundamental problem, the important problem of visual recognition, that is necessary for autonomy. It gives a false sense of progress, and is ultimately a crutch.”
Together with cameras, Tesla is relying on the vast neural network of real-world driving information recorded by the thousands of Autopilot-equipped Tesla vehicles on the road. Using various AI techniques, Tesla is teaching its system to recognize and react to the vast variety of situations that might be encountered in the wild.
“Everyone’s training the network all the time,” Musk said. “Whether Autopilot is on or off, the network is being trained. Every mile that’s driven for the car that’s [equipped with Autopilot Hardware version 2 or above] is training the network.”
Anthony Levandowski, former Google/Waymo engineer, now backtracks on lidar and says,”Elon is right.” (YouTube: TechCrunch)
Some autonomy experts agree with Musk. As Gizmodo reports, Cornell researchers argue that cameras can rival lidar if they’re mounted properly. Lidar systems are designed to provide a 3D picture of a vehicle’s surroundings and the road ahead, which is why they’re often mounted atop a vehicle for the best vantage point. However, in a paper that will be presented at the upcoming Conference on Computer Vision and Pattern Recognition, the Cornell team explains that a pair of inexpensive cameras mounted behind a vehicle’s windshield can produce stereoscopic images that can be converted to 3D data almost as precise as that generated by lidar, at a fraction of the cost.
Mashable reports that British startup Wayve is a member of the no-lidar-needed camp. The company says it doesn’t need multiple data sources — just a GPS system, cameras, and a powerful computer are all it requires to teach cars to drive as well as humans.
Editor’s note: ARK Invest’s autonomous driving expert also told CleanTechnica on a recent podcast that they have been coming around to Elon Musk’s point of view as well. Listen here:
Other players do not agree. A Chinese autonomous driving firm called AutoX deployed a camera-based self-driving system in 2017, but has now added lidar sensors to its vehicles for redundancy and extra input.
Musk isn’t totally opposed to lidar — SpaceX uses it in some applications*. For vehicle autonomy, however, Musk believes Tesla’s sensor suite and trove of real-world driving data will be quite sufficient.
*Editor’s note: Musk noted during Autonomy Day that he led development of a lidar system for SpaceX for docking to the international space station.