Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Is Autopilot that much better than Competitors?

This site may earn commission on affiliate links.
Tethering to the car is kind of a gimmick to me, really, unless it were free. Rarely will anyone *need* tethering in their car, even on road trips. People riding along will likely have their own mobile connected devices. For the times when tethering is needed any one of those devices can be a hot spot, generally, or a MiFi type device can be used. When I want tethering it generally isn't near my car, so I would never pay for such a feature. If it were free, I'd probably use it occasionally just because it was there. But I wouldn't pay for it. (Seems OT for this thread....)
 
Machine vision can also detect humans-- take a look at some of the nvidia videos. Google has chosen to go a certain direction, using their mapping and lidar, and most of the automotive OEMs have gone a different direction, using machine vision. We'll see which is best, but I'd put my money on the OEMs.

Lidar, as Google has on their experimental cars, is too expensive for commercial use, but it is necessary right now for their experiments because commercial automotive radar, even 77 GHz, cannot distinguish humans from cars on streets. Unlike Lidar, cost of radars is coming down fast. Current automotive radars are usually Frequency Modulated Continuous Wave FMCW, which is easy to process for point targets but does not give an image. However, greatly improved phased array radars are coming soon, as this report from UCSD indicates:

http://www.jacobsschool.ucsd.edu/news/news_releases/release.sfe?id=1588
 
Machine vision can also detect humans-- take a look at some of the nvidia videos. Google has chosen to go a certain direction, using their mapping and lidar, and most of the automotive OEMs have gone a different direction, using machine vision. We'll see which is best, but I'd put my money on the OEMs.

I wouldn't go so far to say the OEM's have the best setup avoiding lidar, but the best value. Just comparing lidar vs. machine vision to Betamax vs. VHS, respectively.
 
Fair point. The Google system is probably "better" but I think the OEM systems are probably "good enough" especially when taking cost into consideration. Also, "better" or "best" can have different meanings. From what I can see, the machine vision systems are far more flexible than what Google is doing.

I wouldn't go so far to say the OEM's have the best setup avoiding lidar, but the best value. Just comparing lidar vs. machine vision to Betamax vs. VHS, respectively.
 
My guess is that Tesla is using the data collected from thousands of autopilot-equipped Model Ss to construct virtual 3D maps of tens of thousands of driving situations and will be more quickly able to design response protocols for collision avoidance as well as optimization of common driving situations. Sure, auto-park and lane changing are part of the scheme, but the collection of user data will make it work better and better over time. Since the vast majority of car accidents are caused by driver error, I'd rather trust the car.

I agree with this, based on several of Musk's comments. But I don't know what they are collecting.

- - - Updated - - -

Lidar, as Google has on their experimental cars, is too expensive for commercial use, but it is necessary right now for their experiments because commercial automotive radar, even 77 GHz, cannot distinguish humans from cars on streets. Unlike Lidar, cost of radars is coming down fast. Current automotive radars are usually Frequency Modulated Continuous Wave FMCW, which is easy to process for point targets but does not give an image. However, greatly improved phased array radars are coming soon, as this report from UCSD indicates:

http://www.jacobsschool.ucsd.edu/news/news_releases/release.sfe?id=1588

Google cars now detect pedestrian gestures according to people who have had a demo
 
Slept on the autonomous car issue. Here is what I think Tesla is doing:

In google car and other autonomous cars, the human is the proctor, and the software is the driver. The human reports exceptions to what he considers normal driving behavior.

I think Tesla has flipped this relationship. In the newer sensor equipped cars, the software is proctoring the human driver, and reporting exceptions. To do this proctoring, the software has to know where the human is going in the car. This is done by tagging repetitive trips (to and from work). The software "drives along" without actually controlling the car.

The second type of exceptions that the autonomous car software could report are abrupt maneuvers when the software and driver disagrees: 1) The software calling for an abrupt maneuver to avoid a hazard that the human driver doesn't take. 2) The driver making what seems to be an emergency action that was not flagged as such by the software. e.g. slamming on the brakes.

With this approach, Tesla would be adding the equivalent of a hundred or more autonomous test cars per week. There are major advantages of this approach as the software gets good. The poor human in the google car sits for hours before something interesting happens to report to the development team.

Musk seems to suggest that Tesla has an advantage in autonomous cars due to the very high miles driven. Where are these test cars, if he is not referring to regular production vehicles?
 
The new 7-series can also be commanded to drive in or out of a garage space with the key fob. It handles steering, accel, and braking based on sensor readings.
How to self-park a BMW 7 Series - YouTube

Apparently they didn't want to chance damaging a garage and built a strange white cube. I love the lack of confidence in the body language of the engineer with the key fob. I think he probably high fived the cameraman after the cut.
 
I think you are correct.

I'm still surprised they haven't released the steering yet. After living with TACC for a few days (now I'm back to my regular pre-autopilot car), that seems much harder than the steering. All of the issues that people throw up about why it's so hard to drive autonomously would seem to apply more to throttle and brakes than to steering-- steering just has to stay between the lines.

Slept on the autonomous car issue. Here is what I think Tesla is doing:

In google car and other autonomous cars, the human is the proctor, and the software is the driver. The human reports exceptions to what he considers normal driving behavior.

I think Tesla has flipped this relationship. In the newer sensor equipped cars, the software is proctoring the human driver, and reporting exceptions. To do this proctoring, the software has to know where the human is going in the car. This is done by tagging repetitive trips (to and from work). The software "drives along" without actually controlling the car.

The second type of exceptions that the autonomous car software could report are abrupt maneuvers when the software and driver disagrees: 1) The software calling for an abrupt maneuver to avoid a hazard that the human driver doesn't take. 2) The driver making what seems to be an emergency action that was not flagged as such by the software. e.g. slamming on the brakes.

With this approach, Tesla would be adding the equivalent of a hundred or more autonomous test cars per week. There are major advantages of this approach as the software gets good. The poor human in the google car sits for hours before something interesting happens to report to the development team.

Musk seems to suggest that Tesla has an advantage in autonomous cars due to the very high miles driven. Where are these test cars, if he is not referring to regular production vehicles?
 
Apparently they didn't want to chance damaging a garage and built a strange white cube. I love the lack of confidence in the body language of the engineer with the key fob. I think he probably high fived the cameraman after the cut.
Yeah I laughed at that video. There's a newer video that shows it coming out of a beautiful home. It's a sales promotion video. But I can't find it.

Wait, here's the article
BMW Demonstrates 2016 7-Series Remote Control Parking [w/Video]
 
I have just become aware of a great TED talk from several months ago demonstrating substantial progress from Google for navigating the complexities of street driving with pedestrians, bicyclists, traffic intersections, construction cones, hand gestures, etc. They make a clear distinction between what they are doing to achieve fully autonomous driving and Tesla with driving assistance. However, Musk has said that while street driving is tough, autonomous highway driving is a "solved problem," so I believe with additional radar and video sensors, Tesla will be able to demonstrate autonomous highway driving soon. In the video the image processing, sensor fusion, object recognition and object tracking all look impressive and computationally expensive. I believe Tesla and Delphi have the Nvidia processing power to do all this, but Google clearly has the lead on algorithm development and testing.

https://www.ted.com/talks/chris_urmson_how_a_driverless_car_sees_the_road?language=en
 
Last edited:
Slept on the autonomous car issue. Here is what I think Tesla is doing:

In google car and other autonomous cars, the human is the proctor, and the software is the driver. The human reports exceptions to what he considers normal driving behavior.

I think Tesla has flipped this relationship. In the newer sensor equipped cars, the software is proctoring the human driver, and reporting exceptions. To do this proctoring, the software has to know where the human is going in the car. This is done by tagging repetitive trips (to and from work). The software "drives along" without actually controlling the car.

The second type of exceptions that the autonomous car software could report are abrupt maneuvers when the software and driver disagrees: 1) The software calling for an abrupt maneuver to avoid a hazard that the human driver doesn't take. 2) The driver making what seems to be an emergency action that was not flagged as such by the software. e.g. slamming on the brakes.

With this approach, Tesla would be adding the equivalent of a hundred or more autonomous test cars per week. There are major advantages of this approach as the software gets good. The poor human in the google car sits for hours before something interesting happens to report to the development team.

Musk seems to suggest that Tesla has an advantage in autonomous cars due to the very high miles driven. Where are these test cars, if he is not referring to regular production vehicles?

That's a really interesting thought.

If Autopilot is running continuously in the background, making decisions but not executing them and then comparing that to the actual human choices, they could be developing a lot of experience with the results of their code without having to actually operate cars autonomously with all the permissions and liability.

A rather clever approach, actually. If the car is running a continuous data recording buffer, it could upload a ~30 second clip of all the sensors and the autopilot vs real inputs every time the driver did something substantially different from the software and Tesla engineers can review them remotely.
Walter
 
I was just reading a press release of an Audi RS7 self driving on a race track. They write:

"The systems can assume control of the car during parking or in stop-and-go traffic on freeways at speeds up to 60 km/h (37.3 mph)."

It is supposed to be available with the next generation A8 model due in 2017. So if we get what Tesla promised in the next weeks they are more then 2 years ahead of Audi.

https://www.audi-mediacenter.com/en/press-releases/4496