Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Seeing the world in autopilot, part deux

This site may earn commission on affiliate links.
Is v9.0 still in early access, or is it beginning to trickle out to the general populace?
it's not marked as an early access, does not have any early access disclosures either.

As in the car starts to nudge into the lane to let the other cars know to make room?

That and it also turns on the turn signal, if the other car does not yield, it lets it pass and tries again, slowing down if necessary if it's an exit we are trying to make and want to not blow past it at big speed.

Does this continue to run if parked? If so, what duration?
Yeah, runs all the time until the autopilot unit is shutdown. Normally it's shutdown ~2 minutes after the driver leaves the car, but sometimes might stay up to two hours (e.g. if there's valid wifi and it cannot push accumulated data via it for whatever reason).
Typically triggers have certain requirements about car driving though, so parked triggers no longer happen (unless it's some internal trigger like some crash or overheat or such).
 
Typically triggers have certain requirements about car driving though, so parked triggers no longer happen (unless it's some internal trigger like some crash or overheat or such).

Thanks, that was my next question.

So I just tried drive on nav with automatic lane changes and I am super impressed! The car even negotiates with other cars for lane changes when the lane is not free. The only downside is exits are often indicated at some ridiculously low speeds like 30mph and the car obeys that annoying everybody around.

Yeah... that's gonna cause some road raging incidents. Hope they eventually establish a speed based on what the cars around you are doing up to a certain point.

Now to figure out how to stitch viedo from 8 cameras into a single picture I guess.

Would say simple panoramic but then how much of what it sees overlaps as well? Have some nice translucence effect or something in that case would be nice to contrast. But that's all @DamianXVI doing that part anyways based on your first post, yeah?
 
This is a huge difference between mobileye vs other companies, they actually have to sell stuff to make money. You can't sell a lie (unless you are Tesla :p) Because the person you sell to already tested your chips and know its capabilities before they sign a contract.
Wow, where is that coming from? Just as we are cheering for the HUGHE LEAP FORWARD by the 9.0 software and 'Navigate on Autopilot'. Stop being a sceptic and enjoy the great progress Tesla is giving us! Karpaty is the man!
 
That is super exciting. It is negotiating a lane change. That’s awesome.

Is the firmware you have 2018.39.2.1 cbbcef4? That just appeared on TeslaFi today
I am on 18.39.0.1 that reportedly has all the goodies they later disabled in 18.39.1 (and .2?) But it does look like they are debugging the whole thing live as we speak (there's like 18.36.2.1 now too), I am going to let other brave souls try these new releases to see what else Tesla decides to disable, before I jump on it. I want my full-featured stuff to remain full-featured ;)
 
I am on 18.39.0.1 that reportedly has all the goodies they later disabled in 18.39.1 (and .2?) But it does look like they are debugging the whole thing live as we speak (there's like 18.36.2.1 now too), I am going to let other brave souls try these new releases to see what else Tesla decides to disable, before I jump on it. I want my full-featured stuff to remain full-featured ;)

What features did Tesla disable? Is it solely for debugging?
 
The way Andrej Karpathy tells it, making HW2 Teslas autonomous is mostly a matter of neural networks, and neural networks are mostly a matter of creating data sets. He says neural network architecture is much less of a focus for him than data sets. In the video, Karpathy is specifically talking about deep supervised learning for computer vision neural networks.



If a company wanted to use production fleet data to train a neural network for path planning, it might be able to use sensor data to capture real world situations that can be re-created in simulation. This would help the simulated cars, pedestrians, bikes, etc. reflect the behaviour of the real entities. They could then use reinforcement learning to train the neural network on path planning in simulation.

A company could use driver disengagements from ADAS like Autopilot to flag the real world situations where path planning fails and needs additional training in simulation, or in structured tests on private roads.



What do you think are the main bottlenecks in developing full autonomy?


I realize what Andrej Karpathy ha said, and I agree perhaps the main focus of the autopilot team right now is building the neural networks.

"it might be able to use sensor data to capture real world situations that can be re-created in simulation"

yes this is true. However, they cannot simulate the affects of NN in the environment of the recorded data.... they could use find cases from their fleet to create more scenarios in their test fleet sure... but thats not really harvesting data for NN training.

"A company could use driver disengagements from ADAS like Autopilot to flag the real world situations where path planning fails and needs additional training in simulation, or in structured tests on private roads."

Yes they could


"What do you think are the main bottlenecks in developing full autonomy?"

this question is complicated and not sure what you are asking.... if you are asking what will limit the rollout and availability of fully autonomous vehicles in the next 3-5 years, I would say:

building and maintaining HD maps, mass producing sensors, and other components, and integrating them into high volume vehicles and doing this at economical price points,

If you are asking what takes a long time to develop for any certain player to ready their fully autonomous system, the answer would be something different.

Woah, looks like even pedestrians will be visualized in v9:

Marc Benton on Twitter

really cool

I respect an objective and constructive criticisms of any system but what you produce is simply FUD. You're a Mobileye fan-boy.

I do not see @Bladerskb as a mobileye fanboy,

and Mobileye's AP 1 system too limited to (reliably) recognize non-moving or cross-moving vehicles or objects.

because that was not their intention.... they designed the EyeQ3 in 2011-2013 to be a system for L1 adas vehicles and hands on L2 systems for inactive lane keep assist, acc, and automatic brakes, because that was the demand at that time by automakers... (and still is honestly)

that a lot of accidents and the majority of deaths were a result of an inattentive driver

There are 3 AP deaths Wikipedia knows of, 2 happened with Mobileye's AP 1! List of self-driving car fatalities - Wikipedia If you can prove there are 1 to 2 more, please provide a reference. Same goes to the hundreds of AP accidents, prove it or the real numbers are in the tens and not hundreds.

I know, you're now going to blame Tesla for misusing Mobileye's system, blabla...

I do not blame Tesla nor mobileye... although Tesla did what mobileye told them not to do... because mobileye thought it was a safety concern...

blame goes only to the driver of the cars.


Do you need HD Maps to drive your car? No?! Me neither, what a surprise. Maybe that's Tesla's angle, they try to only use vision (maybe with a little help from radar and ultrasonic) like humans do. We will see if they succeed with that. I'm a little sceptic about that, but let's just wait and see.

No that is not Tesla's angle. Tesla is working on HD maps. and every company that plans to deploy or sell commercially available self driving cars, utilizes HD maps.

Humans do not need HD maps to drive yes, but self driving cars to not drive and think the same way that humans do.

AP1 used a reference design from Mobileye at launch. 2 mins on Youtube and you'll see other EyeQ3 implementations being driven that way.

and that reference design was to disengage immediately after driver takes hands off wheel. Which Tesla did not do


sorry I don't know what this means?

So I just tried drive on nav with automatic lane changes and I am super impressed! The car even negotiates with other cars for lane changes when the lane is not free. The only downside is exits are often indicated at some ridiculously low speeds like 30mph and the car obeys that annoying everybody around.

Now to figure out how to stitch viedo from 8 cameras into a single picture I guess.

Wow this is sweet! Awesome that it is negotiating with other cars for lane changes! Can't wait to try.

So you can definitely tell all 8 cameras are being used.?

it's not marked as an early access, does not have any early access disclosures either.

That and it also turns on the turn signal, if the other car does not yield, it lets it pass and tries again, slowing down if necessary if it's an exit we are trying to make and want to not blow past it at big speed.

Wow also very impressive! I am wondering what geographic location are you testing this in and making these observations?
 
  • Like
Reactions: mongo
if you are asking what will limit the rollout and availability of fully autonomous vehicles in the next 3-5 years, I would say:

building and maintaining HD maps, mass producing sensors, and other components, and integrating them into high volume vehicles and doing this at economical price points,

If you are asking what takes a long time to develop for any certain player to ready their fully autonomous system, the answer would be something different.

A company like Tesla has to solve a few different problems before full autonomy can be deployed, most notably:

  • perception
  • localization
  • path planning
  • control
It seems to me that solving these problems entails either 1) training neural networks with more and better data, and possibly tweaking neural network architectures or 2) doing (1) so as to improve the input to hand-coded software, and tweaking that software.

For Tesla, mass producing sensors and integrating them into economical, high-volume vehicles is already solved — unless you think Tesla is going to need lidar.

HD maps can be compiled by the production fleet. Collecting mapping data from production cars is an advantage both Tesla and Mobileye have. Not sure whether visual HD maps are yet solved.
 
A company like Tesla has to solve a few different problems before full autonomy can be deployed, most notably:

  • perception
  • localization
  • path planning
  • control

You need to define "full autonomy" and "solve" for me... I may sound like I am just being difficult here, but it will honestly affect how I will properly respond.

These 4 categories is a simplification... but lets go with it.

First, Neural networks are only used in perception bullet point.

And they are not even 100% of perception.....far from it.. A very important part of for sure. but there is tons of other development in this category that is not NN or ML based. the NNs are only used in certain stages of perception,, and even in those stages it still doesn't play the whole role, and non NN ML algorithms are also necessary to be developed to perform the same functions that the NNs do. Not to mention many sensor pipelines will not include NNs at all.

Furthermore, developing the trained NNs that will use for various tasks, is a very long challenging task for sure... It relies a lot more on hard working talented AI and computer vision engineers than it does rely on having vast amounts of data. And even more so... data that comes from the fleet is not the most helpful / most core to training the NNs... but I agree there is value from collecting some data from a fleet for a variety of use cases, mostly validation.


2) doing (1) so as to improve the input to hand-coded software, and tweaking that software.

I am telling you the portion of AV development is a lot more of this step that you make it sound like or what you think it is.

For Tesla, mass producing sensors and integrating them into economical, high-volume vehicles is already solved — unless you think Tesla is going to need lidar.

You're right they have, and this is awesome. However, Tesla vehicles are advanced driver assist. Albeit one of the coolest advanced driver assist systems out there, they do not replace the driver. (as you know)

Maybe when you say developing full autonomy you are referring to the ability to create a perception system that

  • Makes perception errors that would result in an accident so infrequently that it is practical to use it in a system that can replace a human driver in some conditions
  • and uses only 8 cameras, sonor, forward radar,

It sounds like you are saying you believe this system can be built? and the key to doing so is collecting lots and lots of fleet data? is that what you are suggesting?
 
I'm sorry but that's basic lane change and nothing advanced.
Well, It was not doing that in the past. I did not get to a Cadillac dealership today and I don't think they are open on Sunday. We'll see if I manage to do it on Monday, but I have this distant memory they don't have any lane changes at all other than the ones you perform yourself?

Also what's "advanced lane change"? Because I don't think I do anything different than the car at this point.
 
Well, It was not doing that in the past. I did not get to a Cadillac dealership today and I don't think they are open on Sunday. We'll see if I manage to do it on Monday, but I have this distant memory they don't have any lane changes at all other than the ones you perform yourself?

Also what's "advanced lane change"? Because I don't think I do anything different than the car at this point.

Advanced involves a roadster changing lanes by passing under the trailer of a semi.
 
It's possible Mobileye could collect training data from production cars, but I doubt they will.

But also Tesla is hardly collecting training data from their production cars.


You are vastly over valuing training data. Nor mobileye nor Tesla's development of full autonomy is limited by the amount of unlabeled data. (or even labeled data for that matter)

Furthermore, there are hundreds or thousands of challenges that go into developing full autonomy, training the NNs is just one.

And about reinforcement learning, you cannot use fleet data for reinforcement learning, the fact that you even suggest this is a possibility makes it clear you do not understand how it works.




lol


Lol, nice thesis. Find me one computer vision / perception engineer at Tesla or Mobileye or Waymo, or any other that says their development is currently labeled by the amount of fleet data they can collect.

They are limited by data. Most of them don’t have enough “production” miles driven to get a true estimate of real world failure rates. In particular you need billions of miles to get a statistically significant estimate of miles per mortality. Simulation is not enough. Simulation is doomed to succeed.
 
Well, It was not doing that in the past. I did not get to a Cadillac dealership today and I don't think they are open on Sunday. We'll see if I manage to do it on Monday, but I have this distant memory they don't have any lane changes at all other than the ones you perform yourself?

Also what's "advanced lane change"? Because I don't think I do anything different than the car at this point.

You are correct:
Screen Shot 2018-09-29 at 9.01.54 PM.png

https://www.cadillac.com/content/da...18-cad-ct6-supercruise-personalization_v2.pdf

@verygreen - just for clarity: I have not driven v9.0, and it appears you have:

Does Tesla Autopilot v9.0:

-Steer to avoid any objects, including vehicles?

-Steer to merge the vehicle into the appropriate lane of traffic or to exit the freeway?

- Make lane changes?

I ask because GM outlines these limitations in their Supercruise owner guide. But, Tesla's autopilot page states these features are possible. Again, I do not know if you have driven Supercruise, so lets take GM's word.

Autopilot

Screen Shot 2018-09-29 at 9.14.22 PM.png

Screen Shot 2018-09-29 at 9.13.46 PM.png
 
-Steer to avoid any objects, including vehicles?
normally it would just brake, but on highway it might change a lane to overcome a slower car.

-Steer to merge the vehicle into the appropriate lane of traffic or to exit the freeway?
Yes, if you select this option in settings (after a bunch of warnings and a double confirmation).
Reportedly the option is gone in 18.39.1 and above, but it is an option in 18.39.0.1 and there are videos showing it in action.

- Make lane changes?
How is it different than above? you certainly can initiate lane changes with a stalk, or if enabled - the car might hop some lanes on its own.

The whole thing in that picture on the tesla side works for me. it transitions me to a surface street and then beeps and displays the red hands on wheel thingie to ask me to take control.

The self park and summoning from garage are a somewhat different matter I guess.

I ask because GM outlines these limitations in their Supercruise owner guide. But, Tesla's autopilot page states these features are possible. Again, I do not know if you have driven Supercruise, so lets take GM's word.
we'll see once I try one out
 
  • Like
  • Informative
Reactions: Joel and EinSV
normally it would just brake, but on highway it might change a lane to overcome a slower car.


Yes, if you select this option in settings (after a bunch of warnings and a double confirmation).
Reportedly the option is gone in 18.39.1 and above, but it is an option in 18.39.0.1 and there are videos showing it in action.


How is it different than above? you certainly can initiate lane changes with a stalk, or if enabled - the car might hop some lanes on its own.

The whole thing in that picture on the tesla side works for me. it transitions me to a surface street and then beeps and displays the red hands on wheel thingie to ask me to take control.

The self park and summoning from garage are a somewhat different matter I guess.


we'll see once I try one out

You didn't have to answer, I was just highlighting your point from above with sarcasm. :)

Unfortunately, I do not have a sarcasm button ;)