Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Maybe not ready to perform perfectly, but certainly more ready than an unconscious driver.

We know that Tesla has been working on networks that watch the driver-facing camera for signs of drowsy driving. If they can get a network that can identify an unconscious driver with a reasonable degree of accuracy, it would probably save a few lives.
Again, at this point Tesla isn't ready to deploy something like that. The car isn't currently capable, maybe soon, but not at this moment.
 
Agreed, but I think the issue is that it isn't approved to take over, ever. Even if it would clearly be the only action that made sense.
Right.

That's not as easy as just enabling. It has to be proven it can safely work and then Tesla would have to accept liability if it failed.

Both fronts would delay this from being implemented.
 
Interestingly, dramamine knocks me out for at least 24 hours on land if I take it when I have a tummy upset, but on a ship, it just works fine and I continue on, just slightly slower than usual. I figure it is because the drug is actually counteracting something. Ginger works just as well for me so I've switched to that. (Again, I should try this in the car - I really miss being able to work in the back seat.)
The Scope patches' only side-effect I ever experienced is dry-mouth. Which is annoying when scuba diving, but sure beats the alternative. When the dive boat reached the dive site, I'd have them put me in the water first, and I'd get going on down, and then I'd wait 10 ft below the boat after the dive for everone else to climb back on board before I did...and then I would be saying, "Go! go! go!". It was fine with the boat moving and the wind in my face. Not so fine when the boat was anchored lol.

The Scope patches are a major win.
 
  • Informative
Reactions: primedive
It would be safer than no driver at all even in current state
Maybe some fantastic news stories about how a Tesla saved someone's life. But the story that would drown out all the others is how the car killed someone when it took over for an incapacitated driver. Nobody will accept such liability, the wolves would have a feast.
 
Motion sickness is a puzzle, but the "conflicting inputs" idea doesn't make a lot of sense to me. If I show you picture of a rabbit and tell you it's giraffe, does it make you feel nauseated?

My wife and I were invited for a day on a sailboat. She took a Dramamine. We spent wonderful hours on the water, seeing dolphins, an aircraft carrier, and more. She wasn't motion sick at all. Why? Because she fell asleep when we left the dock and didn't wake up until we returned.
 
I’ve said this before but I continue to be ignored in general so I’ll continue to say it:

A 0.0.x release from 12.3.4 to 12.3.5 is a maintenance release. It is unlikely to have fixes that you will notice. It will be the same neural network. Maybe, if you’re lucky, you’ll see some fix to something. But for most of us, we’ll see no change yet imagine that it’s a whole newly-trained neural network.

We all need to dial back the excitement and expectation for these maintenance releases. They are very minor changes.

The things we will really notice will come with minor or major version upgrades…12.3 to 12.4 for example.

Hate to be Debbie Downer but it’s important to keep expectations in check as it minimizes the annoying whining I have to wade through in this thread 🤣
But it's slightly more complex than that because Tesla are not using (so far as I can tell) semantic versioning. The major version more or less changes on a whim of Elon, so the team makes do with minor/release versions, which means that they may or may sneak feature changes into "bug fix" releases. Also, with NNs, the distinction between feature and bug fix is much more fuzzy since both involve retraining on revised datasets. Pretty much all you can be sure of is major change = Elon wants it, minor change = does have new features, release change = may or may not have new features.
 
  • Like
Reactions: DrGriz and JB47394
Motion sickness is a puzzle, but the "conflicting inputs" idea doesn't make a lot of sense to me. If I show you picture of a rabbit and tell you it's giraffe, does it make you feel nauseated?

My wife and I were invited for a day on a sailboat. She took a Dramamine. We spent wonderful hours on the water, seeing dolphins, an aircraft carrier, and more. She wasn't motion sick at all. Why? Because she fell asleep when we left the dock and didn't wake up until we returned.

Dramamine always zoned me out. I've been using chewable meclizine for quite a few years now, mostly on cruises. I don't feel any drowsiness with it.

V12 is the first that didn't make my dog quickly throw up, but I can still be slightly smoother.
 
  • Informative
Reactions: primedive
I don't understand how this would work in an any sort of "pure" ML context. It seems to me if you want "Chill", you'd have to identify a bunch of "chill" drivers and train on that.

Mmm ... I take it back, I guess you could set up the "reward function" (NB, I have no idea what I'm talking about) to measure "Chillness" but that seems ... hard, and maybe a bad idea? Like, for "Smooth", you could punish hard braking somehow, but are you really sure you want to do that?
That's more or less it. There are many parameters for follow distance, including weather, speed, mode etc. The training then is to generate the correct follow distance based on these values, using a dataset that provides the needed mapping. I'm not saying this is what Tesla do, the actual process is more abstracted than that, but conceptually this is what happens. So yes, as you note the training set does have to be setup for this, but this is a problem in training, not in the final size of the NN.
 
  • Informative
Reactions: primedive
Maybe some fantastic news stories about how a Tesla saved someone's life. But the story that would drown out all the others is how the car killed someone when it took over for an incapacitated driver. Nobody will accept such liability, the wolves would have a feast.
There are UNECE proposals for "emergency steering function" to do this.
It looks like Volkswagen was planning such a system but I think the system they deployed just stops in the lane.

"A new feature of Emergency Assist 2.0 is that on roads with multiple carriageways the Arteon steers itself in a controlled manner into the nearside lane. It is Park Assist and Lane Assist that enable it to do this, while Side Assist uses its radar sensor to check on the traffic behind in order to minimise the risk of any collision when changing lane."

 
The wobble is the anti-robotaxi - solution isn't clear:


Is it just me, or does V12.3.5 hesitate to turn there in rhythm with the light-bleed from the old HW2.5 repeater camera?

When I bought FSD and had my car in the shop to install HW3, I also went ahead and paid a couple hundred dollars for the new repeater cameras without the light-bleed. Not only did it bother me while driving at night, but I thought it might also pose an eventual problem for FSD.

Edit: from the YouTube comments, they guy is also driving with a kayak strapped to his roof and one of the lines going over the forward-facing camera. That's not ideal:

1713884839814.png
 
There are UNECE proposals for "emergency steering function" to do this.
It looks like Volkswagen was planning such a system but I think the system they deployed just stops in the lane.

"A new feature of Emergency Assist 2.0 is that on roads with multiple carriageways the Arteon steers itself in a controlled manner into the nearside lane. It is Park Assist and Lane Assist that enable it to do this, while Side Assist uses its radar sensor to check on the traffic behind in order to minimise the risk of any collision when changing lane."

Very interesting, even more so that the article is 7yrs old.
 
So yes, as you note the training set does have to be setup for this, but this is a problem in training, not in the final size of the NN.
I would think that the training data could be automatically sorted for a number of parameters because they have the V11 software. They can analyze clips and understand which clips involve a driver following at a larger distance, accelerating more slowly, keeping lateral jerk to a minimum, etc. So they can be fed into training as always, but with additional descriptive parameters, including the characterization of chill/average/assertive. Heck, they could probably replace the three levels with a slider.
 
Motion sickness is a puzzle, but the "conflicting inputs" idea doesn't make a lot of sense to me. If I show you picture of a rabbit and tell you it's giraffe, does it make you feel nauseated?
That doesn't engage your vestibular system.

My wife and I were invited for a day on a sailboat. She took a Dramamine. We spent wonderful hours on the water, seeing dolphins, an aircraft carrier, and more. She wasn't motion sick at all. Why? Because she fell asleep when we left the dock and didn't wake up until we returned.
LOL. Again, Scope patches for the win.
 
Seems hard to believe but some say FSD doesn't track a towed trailer. I don't have any experience with this scenario, but in this case, I would think it's related to insufficient training data.

Very strange the car didn’t even attempt to brake when the truck turned back into the other lane the camera obviously has to see the road blocked