Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
What would be a reasonable miles per accident? Tesla's highest reported number was 6.57M in 22Q1, but I would think 12.x can get much higher especially given the current limited deployment/availability/usage of FSD Beta technology. I was highlighting the flattening in that Autopilot team has been busy with getting end-to-end ready almost all of 2023, so all forms of Autopilot did not see much improvement, but hopefully we'll see meaningful increases to safety across the fleet this year.
It’s difficult to say without further data. One needs to control for as many variables as possible. Tesla’s data compares Tesla drivers which should be a good comparison but it may well be skewed because by definition, AP can only be used on the highway and highways have lower accident rates per mile driven than city roads. I did a brief web search and didn’t find any ready sources of data to make the comparison but I didn’t look terribly hard.
 
Yeah, realistically you can't get much of a better comparison. Both v11 and v12 had some tricky scenarios to deal with and both in roughly the same conditions.

v12 was a clear winner. I don't think any rational person would argue otherwise.
Some people will argue simply because they have already concluded V12 is just a small step change when it appears for once Tesla may actually have released a significant upgrade.
 
The rule isn’t difficult. The practical application is. I haven’t won a Darwin award (yet) and I still have regular instances where I stop at a 4 way stopsign and I along with the other drivers are unsure of who should go first. Who stopped first, especially when no one comes to a complete stop? What about if one driver arrives first but doesn’t actually stop but another driver arrives slightly after and does come to a complete stop - who goes first, the one who’s following the law or the one skirting it? Now throw pedestrians into the mix.

If you’re applying ‘always yield to unsafe drivers’ to the rules then you’ve decided you are using Darwin award winners for training. So now we need to figure out not just the rules but the various ways people break them.
To win a Darwin Award, you must first die doing something stupid. It’s a posthumous award.
 
  • Like
Reactions: FSDtester#1
We hope handling 4-way stops is just a matter of feeding more data, but it is also possible the current model just isn't able to make decisions based on what happened seconds ago. What else requires considering what happened seconds ago? Merging/allowing others to merge? Doesn't seem like much else.
 
  • Like
Reactions: FSDtester#1
Always proceed safely. It may be difficult to decide who goes first but I've never ever seen a backed up four way stop. With Darwin Award winners in the mix, the traffic still flows.
Right - because people adjust to other people braking the rules, signal to each other, anticipate what others may do, etc. Also, didn’t you just say “Don't use Darwin award winners for training. It's not that difficult if you know the rule that the person on the left has to yield?” Now you’re backtracking and saying we need to train with darwin award winners. (Which is correct, BTW - there’s more to ’proceeding safely’ than just following the law, as I’ve repeatedly said)

I’m teaching my 15 year old how to drive right now. She knows the rules perfectly but is frequently unsure how to proceed at a 4 way stop.
 
  • Like
Reactions: FSDtester#1
I think this shows the potential of v12, even if it's got some bugs to work out. Tesla has not explicitly programmed this behavior to obey the road markings, and yet enough training data of drivers not blocking roads painted like that exists for it to emulate that behavior:


It's not humanly possible for Tesla employees to anticipate and write planning code for every road feature. But v12 could possibly learn it all from training data.
 
Since it costs a lot of money to train the network once (an enormous amount of energy is required to train it), I would expect them to use whatever good training data they have and not hold any back.
It sounds like you're referring to training a network from scratch, but there can be different phases especially if there's base models that can be fine-tuned with additional training that might be better suited for an ongoing stream of new data. Holding back data isn't so much of saving some for later but more of a practical aspect of there's always more data coming in, so Tesla just needs to release what has been trained so far to maintain some cadence of improvements.

As you say, it's expensive to train networks both money and time, so if end-to-end can also reduce that cost and speed up iteration perhaps by allowing for continuous incremental training and releases, hopefully that's something we'll be able to see once 12.x goes wider with regular updates. I suppose it could even be possible that the explicit FSD Beta versioning and release notes go away similar to past improvements to Autopilot that didn't get mentions for the general audience.
 
I've watched a few vids now, including the side-by-side, and there are clearly improvements overall.
I think I've become accustomed to the sometime pensive nature of FSD; honestly at times I actually appreciate it being conservative. I don't usually mind gently pressing on the accelerator to let it know *I* think it is OK to go. Seems reasonable to me.
It's important to remember that the real world is very dynamic, there are lots of changing factors that cannot be predicted regardless of how good the models are and certainly human beings can't accurately predict either. There's a lot of risk assessment, and often humans just decide to proceed assuming things will just work out (that other car will see you and respond appropriately). It seems entirely appropriate (to me) that AI would be generally more cautious than the average human.
 
But as far as a functional driver assist in this context, where you constantly have to anticipate the variety of scenarios where it will fail to negotiate the scenario perfectly (and might be used where a driver is unfamiliar, too, which would make it worse especially in the case of failure), it’s not very close, unfortunately.
Different people will have varying comfort levels of using the technology, and FSD Beta so far has really lacked the polish with its potential for sudden jerks even for some basic maneuvers like making a turn or stopping at a line. What we've seen from 12.x so far is much smoother, but there's also regressions from 11.x in various behaviors with some requiring safety disengagements. Unclear what quality Tesla will try to achieve before wide release of any 12.x.

Perhaps end-to-end will lull people into complacency as there won't be the constant reminders that this really is "beta," and similarly there might be more people willing to try it out and keep it active even though it's still fundamentally a driver assistance feature that can do the wrong thing at the wrong time. At least with 11.x, it seems like some people have learned to use it for "simple" situations like following the lane but disengage when it needs to make a turn at busy intersections although maybe let it try to turn if there's nobody around. (This really does seem like FSD Beta could be used for Basic Autopilot.)

Specifically for the "unfamiliar" areas, the hope is that there's enough other Tesla vehicles driving manually when encountering an odd intersection or situation for data collection to happen so that by the time you experience it, it'll do the right thing. Except if the first time is with 12.x active and it's sufficiently different from training, it can be quite awkward or even unsafe, but hopefully the driver is aware enough that there's something strange like the weird angling of the traffic lights.
 
  • Like
Reactions: FSDtester#1
Except if the first time is with 12.x active and it's sufficiently different from training, it can be quite awkward or even unsafe, but hopefully the driver is aware enough that there's something strange like the weird angling of the traffic lights.
I'd like some sort of indicator that says how confident FSD is in a given circumstance. In V11 we get the degraded message, and eventually the takeover screen.

The comical version of this is having the Karmans visible on the visualization, showing how confident they are. This is the red wheel of death replacement, complete with screaming.

kerbalspaceprogram_2099773_650x.jpg


A more serious version might be setting the background color of the visualization. Or a tinting of the objects in the visualization. Or some kind of audio cue. Perhaps the radio volume goes down as FSD's confidence goes down, and then we hear a distinctive tone that increases in volume. I know that there are human factors problems with each of these, and I'm just thinking out loud.
 
Some people will argue simply because they have already concluded V12 is just a small step change when it appears for once Tesla may actually have released a significant upgrade.

We know little about v12 other than its end to end net, it requires training data from ideal drivers, significant bugs remain, v12 has more dangerous interventions versus v11, and stop sign response seems more human like.

If I'm reading this correctly, Chuck might be properly peeved.

Screenshot 2024-01-28 085008.png
 
I think this shows the potential of v12, even if it's got some bugs to work out. Tesla has not explicitly programmed this behavior to obey the road markings, and yet enough training data of drivers not blocking roads painted like that exists for it to emulate that behavior:


It's not humanly possible for Tesla employees to anticipate and write planning code for every road feature. But v12 could possibly learn it all from training data.
Unfortunately, FSD training hasn't been TSLA's strong suit.
 
You know I'm always up for a friendly wager.
Ah yes, our wager. In hindsight it was obvious that traditional programming would never be able to negotiate the arc of Chuck’s UPL. There’s just no way to calculate the future positions of all the vehicles with C code!
I will bet a beer that Chuck’s first version of FSD V12 will finally achieve 90% performance. V12 will be the start of the March of Nines!
 
Right - because people adjust to other people braking the rules, signal to each other, anticipate what others may do, etc. Also, didn’t you just say “Don't use Darwin award winners for training. It's not that difficult if you know the rule that the person on the left has to yield?” Now you’re backtracking and saying we need to train with darwin award winners. (Which is correct, BTW - there’s more to ’proceeding safely’ than just following the law, as I’ve repeatedly said)

I’m teaching my 15 year old how to drive right now. She knows the rules perfectly but is frequently unsure how to proceed at a 4 way stop.
Just yesterday I was at a 4-way and the guy to my right rolled in and came to a stop well before I did. Once another car had cleared the intersection, I looked over to him and he was just looking at me. So I stuck my hand up pointing for him to go and stared at him until he went. Some people.