Well, in one of the crossings for me, AP wants to go straight into the oncoming lane. Is that more or less dangerous than left turns ?
Well, if this is bar it needs to meet in order to be wide released we are probably looking at several more years and at east one new sensor suite upgrade
The biggest problem is the danger of it scales with how good it is. Like 90% isn't really scary as people are going to be really attentive like people are with NoA. They're ready for any kind of mistake it might make. Once it gets to 99% or 99.9% people stop paying attention, and all it takes is one goof and it crashes. Then there are regulatory issues where they have to satisfy concerns that various DMV's will have and the NHTSA will have. What will the NHTSA or DMV's have to say if Tesla releases FSD beta to the masses as an L2 system? It's driver assist, but its asking a lot out of the human driver. With NoA they could release something that was pretty bad because it's on the freeway with fairly controlled situations. I've been absolutely unhappy with NoA, but I've never felt like it was all that dangerous. Instead I feel like its trying to embarrass me by trying to do stupid things. With FSD beta on city streets if a person isn't quick it absolutely will crash into something. From a curiosity standpoint I really like the FSD beta program. I love watching the videos, and I have a lot of respect for the time/effort/energy the owners put into testing. But, from an owners perspective I'm starting to feel as if FSD is just marketing. A way of making believe that it can do something it can never really do in order to sell cars. I'm hoping Tesla will give more attention to NoA to improve it significantly while the FSD beta is going on.
While I understand the skepticism (kind of) that some people have towards FSD, what I think many are failing to take into consideration is what it CAN do now, rather than what it CAN'T do, and the rate at which it improves. It's rate of progress is directly related to the amount of data it receives from the fleet that have it. The fact that they are expanding the pool suggests that the rate of improvement is expanding as well. The fact is that no system is EVER going to be able to handle 100% of every possible case it could ever face. There are times when moving around 4000 pounds of steel is just a bad idea...for a human or a machine. There will be accidents and deaths with any autonomous system. To think otherwise is naive. If that is accepted then it comes down to how much safer is it than a human driver. I KNOW, for a fact, that I am safer with autopilot engaged on the highway than not. This has been proven to me on more than one occasion. Is it perfect 100% of the time in every possible scenario? No, and it never will be. That's not the point. The point is that my passengers and I are much safer with autopilot than without. I have no reason to believe that this won't be the case with FSD in the near future. Have we yet to see exponential growth in the amount of data fed to the neural net for improvement? No. Have we seen the full impact of the Dojo system on its ability to interpret that data, make revisions to the software and disseminate those changes to the fleet? No. How can we make assumptions that it will never happen based on what we see right now? We can't...nobody can. To do so is just justification of one's own biases because we just don't know yet what the system can do yet in the way of timely improvements. I suggest we all just relax, trust Tesla to do what is best for the system, the company and the public, and wait and see. If you don't want to wait it out, don't buy the system. Dan
Not sure what you mean by "crossing" but your case is either anecdotal or one off. However FSD Beta is or is trying to pull out in traffic on all users at a very high percentage of the time. Probably up 5% to 10% of the time on 4 lane, busy highways/roads with traffic coming (percent pulled from my A$$ but have watched almost all videos and it happens often). This must be lowered to something like 0.01% before it can be even semi trusted. My WILD conjecture is V11 will add the new code and maybe a few City features, with most features turned off. Just no way well we get unconfirmed turns for a few months or more. Probably see unconfirmed right turns a couple of months before unconfirmed lefts too.
Ofcourse it’s anecdotal. So are all the YouTube videos. Anytime you have a junction and the roads are curving you get into this kind of behavior - rather common in Seattle metro. AP either takes the wrong lane or aborts suddenly - both can be dangerous.
Not confused, but definitely wondering how large the EAP pool is if 1k users excludes early access folks
I asked because you used the "confused" emoji. I guess the total EAP pool is larger than 1000 since we know that not all EAP people have FSD Beta. Honestly, I was not aware that EAP was that big.
Yeah I'm hoping for something like that soon! Happy to confirm and otherwise supervise, just want to get a chance to try it out haha
How hard would it be to incorporate FSD code into today’s generic autopilot? From the videos I’ve seen FSD seems vastly more competent at avoiding objects in the road, going into the correct lanes, etc. Seems like it would make normal autopilot/NOA safer. Today’s NOA can’t even take an on-ramp from the access road onto the freeway or go to the correct lane when a road splits off in a Y pattern. What would the harm be in using the FSD computer, for instance, to detecting objects?
Probably not directly as you asked as "today's generic autopilot" will likely be completely replaced. But to your overall intent of "Will basic Autopilot (lane keeping and stopping for things in lane) improve because of FSD?" Elon has said yes although unclear when that would happen and with what restrictions, e.g., still intended for divided highways. Although practically for situations like steering around parked vehicles of residential unmarked streets, it would be additional code to "dumb down" Autopilot to replicate the existing incorrect behavior of stopping for the parked vehicle as the advanced FSD neural network would just predict a path to go around. Q: Will standard Autopilot run on 8-camera surround video as well, ie same NN as FSD, just with limited functionality/disabled features? A: Yes, although it will be crazy not to turn on FSD https://twitter.com/elonmusk/status/1353665950290235394 I wonder if there will be a separate "reduced-functinality FSD beta" in the future to make sure basic Autopilot is still safe even with limited features.
I saw some tweets that showed a bunch of toggles in the dev firmware that enable/disable some/all of these behaviors. In theory enabling some of these features should be minimal work.
From Elons tweet, and simply from an ease of implementation perspective it's much easier to code to specific hardware, and then toggle things on/off. For older hardware you can simply lock what you already have down. HW3 has been shipping for awhile so it makes sense to give them the full power of the hardware they have to make it easier for them to justify an FSD purchase. If AP performs pretty well its easier to justify a purchase of FSD than if AP makes simple detection mistakes.