Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD rewrite will go out on Oct 20 to limited beta

This site may earn commission on affiliate links.
EU might be one of the last developed parts of the world that gets FSD, I honestly do believe that Tesla will go down the path of least resistance for deployment then use the fleets global statistics to appease the EU regulatory beast.

That I believe. Musk literally said that. EU is probably last. It sucks, but we're doin' things a bit different round these parts, know what I'm sayin' ? :D
Very sad. :(
The nerfs ironically make things more unsafe. But I think Tesla does it on purpose just to prove a point.
For example: Instead of slowing down in the turn, they rather beep a warning that it can't handle the turn. Easiest thing to fix. But they do it on purpose.
 
Weird- I'm in the same area as you and it generally works fine for me... (2018 Model 3, with HW3 having gotten the free upgrade via owning FSD)...

One drive I do reasonable often for example is Raleigh to Chapel Hill where I start out on 540 and the car merges onto 40 itself just fine.

Likewise the 40->147 merge it seems to handle fine... The 147->85 merge.... the merge where 40 and 85 come together going west... all handled by car without issue.
Maybe my cameras just need to be realigned then (the car is clean, so it's not dirt). Just this morning I was merging from 147S to I-40W, easy onramp, driving manually (not on AP), with nothing in the right lane and an 18 wheeler parallel to me in the center lane. As I merged into the right lane, the car thought the truck was in it, the dash lit up red, beeped, and the steering wheel jerked to the right like it was avoiding a crash. I was watching the truck to ensure it wasn't drifting into my lane as I merged so that's not it. I'm going in next week for HW3, and I'll see if there's anything they can do. Maybe my car needs glasses!
 
how do you know that that information is not tainted by ulterior motives (e.g., spin to the positive, bury the negative)? If you just trust the company mouthpiece
Ouch, dude. I've gone to incredible lengths to stay objective in my coverage. My #1 rule is: if it happens in my ride, I post it--end of story. I also paid upwards of $1500 for all my rides out of pocket. Speaking of which, video #54: huge PR disaster for Waymo. (Look at all the news outlets that covered it) Not to mention all the other times I went out of my way to trip up the system and succeeded. If anything, I'd be incentivized to have the car make mistakes, as that attracts huge views and by extension more ad revenue.
ONE person who had nothing better to do than ride Waymo.
Actually I had a full time job at a silicon fabrication plant while the series was running, those rides happened on my weekends. Fair though, it was a bit of an obsessive hobby. But I got so much positive feedback from people on every side of the discussion.
The main objective of the series was to show everyone exactly what this technology is like right now, major flaws and all. I feel as if I have achieved this.

(Seriously though, what kind of weird multi billion dollar company would hire a 20 year old kid with a cheap camera and a four digit subscriber count)
 
Don't sell yourself short!

Aside. Many of these very deep-pocketed corporations don't like paying $ for anything and would rather offer "exposure". Lots of instances in the past and I think Netflix got some flack for it recently.

(Seriously though, what kind of weird multi billion dollar company would hire a 20 year old kid with a cheap camera and a four digit subscriber count)
 
I really hope some of the FSD rewrite trickles down to AP / NoA . But is there any of that code active on highway?
Because production versions can't handle cut-ins, and they can't even stay in the lane when it widens due to on-ramp merger .
Then it will swerve into the middle. No logic implemented there whatsoever. It's a very simple thing to fix:
C++:
const double MagicFactor = 1.5;
const double OnRampMaxLengthMeters = 250.0;

// If lane is significantly wider than previous AND it's shorter than a standard on ramp merger
// Then stick to the closest marker with the same distance as before, that doesn't widen

if (laneWidth > previousStandardWidth * MagicFactor && distanceTraveled < OnRampMaxLengthMeters) {
    StickToNearestLaneMarkerAsBefore();
}
else
{
    CenterLaneAsUsual(); // Normal lane centering
}

It would make current driving experience so much better. It KNOWS that it gets wider from previously narrower, it shows in the visualization. And it should know that a lane is abnormally wide.
wont pass code review. who on earth taught you that indentation style?

lol
 
Ouch, dude. I've gone to incredible lengths to stay objective in my coverage. My #1 rule is: if it happens in my ride, I post it--end of story. I also paid upwards of $1500 for all my rides out of pocket. Speaking of which, video #54: huge PR disaster for Waymo. (Look at all the news outlets that covered it) Not to mention all the other times I went out of my way to trip up the system and succeeded. If anything, I'd be incentivized to have the car make mistakes, as that attracts huge views and by extension more ad revenue.

Actually I had a full time job at a silicon fabrication plant while the series was running, those rides happened on my weekends. Fair though, it was a bit of an obsessive hobby. But I got so much positive feedback from people on every side of the discussion.
The main objective of the series was to show everyone exactly what this technology is like right now, major flaws and all. I feel as if I have achieved this.

(Seriously though, what kind of weird multi billion dollar company would hire a 20 year old kid with a cheap camera and a four digit subscriber count)
Thanks, JJ. My comment was directed at the dearth of objective data (other than your very instructive and helpful videos!) showing warts when conveyed by the company PR department. Again, it's easy to omit negative data. I'm not saying Waymo does that, but I also don't take at face value anything their PR department puts out (and is rapidly disseminated here as fact).
 
  • Like
Reactions: JJRicks
Thanks, JJ. My comment was directed at the dearth of objective data (other than your very instructive and helpful videos!) showing warts when conveyed by the company PR department. Again, it's easy to omit negative data. I'm not saying Waymo does that, but I also don't take at face value anything their PR department puts out (and is rapidly disseminated here as fact).
In California they're required to report collisions (and there are plenty of them!)

Apparently someone dislikes Waymo even more than you :p
"On July 14, 2021 at 4:47 PM PDT, a Waymo Autonomous Vehicle (“Waymo AV”) was proceeding straight westbound on Clement Street in San Francisco, when a passenger vehicle struck the Waymo AV from behind two times. The passenger vehicle then struck the Waymo AV a third time, on the left side of the Waymo AV, as it passed on the left, then immediately fled the scene. The San Francisco Police Department is investigating the event as a criminal act. At the time of the impact, the Waymo AV’s Level 4 ADS was engaged in autonomous mode, and a test driver was present (in the driver’s seating position). "
 
Last edited:
Thanks, JJ. My comment was directed at the dearth of objective data (other than your very instructive and helpful videos!) showing warts when conveyed by the company PR department. Again, it's easy to omit negative data. I'm not saying Waymo does that, but I also don't take at face value anything their PR department puts out (and is rapidly disseminated here as fact).

You have been following Waymo's statements for quite a while. What is their track record of true vs false statements? Are they a company known for posting demonstrably false data on a regular basis?
 
In California they're required to report collisions (and there are plenty of them!)

Apparently someone dislikes Waymo even more than you :p
"On July 14, 2021 at 4:47 PM PDT, a Waymo Autonomous Vehicle (“Waymo AV”) was proceeding straight westbound on Clement Street in San Francisco, when a passenger vehicle struck the Waymo AV from behind two times. The passenger vehicle then struck the Waymo AV a third time, on the left side of the Waymo AV, as it passed on the left, then immediately fled the scene. The San Francisco Police Department is investigating the event as a criminal act. At the time of the impact, the Waymo AV’s Level 4 ADS was engaged in autonomous mode, and a test driver was present (in the driver’s seating position). "

Tesla probably wouldn't even count that as an accident in their "statistics" on "safety".
 
In California they're required to report collisions (and there are plenty of them!)

Apparently someone dislikes Waymo even more than you :p
"On July 14, 2021 at 4:47 PM PDT, a Waymo Autonomous Vehicle (“Waymo AV”) was proceeding straight westbound on Clement Street in San Francisco, when a passenger vehicle struck the Waymo AV from behind two times. The passenger vehicle then struck the Waymo AV a third time, on the left side of the Waymo AV, as it passed on the left, then immediately fled the scene. The San Francisco Police Department is investigating the event as a criminal act. At the time of the impact, the Waymo AV’s Level 4 ADS was engaged in autonomous mode, and a test driver was present (in the driver’s seating position). "
What kind of autonomous system would Tesla be expected to report on? Just cars with FSD Beta? Including both employee and private citizen?

This crash suspected on regular AP doesn't show in their report. I suppose they don't count Navigate on Autopilot?
https://driving.ca/auto-news/crashe...plauded-full-self-driving-in-videos-on-tiktok

Would they have to report crashes by cars with FSD Beta but not engaged, or all incidents with those cars whether engaged or not? The Cruise incident of July 26, 2021 was listed as in Conventional Mode. Also how is Tesla supposed to know about and report on non-Tesla-employee incidents from the however many FSD Beta drivers there are in California.
 
What kind of autonomous system would Tesla be expected to report on? Just cars with FSD Beta? Including both employee and private citizen?

This crash suspected on regular AP doesn't show in their report. I suppose they don't count Navigate on Autopilot?
https://driving.ca/auto-news/crashe...plauded-full-self-driving-in-videos-on-tiktok

Would they have to report crashes by cars with FSD Beta but not engaged, or all incidents with those cars whether engaged or not? The Cruise incident of July 26, 2021 was listed as in Conventional Mode. Also how is Tesla supposed to know about and report on non-Tesla-employee incidents from the however many FSD Beta drivers there are in California.
NoA is a L2 system, California only requires reporting for L3-L5. Tesla claims that FSD Beta is a L2 system as well. If the DMV were to make the determination that they are doing autonomous vehicle testing then they would be required to report collisions that occurred in customer owned vehicles. I think then they would also be liable for those collisions. I don't think it would be hard to write software to detect collisions and disengagements...

Uber tried and failed to declare their vehicles L2 to get around the reporting requirements.

I'm not sure why they have to report collisions that occur while the car is in conventional mode (unless of course the collision directly followed a disengagement).
 
NoA is a L2 system, California only requires reporting for L3-L5. Tesla claims that FSD Beta is a L2 system as well. If the DMV were to make the determination that they are doing autonomous vehicle testing then they would be required to report collisions that occurred in customer owned vehicles. I think then they would also be liable for those collisions. I don't think it would be hard to write software to detect collisions and disengagements...

Uber tried and failed to declare their vehicles L2 to get around the reporting requirements.

I'm not sure why they have to report collisions that occur while the car is in conventional mode (unless of course the collision directly followed a disengagement).
Tesla is listed as a permit holder, but just as Level 2, is that right? Cruise is listed in the same category but as Level 3 or higher? It doesn't make that clear.

Permit Holders (Testing with a Driver)

As of July 23, 2021, DMV has issued Autonomous Vehicle Testing Permits (with a driver) to the following entities:

AIMOTIVE INC
...
CRUISE LLC
...
TESLA
...
 
Tesla is listed as a permit holder, but just as Level 2, is that right? Cruise is listed in the same category but as Level 3 or higher? It doesn't make that clear.

Permit Holders (Testing with a Driver)

As of July 23, 2021, DMV has issued Autonomous Vehicle Testing Permits (with a driver) to the following entities:

AIMOTIVE INC
...
CRUISE LLC
...
TESLA
...
Tesla has a permit to test L3-L5 autonomous vehicles with a safety driver. I believe they claimed only 13 miles of L3 testing in 2019 (the Autonomy Day demo). They did a bunch of testing in 2016 to produce the demo they did back then.
There is no permit required to test L2 vehicles.
Waymo and Cruise have permits to operate L4 vehicles without a safety driver.