Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Keeping all this in mind - I think by the end of the decade Tesla could be close to getting L3/L4 kind of driving in large geofenced areas. But it is equally likely they will be still struggling with say one dangerous move every 1,000 miles ....
Lumping L3 and L4 together doesn't reflect the significant difference in the requirements of each. I could easily see L3 being available long before L4.
 
The straight part has faded, and they didn't repaint it.
The street view linked earlier shows it was originally clearly double straight white arrows in 2014 even with the one destination lane then around 2017 covered the right lane's straight arrow with black paint then repainted white in 2022. Given all these changes, I'm not surprised by how much map data is wrong, so if end-to-end is relying on map data indicating there's 2 straight lanes, it could keep taking the right/wrong lane.

Even visually with faded or repainted arrows, there's uncertainty of whether it should be straight or not, and if there was a lead vehicle blocking view of the arrows or the intersection, it seems likely 12.x will consistently make the wrong lane selection until Tesla has some closer to real-time map updates. Although I'm not even sure what the map update would be if assuming the painted arrows were actually both straight, does the map update system need to reason about when to ignore road markings or signage to say only the left lane can go straight? I suppose technically, is there anything preventing a merge within an intersection?

Although even without map updates as shown with 12.3's behavior, it seemed to handle it fine anyway, so this seems to be more of avoid awkward situation optimization?
 
I am trying to better understand the hardware, and whether it is sufficient to truly achieve self driving. The side cameras are angled back, which I presume is to view the blindspot during driving. I do not think that these adequately view cross traffic when trying to make a right or left turn. Traffic which is traveling directly perpendicular to the car when it stopped at an intersection does not appear to be adequately seen by either the front or side cameras. What do you guys think
 
  • Funny
Reactions: powertoold
I am trying to better understand the hardware, and whether it is sufficient to truly achieve self driving. The side cameras are angled back, which I presume is to view the blindspot during driving. I do not think that these adequately view cross traffic when trying to make a right or left turn. Traffic which is traveling directly perpendicular to the car when it stopped at an intersection does not appear to be adequately seen by either the front or side cameras. What do you guys think
There are two forward facing side cameras high up on the B-pillar (behind the driver's head), and one camera in the center is wide angle.. There is a full 360 deg view, though it is subject to obstruction by the front of the car (for parking) or nearby objects (for cross traffic).
 
  • Like
Reactions: sdtslafan
There are two forward facing side cameras high up on the B-pillar (behind the driver's head), and one camera in the center is wide angle.. There is a full 360 deg view, though it is subject to obstruction by the front of the car (for parking) or nearby objects (for cross traffic).
@sdtslafan
If you own a Tesla you can see the different camera views when you're in park. Select Service then scroll down to Camera Preview. You can then select which camera to view. As mentioned above the cameras have a 360 degree view. The problem is when you have obstructed views.
 
I am trying to better understand the hardware, and whether it is sufficient to truly achieve self driving. The side cameras are angled back, which I presume is to view the blindspot during driving. I do not think that these adequately view cross traffic when trying to make a right or left turn. Traffic which is traveling directly perpendicular to the car when it stopped at an intersection does not appear to be adequately seen by either the front or side cameras. What do you guys think
I don’t know if it’s helpful, but I recorded a drive a while back and stopped at all of my challenging intersections to show what the cameras saw vs what I can see by looking and leaning forward.
 
You mean the Kim guy? He was driving mostly highway in that video, only about 4-5 mins of city. Almost all of his commentary were highway situations if I understand correctly.


Yes. I got the feeling he didn't have a lot of experience driving FSD however I saw what he was talking about during the city portion. EArly on it sat at a stop sign and he mentioned last time he had to hit the accelerator. Also on a 3 lane road it couldn't decide which lane to be in. I fast forwarded through the highway portion.
 
Well, then they can simply remove the posts. It's odd that these people are deleting their entire accounts, along with YouTube in the case of teslaaigirl

I guess Tesla is asking them to do it or risk termination
Deleting specific videos means filtering through them, which may take time. Even if Tesla doesn't do it, the person's lawyer probably would advise to do it. They can rebuild their profile later if they want, after determining which are clean. Given these people aren't likely depending on their social media accounts as a primary source of income, there's probably little cost to them to delete their accounts, while there is potentially a big negative for keeping it around. Deleting it also shows commitment to not have future potential violations.
 
  • Like
Reactions: rlsd and JB47394
Yea except even chuck thought it was for validation in his post today. You say that they had drivers manually driving for weeks, but Chuck didn't know that (Chuck even said so himself that he didn't know). Nobody knew for sure until Elon confirmed it today. That was my point:


Before Elon confirmed, Chuck posted:

come on Elon, give Chuck v12, hes a great person...
 
As the project manager for a software team that needs to be larger than it is and we are always needing more thorough unit tests (aka we are hurting for people), there’s another big benefit to e2e control code.

Since your development team is now writing much less control code, you can now devote more of the team to unit tests. This may mean that the FSD team is able to write many more and more thorough unit tests to validate behavior and identify regressions.

This will be advantageous in several big ways, off the top of my head:

1. Unit tests are even more important with the “black box” nature of NN control code to ensure that we see many fewer regressions from one version to the next.

2. They will help reduce cycle time from version to version. Once the NN is retrained and the new network passes unit tests, you need less time validating that new network in the real world by employees before it can go wide.

This is another reason, along with more compute hardware, that the e2e approach should see significantly accelerated improvements compared to older versions.

The FSD team is now very close to achieving “Project Vacation”, the nearly fully-automated data engine that Karpathy spoke of many years ago, where improvements require much less effort by the team and the whole process gets more automated.

I assume the team will then devote more focus to highway and parking lot domains.
 
Last edited:
It is now official: 🤣 12.4 🔥 will be able to move to 24.2.x and STILL be behind.🤪

Screenshot 2024-03-14 at 5.29.11 AM.png
 
  • Funny
Reactions: FSDtester#1