Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Poll is current hardware sufficient for full self driving?

Is current hardware sufficient for full self driving?


  • Total voters
    96
  • Poll closed .
This site may earn commission on affiliate links.
Is this FSD chip 4.1, 4.5, 5.0, or dojo? Article says it is expected to be fifth generation.
Tesla plans to build a new generation of fully autonomous driving (FSD) chips using a 3-nanometer process.
Tesla has many electric vehicle-related chips produced by TSMC, such as the supercomputer chip "D1", which uses TSMC's 7nm process. The supply chain revealed that Tesla originally used Samsung's 14nm FSD for production. After successive iterations, and after considering mass product quality and production scale, the HW 4.0 self-driving chip was sent to TSMC and adopted the 5nm family. Production.
This 3-nanometer cooperation is expected to be aimed at building Tesla's fifth-generation chips. Under the future self-driving trend, chips will become the core of electric vehicles, and handing over to TSMC, which has higher yield and stability, is the best choice.
.
Article about the article:
 
Last edited:
No Tesla currently produced today will ever be capable of self driving. I would bet my net worth on this.
.....but Tesla's do self driving now and can do it by far the majority of the time. You need to qualify your statement in order for it to be correct. So let me help you:

No Tesla currently produced today will ever be capable of L4 or higher FULL Self Driving.
 
.....but Tesla's do self driving now and can do it by far the majority of the time. You need to qualify your statement in order for it to be correct. So let me help you:

No Tesla currently produced today will ever be capable of L4 or higher FULL Self Driving.
What Teslas do right now is not "self driving" by any reasonable interpretation of the term.
 
  • Like
Reactions: E90alex
What Teslas do right now is not "self driving" by any reasonable interpretation of the term.
"Level 3 vehicles have “environmental detection” capabilities and can make informed decisions for themselves, such as accelerating past a slow-moving vehicle. But―they still require human override. The driver must remain alert and ready to take control if the system is unable to execute the task."

I would call this "self driving."
 
What Teslas do right now is not "self driving" by any reasonable interpretation of the term.
Sure it is since when engaged the car IS self driving in L2. "Reasonable Interpretation" is a subjective term. Using the word FULL and defining the SEA level is objective. Also Tesla could easily offer L3 Full Self Driving right now if they wanted to do something like Mercedes is doing. So Tesla may give in (or give up on L4 or higher) and we may get a L3 Full Self Driving system at some point.
 
"Level 3 vehicles have “environmental detection” capabilities and can make informed decisions for themselves, such as accelerating past a slow-moving vehicle. But―they still require human override. The driver must remain alert and ready to take control if the system is unable to execute the task."

I would call this "self driving."

Current Teslas are not Level 3, they are Level 2.

Level 2 is "driving assist," the human in the driver's seat is responsible at all times.
 
  • Like
Reactions: Mark II and E90alex
As mentioned, Tesla has not achieved L3 so there is no current Tesla that is “self driving”. Tesla requires hands on wheel and full attention at all times, even with FSD beta. The driver is fully responsible for controlling the car even when the system is active. It may fail and give the “TAKE CONTROL IMMEDIATELY” notice without warning.

L3 does not require hands on the wheel or active attention. The liability is taken on by the automaker while L3 is active. It will warn the driver well in advance when they need to take over.

Personally I wouldn’t consider a car “self driving” until L4 or L5 where no driver attention intervention is required.
 
  • Like
Reactions: Mobile3228
Yeah. People struggle with understanding what the OEDR is, and the J3016 taxonomy isn't aimed at consumers.

@mikez10288 I recommend this, old but accurate, article:

The defining difference between Level 2 and Level 3 comes with OEDR, or “Object and Event Detection and Response.” OEDR is the geek term for doing what any driver does: keep an eye on any factors that might affect driving, especially safety, and deal with it (a core aspect of the “Dynamic Driving Task”). OEDR is defined in detail in the standard. Table 1 in SAE J3016 states that for Level 2, “the driver completes the OEDR subtask and supervises the driving automation system,” immediately taking control when conditions warrant. For Level 3, OEDR is handled by the Automated Driving System (ADS); the driver retains responsibility to be “receptive to ADS-issued requests to intervene.” The flow chart below from SAE J3016 is a great way to negotiate the “levels maze.”

When a Level 3 (ADS) system is driving, you don't have to perform the OEDR, but need to start doing it (and take over the driving task after a completed handover) when the system asks you to. During the handover process (typically 10+ secs), the system is still driving. The only piece of L3 regulation that I am aware or (UNECE R157) outlines the minimum requirement of a 10s hand over period.

L3: "Hey human, please start to drive mentally because I am about to leave my operational domain"
Human: Starts to perform the OEDR
L3: Beep. Take over. (optional step, to be repeated until takeover).
Human: OK.

In a Level 2 (ADAS) you ALWAYS need to perform the OEDR (look at the road and be prepared to handle anything), because the system isn't performing the full OEDR and the system can abort at any time and immediately.

L2: Take over immediately. Failure.
Human: Oh sh!t.
 
Last edited:
  • Like
Reactions: E90alex
Screenshot 2023-12-29 at 11.41.45 AM.png
 
  • Informative
Reactions: Mark II
That's Phil Koopman's slide and not the spec but pretty good.

I'm not entirely sure about the "Vehicle Failure", I don't think you can say "driver" categorically. That will depend on the L3 implementation and the failure. It's can't be a software failure with the ADS anyhow. That's likely covered by the ADS manufacturer liability. A flat tyre, probably handover/MRM too and so on. But again, it will depend on the OEM.

L3 is known as "eyes off". In an L4 you can sleep, because there is no handover procedure required.
 
That's Phil Koopman's slide and not the spec but pretty good.....
The main point was showing that in L3 the ADS is 100% responsible for the OEDR. Tesla's system does a great over all job with OEDR but of course is L2 and you still must be vigilant. Tesla's OEDR could work in a L3 if it used something like the extremely limited ODD Mercedes uses. But what is the point in offering L3 to <5% of the drivers for <5% of their driving time.
 
The Cruise accident where it dragged a person to me is proof that current hardware will never be FSD. Supervised FSD yes. Maybe can squeeze it out on freeways. But without being able to sense what is at the front bumper, it will be very very difficult to obtain FSD on current hardware. Need a front camera or other mechanism to know if dragging someone. What do you think?
It is strange that you make this conclusion about the hardware. The hardware is mostly fine IMHO, it is the software that needs to work correctly with the hardware. I have the FSD active on my car for over 2 years now and the progress is actually very very significant without any hardware changes.
 
This same poll could be taken regarding Humans driving high powered vehicle on existing streeets.

They currently get into millions of accidents every year. Often accidents are fatal to both drivers and passengers.
Human drivers are often distracted by other humans, baby on board, Wifey giving instructions, missed an exit, looking at cell phone, being tired/sleepy/bored. Fighting with others in the car, worrying about personal events, watching for police, disobeying traffic laws, worrying about other nearby drivers, not slowing for conditions, driving while ill/drunk/medicated/recreational drugs/emotional. Humans cause most of the vehicle accidents around the World currently. Perhaps they should also be labeled as unsuitable to pilot personal transportation....

Would be safer to just ban cars. Everyone walks, rides a bicycle, takes a train, bus, street car or airplane. While these form of transportation are much safer for mile traveled, they also have accidents and are inherently flawed.

Current situation is that a human can always simply drive their car themselves. They will take into consideration their current condition, experience and training to decide whether to drive themselves or to engage any drivers aids.
 
It is strange that you make this conclusion about the hardware. The hardware is mostly fine IMHO, it is the software that needs to work correctly with the hardware. I have the FSD active on my car for over 2 years now and the progress is actually very very significant without any hardware changes.
Does your car have the ability to see what is directly in front of its front bumper? If not, you don't perceive that as an issue? Should the car be able to see what is at its front bumper? Don't care if we are dragging people around? You are wanting to wait for super human software that will arrive in a future decade? Current state of the art A.I. requires lots of processing power and memory, much more than capable by the current hardware.
 
Last edited:
Does your car have the ability to see what is directly in front of its front bumper? If not, you don't perceive that as an issue? Should the car be able to see what is at its front bumper? Don't care if we are dragging people around? You are wanting to wait for super human software that will arrive in a future decade? Current state of the art A.I. requires lots of processing power and memory, much more than capable by the current hardware.
1) my cars still have ultrasound sensors. So they can "see" in front of their bumpers.
2) I am not sure why would that be an issue except when someone crawls under the bumper while the normal cameras are not working.
3) Just a few years back, we the humans driving the normal cars with no sensors haven't been that much paranoid by the idea that a little gnome is hiding under the bumper. So, why would that suddenly become such a big issue?
4) People are nasty, a little dragging may help. More seriously, I don't believe Tesla or GM Cruise have insufficient hardware to detect actual dragging of someone or something.
5) I don't want to wait for super human software, I would like instead that people keep being aware the cars, self-driving or not, will remain forever the sources of elevated danger and we have to pay attention to them both when riding and when being around them.
6) I am not working in the AI SD or hardware development, so I can't say for certain. But if my 2019 car can drive me to work and back (17 miles one way) sometimes with no interventions at all, I am pretty sure that a 2024 computer is even better and more powerful. Perhaps, if needed, older cars could be retrofitted with newer computers or even with newer cameras, but that would be an upgrade rather than a totally new system.