Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Speculation: HW4 at Tesla AI Day 2 and already being installed

This site may earn commission on affiliate links.
That does not bode well for Tesla's robotaxi efforts.
Or it means they're really trying to finalize a product (versus showing something in its infancy like the bot).

Really I don't see any point in releasing HW4 until they can be sure that it will allow driverless operation. Of course I've always been a huge skeptic of "door to door" supervised automated driving. I think you actually need a certain error rate to keep people attentive (or maybe some limitations that require frequent driver input).
 
Or it means they're really trying to finalize a product (versus showing something in its infancy like the bot).

Really I don't see any point in releasing HW4 until they can be sure that it will allow driverless operation. Of course I've always been a huge skeptic of "door to door" supervised automated driving. I think you actually need a certain error rate to keep people attentive (or maybe some limitations that require frequent driver input).
I was driving on FSD. I woke up and wasn't aware where I was. I'm disappointed I'm able to fall asleep with my eyes open, facing forward with a hand on the wheel. I think I tuned out for less than 10 seconds.
 
For what it's worth, Dr. Know-It-All speculated Optimus is running on a HW4 chip in his latest video. He noted the board only had one chip, instead of the usual two, and when he asked a Tesla engineer if they were using a new chip for it, they gave a somewhat surprised and guilty look, but no answer.
 
Or it means they're really trying to finalize a product (versus showing something in its infancy like the bot).

That's never stopped Tesla in the past. And why would they be ok with rolling out the bot in its infancy, but not be ok with rolling out a robotaxi prototype that is still in its infancy?

Really I don't see any point in releasing HW4 until they can be sure that it will allow driverless operation. Of course I've always been a huge skeptic of "door to door" supervised automated driving. I think you actually need a certain error rate to keep people attentive (or maybe some limitations that require frequent driver input).

First, Tesla needs to be clear about their goal. The sensors and hardware will depend on whether the vehicle is designed to be driverless robotaxi or "door to door" supervised automated driving. That makes a big difference. Tesla needs to decide which type of automated driving they are designing and then tailor the sensors and hardware for that. If they are doing L2 door-to-door, Tesla needs to update the cameras and add cameras to help see fast cross traffic, and upgrade the computer to handle the extra processing with more redundancy. And they need proper, reliable camera based driver monitoring. If they are doing driverless robotaxi, they need 360 degree HD cameras, 360 degree HD lidar, 360 degree HD radar and the computer to handle the extra processing with enough redundancy.
 
I was driving on FSD. I woke up and wasn't aware where I was. I'm disappointed I'm able to fall asleep with my eyes open, facing forward with a hand on the wheel. I think I tuned out for less than 10 seconds.
I'm disappointed in that too, but how would HW4 or HW3 help though? If you're sleeping with your eyes open facing forward for 10 seconds with hand on the wheel how would any of the car companies' systems notice?

Glad you woke up alive.
 
I'm disappointed in that too, but how would HW4 or HW3 help though? If you're sleeping with your eyes open for 10 seconds with hand on the wheel how would any of the car companies' systems notice?

I believe many of the camera based driver monitoring systems can detect drowsiness or signs of sleepiness. So they can detect when our head tilts down because we are tired or when we shake our head to try to stay awake. So if @DanCar showed any signs of tiredness, a good camera based driver monitoring system could have alerted him to pay attention and maybe prevented him from falling asleep for a few seconds.
 
  • Like
Reactions: Dan D.
I believe many of the camera based driver monitoring systems can detect drowsiness or signs of sleepiness. So they can detect when our head tilts down because we are tired or when we shake our head to try to stay awake. So if @DanCar showed any signs of tiredness, a good camera based driver monitoring system could have alerted him to pay attention and maybe prevented him from falling asleep for a few seconds.
Would they catch the zoned out 1000 yard stare? I guess if your eyes aren't moving enough. Maybe we need more biometrics, HR, O2 sat, to tell if we're alert or sleeping upright.
L2/L3 need to be better monitored, and with less time delay between nag checks.
 
  • Like
Reactions: diplomat33