Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo’s “commercial” ride-hailing service is... not yet what I hoped

This site may earn commission on affiliate links.
For example if you are at a Michigan U turn with cars going 70 MPH and you are making a LEFT on the STOP sign (green/red path).
Your completely separate Vision system would then at that moment accept information from the right corner radar and maybe the right lidar to detect fast approaching cars.

Sounds great in theory, but not sure how it improves the system? Human drivers do exactly this because they have limited "sensors", and this can mean they sometimes caught out by unpredicatable scenarios. But with a full, multi-spectrum 360 sensor suite, and plenty of processing power, where is the benefit?

Might seem counter-intuitive, but taking this approach becomes more of a cost than a benefit. The cost is there because someone has to pre-define where the car should "focus it's attention" in any given scenario. Further, the car has to be able to recognise when it is in a particular scenario in the first place.

It is likely that this knowledge would be stored in the HD map, which limits the benefit to mapped areas. A true L5 car would need a backup policy for when an HD map is not available. The backup policy is likely to be to use the full 360 sensor suite all the time... and if you have to to do that, why take on the additional cost of manual scenario planning?

Maybe this is exposing a weakness of their system - they find it hard to define the trust relationships between the various sensors in a way that is flexible enough to manage real-world scenarios.
 
Sounds great in theory, but not sure how it improves the system? Human drivers do exactly this because they have limited "sensors", and this can mean they sometimes caught out by unpredicatable scenarios. But with a full, multi-spectrum 360 sensor suite, and plenty of processing power, where is the benefit?

I wonder if another way to look at it would be: particular scenario would define the trust relationships in a dynamic manner. When the maps show situation where fast approaching objects may appear from the sides perhaps behind obstacles placing more trust on radar than vision would be warranted for example.

What I find most compelling about @Bladerskb ’s description of MobilEye direction is the goal of independent vision and lidar/radar autonomy. Loss of vision in some direction is one great unanswered question in the Tesla sensor suite towards the sides and the back. Ability to offset such loss with driving on lidar/radar alone even for short while would sound very useful?
 
  • Like
Reactions: J1mbo
What I find most compelling about @Bladerskb’s description of MobilEye direction is the goal of independent vision and lidar/radar autonomy. Loss of vision in some direction is one great unanswered question in the Tesla sensor suite towards the sides and the back. Ability to offset such loss with driving on lidar/radar alone even for short while would sound very useful?

I agree - but this is redundancy, rather than policy.
 
  • Like
Reactions: electronblue
Show me some objective reviews of the vehicles stating their autonomous capability surpasses Autopilot.

You are talking of vehicles I am talking of MobilEye’s chips. How car companies use those chips is of certainly another question. I guess for those chip reviews we’d have to venture into specialist embedded developer forums. Closest of such opinions I think here is @Bladerskb who seems to work in the field. I am just saying MobilEye ships these discussed features they demo in production parts.
 
Last edited:
Show me some objective reviews of the vehicles stating their autonomous capability surpasses Autopilot.

The current autopliot is a direction other car makers decided not to go. The problem with it is that it gives back the control to the driver when it fails = disengages. Studies showed that it takes 6 seconds on average (from 2 to 20 sec) for a person who didn't pay attention to understand the traffic situation and take the right actions. It is a poor choice of Tesla to release something to public that is not ready and compromises safety. They are having owners testing the car. Owners risk their own life. If there is an accident owner pays for it. This is very much unethical and unfair to competitors.
 
The current autopliot is a direction other car makers decided not to go.
This is called sour grapes, I think.

for a person who didn't pay attention
That's not describing people that are using autopilot/autosteer. With this approach all cars should be banned because the driver might get distracted (very easy nowadays with cellphones and whatnot) and get into an accident.
 
This is called sour grapes, I think.


That's not describing people that are using autopilot/autosteer. With this approach all cars should be banned because the driver might get distracted (very easy nowadays with cellphones and whatnot) and get into an accident.

I work in this industry. I sat in an "autopilot" Audi in 2014. Now the current A8 has all the hardware for level 3 system in the production car. But it's not enabled. Why? Because it can not solve certain situations. Just like Tesla can't. I think this is called responsibility.

Why are autopilot users better? You don't hear about the accidents?
 
Why are autopilot users better? You don't hear about the accidents?
Did not you hear about all those accidents in cars without autopilot?
People don't do what they are told and get in trouble, news at 11. This happens in all sorts of cars including Teslas.

I don't know why Audi does not enable their "autopilot", but I know they enable cruise control and the fact that is cannot solve certain situations did not stop them. And there are lengthly disclosures in the manual about when not to use it and what it cannot handle.
 
  • Helpful
Reactions: strangecosmos
The current autopliot is a direction other car makers decided not to go. The problem with it is that it gives back the control to the driver when it fails = disengages. Studies showed that it takes 6 seconds on average (from 2 to 20 sec) for a person who didn't pay attention to understand the traffic situation and take the right actions. It is a poor choice of Tesla to release something to public that is not ready and compromises safety. They are having owners testing the car. Owners risk their own life. If there is an accident owner pays for it. This is very much unethical and unfair to competitors.

That comment is disingenuous. All auto companies sell cars to customers knowing they will get into accidents when the driver did not pay attention. Owners will take full responsibility when that happens. Operating autopilot equipped cars is no different than operating an ordinary car in this regard. There is nothing unethical or unfair to competitors who are not in able to make the same thing, not because they don't want to.

What is unethical, actually criminal, is to put cheat devices in your cars so you can sell a lot of them regardless of how much toxic NOx gases the cars emit to poison the owner and every person around the car.
 
The current autopliot is a direction other car makers decided not to go. The problem with it is that it gives back the control to the driver when it fails = disengages. Studies showed that it takes 6 seconds on average (from 2 to 20 sec) for a person who didn't pay attention to understand the traffic situation and take the right actions. It is a poor choice of Tesla to release something to public that is not ready and compromises safety. They are having owners testing the car. Owners risk their own life. If there is an accident owner pays for it. This is very much unethical and unfair to competitors.

Tesla’s have fewer accidents when Autopilot is enabled. Are you pro accidents?
 
This system, which is not enabled yet is more advanced than Level 2 systems such as Tesla’s Autopilot and Cadillac’s Super Cruise because it does not require the driver to pay attention to the car’s surroundings when activated.

If/when Traffic Jam Pilot is activated in Germany, it will only work on divided highways at speeds up to 60 km/h (37 mp/h), and will merely follow the car ahead of it, just like Traffic Aware Cruise Control (TACC). It will not automatically change lanes, or do anything more advanced than simply drive forward within a single lane at speeds up to 60 km/h.

So, is Traffic Jam Pilot more advanced than Navigate on Autopilot? Apparently different people have different opinions on this. I find automatic lane changes more impressive than slow TACC, even if the driver’s eyes are off the road.

A Tesla or a Cadillac can already physically do everything a 2019 Audi A8 will be able to do. To me, it’s what the car is physically capable of that is impressive.

Technically any Waymo or Cruise test vehicle with a safety driver is Level 2, since a driver is there monitoring at all times and ready to take over at any moment. Yet these test vehicles are more impressive to me than the Level 3 Audi. Because of the physical manoeuvres they can pull off.
 
  • Like
Reactions: kbM3
It can go in Level 3 mode up to 82 mph on the highway. But as mentioned. It isn't enabled.
The issue is the false radar detection. You can clearly see that in the accidents and phantom braking of Teslas.
My Tesla Model X is able to fly when you make the falcon wing doors flap real fast. But it's not enabled (hinge wearing issues?), so I cannot claim it's a flying car just yet ;)