Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Sydney - New M8 Tunnel weirdness

This site may earn commission on affiliate links.
GrimRe: I take on board your comments and I must say I haven't seen the actual Tesla figures with the small print that probably goes with them.
I still stand by what I said, and bear in mind that I am projecting forward to the supposed Level 5 (Which I do not think will ever be achievable in current vehicles.
Let me ask you: do the tesla statistics include those miles where driver intervention has taken place? If they do then my comment stands. It's like saying there were no accidents if you don't take into account the accidents prevented by the driver!
I've just driven back Newcastle - Hawks Nest using the Navigate on autopilot on the freeway and had to submit 3 or 4 "bug reports" due to the car attempting to take an off ramp before swerving back onto the freeway and several phantom braking events. One of these was quite sudden and severe when we approached the turning off to Medowie, with the message saying it was stopping for something - it's done that consistently both North and Southbound at the same point.
(I don't count the driving from South Newcastle to the freeway because it isn't technically usable there although I did put it in several times and had a few scares before I abandoned it.)
I think the way that speed limits are so often incorrect gives a clue to the fundamental problem - the cars are only as good as the data sent to them - they are obviously very clever but nowhere near intelligent in the sense of being near level 5.
I look forward to the complete rewrite we are being promised, and whether it actually makes any fundamental progress towards level 5.
 
  • Informative
Reactions: Hungry Mile
GrimRe: I take on board your comments and I must say I haven't seen the actual Tesla figures with the small print that probably goes with them.
I still stand by what I said, and bear in mind that I am projecting forward to the supposed Level 5 (Which I do not think will ever be achievable in current vehicles.
Let me ask you: do the tesla statistics include those miles where driver intervention has taken place? If they do then my comment stands. It's like saying there were no accidents if you don't take into account the accidents prevented by the driver!
I've just driven back Newcastle - Hawks Nest using the Navigate on autopilot on the freeway and had to submit 3 or 4 "bug reports" due to the car attempting to take an off ramp before swerving back onto the freeway and several phantom braking events. One of these was quite sudden and severe when we approached the turning off to Medowie, with the message saying it was stopping for something - it's done that consistently both North and Southbound at the same point.
(I don't count the driving from South Newcastle to the freeway because it isn't technically usable there although I did put it in several times and had a few scares before I abandoned it.)
I think the way that speed limits are so often incorrect gives a clue to the fundamental problem - the cars are only as good as the data sent to them - they are obviously very clever but nowhere near intelligent in the sense of being near level 5.
I look forward to the complete rewrite we are being promised, and whether it actually makes any fundamental progress towards level 5.

You are correct in that in either case there is no metric for accidents prevented. The times a human takes over control is obviously relatively frequent but therein lies my point. If you are using AP correctly your cognitive load shifts from mundane lane keeping, collision prevention etc to higher level tasks such as maintaining AP vigilance and stepping in when necessary.

Is that safer than totally manually? It might just be and the data seems to suggest it. In other words because AP does make mistakes, you as a driver are more aware of the environmental situations that would cause AP malfunction and step in when necessary. This is contrasted to driver fatigue where you stop paying attention to lower level tasks like stopped traffic in your lane (which AP will reliability react to)

To your other points:

  • Tesla don’t subscribe to the SAE leveling system of autonomy. They have their own definitions in place which is very confusing. Basically to them level 5 is the car attempting a manoeuvre (successful or not) as it shows the fundamental neural net is in place for the SAE version of level 5. Once the car can attempt and the underlying architecture is scalable it’s just a matter of improving reliability through data and neural net training.
  • It will be 6-12 months before we see HW3 FSD attempting most tasks in city level driving and 3-5 years before it’s reliable enough for SAE level 5.
  • Speed sign vision based detection is trivial but not implemented due to egregious MobilEye patent of the same. Tesla have a clever way around this to eventually AP will read and comprehend road signs (not just speed limits).
  • Fundamental rewrite will underwhelm at first but was necessary to improve scalability in the future. 4D labelling is only way to achieve complex planning.
  • If RoboTaxi launches it will geofenced to certain routes that include manoeuvres it can handle.
 
>>They have their own definitions in place..........<<
Wow! Is that "official"? The BMW definition of 5 includes:

"Unlike levels 3 and 4, the “Full Automation” of level 5 is where true autonomous driving becomes a reality: Drivers don’t need to be fit to drive and don’t even need to have a license. The car performs any and all driving tasks – there isn’t even a cockpit. Therefore every person in the car becomes a passenger, opening up new mobility possibilities for people with disabilities, for example." (My underlines)

If Tesla's official definition of five is significantly different from the agreed formula one wonders what their definition of an "accident" is! Seriously, does it include the times the autopilot has to be overridden?

As regards cognitive loading, it's very well established that monitoring automation is more liable to result in misreading or mistaking than actually being in manual control over longish periods.

Don't get me wrong - I would love to see full autonomy available in my lifetime even though I enjoy driving: I just don't see that any of the present attempts are going to get there except in limited areas.
 
>>They have their own definitions in place..........<<
Wow! Is that "official"? The BMW definition of 5 includes:

"Unlike levels 3 and 4, the “Full Automation” of level 5 is where true autonomous driving becomes a reality: Drivers don’t need to be fit to drive and don’t even need to have a license. The car performs any and all driving tasks – there isn’t even a cockpit. Therefore every person in the car becomes a passenger, opening up new mobility possibilities for people with disabilities, for example." (My underlines)

If Tesla's official definition of five is significantly different from the agreed formula one wonders what their definition of an "accident" is! Seriously, does it include the times the autopilot has to be overridden?

Not to get too "conspiracy" but how many autonomous miles do you think Tesla has registered with the California state in 2019? It was 12. Basically nothing. This is because AutoPilot is a level 2 system (by the SAE standards) and so is FSD. It appears as they they will slowly but surely add more functionality to eventually become a fully loaded level 2 system as way of developing a level 5 without explicitly saying so.

Its all part of the same thing. They aren't developing an autonomous car as a standalone engineering effort, they are iterating over and over on the existing platform and then making fundamental changes when and if the iterations make it apparent.

As regards cognitive loading, it's very well established that monitoring automation is more liable to result in misreading or mistaking than actually being in manual control over longish periods.

I've read the 2019 Strapel study (if that's what you are referring to) on this topic and it only proves my point. The perceived cognitive load is reduced when using AP on a highway however the actual cognitive load is marginally higher. What the study doesn't present is how much this impacts fatigue. My assertion is that its better to be cognitively loaded with high level tasks that lower.

Don't get me wrong - I would love to see full autonomy available in my lifetime even though I enjoy driving: I just don't see that any of the present attempts are going to get there except in limited areas.

The biggest question here is will HW3 get Tesla to the point they can operate a RoboTaxi fleet? My intuition says yes as there appears to be enough headroom to run a sufficiently trained Neural Net to perform 99% of driving tasks with a high degree of confidence.
 
Additional potential causes for sudden slowdowns:
  • Route passes over or under roadway with lower speed limit. Your car fantasizes life on the road not taken, adopts the lower speed to enhance the fantasy. If I understand Google Maps M8 Sydney depiction correctly, this is likely cause.
Subsequent items apply to Autopilot 2 and above.
  • Traffic in adjacent lane is slower. Autopilot figures they know something it doesn’t, succumbs to peer pressure.
  • Vehicle on the side appears to be preparing to enter or cross the road.
  • Vehicle on exit ramp (slipway?) slows down. Autopilot fears that driver may suddenly regret the intended departure and veer into the lane.
I used Enhanced Autopilot 2.0 for nearly the entire 400 mile (nearly 700 Km) journey between primary residence and vacation house. This includes NY metro area and several other densely populated areas.

Navigate on Autopilot was enabled, My driving was mostly scanning the way ahead and activating the turn Indicator to initiate lane changes.

Tesla driving capabilities are not “Cruise and snooze”. They do take an enormous burden off the driver, allowing more focus on the road further ahead and closer potential risks.
 
  • Funny
  • Informative
Reactions: OzVic and Hairyman
>>I've read the 2019 Strapel study (if that's what you are referring to) on this topic and it only proves my point. The perceived cognitive load is reduced when using AP on a highway however the actual cognitive load is marginally higher. What the study doesn't present is how much this impacts fatigue. My assertion is that its better to be cognitively loaded with high level tasks that lower.<<

My comment you are referring to was more about the effectiveness, not the burden or otherwise.

Humans are pretty good at doing things, not so good at monitoring repetitive things: computers are good at repetitive things and OK at monitoring manual actions except novel situations.

I think the next ten years will show whether full autonomy in the level 5 sense is legally possible even though I probably won't be around to see the answer!
 
Thinking about our exchanges with GrimRe, I suggest that a down-to-earth autonomy test - rather like the Turing test, actually, is whether the thing would be able to drive from A to B as well as an average driver. Given that an average driver makes misjudgements occasionally I think that's fair?
 
Thinking about our exchanges with GrimRe, I suggest that a down-to-earth autonomy test - rather like the Turing test, actually, is whether the thing would be able to drive from A to B as well as an average driver. Given that an average driver makes misjudgements occasionally I think that's fair?
I’ve been keeping count this week of how many times I saved AP from pranging my car vs how many times it saved me. The count is currently 12-0. Basic real world fact. Yesterdays heavy phantom brake almost resulted in a major rear ender. I saved my car with the tesla acceleration. I know AP isnt perfect and its under development
 
  • Informative
Reactions: baillies
I drove on A/P from Tea Gardens to Raymond Terrace yesterday and on the highway made something like 5 or 6 bug reports, including a near rear-ending because of the usual braking at the usual place and a swerve off at a non- intended exit followed by a swerve back on.
No-one would pass a learner driver test if they drove like this, and I cannot believe I'm the only one who has these experiences.
Oh, and on the way back it tried to stop for traffic control - on the freeway without even a turnoff!
 
  • Informative
Reactions: Hungry Mile
I drove on A/P from Tea Gardens to Raymond Terrace yesterday and on the highway made something like 5 or 6 bug reports, including a near rear-ending because of the usual braking at the usual place and a swerve off at a non- intended exit followed by a swerve back on.
No-one would pass a learner driver test if they drove like this, and I cannot believe I'm the only one who has these experiences.
Oh, and on the way back it tried to stop for traffic control - on the freeway without even a turnoff!
You arent the only one