Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car = 0, Idiot Human = 1.

This site may earn commission on affiliate links.
"As the Google [autonomous vehicle] proceeded through a green light at the El Camino Real intersection, its autonomous technology detected another vehicle traveling westbound on El Camino Real approaching the intersection at 30 mph and began to apply the Google AV’s brakes in anticipation that the other vehicle would run through the red light. The Google AV test driver then disengaged the autonomous technology and took manual control of the Google AV. Immediately thereafter, the other vehicle ran through the red light and collided with the right side of the Google AV at 30 mph. At the time of collision, the Google AV was traveling at 22 mph. The Google AV sustained substantial damage to its front and rear passenger doors. The other vehicle sustained significant damage to its front end."

Sounds like the human quite reasonably took over when the Google car started unexpectedly slowing at a green light. Then found out a few seconds later why the car was slowing. One of the reasons there needs to be better communication of what the car is seeing and thinking.
 
"As the Google [autonomous vehicle] proceeded through a green light at the El Camino Real intersection, its autonomous technology detected another vehicle traveling westbound on El Camino Real approaching the intersection at 30 mph and began to apply the Google AV’s brakes in anticipation that the other vehicle would run through the red light. The Google AV test driver then disengaged the autonomous technology and took manual control of the Google AV. Immediately thereafter, the other vehicle ran through the red light and collided with the right side of the Google AV at 30 mph. At the time of collision, the Google AV was traveling at 22 mph. The Google AV sustained substantial damage to its front and rear passenger doors. The other vehicle sustained significant damage to its front end."

Sounds like the human quite reasonably took over when the Google car started unexpectedly slowing at a green light. Then found out a few seconds later why the car was slowing. One of the reasons there needs to be better communication of what the car is seeing and thinking.

Yeah. This is one of the reasons Tesla is well ahead of the crowd in advanced Level 2 systems - they are continuing to refine a display showing what the car is thinking, while folks like Mercedes don't appear to realize they need that display yet.
 
"Even so, the anecdote is a bit unsettling when tech companies and automakers alike are rushing to get autonomous technology ready for the road"

This is a pretty dumb statement by the reporter. The autonomous car tried to avoid the accident, and its driver took over and as a result crashed.

That suggests we should get non-autonomous drivers off the road ASAP.
 
  • Like
Reactions: Yuri_G and Saghost
I'm trying to figure out how applying brakes when you're already in an intersection about to get hit helps you. If they hit the passenger compartment instead of the rear of the car, then the brakes could kill the passengers (including driver). If the car hits the rear, that's a much safer impact zone, crumple zone, and stuff, and it sort of just spins you around going in the same direction, rather than popping your head off, smashing you into bits, and stuff like that, such as if you hit your brakes and get T-boned.

But, if hitting the brakes causes the car to be hit where the passengers are rather than the front of the car or missing being hit altogether, then that might be preferable.
 
I'm trying to figure out how applying brakes when you're already in an intersection about to get hit helps you. If they hit the passenger compartment instead of the rear of the car, then the brakes could kill the passengers (including driver). If the car hits the rear, that's a much safer impact zone, crumple zone, and stuff, and it sort of just spins you around going in the same direction, rather than popping your head off, smashing you into bits, and stuff like that, such as if you hit your brakes and get T-boned.

But, if hitting the brakes causes the car to be hit where the passengers are rather than the front of the car or missing being hit altogether, then that might be preferable.

I'm not sure I understand the question? As I understand it, the Google car tried to slow down because it saw the driver getting ready to run the red light, then the Google driver intervened and forced the car to drive into the intersection, after which the other driver did run the red light and caused the accident.
 
I'm not sure I understand the question? As I understand it, the Google car tried to slow down because it saw the driver getting ready to run the red light, then the Google driver intervened and forced the car to drive into the intersection, after which the other driver did run the red light and caused the accident.
That's how I understood it too. If that's the case then the official scorer needs to change the score to:
Autonomous Car: 1
Human Driver(s): 0
 
  • Like
Reactions: Yuri_G and GoTslaGo
Yeah, I might have read that too quick. It absolutely could have been the Google Driver's error if he turned off automation, and applied the throttle.

However, if he turned off automation, nailed the brake while swerving in the direction of travel of the other car, he reacted correctly. It does not say the AV system was going to stop before the intersection, or just slow down.

On ABS cars, apply brakes until they pulse while steering the car into the safest area. If there is no safe area, do not aim at fixed objects like trees, utility poles, etc.
 
No, I scored it correctly. The AV car is probably destroyed. The AV car lost. The idiot in the car who blew the stop, won, he successfully hit the target.
Or, from a decision making perspective, the AV car was in the process of making the correct decision before it became a victim of two human errors. All depends upon how the end result is spun. :)
 
Or, from a decision making perspective, the AV car was in the process of making the correct decision before it became a victim of two human errors. All depends upon how the end result is spun. :)

The software should have hit the horn quickly and repeatedly. It's actually why they put horns on cars. The handbook that says the horn is a replacement for the middle finger was recalled.
 
I've suggested a couple times that Tesla needs an AutoHonk feature as part of the Autopilot suite. :)

I was on a Caribbean island and the driver was honking all the time for apparently no reason. Then I realized the other cars were honking too, and they were smiling. They were just saying "Hi, I'm Here Again!" back and forth.

In California, you'd get your arse road raged by doing that.

I like the Island Folk better...