Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
I don't think it's possible. Sure you could probably train the system to recognize any particular attack but humans are inventive. AI is very bad at recognizing things that have never happened before.
No, I'm just talking about getting new video from AI when there isn't enough training data available. Basically simulation - but use externally available AI so that AI generated images / video are different enough from what Tesla already has.
 
(...) lidar would detect a pedestrian, the camera vision would detect the pattern of a stop sign at the exact same location as the pedestrian. So the perception could learn that this stop sign is not real because it is "inside" the pedestrian and real stop signs are distinct from people. Additionally, if the stop sign is moving with the same velocity as the pedestrian that is another clue that it is a false since stop signs don't move like pedestrians. (...)

But that's exactly what I'm talking about. Both of your detection scenarios could lead to construction workers not being picked up as a stop sign.

If the stop sign velocity = worker, not a stop sign?
If the stop sign is held close to the worker, not a stop sign?

That's why personally a lot of this IMO never works well without a LOT more effort from the gov, such as publishing open data feeds on light timing (previously mentioned), construction zones, lane closures, etc.

If the gov did that, it would likely solve the majority of this case (no construction zone nearby, ignore this "ped-stop-sign"). It would have prevented the pile of Waymo's refusing to enter the freeway when directed. Would prevent them from driving into wet concrete. Etc.
 
But that's exactly what I'm talking about. Both of your detection scenarios could lead to construction workers not being picked up as a stop sign.

If the stop sign velocity = worker, not a stop sign?
If the stop sign is held close to the worker, not a stop sign?

No, you misunderstand. I was just giving one tool that would help detect a true stop sign. It would not be the only way. Lidar can distinguish the difference between a construction worker holding a stop sign. And camera vision would also be used to detect true stop signs. So in your scenario, the construction worker holding a stop sign would be picked up as a stop sign because the camera vision would detect and classify a construction worker holding a stop sign. And the lidar would detect a person holding a sign. So it would know that it is a true stop sign.

That is why I said that camera vision needs to be more detailed. Camera vision needs to be able to distinguish between a construction worker holding a stop sign vs a pedestrian with a stop sign printed on their shirt vs a real stop sign at an intersection. Part of that is the camera vision needs to be able to classify the parts. So it needs to classify a construction worker alone, a construction worker holding a sign, a regular pedestrian etc...
 
Waymo pulls out with no way to go left, so it starts driving the wrong way in traffic...


Smells like a regression somewhere to see two instances of this so close together.

Yeah that is very bad. Deliberately driving the wrong way in a lane, especially with traffic, should never happen. And mistakes like this give ammo to the critics who say that AI is not ready to drive a car yet. I find it frustrating that AI can seem so good sometimes at driving and at other times, be so dumb. Driving is one of those tasks that seems so obvious to humans but seems to be perplexing to AI. I want to see autonomous driving that is safe and reliable.

The mistake is very odd to me because we would assume that Waymo's camera vision and lidar would see that it is driving the wrong way based on how other cars are facing, plus it has HD maps that also tell the car the correct direction of travel. Furthermore, if the left turn was blocked, the obvious thing to do would be to reroute right. The Waymo could have just turned right.

Yeah, I agree there seems to be a big regression in the ML planner. Maybe in order to try to solve some of the issues with the Waymo needing remote assistance to reroute, Waymo is training the ML planner to reroute on its own. So the ML planner, when faced with a blocked path, is looking for an alternative route but it is somehow ignoring the rules of the road about direction of travel. In this instance, I think the Waymo wanted to do an unprotected left but the path was blocked by cars in the suicide lane, so the ML planner saw another lane that was free and took it. I am getting the impression that Waymo is doing more with all ML in the planner and the ML seems to be hallucinating. Thus, we are seeing the planner doing odd things.

Waymo did respond to the tweet so they are aware of this issue. Hopefully, Waymo fixes these regressions quickly. Tesla FSD has regressions but it is supervised. Waymo is not supervised. These regressions could cause a serious accident which would be very bad for Waymo.

Imagine the media's collective heads exploding if a Tesla on FSD did this...

Believe me the media will have a field day bashing Waymo over this. The fact is that any AV that makes an egregious mistake should be called out. If Tesla FSD was so heavily criticized by the media it is because it used to make a ton of these mistakes. Fortunately, the latest V12 makes fewer mistakes, so Tesla FSD is less of a target nowadays. Also, Tesla FSD is supervised so the media never gets to see most mistakes because the driver disengages. Waymo is actually more of a target since they are driverless. There is no safety driver to "hide" the mistake. So when Waymo makes a mistake, it is in plain sight for everybody to see. And Waymo will be heavily criticized for this mistake.
 
Last edited:
  • Like
Reactions: Dewg
ABC7:

Waymo vehicles are disregarding traffic laws by driving in bus only lanes, and then stops to make an (illegal) left turn when signs say no left turn.


I do find these traffic violations (driving in bike lane and driving in bus only lane) to be a bit odd since we might assume that HD maps would automatically prevent these issues by labeling those lanes as off limits. So either the HD maps were wrong (the lanes were not correctly labeled in these two incidents) or Waymo is relying more on the ML planner and deprioritizing the HD maps. I am guessing the latter. It would make sense that Waymo would not want to be strictly tied down to HD maps rules as that would be too rigid and would not work well for edge cases. After all, there are exceptions where you do have to violate the law. For example, crossing a double yellow lane to go around an accident. As the video says, AVs need to be less rule based and rely more on AI in order to be flexible and handle edge cases. But I trust Waymo is actively working to train their perception and ML planner to solve these issues. It is one thing to break the law in a very special emergency situation, it is quite another if the AV is routinely flaunting the rules. The latter would not be good.
 
  • Like
Reactions: Doggydogworld
SFMTA says taxis are allowed in the lanes.
Waymos are taxis.
 
  • Like
Reactions: spacecoin