Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
That is interesting. The delay may have been the car calling home for help. "Is it appropriate to go around" with the staff saying "Yes". The visual cue to drive around is the open back door doing a delivery and delivery staff going in and out of the back. This doesn't seem like a detail the AI would have high confidence at this point: at least I hope they don't have thousands of examples of double-parked trucks doing deliveries yet.

Yeah, we don't know if the Waymo did figure it out on its own without remote assistance or if it phoned home and got the hint that it was ok to go around.

I am sure Waymo has examples of double parked trucks doing deliveries. In the 10M+ miles of autonomous driving, this can't be the first time they encountered a double parked truck doing deliveries.

Waymo also uses a technique called data augmentation where they can take a small sample and use ML to create variations of the sample to increase the variety and size of the sample. Data augmentation allows you to create a bigger and better training set than you otherwise could do if you just waited to encounter every case in the real world. So Waymo could easily create thousands of examples of double parked trucks doing deliveries to train their stack. If they have not done so already, I am sure they will now based on this example. Anguelov talks about data augmentation here:

 
Last edited:
You're describing an ADAS function, not an AV function.
Clearly. I was countering the assertion that I quoted. Software exists that allows a driver to break the law. if NHTSA was hot to prevent lawbreaking, all they'd have to do is tell manufacturers to stop enabling lawbreaking with their software.
If you think the driver should get the speeding ticket while the car was in L3+ mode, then I respect that viewpoint, but I think you'll have quite a few traffic court arguments.
I didn't say a word about SAE levels or speeding tickets but, yes, I would assign the blame to the driver for putting a vehicle on the road that does something illegal. If owners want to seek remedy against the manufacturer, they can pursue a class action lawsuit.

In the short term, lawmakers will enforce laws that assume that most vehicles on the road are driven by people (no rolling stops). In the long term, lawmakers will set up the laws specific to autonomous vehicles (rolling stops are fine for autonomous vehicles). Ultimately, lawmakers will be able to assume that all vehicles are autonomous and set up laws just for them (no more stop signs).
 
  • Like
Reactions: Larry Hutchinson
And Waymo also uses a technique called augmented data where they can take a sample and use ML to create variations of the sample to increase the variety and size of the sample. So Waymo could easily create thousands of examples of double parked trucks doing deliveries to train their stack. If they have not done so already, I am sure they will now based on this example.

Yeah, you can get an order of magnitude out of that technique, which is why I suggested they only needed 10,000 real-world samples.

A parked delivery vehicle is really hard to differentiate from a stopped delivery vehicle or just a flat-bed with an odd load (glass or granite held vertically on racks).

You've got van with door on back and a ramp, door on back and a lift, door on side and a ramp, flat-beds with a crane, flat-beds with a fork-lift, and that's before you get into more specialized items like a concrete truck. Not a lot of common hints between those, other than they're stopped for a long time, but that requires waiting to confirm they're parked and not temporarily stopped.

Beyond all that, it's not uncommon for people to ride on the back of trucks: garbage trucks are an obvious and common example in the USA but any and all trucks would count in parts of Asia. Training based on person location, without a geographic element, will not work and behaviour samples based on geography explodes the amount of training data required.
 
NHTSA has muddied the waters here... As they made Tesla recall FSDb, an ADAS system, to make it stop rolling stop signs, and to more proactively adjust to speed limits. But yet they still let the driver set it to go faster than the speed limit, at least for now. (And even after the initial recall for coming to a full stop at a stop sign, they made them have it stay stopped longer, as they didn't like how quickly it continued on.)
I can bet all I can that when FSD roll out (non beta) there will be no option to change speed limits +/- . It will strictly follow the speed limits. Enjoy!
 
As it should.
Assuming you want to limit the rollout of AVs that are safer than human driving.

Don‘t you think there may be backlash of people fed up with AVs poking along at 5 or 10 mph under the prevailing conditions?

Traffic laws and such are set by humans who assume vehicles are driven by other, perhaps texting humans, not by at least potentially, much more capable machines.

Continuing with this traffic law absolutism will only result in more deaths In the long run.
 
Assuming you want to limit the rollout of AVs that are safer than human driving.

Don‘t you think there may be backlash of people fed up with AVs poking along at 5 or 10 mph under the prevailing conditions?

One of the things that makes AVs safer is that they don't speed. If they broke the speed limit laws, they would be less safe. Sure there could be some backlash just like some humans get annoyed with other human drivers who drive "too slow". But public safety is more important than public opinion. We should not limit the rollout of safer AVs just because humans are annoyed by them. That would be dumb.

Traffic laws and such are set by humans who assume vehicles are driven by other, perhaps texting humans, not by at least potentially, much more capable machines.

Maybe the traffic laws should be changed. There is actually a good argument to be made that AVs don't need to drive as slow. But until those laws are changed, they should be followed.

Continuing with this traffic law absolutism will only result in more deaths In the long run.

Following traffic laws save lives, they don't cause more deaths. I know that law enforcement does not always enforce traffic laws properly but there is a good reason we have traffic laws in the first place. It's for the public's good. And there is a reason cops will ticket you when you speed. Speeding is against the law!
 
  • Like
Reactions: Dewg
Waymo looks smooth, decisive, and unsurprised by traffic and pedestrians near roundabouts versus FSD's abrupt start/stop and wacky steering input.

I think it is a testament to Waymo's hard work in all three parts of the stack (perception, prediction and planning). Good perception is key because the car needs to know where everything is. Good prediction is key because the car needs to anticipate where things will be in the future. Knowing where objects will be in the future, will allow the car to plan a smoother path. And planning is key because the car needs to plan the right path and action. When all three are very accurate and reliable, the car will be able to drive more decisively and smoothly. And last year, Waymo switched to their next gen ML planner and said it really helped make the car drive in a more human and confident manner.

I don't think Tesla has done enough work on behavior prediction and planning (I suspect the new foundational world model and end-to-end is an attempt to improve prediction and planning). Poor prediction will lead the car to not be sure what traffic or pedestrians. I've noticed FSD can be very jerky and overly slow around pedestrians, likely because it is not sure what the pedestrians will do. And better planning can smooth out the steering and braking of the car. It is also possible that Tesla's vision is sometimes not sure about the exact geometry of the roundabout which could lead to jerky steering.
 
Following traffic laws save lives, they don't cause more deaths.
You might have missed my point. The status quo is approximately 40,000 deaths per year in the US. Anything that delays automated driving continues that statistic. No amount of yelling at people to not speed and not drive drunk will have any effect.

Forcing autonomous vehicles to drive much slower than prevailing traffic will both slow the transition due to public backlash and be dangerous in itself. I think it is clear that all lanes moving at the same speed is safer than everyone having to change lanes to get around slowpokes. I suspect at some point, an AV is going to get shot at.

And again, I am not talking about dangerous speeding but rather the 5 or 10 mph over that humans drive.
 
  • Like
Reactions: MP3Mike
Waymo seems very confident that the Waymo Driver is improving road safety. So hopefully, we see Waymo scale faster, maybe even adapt the tech to consumer cars soon. After all, it would not make sense to hold the tech back if it is as life-saving as they claim.

@Bladerskb I think you've said that Waymo will scale big soon (2024?). Their confidence in their safety would support that.



 
Last edited: