Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Are you saying local news is more favorable to Cruise than to Waymo? The Waymo story seems more negative (focusing on a "stall") whereas the Cruise seems more positive (focusing on Cruise wanting to expand quickly).
That's just a sample of SF-local news on robotaxis. I don't have an opinion about one vs. the other
(Zoox will be there too), but I'm a bit strange, having never ride-hailed anywhere.

[Aside: For basic transportation within the city environs I take BART, SF Muni bus/trolley, or walk, and otherwise happily drive a Tesla (using FSDb a good fraction of the time). Besides these transportation modes (almost all done with renewable fuels), there are scooters & bikes for the brave. Uber/Lyft was funded by the subsidized techbro mentality; I believe it generally undermines quite decent public transportation and further clogs up the streets. Note: this does not speak for people who never would rub shoulders with social miscreants, ha!]

One Mission Local news comment on the controversial Valencia St. bike-lane is there are plenty of
non-robotaxi stalls, too, like double-parked UPS trucks, ride-hail cars, etc. It's street-specific what ends up happening there with the combination of driver impatience and unknown-length stall time.
 
Last edited:
  • Like
Reactions: diplomat33
but I'm a bit strange, having never ride-hailed anywhere.

I've only taken one ride-hail (Lyft) ride once when I was traveling to a work related conference and needed to go from the airport to the hotel. It was a very good experience. I've never taken a ride in a robotaxi yet. I hope I get that chance. I imagine it would be similar to any other ride-hail except for the fact that there is nobody in the driver seat and the car is driving itself. From what I've read from first rider testimonies, it can be an eerie experience for the first few minutes but then you get used to it.
 
It's 2 weeks old so not sure if it has already been posted but Mobileye did a nice little deep dive into their REM maps if anyone is interested. it shows how Mobileye is doing HD maps at scale.


One thing that is interesting is that Mobileye does not dumb down the REM maps for driver assist systems. They use the same REM map, with all the info, whether it is a basic L2 driver assist or L4 "eyes off".
 
Incredibly well done HW3 to HW4 comparison.
There are obvious differences, but a new HW3 compared to a new HW4 would have been better. The HW3 vehicle's forward camera looked smudged on the lower half. Perhaps 7 years of UV damage or some such thing.

Separate from that, does anyone know if what we see on recordings is the same data that FSD works with? My recollection is that it is not, and I wonder how much difference there is between the information that each generation delivers to FSD. Does FSD see something even 'clearer' than the HW4 video shows?
 
They use the same REM map, with all the info, whether it is a basic L2 driver assist or L4 "eyes off".
Audio was bad but I think they say they are crowdsourcing HD maps using cameras alone. Great idea!

Even if it is for lane-positioning only I would think a similar approach for Tesla would be relatively easy for them. While ME needs it for all of their autonomy, Tesla just needs to easily identify which lane to prefer much better. This is all built-in maps for Tesla today which makes construction and redesigned traffic flows problematic.
 
IMO, both vehicles contributed to the accident. The Cruise should not have tried to pass the semi on the left like that. That just puts the Cruise is a difficult spot if the semi does try to make a left turn. But I think the semi was also responsible because it tried to make a really dumb turn and was clearly not paying attention to other vehicles around it. The semi should have seen the Cruise vehicle and known that it was not safe to make that turn at that moment.

On a related note, I think this accident shows why accidents are not as simple as just blaming one party. And AV companies will naturally try to deflect blame by saying that they were not at-fault. This can potentially skew safety data by making the AVs look safer than they really are. We cannot just look at at-fault accidents as a metric for AV safety. Not being at-fault does not necessarily mean safe since there are things the AV could or should have done to mitigate the risk or possibly avoid the accident even if they were not at-fault.

Lastly, I also think this accident exemplifies why semi trucks should not be allowed on city streets. They ae too big and unwieldy IMO. So when they need to make a turn, they cause problems because they take up too much space and often have to use both lanes to make the turn.
Hard to tell without seeing what happened prior. If the semi was already pretty much all blocking the road like in the video, it would mostly be the Cruise's fault because it just stopped right in the path the turn where it would get clipped (even though it was the truck that does the clipping). I don't see a human ever doing something like this unless they were not paying attention at all.

It could be another path prediction error similar to how it rear ended the bus previously. It's possible it doesn't model the potential path of the truck correctly, given how it's split between the cab and trailer, so it stopped where it thought would be out of the path, but ended up still in the path.

If instead the truck didn't start the turn yet or just barely started it (haven't crossed the lanes yet) and the Cruise had no choice but to stop there, that is a different case.

Semis also have huge blind spots, so it's possible the Cruise vehicle was in its blind spot.
 
Audio was bad but I think they say they are crowdsourcing HD maps using cameras alone. Great idea!

That is correct. Mobileye crowdsources HD maps with cameras only. It is another big advantage since they can use the millions of cars equipped with front cameras already on the road to collect data for their maps. It is why they have been able to map all of US and EU roads so quickly.

Even if it is for lane-positioning only I would think a similar approach for Tesla would be relatively easy for them. While ME needs it for all of their autonomy, Tesla just needs to easily identify which lane to prefer much better. This is all built-in maps for Tesla today which makes construction and redesigned traffic flows problematic.

I've long said that Tesla should crowdsource maps with vision-only like ME is doing. Tesla has the large vision-only fleet to do it. It's one reason I started that thread on the 3 things Tesla should do to improve FSD.

But I think Tesla could use the maps for all of autonomy too. I mean, if you have maps with so much semantic info, why not take advantage of that to improve all aspects of driving, not just lane selection.
 
That is correct. Mobileye crowdsources HD maps with cameras only. It is another big advantage since they can use the millions of cars equipped with front cameras already on the road to collect data for their maps. It is why they have been able to map all of US and EU roads so quickly.



I've long said that Tesla should crowdsource maps with vision-only like ME is doing. Tesla has the large vision-only fleet to do it. It's one reason I started that thread on the 3 things Tesla should do to improve FSD.

But I think Tesla could use the maps for all of autonomy too. I mean, if you have maps with so much semantic info, why not take advantage of that to improve all aspects of driving, not just lane selection.
They dabbled in map creation prior to FSD, but seems to have largely abandoned it to third parties and switching to a more generalized lane logic.

The problem is that lane logic tends to be poorly optimized for a given road given it's mainly making decisions based on what it can immediately see. Ideally all the lane changes should be mapped as soon the destination is set and the car only changes it in response to traffic, but that does not appear to be anywhere close to how it works right now (it's more on the fly).
 
  • Helpful
Reactions: diplomat33
Hard to tell without seeing what happened prior.
100% agree. We don't know where the Cruise was when the Semi started its turn. If it was already next to the Semi (as their self-serving statement implies) then it's the Semi's fault. If the Cruise was 100 yards away and simply kept driving until the last half-second, then it's on them (at least "morally", legal fault depends on how much latitude SF traffic law gives to trucks making wide turns).
 


Journalist had a super embarrassing "stall" in a Cruise with a driving instructor. You can see the Cruise just lurch towards the median and stop, blocking traffic until Cruise dispatched a person to manually drive the car. And the situation did not look complicated at all, the intersection was clear. It's like the car just "tilted". It's incidents like this that definitely make driverless cars look bad.
 
Last edited:


Journalist had a super embarrassing "stall" in a Cruise with a driving instructor. You can see the Cruise just lurch towards the median and stop, blocking traffic until Cruise dispatched a person to manually drive the car. And the situation did not look complicated at all, the intersection was clear. It's like the car just "tilted". It's incidents like this that definitely make driverless cars look bad.
Sounds like some sort of sytem failure if the remote driver couldn't get it going. Bad timing for sure.
 


Journalist had a super embarrassing "stall" in a Cruise with a driving instructor. You can see the Cruise just lurch towards the median and stop, blocking traffic until Cruise dispatched a person to manually drive the car. And the situation did not look complicated at all, the intersection was clear. It's like the car just "tilted". It's incidents like this that definitely make driverless cars look bad.
In the previous incident with Waymo, some people here kept implying the situation was staged by the journalist. So far Waymo never came out to say that happened. In this case, it's even more clear that there was no staging, the Cruise just fell flat on its face handling a relatively simple construction zone. I wonder if the move to the median was after remote prompting.
 
Last edited:
Sounds like some sort of sytem failure if the remote driver couldn't get it going. Bad timing for sure.

I wonder if the autonomous driving stack was unsure about how to handle the construction and remote assistance tried to help but accidentally made it worse. The car suddenly lurching towards the median does not feel like something the autonomous driving stack would normally do on its own. Perhaps, the car was trying to follow instructions from remote assistance and failed.
 
  • Helpful
Reactions: scottf200
Cruise provided this explanation for the stall with the NBC reporter:

The vehicle encountered an unexpected construction zone, that would've expected several lane changes. The better course was for the AV to come to a safe stop rather than proceed.

What a terrible explanation! First, it implies that the Cruise AV cannot handle unexpected construction zones if they require lane changes which makes the tech sound rather wimpy and immature IMO. Lane changes should not be difficult. Second, the explanation does not address the fact the the stop was not safe since the AV pulled sideways and blocked 2 lanes. So even if the Cruise was correct to try to come to a safe stop, it failed to do so. And there is no explanation for why it failed to come to a safe stop.
 
Incredibly well done HW3 to HW4 comparison.

" AI DRIVR - Tesla Autopilot HW3 and HW4 footage compared (much bigger difference than expected): "
@petit_bateau @2daMoon @cliff harris @MC3OZ not sure if you guys saw this. Didn't think it made sense on the investment forum but this was a really well done look at the difference with hw4 and hw3. I expect to see 1 more hardware iteration and I give pretty good odds that we end up with lidar back in the system and that would make 2 hardware iterations. I'd expect it to be hitting near limited RT capability in 6 years with or without lidar it's just that without lidar seems a big risk and we know the only reason it was removed was costs and parts availability. Then they still have to do the same work that Waymo and Cruise have done in California before they can actually offer RT service in California. I would guess at least a year maybe 3 on that.

However, the point here is that HW3 does not appear to be sufficient on many levels.
 
  • Like
Reactions: petit_bateau
I expect to see 1 more hardware iteration and I give pretty good odds that we end up with lidar back in the system and that would make 2 hardware iterations.

When you say "lidar back in the system", grammatically, that implies that lidar was once in. But Tesla never used lidar. I think you mean radar. Tesla used to have radar and removed it. I agree that Tesla will likely put radar back in at some point.
 
  • Like
Reactions: petit_bateau