Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

GM’s new Ultra Cruise: Hands-free driving on all paved roads in US/Canada

This site may earn commission on affiliate links.
Also, as a lay person, at a conceptual level, how can cameras alone be sufficient for L5 or even L4? What if the cameras are blocked even if for a brief moment while the car is self driving at highway speeds? There isn’t even a cleaning mechanism or washers. What if the cameras are blinded by fog, snow, slush etc? There are million what if scenarios that could render cameras alone inadequate. If I were a betting man, I would bet on Waymo or Cruise getting there first and Tesla having to revisit their sensor choices. Elon has disappointed for too long on this front.

As a lay person, the first thing you need to do is familiarize yourself of the strengths and weaknesses of all the sensor types. You mentioned a few situations where visible light cameras might struggle. But do you know how those situations impact other sensors? And can you rely on those sensors if your primary sensors (light cameras) are not working? In most cases, the answer is no.

Radar is the only sensor that reliably sees through weather obstruction, but it is incredibly low resolution that it cannot be relied upon as a backup when light vision is down.

Lidar also relies on visible light bouncing off objects, like cameras, but it uses lasers to actively bounce light off objects, whereas cameras rely on ambient light (passive) to do the bouncing.. Lidar can detect objects in pitch darkness while cameras will fail. But Lidar will also degrade in fog/snow, as the weather will weaken the laser beam on its round trip, requiring a higher power laser. Not reliable as a backup for primary vision. Lower cost lidar will also have a hard time reading signage and determining object density, as it only see things as solid or not there.

Now, for both radar and lidar, there are different levels of performance. To overcome environmental challenges, you can go with more expensive radar/lidar to get better performance, but that may not be viable from a business perspective.

Sensor fusion also means increased chances of false positives and negatives.

The primary challenge of autonomy is not perception of the environment. I'm not suggesting Tesla has solved perception. Just pointing out it's the easy part. The harder part is getting the car to do the right thing when presented with a picture of the environment. Throwing more sensors at the problem is just addressing perception, not the decision-making.
 
If I remember correctly, he even claimed AP1 would allow automatic lane changes. Using what sensors? Low speed ultrasonic sensors? AP1 didn’t even have side cameras, let alone radars. Claims like that are downright nonsensical IMO.
AP1, which is what my first Model S had, did have both Auto Lane Change and radar, so I don’t know where you’re getting your information.
 
AP1, which is what my first Model S had, did have both Auto Lane Change and radar, so I don’t know where you’re getting your information.
It only had forward facing radar, which is not relevant to lane changes. And if AP1 actually changes lanes without cameras, radar or LiDAR and just using previous gen. ultrasonic/parking sensors, that’s idiotic and unsafe.
 
I don't understand why people think that the detailed mapping of road environments that these systems require is a one shot deal.
Roads change, junctions change, road markings and signs change - when your system relies upon detailed and accurate data of your surroundings to work effectively then you have to keep mapping and remapping in perpetuity.
 
As a lay person, the first thing you need to do is familiarize yourself of the strengths and weaknesses of all the sensor types. You mentioned a few situations where visible light cameras might struggle. But do you know how those situations impact other sensors? And can you rely on those sensors if your primary sensors (light cameras) are not working? In most cases, the answer is no.

Radar is the only sensor that reliably sees through weather obstruction, but it is incredibly low resolution that it cannot be relied upon as a backup when light vision is down.

Lidar also relies on visible light bouncing off objects, like cameras, but it uses lasers to actively bounce light off objects, whereas cameras rely on ambient light (passive) to do the bouncing.. Lidar can detect objects in pitch darkness while cameras will fail. But Lidar will also degrade in fog/snow, as the weather will weaken the laser beam on its round trip, requiring a higher power laser. Not reliable as a backup for primary vision. Lower cost lidar will also have a hard time reading signage and determining object density, as it only see things as solid or not there.

Now, for both radar and lidar, there are different levels of performance. To overcome environmental challenges, you can go with more expensive radar/lidar to get better performance, but that may not be viable from a business perspective.

Sensor fusion also means increased chances of false positives and negatives.

The primary challenge of autonomy is not perception of the environment. I'm not suggesting Tesla has solved perception. Just pointing out it's the easy part. The harder part is getting the car to do the right thing when presented with a picture of the environment. Throwing more sensors at the problem is just addressing perception, not the decision-making.

I’m in no way an expert on this, but my biggest concern is lack of forward and side/corner radars. I understand they’re low res but at least it can detect vehicles ahead/to the side in case cameras temporarily lose visibility. Also, my pessimism in Tesla getting this right, Tesla has lost credibility in autonomy with the broken timelines and hardware upgrades and changes after stating HW2 was L5 capable. If Waymo had sold such wild claims years and years ago and gotten it continuously so wrong on sensors, timelines etc, I wouldn’t believe them
either.
 
  • Disagree
Reactions: lUtriaNt
It only had forward facing radar, which is not relevant to lane changes. And if AP1 actually changes lanes without cameras, radar or LiDAR and just using previous gen. ultrasonic/parking sensors, that’s idiotic and unsafe.
Auto Lane Change worked well enough with AP1, but you had to be more careful about approaching cars in the target lane since you didn’t have the rear-facing side cameras. Just as you would for manual lane changes.
 
  • Like
  • Helpful
Reactions: K5TRX and lUtriaNt
I don't understand why people think that the detailed mapping of road environments that these systems require is a one shot deal.
Roads change, junctions change, road markings and signs change - when your system relies upon detailed and accurate data of your surroundings to work effectively then you have to keep mapping and remapping in perpetuity.
Simple 2-D maps have so many errors as we are seeing confusing Tesla FSD. How are they going to make sure "every paved road" is accurate and up to date all the time ? We are talking about a company that would cut corners to save 10 cents, even if it causes a few deaths.
 
Last edited:
Radar is the only sensor that reliably sees through weather obstruction, but it is incredibly low resolution that it cannot be relied upon as a backup when light vision is down.

There is new HD radar now that has much higher resolution than normal radar. This new radar has high enough resolution that it can be used as a back up to vision in bad weather. In fact, Waymo recently published a blog on how they are using HD radar to help see in poor visibility like fog:

 
Simple 2-D maps have so many errors as we are seeing confusing Tesla FSD. How are they going to make sure "every paved road" is accurate and up to date all the time ? We are talking about a company that would cut corners to save 10 cents, even if it causes a few deaths.

That’s a valid point. I wonder how well they’ve kept the current Super Cruise routes up to date.
 
Why is every company other than Tesla using radars and lidars? What’s more likely, that Waymo, Cruise/GM and everyone else have it right or that Elon, who has been promising coast to coast full self driving since 2016/2017 yet has to continuously upgrade the hardware to get there, is right? Overpromise, underdeliver.
Dogma ? Herd mentality ?

Personally - when it comes to FSD, since no one has got it yet - I'm glad there are different approaches being tried, instead of a single approach by everyone.
 
At launch, Ultra Cruise will cover more than 2 million miles of roads in the two countries, which includes city streets, subdivision streets and paved rural roads, in addition to highways.

If true that'd be over 60% of the US & Canada paved streets, but given that this is GM I'd strongly doubt. This is coming from a former owner of numerous GM/Cadillac cars; 2 years ago we were driving a DTS and an Escalade. Now Tesla 3 & Y.... Willing to offer odds on this bet even ...

Given that they are testing the Super Cruise on very very limited sample of streets right now, I feel pretty good that would be a safe bet

So 2 millions miles. Is that all using HD maps ? If so - lets do some fun calculation.

How long does it take to create HD map of 1 mile ? You have to run the map-car slowly - say 25 mph ? Then, someone has to manually remove all the dynamic objects from the video/image. May be some of that is automated - still needs manual input. Essentially we are talking about "labeling" every 3-D image of that mile. Lets assume that is 1 person day per 1 mile.

So, we are talking about 2 Million hours of work to prepare HD maps for those many miles. If 1,000 people are doing the labeling - it would take them about 8 years to label 2 Million miles.
 
Well, currently Super Cruise probably don't even touch 0.1% of paved roads in US. So much easier to keep that up to date.

Realistically - there is no way to claim "all paved roads" and needing HD maps. Are they using just 2-D maps ?
Tbh I don’t really understand how any amount of maps can be kept fully up to date unless it’s a city or smaller area. I mean, changes can on roadways can occur on a daily basis. Otherwise, 2 million miles for ultra cruise is “only” 10x the 200,000 miles for super cruise.
 
  • Like
  • Disagree
Reactions: EVNow and lUtriaNt
Dogma ? Herd mentality ?

Personally - when it comes to FSD, since no one has got it yet - I'm glad there are different approaches being tried, instead of a single approach by everyone.

The main reason I’m more willing to believe Barra or Waymo on FSD than Tesla is because Elon, for all his genius, is kind of crazy and doesn’t always think things through before making incredibly bold claims and commitments, which often don’t pan out. Barra on the other hand will get the boot if she gets it as wrong. GM can’t just announce such a major upcoming product without a clear roadmap for how they are going to get there. I mean the markets will slaughter them. They are answerable to investors whereas Elon seems to just do his own thing, SEC be damned! 🤣
 
The main reason I’m more willing to believe Barra or Waymo on FSD than Tesla is because Elon, for all his genius, is kind of crazy and doesn’t always think things through before making incredibly bold claims and commitments, which often don’t pan out. Barra on the other hand will get the boot if she gets it as wrong.
I’m curious what you think Barra’s track record is…
 
I’m curious what you think Barra’s track record is…
GM promised and delivered with Super Cruise functionality and are testing Cruise autonomous ridesharing.

Of course time will tell what GM has on the market in 2022/23 and who gets to FSD first. Whoever it is, the sooner FSD comes, the more lives we can save and more convenience/comfort we can provide to people.
 
  • Disagree
Reactions: lUtriaNt
Just found this online on maps:

“GM relied on lidar-scanned high-definition maps for Super Cruise, but Ditman said it wasn’t practical to map all 2 million miles of road for Ultra Cruise. “We do rely on similar map data,” he said. “However, we have a larger number of sensors that also observe the roads so when we combine the map accuracy with what our sensors see of the road geometry and the road markings, we’re still able to accurately place ourselves and drive the right nominal path.”

 
  • Like
Reactions: Bladerskb