Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
It's not about competition but it's about which system is safer regardless of other factors (sales, price, popularity, volumes...)
The simplest is comma.ai Openpilot with one camera, DIY system for ~$1000. I'm not sure if they leverage/override any on-board safety systems.

Tesla is going with 8 cameras and some ultrasonics for ~$10,000

Mobileye had vision-only and Lidar systems but seems to be going to a vision/Lidar redundant blend (see Tam's post)
MobilEye describes its fusion

Lots of other car manufacturers and systems are coming on-line with various combinations of sensors & capabilities.

Also autonomous delivery robots are already in use in certain markets including the USA.

The general public is still by-and-large unaware of the upcoming onslaught of these systems. We hear the odd special-interest story about a pizza robot somewhere or the occasional autonomous crash but it's usually just a Gee-whiz Future story. Doesn't really affect us.

That's about to change and once the general public and Main Stream Media realize it's actually happening it's going to be important if it's the safety and convenience or the fatalities that gets more attention.

Where the industry goes next depends on that. If it's all safe and in the background then probably people won't really give it much notice. If it's running over pedestrians and offing Influencers/Grandmothers/CarIdiots then it's gonna be a witch hunt/angry mob situation.

Could go either way at this point.
 
  • Like
Reactions: diplomat33
....If it's running over pedestrians and offing Influencers/Grandmothers/CarIdiots then it's gonna be a witch hunt/angry mob situation....

The Starship autonomous food delivery robot already did $2,600 damage and caused a stir last year:


No one was willing to take responsibility for the robot vs car collision: The police ("not sure robot is a car to ticket it"), the city who gave the robot the permit, and finally the company.

Not until the news covered the incident and then suddenly the company was happy to cover the bill.

Starship Technologies seems to rely on cameras and not LIDAR.

 
  • Informative
Reactions: cwerdna and Dan D.
The Starship autonomous food delivery robot already did $2,600 damage and caused a stir last year:


No one was willing to take responsibility for the robot vs car collision: The police ("not sure robot is a car to ticket it"), the city who gave the robot the permit, and finally the company.

Not until the news covered the incident and then suddenly the company was happy to cover the bill.

Starship Technologies seems to rely on cameras and not LIDAR.

Thanks, that's the kind of problem that will do harm.
Hassle from the authorities. The Starship Company is belligerent and refuses to accept blame or release video of the accident. No phone number on the device in case of accident but claims "the website" is good enough. Likely (?) no emergency shutoff in case say it snares a dog leash. Escalators and elevators have them. (Edit: it should stop when it senses a problem. Perhaps you can stop it by a good kick)

The spokesman is dismissive when asked about its safety systems, 1:50 such as if a child was trying to ride it. "It's not really meant to be ridden on", he says. "It does not cause any danger".

This is what will lead to regulations. Shame they can't be pro-active and anticipate the problems, but there you go.

Barely avoids the only pedestrian around 0:21
Runs into her foot 1:13
(It's just advertising I know)

"Using ten cameras, ultrasonic sensors, radar, neural networks, and other technology the robots can detect obstacles, including animals, pedestrians/cyclists, and other delivery robots."
And presumably cars? Other paragraphs say it has 9 and 12 cameras, so 9-12 cameras anyway.
 
Last edited:
..."Using ten cameras, ultrasonic sensors, radar, neural networks, and other technology the robots can detect obstacles, including animals, pedestrians/cyclists, and other delivery robots."...

Thanks for listing its hardware. Apparently, that's not enough because it was the second time that robot had the collisions at the very same GPS coordination!

1618353809144.png


This company gives a bad name to pro-vision technology side.
 
With the 2016 fatal autopilot crashing into and under the white tractor-trailer, Tesla explained the sensor fusion problem: "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

But so far, none has voiced complaints against LIDAR and its fusion in such a scenario.

MobilEye describes its fusion in terms of separate systems of

1) Vision
2) LIDAR + RADAR

And when one fails, the other would take over


View attachment 653533View attachment 653535
I really like Mobileye's approach in that they are a backup system. I would assume that you drive camera only and the other sensors are used if cameras are somehow deemed inoperational?
 
  • Like
Reactions: diplomat33
Mobileye’s approach makes no sense. The way Amnon explains it is also nonsensical. I believe they’re only taking this “different” approach as a marketing ploy to say they’re special.

I thought it is easy to understand.

MobilEye was visionary on how machines could drive with just cameras and there was no need for other sensors. It was able to convince Tesla to use its system.

MobilEye wanted Autonomous vehicles too but its approach was step by step, very conservative.

It thought instead of trying to figure out with all the sensors, it wanted to concentrated on vision first and work out the bugs without the distraction of other sensors (If there's a fault, it's camera system, not radar, not sonars because there's only vision to work with.)

It's similar to figuring out which is clarinet and which is flute by just one sensor: the ears and shut your eyes. If the guessing is wrong, it's the ears' fault because the eyes are shut and not giving guesses.

Once perfected by ears, then shut the ears and open the eyes and tell the difference of clarinet and flute.

With fusion, the eyes and the ears help each other at the same time to guess which is which.

Now back to the history in 2014, it sold Tesla an ADAS that required driver's attention on the road but when Tesla first sold the Autopilot, Tesla was too enthusiastic and started to talk about how Autopilot could become Autonomous Vehicle someday.

MobilEye didn't like that kind of talk that might confuse ADAS with an Autonomous Vehicle system.

In 2016, with the first Autopilot death, MobilEye didn't want to have anything with Tesla anymore for fear of being associated with evangelizing recklessness in overselling an ADAS system as if it can do more or worse as FSD.

MobilEye continued to perfect its ADAS while researching on the Autonomous system with the vision alone.

That doesn't mean MobilEye won't use other sensors once it's perfected the vision. Now, it's adding in Radar + Lidar for commercial fleets first then later in 2025 for consumers.

I think MobilEye redundancy is not just like a traditional kind of "redundancy" like Tesla that would have like 2 same chips on a board and each chip has its own power circuit in case one circuit fails and the other could take over.

MobilEye redundancy is just like if the camera thought it's a white sky in the 2016 fatal Autopilot vs Tractor-Trailer accident and the camera failed to initiate braking, the other independent system of Radar + Lidar would step in and initiate braking.

Both independent systems would run at the same time, when one fails to detect the danger, the other one would step in and intervene.

It's not fusion because each system does not need one another to identify danger (Fusion is just like when the ears ask the eyes to confirm it's flute not clarinet).

It's not traditional redundancy because these 2 systems are not exact duplicated (2 same chips, 2 same programs...).

Mistaken white sky is only one among other camera failures.

Camera fails to detect the bus and the car to due sun too low on the horizon but LIDAR has no problem:

1618375009829.png




Camera is failing to detect 2 pedestrians in low light but LIDAR has no problem:

1618374973770.png
 
Last edited:
Sorry, but Mobileye's approach makes little sense if you actually look into the details and how Amnon explains it. I'm not the sort to summarize, but look into the following:

REM crowd sourced mapping only uses 10kb per km of data. They don't source images or videos from the fleet. Even worse, Mobileye has yet to deploy a car with their entire camera-only sensor suite. All they have are the current ADAS camera fleet (mostly one to three forward facing cams), which again, they don't / can't source images or video from.

Amnon claims that the safety of separate camera-only and lidar/radar-only subsystems is the product of their safety. This is so wrong it's funny. The safety of any autonomous system is the weakest link, not the product of two subsystems.

Mobileye is planning to use REM to localize their lidar/radar subsystem. REM is based entirely on vision. You can connect the dots to nonsense here.

It's very clear to me that Mobileye is a sort of Nikola-esque company / approach right now. There are many logical disconnects for critical thinkers. They're not making meaningful company agreements. There's a lot of marketing jargon that have little underlying engineering sense.
 
Ford unveils hands-free highway L2:

"Ford is today unveiling BlueCruise, a hands-free highway driving system, which it describes as “similar to Tesla Autopilot.” The new system is going to be pushed as an optional over-the-air software update to Ford Mustang Mach-E owners."

Press release:

"BlueCruise is an SAE Level 2 driver-assist technology, similar to Tesla Autopilot but with the advantage of offering a true hands-free driving experience while in Hands-Free Mode that does not require a driver’s hands to stay in contact with the steering wheel, unless prompted by vehicle alerts.

And unlike other approaches – such as GM’s Super Cruise, which uses red and green lighting, or Tesla’s Autopilot, which requires a driver keep their hands on the steering wheel – BlueCruise communicates with drivers in different ways. The instrument cluster transitions to communicate that the feature is in Hands-Free mode through text and blue lighting cues, effective even for those with color blindness."


 
What is pre qualified highway? Is this system destined to drive on city streets?

The system only works on divided highways that Ford has approved the system for. Right now, the system works on about 100,000 miles of divided highways in US and Canada. The system is not designed for city streets. It is only for highway driving.