Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

No more radar means the car cannot see through fog?

This site may earn commission on affiliate links.
In my 2016.5 Model S with AP 1, I am certain of one thing, the radar--not the camera saved me from an accident. I was driving down a dived road with Autopilot when my car started alerting and breaking hard for no apparent reason, just about the time I was ready to take over and stomp on the accelerator, the pickup in front of me slammed into the car in front of him that had made a sudden unexpected stop. The radar saw the car in front of the truck stopping before the pickup drive had reacted, and and therefore long before I would have known to stop. I'm not a tailgater, so maybe I would have responded in time, but I highly doubt that I would have had the time in this situation.
 
Where I live we get the Amish Horse and buggies. Sometimes they are riding on the shoulder and sometimes they are partially on the shoulder and the road. In some areas this may be a rare accourance or are never seen at all. How do you train the cars to recognize certain situations that maybe normal in one area but may not exist in other areas? Do you train the whole fleet in a situation that the locals deal with regularly?
 
Radar also sees under cars and able to "see" a car in front of the car you are following - a piece of information Tesla AP will no longer have. Some of the other manufacturer's EAB (no AP) even use this to brake earlier if your vehicle closure rate to the car in front of the car immediately ahead of you is too fast.
This is something that I will miss if they remove radar and vision can’t account for. I do know that has warned me a couple times by beeping before the person in front of me even brakes. But I tend not to follow too close anyway and would have stopped in time regardless.

It does seem like a step back to completely remove radar.
 
  • Like
Reactions: COS Blue
Last time I used AP1 in the fog/rain, it worked fine. I hope that doesn't go away.
AP is actually very useful in bad weather conditions, like slower highway travel in torrential rain. Did the invisible grey car without headlights that just passed cut in front of me? Did the car (that I can't see) in front of the car in front of me (that I can see) stop suddenly? AP detects and reacts, which help without me having to take my eyes off the road.
(There are plenty of traffic pile-up videos out there showing why stopping by the side of the road isn't necessarily safer, BTW.)

AP1 cars shouldn't be affected by this 'vision' stuff. Our system is entirely different from the current AP....we run on Intel MobilEye, not Tesla's home-grown version.
 
Where I live we get the Amish Horse and buggies. Sometimes they are riding on the shoulder and sometimes they are partially on the shoulder and the road. In some areas this may be a rare accourance or are never seen at all. How do you train the cars to recognize certain situations that maybe normal in one area but may not exist in other areas? Do you train the whole fleet in a situation that the locals deal with regularly?
I wonder how FSD would react to situations like this:
1622150972284.png


or this:
1622151062016.png


The big advantage that people have over car neural network is that human brains are trained on and off the road with context coming from not just the visual inputs, but from sounds, smells, and most importantly the ability to understand a situation.
 
  • Like
Reactions: jpk195
AP1 cars shouldn't be affected by this 'vision' stuff. Our system is entirely different from the current AP....we run on Intel MobilEye, not Tesla's home-grown version.
There are number of things Tesla software handles. Remember early AP1 used to ping pong in the lane for example, fixed by Tesla, not MobileEye firmware. AFAIK MobileEye is not an end to end solution - plug your cameras here, your steering wheel/brake/accelerator actuation here, here are settings you can change. This is why two different manufactures using the same MobileEye solution may behave differently and have different capabilities.
 
I wonder how FSD would react to situations like this:
View attachment 666973

or this:
View attachment 666974

The big advantage that people have over car neural network is that human brains are trained on and off the road with context coming from not just the visual inputs, but from sounds, smells, and most importantly the ability to understand a situation.

Hence why *all* of tesla's driver aids are just driver aids. I sincerely doubt they'll ever achieve true FSD without lidar or a similar technology.
 
There are number of things Tesla software handles. Remember early AP1 used to ping pong in the lane for example, fixed by Tesla, not MobileEye firmware. AFAIK MobileEye is not an end to end solution - plug your cameras here, your steering wheel/brake/accelerator actuation here, here are settings you can change. This is why two different manufactures using the same MobileEye solution may behave differently and have different capabilities.

Agreed. What I'm saying is - I *think* Tesla has more ability to configure / regulate on their home-grown AP vs the MobilEye version.

So, if they were to try to limit AP1 to 'vision only,' it might disable MobilEye altogether because it's expecting vision and radar inputs...and i don't think Tesla can change that. Plus, I really don't see them investing any money in developments for cars that are 4+ years old and lack many of the inputs that their 'vision' system needs (multi front cameras, side/rear cameras).

But that's purely speculative.
 
Unlike the other manufacturers, Tesla has a large number of vehicles on the road capturing data and that can be used for testing out new software while running in shadow mode in parallel with a human driver or previous version of the AP/FSD software.

It's very possible that Tesla has been testing Tesla Vision for quite a while - and based on the results of those tests - in real world conditions, they concluded they do not need to have radar.

The Tesla test vehicles with lidar are probably being used to do additional validation of Tesla Vision - comparing the object recognition (type of object, location, relative speed and direction, size, ...) of TV vs. lidar and possibly also against what radar is reporting.

They may even be recording the data streams in the test vehicles - and when there are area where Tesla Vision made mistakes, they can revise the software and re-run the scenario until they get it right.

Since they already have been manufacturing vehicles with radar - and removing the radar isn't going to be a huge cost savings, it seems likely they are doing this only after Tesla has done enough testing to demonstrate the radar data isn't needed.

This is a huge change for Tesla - and it seems unlikely they would make this change unless they were confident this was the right direction.

Though like any change Tesla makes, it could be a "two steps forward, one step backward" path, where it will take them a couple of releases to get all of the kinks worked out when deployed to the entire fleet.
 
AP1 cars shouldn't be affected by this 'vision' stuff. Our system is entirely different from the current AP....we run on Intel MobilEye, not Tesla's home-grown version.
Indeed. OTA updates are not always good. In fact, they can be bad. The fact that Tesla has abandoned AP1 cars' functionality should be a good thing since it actually does a pretty good job for its use case of highway driving assistance. This tweet should provide a small amount of comfort that AP1 won't be f'ed with as part of this "vision" thing.

1622215294461.png
 
  • Like
Reactions: QUBO
Hence why *all* of tesla's driver aids are just driver aids. I sincerely doubt they'll ever achieve true FSD without lidar or a similar technology.
Lidar is very nice for ranging and verifying what you know but it isn't any better against elephants or hot tubs. In fact, it'd be even less able to tell what it is.

I don't think true L5 is coming any time soon because it's still unclear how to program the AI to deal with so many possibilities. You can't big data your way around elephants and hot tubs.
 
Indeed. OTA updates are not always good. In fact, they can be bad. The fact that Tesla has abandoned AP1 cars' functionality should be a good thing since it actually does a pretty good job for its use case of highway driving assistance. This tweet should provide a small amount of comfort that AP1 won't be f'ed with as part of this "vision" thing.

View attachment 667236

"AP1 is Dead! Long live AP1!"

I'm happy that AP1 is mature. I can rely on it to operate the same way without change, and not hit a major issue because of a bad reaction to a new update.
 
"AP1 is Dead! Long live AP1!"

I'm happy that AP1 is mature. I can rely on it to operate the same way without change, and not hit a major issue because of a bad reaction to a new update.
I'm also happy with my 2015 AP1 performance & recently upgrading to MCU2 made it even smoother, e.g. at taking highway speed curves, less desire to veer towards every exit when the right lane marking disappears, and general situational awareness of other cars. I also like radar's ability to see the car in front of the car in front of me, which I don't understand how a vision system could replicate.
 
  • Like
Reactions: kavyboy
Glad not to be part of the “does vision AP work in the rain?” drama all over Reddit right now. As others have said, Tesla removing radar removes a modality that can sense objects humans (and cameras) can’t see, especially at rain/snow/at night. This makes it potentially less useful as ADAS while still being far short of full autonomy. I think Tesla is heading into the messy area of more rarely or less predictably needing driver intervention and probably more scenarios where functionality is limited. This may be a strategic path forward for Tesla, but it sounds like a crappy customer experience to me at best, and potentially dangerous.
 
I'm also happy with my 2015 AP1 performance & recently upgrading to MCU2 made it even smoother, e.g. at taking highway speed curves, less desire to veer towards every exit when the right lane marking disappears, and general situational awareness of other cars. I also like radar's ability to see the car in front of the car in front of me, which I don't understand how a vision system could replicate.
When you put a camera six inches above the road, you can see under the car in front to the car in front of them. I imagine it's like the fellow above you said, that Tesla compared cameras with radar and didn't notice much of a difference, possibly even finding that cameras alone were actually better. Since none of us did any tests, I'd go with Tesla's decision.
 
  • Like
Reactions: WhiteWi and QUBO