Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

60 mph; excellent road markings and AP2 tried to throw the car at the median

This site may earn commission on affiliate links.
Meanwhile, the competition seems to be aiming for a 360 degree vision, lidar and radar combination as all three technologies have certain benefits and only by a 360 view for all can all scenarios be covered.

I am sure there will be other setups between these extremes, but IMO the argument Elon and Tesla are making is that vision is enough, when the rest of the major players believe otherwise.

And we're almost one full year into the Tesla HW2 platform, and it still can't tell when it's raining. That really inspires confidence that it can avoid killing everyone in my car.
 
Nobody aims for just lidar AFAIK. Radar is there too.

I think what is deep down boils to, actually, is this:

Is vision sufficient for self-driving? Is vision-only good enough or better than other methods?

Tesla is basically at the moment aiming for a vision only system, complemented by narrow front radar for highway driving and slow ultrasonics for low-speed maneuveuring. But basically it is a vision system, given the inherent limitations of the latter two.

Meanwhile, the competition seems to be aiming for a 360 degree vision, lidar and radar combination as all three technologies have certain benefits and only by a 360 view for all can all scenarios be covered.

I am sure there will be other setups between these extremes, but IMO the argument Elon and Tesla are making is that vision is enough, when the rest of the major players believe otherwise.
Interesting. One argument Elon was making in favor of radar was that Lidar was too expensive. So I guess it's no longer the case (considering we are seeing it in some consumer vehicles)?

Also sometimes it seems to me that Elon is really the one making these arguments and the engineers at Tesla have no other choice but to follow (or of course resign like the head guy did in December).

Anyway for us model 3 reservation holders it's a sad topic. I never believed that self driving would happen with this, but I kind of hoped that we would have a reliable autopilot at some point. I fear that the whole thing needs to be rearchitected, and that Lidar will eventually be needed (I.e hardware upgrade). The chances of that happening will be magnified as Lidar becomes more affordable (which reading these pages seems to be already happening)

My knowledge of Lidar is very limited but I know we have had some driverless vehicles for actually years without issues. Google claimed 700,000 miles without accident (or a few accidents where the car was not at fault if I recollect).

I don't know all the details. I realize that radar is also used. But if it would seem to me that a full 3D cartography of the surrounding would be a huge benefit.

I will likely pay the $5k for EAP, mostly for TACC

Thanks
 
Interesting. One argument Elon was making in favor of radar was that Lidar was too expensive. So I guess it's no longer the case (considering we are seeing it in some consumer vehicles)?

I think the point has been mentioned before that Tesla is obviously trying to do FSD on the cheap. Certainly with AP1 they did more with a much more limited suite than the competition. In AP2 a vision only (mostly) system allows Tesla to get away with minimal sensor fusion - possibly yielding faster (as in ships earlier) results and with less processing power...

And yes, since they want to put an FSD suite in every car as soon as possible, sensor cost is likely a factor too. It will be interesting to see if and how Tesla's suite and narrative possibly changes over time as the technology gets cheaper. I guess we just don't quite know yet which types of suites will prevail.

Elon has argued against lidar quite vocally in FSD cars. Is it just a case of "sell what you ship now" or truly some foresight that the rest of the industry is not grasping but will realize later?

Volvo for example aims for, I believe, a 360 vision, 360 lidar, 360 radar, but that system is still some years from shipping... The first Level 3 car (highways at traffic jam speeds), the new Audi A8, has several radars and one lidar and one driving camera - the lidar and driving camera cover the front there but that system is limited to highways where such sounds sufficient with corners and rear covers by radar...
 
Elon has argued against lidar quite vocally in FSD cars. Is it just a case of "sell what you ship now" or truly some foresight that the rest of the industry is not grasping but will realize later?
My honest opinion is that Musk honestly believes we can do FSD with almost entirely vision (cameras) alone and that he wants to drive the development to a point that it's a foregone conclusion that cameras can do everything since humans with that alone are able to do the driving. Yes it's probably ambitious but as I've said before with enough software and computing power, why not? Musk is trying to force the development to happen.
 
  • Like
Reactions: JohnnyG
My honest opinion is that Musk honestly believes we can do FSD with almost entirely vision (cameras) alone and that he wants to drive the development to a point that it's a foregone conclusion that cameras can do everything since humans with that alone are able to do the driving. Yes it's probably ambitious but as I've said before with enough software and computing power, why not? Musk is trying to force the development to happen.

Vision only probably needs less computing power than triple or quadruple sensor fusion.

I guess the biggest argument is why not aim for superhuman senses. Sensor fusion certainly takes that further than vision alone, even if vision too can be better than eyesight...

One reason for superhuman senses is that humans are less susceptible to things like sensor blockage (they can just get out and check/clean). So very robust suites can help there as well.
 
Vision only probably needs less computing power than triple or quadruple sensor fusion.

I guess the biggest argument is why not aim for superhuman senses. Sensor fusion certainly takes that further than vision alone, even if vision too can be better than eyesight...

One reason for superhuman senses is that humans are less susceptible to things like sensor blockage (they can just get out and check/clean). So very robust suites can help there as well.
Radar and ultrasonic makes it superhuman too. Again, I don't disagree that lidar provides more information than radar+u/s+vision, but I was there when GPS in-car navigation was in its infancy - the amount of extra sensors cars had to make the information useful was extreme. These days the bog standard cheap arse crap GPS in an average mobile phone embarrasses what cars of 20 years ago could do. It's the same thing. Aim for doing everything with just cameras since we know it will get there eventually. That's Musk's thinking. Whether it's too early for that or not is, in my opinion, what the real debate should be; not whether it will be possible or not.
 
Radar and ultrasonic makes it superhuman too. Again, I don't disagree that lidar provides more information than radar+u/s+vision, but I was there when GPS in-car navigation was in its infancy - the amount of extra sensors cars had to make the information useful was extreme. These days the bog standard cheap arse crap GPS in an average mobile phone embarrasses what cars of 20 years ago could do. It's the same thing. Aim for doing everything with just cameras since we know it will get there eventually. That's Musk's thinking. Whether it's too early for that or not is, in my opinion, what the real debate should be; not whether it will be possible or not.

Radar and ultrasonics provide for very narrow cases, though, in the case of Tesla. This is not just about Lidar, but lack of 360 radar...

Tesla has one narrow front radar. Much of the competition already has near 360 radars in their current aids and is aiming for 360 degree radar...
 
I still stand by the notion that Tesla is aiming for (near) the minimum feasible FSD suite. In the future they may find some other suite more optimal, but they are starting with the bare essentials.

Out of 360 vision, 360 radar, 360 lidar, 360 ultrasonics, only the first is really, really required for FSD. Radar and ultrasonics seem to be there mostly for legacy reasons - so that the current AP1 approximation can do its thing - and because the suite does not have full 360 vision on the bumper level (it has a significant dark spot all around the nose).

Whether or not Tesla continues adding more radars for example, remains to be seen. But IMO everything points to Tesla trying to get there early and to get there in volume - and to do that, get there with the minimum suite. Basically Tesla is trying to under-engineer this and it remains to be seen how well that works for them.

The competition is approaching FSD from a much more robust expectation with triple or quadruple redundant sensors. Tesla also of course still has time to follow that path if they so choose, the competition is not here yet for the most part.
 
I still stand by the notion that Tesla is aiming for (near) the minimum feasible FSD suite. In the future they may find some other suite more optimal, but they are starting with the bare essentials.
Which makes sense from a business standpoint right? The could've started with "max" but started with "bare minimum". I would do that if I were Musk, as long as I was confident (not stupidly so) that this could do it.

Question is: Has Tesla acted stupid or Was the decision reasonable?
 
Which makes sense from a business standpoint right? The could've started with "max" but started with "bare minimum". I would do that if I were Musk, as long as I was confident (not stupidly so) that this could do it.

Sure, but separating what makes business sense from what is the optimal FSD suite should be separated, to have an accurate conversation. :)

Question is: Has Tesla acted stupid or Was the decision reasonable?

That is certainly another debate. I think there is still hope for them on this, but the ramp-up has been abysmal.
 
Radar and ultrasonic makes it superhuman too. Again, I don't disagree that lidar provides more information than radar+u/s+vision, but I was there when GPS in-car navigation was in its infancy - the amount of extra sensors cars had to make the information useful was extreme. These days the bog standard cheap arse crap GPS in an average mobile phone embarrasses what cars of 20 years ago could do. It's the same thing. Aim for doing everything with just cameras since we know it will get there eventually. That's Musk's thinking. Whether it's too early for that or not is, in my opinion, what the real debate should be; not whether it will be possible or not.
From what I read, google uses Lidar information to match against preloaded 3D maps, allowing perfect localization of the car (unlike using GPS which is not precise at all).

Maybe Tesla does something similar though with its cameras I don't know.
 
Vision only probably needs less computing power than triple or quadruple sensor fusion.

I guess the biggest argument is why not aim for superhuman senses. Sensor fusion certainly takes that further than vision alone, even if vision too can be better than eyesight...

One reason for superhuman senses is that humans are less susceptible to things like sensor blockage (they can just get out and check/clean). So very robust suites can help there as well.

Like you mentioned in your other comment, for GPS to really work you needed more sensors at first. Why was that? Because with more information you need a less clever computer. I think it's the same with self driving.

A human brain is incredibly more competent, than any computer today. So our own eyes work fine enough for driving a car. But if you have something as stupid as a computer, it needs every bit of information it can get, to make it work.

Essentially, once computers get smart enough I am sure a couple of cameras could do the work good enough. The real question is when that's going to happen.
 
Like you mentioned in your other comment, for GPS to really work you needed more sensors at first. Why was that? Because with more information you need a less clever computer. I think it's the same with self driving.

A human brain is incredibly more competent, than any computer today. So our own eyes work fine enough for driving a car. But if you have something as stupid as a computer, it needs every bit of information it can get, to make it work.

Essentially, once computers get smart enough I am sure a couple of cameras could do the work good enough. The real question is when that's going to happen.
I would think from an algorithm perspective image recognition (vision) is far more intensive than relying on a Lidar which provides a full 3D picture of the environment. You know exactly where you can go etc.
 
  • Like
Reactions: NerdUno
I received a software update a few days ago. The release notes promised resolution of some unspecified bugs, so I tried a few things.

Summon: I have a two car garage with separate doors, so there's a center divider between them. There's plenty of clearance on both sides. I told the system to back out into the driveway. With the car halfway out of the garage it inexplicably turned the wheels and aimed the front fender at the center divider. I managed to abort the process about two inches from a collision.

AP2: out on the highway I had several instances of hard braking for no apparent reason. Then, driving in the left lane at about 65 mph, it took a sudden lurch toward the metal barrier. Because I had my hands on the wheel I managed to avoid the collision, but overcoming the steering wheel resistance caused a rapid overcorrection to the right and then a correction back to the left.

My frightened passenger, who long ago promised to love and honor - but not to obey - me, begged me to never again turn on the AP system with or without her in the car. I'm a compulsive early adopter of new technology. She tends to embrace what works and improves our lives, and reject gadgetry that just adds complexity. So when she says NFW, I listen.

The novlety of AP was exciting. Now I'm at the point where all it does is cause me extra stress and anxiety while Decreasing safety. I'm a safe and confident driver. I would love the convenience of a competent Adaptive Cruise Control, like other cars have. AP2's other features should be recalled, voluntarily or not, until it can be made safe.

Note to those who attack anyone expressing concerns about their Teslas, and are about to reply with criticism of my reading comprehension or general intelligence, or to demand statistical evidence, or tell me to just stop using it if I can't handle it: My concern isn't just for my own car. I think Teslas operating under AP2 are a hazard to other vehicles as well as their own, even with a vigilant driver. The sudden braking for no reason could easily cause a rear-end collision. Attempts to correct sudden lurching into barriers or other lanes can affect other cars. In the face of Tesla's stonewalling and continued failure it's time to blow the whistle.

Anyone who diagrees with what you wrote should be ashamed.
 
Can I ask what mileage people have on their cars that have random braking or swerving episodes? I know that when I first bought my car back in June '17, I had a very rough start with the AP system with my car under 3,000 miles. I think I had a random slow down, hard braking, or even a swerve at least once or twice a week. Now at 5.600 miles I have yet to have an incident that leaves me both anxious and angry for a good month at this point.
 
Can I ask what mileage people have on their cars that have random braking or swerving episodes? I know that when I first bought my car back in June '17, I had a very rough start with the AP system with my car under 3,000 miles. I think I had a random slow down, hard braking, or even a swerve at least once or twice a week. Now at 5.600 miles I have yet to have an incident that leaves me both anxious and angry for a good month at this point.
10.5k