Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Emperor's new clothes

This site may earn commission on affiliate links.
I'm far from an fan of Autopilot and I think Musk has over blown the expectations but I also have a problem with poor reporting.

The fire truck accidents happened WITH forward facing radar, the vision only approach is much later but the article suggests that its vision only that is the cause.

It also says that "Unlike technologists at almost every other company working on self-driving vehicles...." but it fails to mention Mobileye, one of the biggest players have a vision only development stream called Supervision. Given there are only 3 or 4 mega players in this space (Tesla, Mobileye, Waymo being the names we all know), to say "almost every" is a little disingenuous as the exception could be argued as being the company insisting on LIDAR and other technology.

Personally though my primary reservation is Musk's belief that as long as its safer than humans it will be approved, but that is just not the reality. It needs to be safer than the safest human otherwise a good driver is at more risk using the system than not, and that's not a position which will be tolerated.
 
...fails to mention Mobileye

Please note that Mobileye's SuperVision is emphasized as ADAS, L2 in the autonomous scale.

"Mobileye SuperVision
What's Behind the Name?
The classic definition of “supervision” is watching over someone or something to ensure everything is done properly and safely. It also speaks to the quality of possessing extraordinary capabilities for sight, which our surround camera configuration brings to the table. Equally important however, is that this is an ADAS system, so it still requires human oversight"

It believes L3 and above need the addition of Radar and Lidar.
 
Please note that Mobileye's SuperVision is emphasized as ADAS, L2 in the autonomous scale.

"Mobileye SuperVision
What's Behind the Name?
The classic definition of “supervision” is watching over someone or something to ensure everything is done properly and safely. It also speaks to the quality of possessing extraordinary capabilities for sight, which our surround camera configuration brings to the table. Equally important however, is that this is an ADAS system, so it still requires human oversight"

It believes L3 and above need the addition of Radar and Lidar.
Supervision was a bad choice by DF but Mobileye certainly believe (or believed) vision only is ok

From their website:

From the outset, Mobileye’s philosophy has been that if a human can drive a car based on vision alone – so can a computer.
 
Well written comment @DrJFoster, agree with you completely.
Auto Pilot or whatever other manufacturers want to call it is just another aid to driving like lane keeping or traffic sign recognition and is designed to assist an attentive driver - not take over responsibility for doing so.
The aim for full self driving has to be admired but the variables that need to be considered and acted upon to achieve that are so complex and numerically virtually impossible to calculate that the technology and processing required is far beyond what is available today in any car and indeed would be a challenge for even a Cray supercomputer running perfect software.

Full self driving is just a name of a function Tesla has given to a conglomeration of different aids being brought together as one complete function, its useful and sort of works OK ish for most of the time on a motorway, I have FSD but only engage it on a motorway and only because i can because its impressive when it does something good - but its as satisfying as giving the car a full throttle blast - its temporary and gives you a high - and of course bragging rights in the pub.

Is it worth the money - absolutely not, its not worth 10% of the charges applied but so many seem to want a system that they can just sit in their car, tell it where to go and sit back reading the newspaper and some probably believe the hype in that some day their car will self drive - It will never work with the current range of cars or technology available today, indeed I don't envisage the day when any automaker has a system close to level 5 autonomy - but trying to get there shouldn't be discouraged.
 
  • Like
Reactions: Jason71
Absolutely an interesting set of changes to watch! There are definitely measurable improvements in each point release of the beta FSB ( ) and these are coming every 3-4 weeks. Project that to the future and they will hopefully get pretty close. Possibly not read the newspaper close, but nearer than they are just now.

I actually hold the wipers as an example to prove the point that the approach can work. Every other auto maker has a dedicated sensor for wipers, tesla doesn't. I do wish they had just included the damn sensor, but there you go. Given that they didn't, once they got around to focusing on the problem, they got it sorted pretty well. Its maybe not as good as a dedicated sensor, but they prove, for me, that the train NN, deploy to fleet, refine, deploy, can work. That they stopped at 90% quality is unfortunately very typical, but is part of the price you pay for the company also being able to eg move to octo pump, or keep producing cars by switching chips that they use to enable continued production, or focus more on addressing the driving elements of FSD. Its the price of agility - only the truly squeeky wheels get attention.

And although I'm happy to follow and watch FSD develop, I'm glad I didn't pay for it. I do get tempted by EAP tho, which I think is all that will be useful to us in the UK in the life time of my car.
 
  • Like
Reactions: house9
I say that if Tesla can't get the basics right (auto wipers & headlights and all the other stuff that every other auto maker nailed years ago) then what faith should one have that they will get FSD right?

If it's the same team developing both then I would agree with you ... but I'm pretty sure it isn't. The people developing the engine in a Formula 1 car are not the same people developing the aero package. Of course somebody overseeing the whole project has to sign off both as being ready for use ... hmm ... not Elon but it must be someone who reports to him.
 
I actually hold the wipers as an example to prove the point that the approach can work. Every other auto maker has a dedicated sensor for wipers, tesla doesn't. I do wish they had just included the damn sensor, but there you go. Given that they didn't, once they got around to focusing on the problem, they got it sorted pretty well. Its maybe not as good as a dedicated sensor, but they prove, for me, that the train NN, deploy to fleet, refine, deploy, can work.
Funny but it proves to me they can't. A stubborn instance that AI solves everything when the existing solution was better, it's taken them 4 years and countless iterations of the wipers (remember Deeprain?) and its still only "almost as good" as a 30p rain sensor.

There are some fundamental physics based issues. The focal length of the camera can't see the windscreen, in fact it would be a poor design if a single rain drop over the lens obliterated a large portion of the field of view. As a consequence they need to determine if the windscreen needs wiping by looking at the surrounding area and infer if it needs wiping. That's never going to be as accurate. If it was either "raining heavily" or "not raining" then it would, but we have a thousand intermediate states, misting because of temperature differentials, mizzle, splashes from other traffic, plus the scenarios of raining on what appears to be a sunny day (it happens).

Same goes for FSD. The cameras can't cope now in the UK, my rear camera is obscured after a mile of driving, my side cameras are frequently obscured because of the angle of light, and in the dark you can forget any of it working. As a L2 driver assist on good days, I have no doubt it could be the best, as a means of becoming anything more, I just can no see it with the current sensor suite. Vision only is maybe ok so long as you have the ability to ensure you can still see, Tesla can't.
 
Funny but it proves to me they can't. A stubborn instance that AI solves everything when the existing solution was better, it's taken them 4 years and countless iterations of the wipers (remember Deeprain?) and its still only "almost as good" as a 30p rain sensor.
I don't think it does solve everything, but it seems to be within a reasonable chance of being able to.

They don't have the bandwidth (or possibly the hardware just now) to do it, but as a general solution moving from HW to SW defined sensors will ultimately help. Maybe the cameras will be able to detect ice and change driving patterns appropriately for it, the FSD beta stack already spots large puddles (not sure if it goes around them, but they get labelled). The key is using SW enables these possibilities, although the lack of a 30p sensor (ignoring the integration costs of that 30p sensor) does hurt in the short term.

Its much the same, slightly uncomfortable, transition that is going on in organisations moving to the cloud. You sacrifice a bit of control, but in return its always up to date, always secure and you put waaaay less effort into maintenance so you can actually -do- more.

I do think the wipers are adequate now, although the auto headlights are pretty meh.
Vision only is maybe ok so long as you have the ability to ensure you can still see, Tesla can't.

Also, lasers ;). (Tesla obtains patent on its wild idea to use lasers as windshield wipers)
 
I'm far from an fan of Autopilot and I think Musk has over blown the expectations but I also have a problem with poor reporting.

The fire truck accidents happened WITH forward facing radar, the vision only approach is much later but the article suggests that its vision only that is the cause.

It also says that "Unlike technologists at almost every other company working on self-driving vehicles...." but it fails to mention Mobileye, one of the biggest players have a vision only development stream called Supervision. Given there are only 3 or 4 mega players in this space (Tesla, Mobileye, Waymo being the names we all know), to say "almost every" is a little disingenuous as the exception could be argued as being the company insisting on LIDAR and other technology.

Personally though my primary reservation is Musk's belief that as long as its safer than humans it will be approved, but that is just not the reality. It needs to be safer than the safest human otherwise a good driver is at more risk using the system than not, and that's not a position which will be tolerated.
Doesn't have to be safer than the safest humans.
Most traffic injuries and deaths are caused by people doing selfish and dumb things so you can eliminate a very large proportion of them just by being competent, and not selfish or dumb.

The safest humans (or the ones who think they are) can continue to drive themselves.
 
Its much the same, slightly uncomfortable, transition that is going on in organisations moving to the cloud. You sacrifice a bit of control, but in return its always up to date, always secure and you put waaaay less effort into maintenance so you can actually -do- more.
And totally at the mercy of your ISP, the cables themselves and any bad actor that can disrupt things. My business and client info was always air gaped.
 
Doesn't have to be safer than the safest humans.
Most traffic injuries and deaths are caused by people doing selfish and dumb things so you can eliminate a very large proportion of them just by being competent, and not selfish or dumb.

The safest humans (or the ones who think they are) can continue to drive themselves.
I can assure you it will never be approved if it was only better than average but not better than the majority (for the sake of argument those up to 2SD above the mean or approx 98% of the population). Musk himself conceded his original "twice as good" compared to average target was not going to cut it a few years ago.

The reason is simple, an idiot crashing a car causes the idiot harm. An idiot crashing into you and they can potentially go to prison and you can claim compensation from them, Self driving car failure however can cause somebody at random to be harmed but who is culpable? Who do you sue? The world does not accept that nobody is at fault once you put the responsibility for safety into corporate hands, no disclaimer is going to cut it. Just look at the investigations you get after aircraft and train crashes.
 
  • Like
Reactions: GeorgeSymonds
Personally though my primary reservation is Musk's belief that as long as its safer than humans it will be approved, but that is just not the reality. It needs to be safer than the safest human otherwise a good driver is at more risk using the system than not, and that's not a position which will be tolerated.
With respect, I sharply disagree with your assertion here. If automated driver assist technology (whether from Tesla or any other rival technology) is shown to clearly reduce the chance of accidents I predict that there will quickly be a huge insurance premium advantage on offer to drivers using the system and a significant financial penalty to drivers not using this system. Your argument is really an emotional one and driven (pun intended) by the widespread fallacy that we are all exceptional, top 5-10% of all drivers, who will not need the 10X safety advantage of an automated system. However, my opinion (these are all just our respective opinions after all, despite the tendency on this forum for people to transform their opinions into bold statements of fact), is that adoption will quickly result from cold blooded financial calculations, with insurance companies providing both the carrot and the stick.
Of course, I might be wrong and this all depends in any case upon automated technology improving to the point where there is a data-driven argument that it is indeed much safer than most humans. I have no idea how quickly this is going to happen. Watching the progress of each new iteration of FSD beta on YouTube I find my confidence in how close we are to a system that can be rolled out widely swinging back and forth. In my opinion, the system has improved hugely over the past 12 months and overall I find its performance remarkable, but, so far at least, still not reliable. I have also not seen much sign of the 'exponential improvement' Elon Musk predicted and instead it seems to improve only incrementally with each update. I hope I am wrong, but I suspect that we will not be getting access to FSD 'city streets' in the UK anytime soon, probably not during 2022.
 
  • Like
Reactions: Sandor and cryo
Your all talking about windscreen wipers, when we are watching F9 boosters landing all the time. FSDbeta videos clearly show camera analysis performance way ahead of human vision (Dirty Tesla has good evidence and others) Your rear camera is not crapped up as quick as you claim. I wipe mine every week or so and it has a crappy wet scottish country commute (very muddy roads). Most of the weather related camera errors appear to be limited to the production AP stack. The lack of lane changing in the rain is weird as mine has started to do the (riskier) lane change into the fast lane under foul conditions where it still is refusing to go back into the slow lane. Standard AP copes with low winter sun better than me already. I could go on and on. 9.5 is looking pretty sweet and most beta testers agree that it is not perception that is the problem, but the decision making which is the limiting factor at the moment and they have just started to implement the Go beating "deep mind" algorithm shown in AI day. It is currently driving well through insane narrow double parked roads that would be a nightmare for most folk, but is hamstrung with having no protocol for 1. Reversing out of trouble and 2. Using medians for crossing busy multi lane roads (although it is starting to think about that now). I would guess the limiting factor in getting "city streets" to the UK will be TESLA bandwidth and how much Grant Shapp wants it in his car.
 
I'm far from an fan of Autopilot and I think Musk has over blown the expectations but I also have a problem with poor reporting.

The fire truck accidents happened WITH forward facing radar, the vision only approach is much later but the article suggests that its vision only that is the cause.

It also says that "Unlike technologists at almost every other company working on self-driving vehicles...." but it fails to mention Mobileye, one of the biggest players have a vision only development stream called Supervision. Given there are only 3 or 4 mega players in this space (Tesla, Mobileye, Waymo being the names we all know), to say "almost every" is a little disingenuous as the exception could be argued as being the company insisting on LIDAR and other technology.

Personally though my primary reservation is Musk's belief that as long as its safer than humans it will be approved, but that is just not the reality. It needs to be safer than the safest human otherwise a good driver is at more risk using the system than not, and that's not a position which will be tolerated.
Mobileye does have vision only as their mission statement, but their only implementation as a system is Zeeker, which is remarkably similar to Tesla HW3 (with radar)
 
''Tesla's using Autopilot drove into parked fire trucks, police cars and other emergency vehicles''??? and here's me thinking the driver is in charge at all times so I suspect accidents like this are termed driving without due care and attention or in these particular incidents driving with no attention at all until final impact? You cannot blame a manufacturer for the actions of a customer who buys the product. Gun manufacturer's come to mind.
 
  • Like
Reactions: flyhighboi20