Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Frustrated with FSD timeline

This site may earn commission on affiliate links.
I think you missed his point. His point is not that there are no other cars using EyeQ3. In fact he said there are over 2 million (while Tesla only makes up ~150k of that). The point is none of those cars come even close to matching the capabilities of AP1 or AP2 (see your own review link for the BMW for example). This seems to suggest that the success of AP1 is not an inherent part of EyeQ3 (because if it was an inherent part of EyeQ3, then all the lane keeping systems using EyeQ3 should be as good as Tesla's).

Supercruise is pretty much the only system that seems to have a chance (we'll see the reviews when it comes out). However, even that suggests it is not Mobileye that deserves the credit, but rather GM's extensive development (if you are to trust GM's PR).

This is not true. IF you actually read the reviews, a lot of these system actually perform well on well marked roads. which is what they are intended for. but they struggle in curves and when the lane fades. which is a problem tesla had from the start, but has improved upon with their high precision map.

secondly, unlike tesla these companies don't have hundreds of developers dedicated to one feature. Which is the reason tesla was able to develop a map system using gps logging in connection with vehicle's speed. This allowed them to drive better when the lane fades and in corners. Elon ofcourse hyped this up as "fleet learned road curvature".

The reason the new DrivePilot 4.5 will match or even exceed autopilot is because for the first time it is also using map data (HERE maps).
Mercedes will give Tesla's Autopilot its first real competition this year
However GM supercruise will actually transcend autopilot and that's due to it also using its own lidar map.

Another example is tesla using the car object detection and tracking that mobileye provides to do car following. again all these functions are available, its up to you to use it and tesla pretty much had unlimited resources to devote towards it.
 
I believe the point was that although many automakers use eyeq3 chips, Tesla's implementation was widely regarded as the best.\

There were several people in many threads that claimed there were no other L2 cars using mobileye other than tesla period.

For more look at my response to @stop.
I wouldn't say it's a myth per se, but a misconception on how the miles of data are stored. It's likely similar to how MobilEye does it and you're correct it's not a billion TB of video data. Tesla is also interested in disengagements or when a human does something that the machine was not expecting and recording the outcome of the event.

Amnon Shashua described the MobilEye system called road experience management as recording small data packets of under 10 KB per kilometer

it is a myth and a very big one and is tied into Sterling's Tesla Director of Autopilot public rebuke. The idea that tesla will collect 50-100 GBs of raw video data from each car everyday and somehow upload it with no hacker having detected any such upload. Even with such data upload being impossible even on wifi. And then feeding that raw data into one big machine and pulling a crank.

Secondly, tesla already told us what map they have created and used for their ap1. Its called the high precision map using gps logging. Tesla isn't doing hd mapping using camera as mobileye has done. They didn't have that capability in AP1 and they haven't started doing it in AP2. I definitely believe they will and They will announce it when they do. Just like elon can't help but brag about anything he does.

Nvidia also have a local lane end to end mapping demo and i'm sure tesla will work off their sdk
 
  • Funny
Reactions: ohmman
@stopcrazypp @AnxietyRanger @JeffK

I didn't get to finish my inputs about radar, point clouds and the hardware tesla is using. Since its been acouple posts I will relent.
But I will say this, since 8.0 software, Elon's so called "point cloud" has already been implemented.

Upgrading Autopilot: Seeing the World in Radar


"The first part of solving that problem is having a more detailed point cloud. Software 8.0 unlocks access to six times as many radar objects with the same hardware with a lot more information per object. The second part consists of assembling those radar snapshots, which take place every tenth of a second, into a 3D "picture" of the world. "

Yet cars are still running into barriers and rear ending other cars.
Hurray the great LIDAR replacement.
 
...the consumer price has more to do with Model 3 ramp up and future value to the customer when it's full potential is realized vs actual cost of the equipment.
I believe most Model 3 buyers will buy AP2 when AP2 is really ready. Your saying reminds me of the other story: Plastc folds after making millions on orders never shipped "Another crowdfunded Bay Area startup has closed its doors without shipping customers the products they paid for, a failure that raises fresh concerns about risky online pre-order campaigns."
 
@stopcrazypp @AnxietyRanger @JeffK

I didn't get to finish my inputs about radar, point clouds and the hardware tesla is using. Since its been acouple posts I will relent.
But I will say this, since 8.0 software, Elon's so called "point cloud" has already been implemented.

Upgrading Autopilot: Seeing the World in Radar


"The first part of solving that problem is having a more detailed point cloud. Software 8.0 unlocks access to six times as many radar objects with the same hardware with a lot more information per object. The second part consists of assembling those radar snapshots, which take place every tenth of a second, into a 3D "picture" of the world. "

Yet cars are still running into barriers and rear ending other cars.
Hurray the great LIDAR replacement.

The car will do whatever the human tells it to do... every AEB system is susceptible to that.

Lidar has trouble in rain, fog, snow, or simply seeing around cars. Radar not so much.

A human being doesn't have lidar, they have the equivalent of optical cameras. With optical cameras and motion you can make 3D maps of your environment, you can do object recognition, etc.

There's no compelling reason lidar must be on the car. Sometimes I hear people say what if the cameras were obscured by dirt or salt etc. but lidar would suffer the same issue.

If Tesla determines they really need lidar and if the price comes down dramatically as expected they might put it in in a future model.
 
  • Like
Reactions: mblakele
The car will do whatever the human tells it to do... every AEB system is susceptible to that.

i'm talking about autopilot accidents and secondly no other cars FCB actually stop the cars.

Lidar has trouble in rain, fog, snow, or simply seeing around cars. Radar not so much.

A human being doesn't have lidar, they have the equivalent of optical cameras. With optical cameras and motion you can make 3D maps of your environment, you can do object recognition, etc.

There's no compelling reason lidar must be on the car. Sometimes I hear people say what if the cameras were obscured by dirt or salt etc. but lidar would suffer the same issue.

If Tesla determines they really need lidar and if the price comes down dramatically as expected they might put it in in a future model.

So basically you have no original thought. if tesla/elon bash it then its pointless, but if tesla/elon praise and need it then its warranted.
So why are you even in this discussion at all?
 
i'm talking about autopilot accidents and secondly no other cars FCB actually stop the cars.

For the record the Subaru AEB system can stop the car if the speed differential between it and the other car is < 30mph. That's a stereo visual system that can easily be impaired because of the sun/dirt or simply not detecting the object.

As to Tesla I don't know what became of the entire Point Cloud thing. Tesla enabled the data collection with version 8.0, but as far as I know it still doesn't do active braking as a result of this data (at least not in HW1). It MIGHT do FCW because of it.

This is critical to this discussion since the point cloud thing hasn't been proven that it will work. I'm a bit skeptical that HW1/HW2 owners will ever have an AEB system that can activate on radar only. If it could wouldn't we have it by now?

As to me I'm a huge proponent of sensor fusion.

Radar including on the sides/rear
Lidar
Vision from multiple cameras going to at least three NN's.
Car to Car communication, and road to car.

I don't think any of these technologies by themselves gets you to where you need to be. But, if you combine them together you get a much better view of the world.

People are not going to be okay with a system that isn't orders of magnitudes better than a human driver.

Luckily I think the two opposing sides Radar/Lidar are solving the drawbacks with their system. Like point cloud with radar and this with Lidar.

Driverless cars have a new way to navigate in rain or snow
 
Last edited:
  • Informative
Reactions: Bladerskb
For the record the Subaru AEB system can stop the car if the speed differential between it and the other car is < 30mph. That's a stereo visual system that can easily be impaired because of the sun/dirt or simply not detecting the object.

As to Tesla I don't know what became of the entire Point Cloud thing. Tesla enabled the data collection with version 8.0, but as far as I know it still doesn't do active braking as a result of this data (at least not in HW1). It MIGHT do FCW because of it.

This is critical to this discussion since the point cloud thing hasn't been proven that it will work. I'm a bit skeptical that HW1/HW2 owners will ever have an AEB system that can activate on radar only. If it could wouldn't we have it by now?

As to me I'm a huge proponent of sensor fusion.

Radar including on the sides/rear
Lidar
Vision from multiple cameras going to at least three NN's.
Car to Car communication, and road to car.

I don't think any of these technologies by themselves gets you to where you need to be. But, if you combine them together you get a much better view of the world.

People are not going to be okay with a system that isn't orders of magnitudes better than a human driver.

Luckily I think the two opposing sides Radar/Lidar are solving the drawbacks with their system. Like point cloud with radar and this with Lidar.

Driverless cars have a new way to navigate in rain or snow

I was actually saying other cars FCB actually stop the cars.

Volvo for example also has it and it brakes for you. Frankly tesla is the only one who doesn't
 
Audi & Nissan will also release a L3 car in 2018.

Actually Audi's Level 3 (traffic jam speeds on highways) Audi A8 is coming this year, so 2017. Though of course it is possible deliveries only happen in 2018 and U.S. definitely only in 2018. I would expect to see some cars delivered in Germany in 2017. Audi has been working on their self-driving for years and this is the first car that will have it. It is not an Autopilot 1 kind of thing at all, it will actually self-drive, with Audi taking responsibility for its driving and letting you concentrate on other things, within a limited scenario.

So, IMO it seems possible Audi has Level 3 before Tesla does. If Tesla wants to beat Audi, they'd have to mature EAP (or an early FSD) to the point that customer can read a book and be given say 10-15 seconds to intervene when the system is no longer capable. No matter Tesla's bravado, they seem to be a long way off from telling us to take our hands off from the wheel and pick up Fifty Shades Darker. But maybe that early FSD version comes with a different story in those "3-6 months"? With Telsa, one has to be ever the optimist, no? ;)

Of course Tesla's unique benefit are the constant software updates. There is no other car to buy today or probably any time in 2017 (perhaps even 2018) with this level of future potential in self-driving. I'm hesitant to say much beyond potential, because we know historically Tesla's claims have not really come fully true (Performance models, AP1 features etc.). We shall see if e.g. Audi follow suit in some fashion on the updates, probably not yet. But that Level 3 Audi A8 will have a far more comprehensive sensor package than AP2 does. This of course is the trend: Tesla is doing more with less hardware, already since AP1 days, and has constantly changing software through over-the-air updates.
 
  • Informative
Reactions: oktane
@stopcrazypp @AnxietyRanger @JeffK

I didn't get to finish my inputs about radar, point clouds and the hardware tesla is using. Since its been acouple posts I will relent.
But I will say this, since 8.0 software, Elon's so called "point cloud" has already been implemented.

...

Yet cars are still running into barriers and rear ending other cars.
Hurray the great LIDAR replacement.

But I do think it is important to get the distinctions right, don't you?

We now have the data that Tesla uses the same radar in AP1 and AP2 - and that it can see in 3D. Thus Tesla can get a 3D point-cloud from it. We can settle that sub-discussion, right?

Your next points about the quality of the point-cloud are a new conversation and one that I for one welcome. But it does not diminish the value of getting the earlier conversation, which was useful too.
 
The idea that tesla will collect 50-100 GBs of raw video data from each car everyday and somehow upload it with no hacker having detected any such upload

I certainly don't know exactly the form the days being transferred back to the mother ship, but I don't know where you would get 50-100 gigs. If you can download a 2 hour HD movie from Netflix and it be around a gig, I don't see why you couldn't steam 2 hours with of driving data a per day from ever car. So 8 gigs or so of compressed video and telemetry data? Steamed over 24 hours is nothing. Not to mention that the data they choose to acquire could be limited to stations where the software in the car encounters a situation where it would have done something different then the driver or encounters something it doesn't understand or an object it can't identify.

Imagine you are driving along the freeway for an hour and the HW2 is doing it's thing, plotting it's course, identifying objects, road signs, cars, trucks and so on. In that hour it might log a few hundred items or situations it doesn't understand. Those might amount to several minutes of image data and telemetry data. No need to send the entire hour long trip.

There are a billion ways Tesla could leverage the fleet data without transmitting every piece of raw uncompressed sensor data. Though I still disagree with your 100-150g of data assertion as it is nonsensical.
 
  • Like
Reactions: jimmy_d and MP3Mike
Thank you for the data point @croman. Excellent info.

No problem. I started noticing it when I wasn't getting updates and I created a network in my garage just for the car. Its fun watching it go. There is no rhyme nor reason as far as I can tell when it uploads. I do have 6500 miles on it since December 27. It does next to nothing on LTE as far as I can tell.
 
  • Like
Reactions: AnxietyRanger
My AP2 car uploads 1 to 1.5gb of data on most nights. It's on its own network . Starts around 930pm.

So about an hour a day on average of driving and about an hour a day worth's of data. Makes sense to me, going back to my response earlier to the notion that it would require 100-150gigs of data:

I certainly don't know exactly the form the days being transferred back to the mother ship, but I don't know where you would get 50-100 gigs. If you can download a 2 hour HD movie from Netflix and it be around a gig, I don't see why you couldn't steam 2 hours with of driving data a per day from ever car. So 8 gigs or so of compressed video and telemetry data? Steamed over 24 hours is nothing. Not to mention that the data they choose to acquire could be limited to stations where the software in the car encounters a situation where it would have done something different then the driver or encounters something it doesn't understand or an object it can't identify.

Imagine you are driving along the freeway for an hour and the HW2 is doing it's thing, plotting it's course, identifying objects, road signs, cars, trucks and so on. In that hour it might log a few hundred items or situations it doesn't understand. Those might amount to several minutes of image data and telemetry data. No need to send the entire hour long trip.

There are a billion ways Tesla could leverage the fleet data without transmitting every piece of raw uncompressed sensor data. Though I still disagree with your 100-150g of data assertion as it is nonsensical.

The whole point of shadow mode and machine learning is to feed the system data. You do not have to feed it every second of every mile that people drive and you do not have to feed it completely raw uncompressed camera data from all 8 cameras.
 
There are basically two kinds of data in this, or so the argument goes: the kind you can use to train a neural network - and the kind of data that might be useful otherwise to train or validate your auto-driving system. The assumption so far has been that it only sends the latter and that former is done by Tesla themselves.

Any comments on that based on this data amount? Or this summary? I am no expect, so I will not even begin to guess.
 
I need to see the receipts and confirmation from others

Uh ok...I haven't checked the data in a month since I haven't gotten an update but I'll look tonight and take a screen capture of I have time and care but frankly it is what it is. I have no idea what the data is or whether Tesla actually does anything but it's a fact my car sends this data to them. You can choose not to believe without proof.
 
  • Like
Reactions: boaterva