Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Poll: 81% of Prospective Model 3 Owners Say They Won’t Pay Upfront For Full Self-Driving

This site may earn commission on affiliate links.
[vc_row][vc_column][vc_column_text]It seems most prospective Model 3 owners aren’t willing to shell out cash upfront for a $3,000 “full self-driving capability” option that is likely years away from becoming available to engage.

In a poll posted by jsraw 81.3% (347) of respondents said they will not pay for the feature at purchase. Adding the option later will cost an additional $1,000. Of respondents, 18.7% said they will pay for FSD upfront.

According to Tesla’s website, FSD “doubles the number of active cameras from four to eight, enabling full self-driving in almost all circumstances, at what we believe will be a probability of safety at least twice as good as the average human driver. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat. For Superchargers that have automatic charge connection enabled, you will not even need to plug in your vehicle.”

Elon Musk has said that level 5 autonomous driving is possible with second generation Autopilot and the FSD option, meaning the car is fully autonomous in any and all conditions. During his TED talk in April, Musk said the company plans to conduct by the end of 2017 a coast-to-coast demo drive from California to New York without the driver touching the wheel.

Obviously, there will be regulatory hurdles ahead and Musk has said it will likely be two years before owners will be able to engage FSD capability.

See a few comments on the poll below, or go to the thread here.

Screen-Shot-2017-08-14-at-3.24.50-PM.png
Swift

Screen-Shot-2017-08-14-at-3.16.15-PM.png
EinSV

Screen-Shot-2017-08-14-at-3.17.02-PM.png
jason1466

Screen-Shot-2017-08-14-at-3.17.59-PM.png
Waiting4M3

Screen-Shot-2017-08-14-at-3.21.06-PM.png
Enginerd[/vc_column_text][/vc_column][/vc_row]

 
Last edited by a moderator:
You need the actual visual radar inputs for the case when there is NO purple box, though.
Steering angles are also not as cut and dry. What if the car is on a curve, driver decided to overtake somebody/change a lane, ...

Basically Everything that the car already knows it already knows, it's the unknown that the car or other automated systems could not infer themselves so human help is needed. And that is where the bottleneck is. I am sure they did not just stopped FCW reporting because it was greatly useful.
The radar SNR is in the driver logs.
If the car is on a curve the lane lines/road are also curved. If you overtake that's a lane change.
If they couldn't infer it themselves then that's listed as a failure. A network can learn from a failure as much as it can from a positive outcome (if not more).
 
Exactly ... object detection is the easy part. No need for humans to verify the majority of time.
How so?
There's a picture. You detect the objects in some way.
How are you going to know for sure nothing was missed and no false detections took place without a human taking a look every single time?
Granted in many cases you might not care the detection was not 100% correct, but that's a dangerous path. It might not be important until it is...
 
  • Like
Reactions: Swift
I think it's mostly hype.


Anomalies like what? How would they be reported? Same for the lights and other "nuanced data". How would the car even know if it misdetetcts something?
Construction zones, missing lanes - you need lots of cars on the road everywhere for this, otherwise it'll only work reliably in California and everywhere else you'd still need to pay for a subscription to the feed (and it's unlikely they'd let you remove just California data for a discount, I think).

Recognizing spikes is going to be challenging and I am not sure what's the value in gated entrances. Also some gated entrances are open all the time until they are closed at the worst possible moment ;)


pedestrian/potholes/debris are typically short-lived so only matters if you have a lot of Teslas in vicinity so they can take advantage of it right this moment. (if you do remember such events on the other hand that's going to be a separate can of worms in the form of "why is my car always slow at this spot?")
Also I suspect debris are going to be a source of frustration since a flailing around plastic bag is going to be hard to tell from something more dangerous, I suspect.
Remember the car is not a human so there's no easy way to determine why did the driver take whatever action, unless you send that to the mothership right away and have people waiting to analyze it eagerly so there's no delay (= lots of humanpower).


I suspect lots of people don't properly disengage AP before doing some maneuver (I certainly don't a lot of times) that disables the AP anyway so sifting through this data in search of gems would be hard. If you don't even depend on eap/tacc activated then there's going to be even bigger crapshoot. "oh no, there's a guy in Mustang overtaking me, cannot allow that, full on forward!"
"When you approach a yellow light - accelerate", ...

So there's the eclipse tomorrow and at about 2:20pm all the drivers would likely greatly slow down and partake in dangerous maneuvers around here. If all those were Teslas with this learning thing, I wonder what would the fleet learn from this? ;)

So in short - you could learn a lot from the data, but you need to employ a lot of humans to look at all this data all the time to actually infer new knowledge, NNs can just help you filter some of the data, but the bigger the filtering likely the more useful data is also lost.

So are you suggesting that because the data may be hard to deconstruct Tesla simply isn't collecting it? I'm having a hard time seeing the point you're trying to make. Any data Tesla can collect from their fleet is going to be valuable. The cars will have the hardware and data connection regardless of if you are paying for it. What better way to improve FSD than to have 500,000 cars on the road all sending back data?
 
  • Like
Reactions: JeffK
How are you going to know for sure nothing was missed and no false detections took place without a human taking a look every single time?
airbag deployment is usually a dead giveaway or aggressive human maneuver... if nothing is detected and this happens then you don't need a human to tell that something must have been missed. It doesn't matter what it was.

Your human taking a look is sitting in the driver's seat. You don't always need another human to take a look if you had a human that was actually there.
 
The radar SNR is in the driver logs.
If the car is on a curve the lane lines/road are also curved. If you overtake that's a lane change.
If they couldn't infer it themselves then that's listed as a failure. A network can learn from a failure as much as it can from a positive outcome (if not more).
I think we are just talking in some different languages or something.
Perhaps it's my lack of understanding, but I just don't see how a system would learn something when the driver input differs from suggested system driving output and the actual driver input did not result in a crash.
 
  • Like
Reactions: JeffK
I think we are just talking in some different languages or something.
Perhaps it's my lack of understanding, but I just don't see how a system would learn something when the driver input differs from suggested system driving output and the actual driver input did not result in a crash.

depends if the system output would have resulted in a crash (based on physics of the moving vehicle)

I'd suggest a machine learning course for the basics. Coursera has some really good ones. Some of the coursera courses are taught by people who worked on Google's self driving car too.

I'm suggesting them for people at my work as well.

Here's a TED talk by Sterling Anderson which has been posted in other threads as well
it shows some simulations based on physics
 
Last edited:
So are you suggesting that because the data may be hard to deconstruct Tesla simply isn't collecting it? I'm having a hard time seeing the point you're trying to make. Any data Tesla can collect from their fleet is going to be valuable. The cars will have the hardware and data connection regardless of if you are paying for it. What better way to improve FSD than to have 500,000 cars on the road all sending back data?
Let's start from the basics.
Tesla does not collect any of the "Fleet data" in the sense people seems to imply.
The data Tesla currently collects consists of: camera calibration (about every 15-30 minutes I think?), every time you put a car in park after driving some amount of time, there's autopilot-trip-log report that reports road classes you drove on during this trip, how much of the time autosteer and tacc was available and how much was it actually used.
Every once in a while they upload random 10 second video snapshots of your surroundings (I hope they still do it, did not ask my car to do it for months). They used to upload FCW snapshots, but stopped. They upload crash logs if they detect the car to be in a crash. If an autopilot component crashes (in software sense), they upload debug data of that too.
That's about it, I think. They don't send anything if you disengage the AP in any way, there's no ongoing feed of whatever as you drive, there's no other smarts that I can see.

With that out of the way, can they collect more data? of course they can. The problem is not ability to collect, the problem is ability to get anything useful out of it, as as we see they started to collect less not more, which seems to confirm they cannot really d a lot with the data they are able to collect with resources they have (i.e. I speculate the analysis is a lot less automated and a lot more human-dependent that people realize).

So I do not know WHY they don't collect it, but I know they don't and I am speculating why.
 
  • Informative
  • Like
Reactions: R.S and Swift
That's fair, but that they don't, to our knowledge, currently collect it, doesn't preclude them from doing so in the future, or finding ways to monetize (even internally, ie not having to pay for 3rd party data) to help cover the costs of putting all that hardware into cars that may never utilize it, at least via owner-purchased functionality. I think at this point we're all speculating, I was simply suggesting that it could a logical reason why they're including the hardware into everything.
 
airbag deployment is usually a dead giveaway
I guess we'll need a lot of accidents then.

I was driving around a curve at 50 mph on a 4 lane road the other evening and there was an unmarked police vehicle in the right lane fully stopped. Other than the expletive I yelled, there was no alert from the car. I widened my turn and slid into the left lane to just clear him.

Not sure how fleet learning is going to handle that since the car thought everything was wonderful. When I got my bearings straight I reported it but I'm sure there aren't enough humans at Tesla to handle reviewing it, let alone looking through whatever else is coming back from all the video supposedly being collected.
 
Not sure how fleet learning is going to handle that since the car thought everything was wonderful.
If the car had shadow mode software running we'd actually have no clue what may or may not have been detected by software and it would not have alerted you as the software doesn't interface with the driver at all.

That said, it's entirely possible a fully trained future FSD car in five or ten years might have actually gotten into an accident in that scenario. They aren't going to be perfect, by any means. *Stuff* happens.
 
Let's start from the basics.
Tesla does not collect any of the "Fleet data" in the sense people seems to imply.
The data Tesla currently collects consists of: camera calibration (about every 15-30 minutes I think?), every time you put a car in park after driving some amount of time, there's autopilot-trip-log report that reports road classes you drove on during this trip, how much of the time autosteer and tacc was available and how much was it actually used.
Every once in a while they upload random 10 second video snapshots of your surroundings (I hope they still do it, did not ask my car to do it for months). They used to upload FCW snapshots, but stopped. They upload crash logs if they detect the car to be in a crash. If an autopilot component crashes (in software sense), they upload debug data of that too.
That's about it, I think. They don't send anything if you disengage the AP in any way, there's no ongoing feed of whatever as you drive, there's no other smarts that I can see.

With that out of the way, can they collect more data? of course they can. The problem is not ability to collect, the problem is ability to get anything useful out of it, as as we see they started to collect less not more, which seems to confirm they cannot really d a lot with the data they are able to collect with resources they have (i.e. I speculate the analysis is a lot less automated and a lot more human-dependent that people realize).

So I do not know WHY they don't collect it, but I know they don't and I am speculating why.

At some point, Tesla actually said that as part of silky smooth, Tesla ap2 can hold car on lane better than human driver. that's telling that Tesla actually collecs metrics.

Also, the point of data mining and AI learning is that a large data set can be processed without a lot of human input. I.e Little to no human interactions are happening to process billions of logs. Think about musk landing stage 1 for space x autonomously and tell me again how a person can do that. It's the similar way that eap and fsd will get here.
 
At some point, Tesla actually said that as part of silky smooth, Tesla ap2 can hold car on lane better than human driver. that's telling that Tesla actually collecs metrics.
How do you know Tesla representatives are lying? Their lips are moving ;)
I wish it was a joke.

There is zero way for Tesla to know how does a particularhuman driver holds within a lane, currently there's also no way for them to know how does autopilot in customer cars does this. Just today getting the below log I had several attenpts by autopilot to jump out of a lane either into upcoming traffic on outside of the road.

Here I present you the whole trip log that is posted after every trip to Tesla (I just did a 30 minutes driving with a bunch of autosteer enabled to get it and ensure this is all there is to it):
Code:
{
    "snapshot-version": "0.3",
    "wall-time": "XXX",
    "monotonic-time": "XXX",
    "sha1": "XXX",
    "requester": "autopilot-trip-log",
    "request-clock-type": "monotonic",
    "request-trigger-time": "XXX",
    "boot-sec-info": "XXX",
    "vehicle-type": "XXX",
    "entries": {
        "autosteer": {
            "time_s": {
                "autosteer_unavailable": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "20",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "159",
                    "ROAD_CLASS_4": "61",
                    "ROAD_CLASS_5": "208",
                    "ROAD_CLASS_6": "16"
                },
                "autosteer_available": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "35",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "158",
                    "ROAD_CLASS_4": "36",
                    "ROAD_CLASS_5": "63",
                    "ROAD_CLASS_6": "12"
                },
                "autosteer_on": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "362",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "731",
                    "ROAD_CLASS_4": "58",
                    "ROAD_CLASS_5": "68",
                    "ROAD_CLASS_6": "0"
                }
            },
            "odometry_m": {
                "autosteer_unavailable": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "0.000000",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "2100.000000",
                    "ROAD_CLASS_4": "1000.000000",
                    "ROAD_CLASS_5": "1500.015625",
                    "ROAD_CLASS_6": "0.000000"
                },
                "autosteer_available": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "0.000000",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "3100.000000",
                    "ROAD_CLASS_4": "800.015625",
                    "ROAD_CLASS_5": "700.015625",
                    "ROAD_CLASS_6": "1199.984375"
                },
                "autosteer_on": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "13700.015625",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "18599.984375",
                    "ROAD_CLASS_4": "0.000000",
                    "ROAD_CLASS_5": "899.984375",
                    "ROAD_CLASS_6": "0.000000"
                }
            }
        },
        "acc": {
            "time_s": {
                "acc_unavailable": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "0",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "39",
                    "ROAD_CLASS_4": "11",
                    "ROAD_CLASS_5": "107",
                    "ROAD_CLASS_6": "0"
                },
                "acc_available": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "53",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "131",
                    "ROAD_CLASS_4": "66",
                    "ROAD_CLASS_5": "161",
                    "ROAD_CLASS_6": "29"
                },
                "acc_on": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "364",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "878",
                    "ROAD_CLASS_4": "78",
                    "ROAD_CLASS_5": "71",
                    "ROAD_CLASS_6": "0"
                }
            },
            "odometry_m": {
                "acc_unavailable": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "0.000000",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "400.000000",
                    "ROAD_CLASS_4": "0.000000",
                    "ROAD_CLASS_5": "1500.015625",
                    "ROAD_CLASS_6": "0.000000"
                },
                "acc_available": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "0.000000",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "3099.984375",
                    "ROAD_CLASS_4": "1800.015625",
                    "ROAD_CLASS_5": "700.015625",
                    "ROAD_CLASS_6": "1199.984375"
                },
                "acc_on": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "13700.015625",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "20300.000000",
                    "ROAD_CLASS_4": "0.000000",
                    "ROAD_CLASS_5": "899.984375",
                    "ROAD_CLASS_6": "0.000000"
                }
            }
        },
        "alc": {
            "aborted": "0",
            "completed": "0",
            "initiated": "0",
            "requested": "0"
        },
        "config": {
            "version": "1.0",
            "lc_enable": "true",
            "autopilot_car": "true"
        }
    }
}
I wonder what's that alc stuff with all zeroes, I specifically did some autosteer aborts and whatnot, yet it's all zeros.
Note there's no location information, no disengagement reports, no other performance metrics.

Granted, the CID also runs something called "telemetry client" that might post more data, but that is like third-hand info glanced from canbus - i.e. what's displayed on IC and related info, nothing too deep.

Also, the point of data mining and AI learning is that a large data set can be processed without a lot of human input. I.e Little to no human interactions are happening to process billions of logs. Think about musk landing stage 1 for space x autonomously and tell me again how a person can do that. It's the similar way that eap and fsd will get here.
That's the theory they want you to believe. Reality is: computers cannot think. Yes, they can find certain trends, no this is not going to help them analyze why did a disengagement happen in this particular instance.
I do not know what sort of machine learning they needed to space x landing, but landing a rocket in general (as long as you know where) is mostly solved anyway. You do not need complex neural networks and whatnot, there are no obstacles in the sky, just keep the rocket upright and slowly adjust throttle to land. See lunar landers of various vintages for example of that from way in the past.

Certainly capabilities of computers will change over time and more things will become possible, we'll revisit it then, I guess.
Right now even Tesla seems to mostly abandoned the idea of data collection as mostly useless. I think I heard even a quote from the previous earnongs call that reflected to that from Musk, but cannot find it ATM.
 
How do you know Tesla representatives are lying? Their lips are moving ;)
I wish it was a joke.

There is zero way for Tesla to know how does a particularhuman driver holds within a lane, currently there's also no way for them to know how does autopilot in customer cars does this. Just today getting the below log I had several attenpts by autopilot to jump out of a lane either into upcoming traffic on outside of the road.

Here I present you the whole trip log that is posted after every trip to Tesla (I just did a 30 minutes driving with a bunch of autosteer enabled to get it and ensure this is all there is to it):
Code:
{
    "snapshot-version": "0.3",
    "wall-time": "XXX",
    "monotonic-time": "XXX",
    "sha1": "XXX",
    "requester": "autopilot-trip-log",
    "request-clock-type": "monotonic",
    "request-trigger-time": "XXX",
    "boot-sec-info": "XXX",
    "vehicle-type": "XXX",
    "entries": {
        "autosteer": {
            "time_s": {
                "autosteer_unavailable": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "20",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "159",
                    "ROAD_CLASS_4": "61",
                    "ROAD_CLASS_5": "208",
                    "ROAD_CLASS_6": "16"
                },
                "autosteer_available": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "35",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "158",
                    "ROAD_CLASS_4": "36",
                    "ROAD_CLASS_5": "63",
                    "ROAD_CLASS_6": "12"
                },
                "autosteer_on": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "362",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "731",
                    "ROAD_CLASS_4": "58",
                    "ROAD_CLASS_5": "68",
                    "ROAD_CLASS_6": "0"
                }
            },
            "odometry_m": {
                "autosteer_unavailable": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "0.000000",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "2100.000000",
                    "ROAD_CLASS_4": "1000.000000",
                    "ROAD_CLASS_5": "1500.015625",
                    "ROAD_CLASS_6": "0.000000"
                },
                "autosteer_available": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "0.000000",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "3100.000000",
                    "ROAD_CLASS_4": "800.015625",
                    "ROAD_CLASS_5": "700.015625",
                    "ROAD_CLASS_6": "1199.984375"
                },
                "autosteer_on": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "13700.015625",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "18599.984375",
                    "ROAD_CLASS_4": "0.000000",
                    "ROAD_CLASS_5": "899.984375",
                    "ROAD_CLASS_6": "0.000000"
                }
            }
        },
        "acc": {
            "time_s": {
                "acc_unavailable": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "0",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "39",
                    "ROAD_CLASS_4": "11",
                    "ROAD_CLASS_5": "107",
                    "ROAD_CLASS_6": "0"
                },
                "acc_available": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "53",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "131",
                    "ROAD_CLASS_4": "66",
                    "ROAD_CLASS_5": "161",
                    "ROAD_CLASS_6": "29"
                },
                "acc_on": {
                    "ROAD_CLASS_UNKNOWN": "0",
                    "ROAD_CLASS_1": "364",
                    "ROAD_CLASS_2": "0",
                    "ROAD_CLASS_3": "878",
                    "ROAD_CLASS_4": "78",
                    "ROAD_CLASS_5": "71",
                    "ROAD_CLASS_6": "0"
                }
            },
            "odometry_m": {
                "acc_unavailable": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "0.000000",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "400.000000",
                    "ROAD_CLASS_4": "0.000000",
                    "ROAD_CLASS_5": "1500.015625",
                    "ROAD_CLASS_6": "0.000000"
                },
                "acc_available": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "0.000000",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "3099.984375",
                    "ROAD_CLASS_4": "1800.015625",
                    "ROAD_CLASS_5": "700.015625",
                    "ROAD_CLASS_6": "1199.984375"
                },
                "acc_on": {
                    "ROAD_CLASS_UNKNOWN": "0.000000",
                    "ROAD_CLASS_1": "13700.015625",
                    "ROAD_CLASS_2": "0.000000",
                    "ROAD_CLASS_3": "20300.000000",
                    "ROAD_CLASS_4": "0.000000",
                    "ROAD_CLASS_5": "899.984375",
                    "ROAD_CLASS_6": "0.000000"
                }
            }
        },
        "alc": {
            "aborted": "0",
            "completed": "0",
            "initiated": "0",
            "requested": "0"
        },
        "config": {
            "version": "1.0",
            "lc_enable": "true",
            "autopilot_car": "true"
        }
    }
}
I wonder what's that alc stuff with all zeroes, I specifically did some autosteer aborts and whatnot, yet it's all zeros.
Note there's no location information, no disengagement reports, no other performance metrics.

Granted, the CID also runs something called "telemetry client" that might post more data, but that is like third-hand info glanced from canbus - i.e. what's displayed on IC and related info, nothing too deep.


That's the theory they want you to believe. Reality is: computers cannot think. Yes, they can find certain trends, no this is not going to help them analyze why did a disengagement happen in this particular instance.
I do not know what sort of machine learning they needed to space x landing, but landing a rocket in general (as long as you know where) is mostly solved anyway. You do not need complex neural networks and whatnot, there are no obstacles in the sky, just keep the rocket upright and slowly adjust throttle to land. See lunar landers of various vintages for example of that from way in the past.

Certainly capabilities of computers will change over time and more things will become possible, we'll revisit it then, I guess.
Right now even Tesla seems to mostly abandoned the idea of data collection as mostly useless. I think I heard even a quote from the previous earnongs call that reflected to that from Musk, but cannot find it ATM.
Where's the other info from the driver logs?
 
What do you mean?
This is everything that the autopilot computer sends to tesla after you put the car in park after a bit of driving. Nothing else.
While you are driving it sends calibration data for main and narrow cameras every 5-6 minutes. Interested in a sample?
Not the autopilot data... the driver logs.
An example of the type of data is found at the end of this florida highway patrol report about the accident last year
Florida Highway Patrol releases full investigation into fatal Tesla crash: read it here
 
Not the autopilot data... the driver logs.
An example of the type of data is found at the end of this florida highway patrol report about the accident last year
Florida Highway Patrol releases full investigation into fatal Tesla crash: read it here
oh, that's all over the place. Also in that case it was AP1 that's a bit different.

Anyway gateway stores a bunch of that data in a binary format that I did not look how to read.
This could be proactively requested by mothership and is not volunteered by the car otherwise I think. I do not know how far back does it go. last time this log was requested from my car 3 days ago, prior to that - more than a week of not requesting it.
The log size was ~33M

In addition autopilot in ap2 stores can bus traffic for some time going back.
Some of it also gets into the regular logs of cid and ic.
E.g. ic logs the hands on wheel and autopilot state like this:
Code:
2017-08-20TXXX-07:00 ic QtCarCluster[21135]: [DataValueManager] INFO DAS_autopilotHandsOnState now 'NotDetected' (was 'Detected')
2017-08-20TXXX-07:00 ic QtCarCluster[21135]: [DataValueManager] INFO DAS_autopilotHandsOnState now 'Detected' (was 'NotDetected')
2017-08-20TXXX-07:00 ic QtCarCluster[21135]: [DataValueManager] INFO DAS_autopilotHandsOnState now 'NotDetected' (was 'Detected')
2017-08-20TXXX-07:00 ic QtCarCluster[21135]: [DataValueManager] INFO DAS_autopilotHandsOnState now 'Detected' (was 'NotDetected')
2017-08-20TXXX-07:00 ic QtCarCluster[21135]: [DataValueManager] INFO DAS_autopilotHandsOnState now 'NotDetected' (was 'Detected')
2017-08-20TXXX-07:00 ic QtCarCluster[21135]: [DataValueManager] INFO DAS_autopilotHandsOnState now 'Detected' (was 'NotDetected')
2017-08-20TXXX-07:00 ic QtCarCluster[21135]: [DataValueManager] INFO DAS_autopilotState now 'Unavailable' (was 'Active')
But this is second hand info based on the canbus messages.
 
  • Informative
Reactions: Swift
im definitely gonna buy full self driving, and avoiding that 1k penalty is one of the reasons why

I have a great bridge in Brooklyn I'll sell you. It's a bargain, and because I like you, I'll throw in a bag of magic beans. But if you don't sign on the dotted line today, the price goes up 33% tomorrow.

The chance to get something less expensively by buying it right now certainly doesn't mean that it's a smart thing to do.
 
This is not a theory...I've tested it enough times now (when other cars aren't oncoming) to know this will happen.

View attachment 240673

Actually, by definition, that is EXACTLY a theory. Didn't you pay attention in science class? You've experimented, collected data, tried to falsify the EAP on backroads. You did exactly that. That my friend, is theory, not the misused, "Evolution is just a theory" BS.
 
  • Like
Reactions: DR61 and landis