Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla FSD/Autopilot under attack

This site may earn commission on affiliate links.
Yes, it would be interesting to know if GM gets ANY information from OnStar system.
A quick web search reveals the OnStar privacy policy, which mentions data collected from the vehicle:
  • Information about your vehicle, like the vehicle’s identification number (“VIN”), its make, model and year;
  • Information about your vehicle’s condition, like diagnostic data, odometer readings, oil life remaining, tire pressure, fuel economy;
  • Information about when your vehicleis refueled or recharged;
  • Information that might indicate that your vehicle has been broken into or stolen, like glass breakage or ignition switch activity if your vehicle is equipped to detect those things;
  • Information about apps that are pre-installed on the OnStar equipment, including the version of those apps;
  • Information about the OnStar equipment, including the version of the OnStar software installed on that equipment;
  • Information about when your vehicle’s ignition is on or off; and
  • Information about collisions involving your vehicle, like the direction from which impact happened and which air bags deployed.
 
  • Like
Reactions: gearchruncher
I agree that L1-L5 is flawed.
I agree that there really is no L3 due to either extremely limitations or extreme danger.
I completely disagree on L4 as L4 is really where autonomous vehicles are at. L4 doesn't require a driver to take over immediately, and it's not even a timed take over like L3. It has to pull safely off the road to allow a sleeping passenger to take over.
L5 is just some fairy tale, and relies on the fairy tale of general AI.

So what we really have is either the human is responsible or the car is responsible during L4 activation in a geo-fenced area/conditions.

So basically L1, and L4.

L2 tries to do something really dumb by taking away driving responsibility from the driver yet still assigning the individual 100% of the blame if an accident happens.
What, did you just say that what Tesla does with autopilot and FSD beta is really dumb?

If you mean that level 2 is very difficult to do safely because humans tend to trust the system too much, I totally agree.
 
Last edited:
A quick web search reveals the OnStar privacy policy, which mentions data collected from the vehicle:
  • ...
  • Information about collisions involving your vehicle, like the direction from which impact happened and which air bags deployed.
Thank you. There was a discussion in legal field regarding the privacy of collecting data when owners are not subscribed for OnStar. Assuming that GM collects data independent of the owners will, it would be interesting to see what exactly is being collected other than the air bag deployment, location, and speed information. Has anybody seen a data log file from OnStar/GM?
 
Assuming that GM collects data independent of the owners will, it would be interesting to see what exactly is being collected other than the air bag deployment, location, and speed information. Has anybody seen a data log file from OnStar/GM?
Has anyone seen a data log file from Tesla? I mean what is your point here? It's clear that Tesla is not the only company that has real time telematics related to crashes.
You can turn off data sharing in a Tesla too FYI.
 
Great points! These are the point intensely promoted by people who either have no clue or have an agenda or both.
Let me explain:

For #1, video monitoring instead of torque monitoring is a valid criticism. It has been obviously addressed by Tesla in some resent updates. Note that many other systems (such as Co-Pilot360) still don't use the video monitoring. Singling out Tesla for this is (a) now misleading and (b) totally biased against Tesla.

For #2, Tesla software is evolving at the speed that has not been known in the industry. Everyone is basically stuck with their current software forever unless your car is a Tesla. That said, Tesla/Musk/User Manual admit that software isn't perfect and the driver must watch it be ready for corrective action. This is a MORE conservative approach than, say, Ford's or GM's which declared that you can now ride with their under-cooked Cruise/Pilot soft hands-free. So, Tesla IMHO is the safety leader here in implementation of partly autonomous software. Again, 0 reason to single out Tesla here.

For #3, Radar or lidar consume a lot of computing resources and coding can be really hard for the system to make decisions on conflicting inputs. I am glad that Tesla has solved the video conversion problem and got rid of radar. The use of lidars for solving the problem of autonomous motion would require a separate Tesla-like computer added to each lidar to convert its overwhelming stream of data into a reliable compact structure. Very expensive and you still have no clue about the color of the traffic signals.

I believe Munro summed up the situation around Tesla's FSD pretty well. Don't you agree?

Just a big pile of misinformation
 
Last edited:
  • Disagree
Reactions: drtimhill
See thread title... make internal bet with self on not only what the content of the discussion will be, but also which forum members will actively participate in said discussion.

Check back on thread after 24 hours, confirm both thread content, and participants, matched expectations.

Note that I am not saying anything is wrong, bad, etc... just that it was fairly predictable what this thread would look like, and who would participate in it, based on the title.
Is this sub-forum the Tesla Misinformators Club? The level of anti-Tesla bias here is approaching Jalopnik and Edmunds. We had a lot better discussions on substance with admittedly anti-Tesla biased users such as FordMME in the other sub-forums than in this one. If you inform me about your observations 24 hours ago, you'd save me time reading through the bs of the forum members who are so predictable to you. This is my first endeavor at the Autopilot sub-forum, and I didn't expect it to be so radically different from other places on TMS. This is an eye-opening experience, thank you.
 
  • Disagree
Reactions: qdeathstar
Is this sub-forum the Tesla Misinformators Club? The level of anti-Tesla bias here is approaching Jalopnik and Edmunds. We had a lot better discussions on substance with admittedly anti-Tesla biased users such as FordMME in the other sub-forums than in this one. If you inform me about your observations 24 hours ago, you'd save me time reading through the bs of the forum members who are so predictable to you. This is my first endeavor at the Autopilot sub-forum, and I didn't expect it to be so radically different from other places on TMS. This is an eye-opening experience, thank you.

If you wanted to see what the discussion in this particular subform looked like, simply open up any of the threads on the first page of this forum. Its all pretty similar discussions. I have no opinion on "misimformation" or whatever, its just all the threads look pretty much the same, with pretty much the same people, saying pretty much the same things.
 
Link away. Links to sold, verifiable data are a good way to prove your "yes" isn't just misinformation.

What misinformation and "political agenda" do you believe these journalists have? That Tesla autopilot is actually very safe, capable, and proven, and should not be measured, judged, or examined in any way except to exalt it?

This is an eye-opening experience, thank you.
Go check out Reddit. The discussions around AP there are pretty mixed. It's far from a "defend Tesla at all costs" group, particularly lately given Tesla's huge misses on autonomy over the last few years.
 
  • Like
Reactions: Matias
What, did you just say that what Tesla does with autopilot and FSD beta is really dumb?

If you mean that level 2 is very difficult to do safely because humans tend to trust the system too much, I totally agree.
What I'm saying is really dumb is to have manufacture like Tesla used L2 as a way to avoid any responsibility for their hardware/software failing.

Basically think about it in terms of balance.

Under manual driving the driver is at blame unless some vehicle malfunction causes the accident.

Under driver assist like autopilot the same thing as the above holds true, but its completely out of balance. The driver can't possibly be expected to have the same level of performance. So the vehicle needs to have more responsibility to balance it out.

I'm not saying L2 can't be done
I'm not saying the benefit of L2 can't outweigh the inherent risks of it

I'd like L2 to be done in a way that forces manufactures to build quality hardware/software.

In fact I wouldn't even allow an L2 vehicle on the road unless it passed at least 100 different tests.
 
  • Like
Reactions: daktari
What, specifically, is the misinformation is this guys post?

I'll take #1 as its the easiest as a single link shows its wrong.
#2 is subjective and depends a lot on what's important to people.
#3 is factually wrong.

According to this the blue cruise offered by copilot 360 does have driver monitoring. It's a little hard to have hands free without driving monitoring.

 
In fact I wouldn't even allow an L2 vehicle on the road unless it passed at least 100 different tests.
1) My guess is that any L2 system passes way more than 100 different tests.
2) It is good for humanity that you are not in a position of authority to control the deployment of L2 capabilities or any other automotive system deployment to that regard.
 
According to this the blue cruise offered by copilot 360 does have driver monitoring. It's a little hard to have hands free without driving monitoring.

Certainly, you have no clue. Ford has Co-Pilot360 for a long time that allowed drivers to go hands free for short time even on country roads, and it had no camera monitoring of the driver.
This video may help you to learn more about the real functionality of Fords autopilot.
 
  • Funny
Reactions: helvio
I guess I'll take #3. LIDAR data is very simple and far easier to process than camera data (Samsung makes a robot vacuum that uses LIDAR!). It does not require neural networks at all.
You can't rely on LIDAR without a camera, so 1) you have two types of data streams; 2) you need to interpret both streams (tag the objects); 3) you need now to merge the two streams for decision making.
 
You can't rely on LIDAR without a camera, so 1) you have two types of data streams; 2) you need to interpret both streams (tag the objects); 3) you need now to merge the two streams for decision making.
You definitely need cameras. I'm not sure how much labeling is done on LIDAR data. Obviously you can feed both LIDAR data and camera data into the same perception neural net. I'm not an expert but certain types of decision making are very simple (i.e. don't run into the side of the semi truck trailer that the vision neural net didn't see!). I'll also point out that Tesla has a neural net that creates a Vidar data stream (exactly the same type of depth data that LIDAR does, just with lower accuracy) so they'll have to figure out how to merge those two streams for decision making too.
People often say LIDAR causes all sorts of problems but if you look at the actual collisions that AVs with LIDAR have they always seem to have nothing to do with LIDAR.