Sorry, I'm having trouble following all that. Was that a "yes" or a "no" to the question of "Are we sure this does something?" The service center has told me these do not get sent anywhere. There is an approximately two week window for the service center to download them then they overflow the end of the buffer. Are you saying a bug report will be sent to someone? Who?
What is a "deet"?
Bug reports are automated WiFi upload and bypass the service center. Set up a packet sniff and MITM (Man In The Middle - Intermediary certificate) the secure connection on the WiFi and it's possible to see the data that gets sent. Spotted a squelch flag too in the response, so I'd guess that some people abuse the voice command bug report function. These reports go into a system that the service center may or may not be able to see. They GO, but as to who looks at them? No idea.
So the answer to "Are we sure this does something?" is "No", since we have no way of knowing if any given person is squelched, or if they have their WiFi set up correctly. However there is a sufficiently-reasonable possibility that it does, so if you make a good faith effort to provide high quality data, then you might be helping, and that can't hurt.
'Deet' is slang for 'detail' when it's not referring to the insect repellent diethyltoluamide.
Not much actual facts there. When a car crosses the road some distance in front of me doing 25 mph it has a breaking distance longer than the vehicle or the lane, so it can't possibly stop and become a hazard for me. That's why I don't hit my brakes. What is Tesla's excuse for hitting the brakes irrationally?
Unless it hits something and comes to an unexpected standstill. Please also remember that radar doesn't provide anything other than the relative speed of everything in the cone of detection and the optical data is currently frame-based. The car literally has zero concept of "time". It does not know what happened a frame ago, nor can it anticipate what will happen in the future.
But in general it's user error in activating TACC at all in that environment.
From the manual:
"Traffic-Aware Cruise Control is primarily intended for driving on dry, straight roads, such as highways and freeways. It should not be used on city streets."
Most vehicles that use radar-based TACC will exhibit this behavior if TACC is used in an inappropriate situation. With the advent of Teslas, we're having more opportunities for people who don't understand to become statistics and stories like the guy who turned on cruise control in the motor home then went to make a sandwich because they thought "cruise control" meant the car would completely drive for them. It appears that a lot of Tesla owners think the car has superpowers that it doesn't. Tesla is working toward this, but until Level 5 Autonomy is complete, RTFM (Read The Flipping Manual) and pay attention to the limitations. You're getting partial
When a car is leaving the lane for a turn lane and is slowing down with a few inches still over the line it is no more dangerous than the cars about to run into my rear bumper because they have no idea why I am still riding my brake pedal. Even a couple of seconds after the turning vehicle leaves the lane my Tesla won't pickup the pace and not only continues going slow, but continues to slow down further, nearly to a stop. Yeah, these are dangerous issues.
Absolutely! Dangerous issues that are easily solved by not using the TACC in the places it's not meant for. This is why the manual makes it clear that the driver is in charge and shouldn't do the thing. If you're using the handle of a screwdriver to hammer in a screw and it slips and you cut your hand, don't blame the screwdriver. Misapplication of a tool is user error.
The numbers you cite above are not the issue because while people have a known accident rate, autonomous vehicles simply have not been on the roads enough to have good statistics given the moving target of what software is being measured. Then there is the issue that the Tesla software is claimed to be dependent on humans as a back up which is a very flawed approach. Humans are very poor at situations where they are a passive observer expected to take over. We won't have anything to actually compare to a human driver until the system is fully autonomous and a human operator is not required.
I'll summarize that there are some logical fallacies in that paragraph and this is both objectively incorrect and subjectively questionable as a result. I don't want to go into the details on that as it may come across as patronizing or lecturing and I like people to have the option to introspect retrospectively and consider what might be improved before pointing it out directly.
===========
In general, I'm trying to figure out why people think that...
- Teslas, which use the exact same well-known technology as other cars, are expected to act differently than those other cars in the same situations
- The car should be blamed when the driver is misusing a tool it provides
- Incomplete software isn't incomplete despite being told many, many times in the manual and the UI that it's beta/incomplete/subject to limitations
So far one person has shown me two videos of unexpected braking, saying the car "slammed on the brakes". Actual events:
Video 1: Car went over a seam in a bridge that caused a forward drop and a gravitational impression of braking. Based on FxF no actual braking occurred. You can see cars approaching the bridge braking before hitting the seam, so those drivers anticipated the buck, while the Tesla had clear road ahead so the buck was substantially worse since it didn't brake. That's an opportunity for improvement once video data is temporally analyzed.
Video 2: Leading car goes into left turn lane and Tesla continues to slow after that car is fully in the left turn lane. FxF shows delta v of approximately 2 ft/s² after the lead vehicle is clear and delta v matching lead vehicle until it is clear. Matching the lead vehicle is fully expected. The continued braking afterward was because the traffic in front of the left-turner was also stopped at a signal and there was no logical reason to accelerate to the stopped car like a bad driver might. That's very light braking .Not even full regen. But it still returns to a: Don't use TACC there; and b: It operated safely, not the way you'd expect a human driver to.
I still welcome people to provide video of unexpected braking events. ^.^ Helping folks understand the technology, its limitations, and how it responds to what is always a good thing.