So we know the camera sees the trailer.
... at 5mph from 30m out, yes, but there is no hard evidence yet as to what it makes of a side-on semi from 150m at >=70mph, only the proximate supposition that it may be very little indeed.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
So we know the camera sees the trailer.
We don’t know when during this time the truck pulled into the road or at what point it would have been visible to AutoPilot or an attentive driver.
Those cross-highway roads are terrible, and require full attention for this exact reason.
True. We need Verygreen to take one for the team and hit the accelerator next time a truck pulls out in front of him... at 5mph from 30m out, yes, but there is no hard evidence yet as to what it makes of a side-on semi from 150m at >=70mph, only the proximate supposition that it may be very little indeed.
Yep, that explains it. Thank you!
True. We need Verygreen to take one for the team and hit the accelerator next time a truck pulls out in front of him![]()
If I understand corrently Tesla refuses to build maps that would expect the overpass in that exact place and height, rather gamble that the cars passing there would guess the situation right every time. And then once in a while mistake a semi for an overpass that was never there.nope, it's not really filtered out (the 2nd line at the bottom object description comes from radar). green on Twitter
It's equally significant that the label beneath the bounding box reads "No Rad Sig",
the old style maps don't do it, replacing it with flags like "don't brake based on radar returns here" instead. Could they do better? I guess so. But apparently it's not a priority for whatever reason.If I understand corrently Tesla refuses to build maps that would expect the overpass in that exact place and height, rather gamble that the cars passing there would guess the situation right every time. And then once in a while mistake a semi for an overpass that was never there.
Unlike humans.Computers can be so dumb and clueless.
Computers have this nice property. They do exactly what you tell them to (though there's a downside, they don't do what you want them to do)!Unlike humans.
Sorry this is so long and done from my phone.
After sleeping on this entire subject, here is where I land on this. All the technical jargon on this aside.
For about a year after getting my car and a few dead stopped fire engine accidents on highways is when this Amateur Tesla owner figured out this car will not be stopping, attempt to stop or maybe even not maneuvering around stopped objects in highways at higher speeds then 35mph or some threshold I am still trying to find that answer. I had gone a year not knowing this. I thought my car did everything. All the sheepish expressions when asking Tesla employee a question are all coming into view now in my memory.
Two years prior to buying and years after up until yesterday before I knew Jeremy was killed, I had written essays and recommended people to either buy a Tesla or seriously consider it even as their next car. To date I have zero referrals. There are 5 Teslas in my neighborhood and I know probably 3-4 others around my circle that own Tesla’s. Did I influence any of that? Either way I now almost feel it’s my duty as another human being to inform them completely that these cars are not going to save your life. In fact they might give you a false sense that they will save your life.
I am certainly from this point forward shutting my mouth to help this cause for fear I will have blood on my hands. What makes me think this way. If I had been able to sit down and just in a few minutes let Jeremy know some facts that he would have found enlightening, I’m sure being a software engineer he would have taken that info and at least investigated for himself. Would it have changed the outcome we have here, nobody knows.
I know for sure there are many aspects of at least of a Model S that cause undue loss of life. How about the BLUE Model S fire in Ft Lauderdale. People reported trying to open the car door. The handles were not or did not pop out upon impact. They sat and watched that person burn to death. Had they even broken the window (one Tesla employee suggested)and with all the heat and the rush: Do you think they would have found the special release handle located on the door we all know about up by the stationary wing window?
The one we all grab daily if your an owner of a model S. Think about it. It’s not easily understood.
My first long trip in my car, I showed the other “Emergency Release” to my daughter since she rides in the backseat. I wanted her to know in the event of a crash and the electronics (the back doors depend 100% on electricity) shut down how to get out in the event so as to perhaps save her life. She could barely do it after struggling to find it under her seat. So that’s it. Special releases for people to exit a badly damage or burning vehicle. In the model S that’s just the rear passengers have to know that. Now if any Model S owner just read this and you never knew this, well you need to get your manual out on your computer and read it from to back and do it again in another week.
So my point is this, this car, this entanglement of tech and not so tech, before a laymen, laywoman, person, child, uses it, rides in it, drives it, you would have to hold classes on the entire aspect of the car, then do it again, before you could safely say I informed, I educated, I have done my part to make sure everyone understands what to do in an Event, ie stopped fire engines, 90 degree facing semis, accidents, fire, on and on. In fact the Model S is the only car I have ever owned where you would have to do that. Dare I say there are other Tesla’s where it’s a requirement. I just have not read the manual and done the in-depth study on those vehicles like I have here. Still learning.
Now Jeremy was competitive. He was also a jealous personality. He was human. How do I know this. The camera quote in my previous posts. He like so many humans fed his ego on facts related to your behind on your tech. He was a software engineer after all. I cannot help but think he kept up with us through Facebook and those posts of my red Tesla Model S, well he was going to do one better with a Red Model 3. You see we are the marketing team for Tesla. The whole universe is looking at our cars. Last night I must have had at least 6-10 people I noticed at lights looking at my car. Those are the ones I noticed. Probably well over 1000 in the whole trip. These are special cars, they take special understanding. Even then I am not convinced they are that special any longer. I’m sorry for myself, others and the planet, because the dream of an electric car has been mine from a very early age. Unfortunately Tesla has taken it too far, too fast and made it so the human beings are not able to understand fully what they have gotten into here. Elon Musk wants to win, the cost for that is steep, the cause I get, the speed at which he is trying to do it in, that’s business.
I in no way blame myself for anything, I do though feel it important to educate. Perhaps I have found my next calling forward. Thanks for reading and safe Travels in whatever your driving. Have a great weekend.
Will FSD be able to incorporate such learnings and experience? I hope so but some seem to be pretty tricky problems.
I wish they would roll this out to AP. Currently it doesn't seem to react until the car in front is almost entirely in the lane.In the AP presentation, Karpathy spoke about exactly that sort of thing: collecting data on cars that entered the lane (or didn't) that were not using turn signals. So they collected a bunch of fleet data where the cars did move other and ones where the NN thought they would, but didn't. That data was used to improve the NN.
they did. All the reports of people saying "AP now reacts to blinkers and lets cars merge in! I saw it!" is from that code. It's just not very robust it appears.I wish they would roll this out to AP. Currently it doesn't seem to react until the car in front is almost entirely in the lane.
What is radar braking and where is it enabled?btw since I already extracted it by the requests of other people, this is the relevant Tesla adas maps snapshot for the place in question that others might find interesting.
dated "2019-04-26" so already post accident. I am sure NHTSA has full access to this and other maps data in use from the date of the accident, but who knows when they will actually report everything they learned.
View attachment 409639
Edit: does not seem to load so here's the externally hosted version: maps of the crash site
I just wish it would react better to cars halfway in the lane. It just seems to have no reaction until the other car is almost all the way in the lane. The problem could be that they're training it using Tesla driversthey did. All the reports of people saying "AP now reacts to blinkers and lets cars merge in! I saw it!" is from that code. It's just not very robust it appears.
radar braking is whenever the car would brake for stationary objects in car path based solely on radar. Areas that have bridges/overpasses/... that cause "phantom braking" are marked with the flag to not brake for just radar alone.What is radar braking and where is it enabled?
That's why I don't have faith in AI.Computers have this nice property. They do exactly what you tell them to (though there's a downside, they don't do what you want them to do)!
Alternatively:That's why I don't have faith in AI.
Computers are good at following the rules that the programmer sets.
For example, a large Triangle on the side of the road up ahead. What could it be?
Computer: clueless.
Human: I see the Triangle is being carried by a human whose head and torso are hidden from the view behind the Triangular object.
And he is about the cross the road, and he does not see me.
I had better be ready to stop if he steps on the road.