You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Hopefully, this suggests the white list is being populated ... things to ignore (like overpasses).My issue hasn't been cars in other lanes, it's with overpasses and overhead signage. Seemed way way way retduced today.
so has anyone got the updated fixes for AP 2.0 yet that EM tweeted about?
I received an update notification last Friday but did not get my MX back until today as it was undergoing a full xpel wrap with Adonis Detail.
Just updated now and it is 2.50.201
Release notes appear identical to 2.50.185
He's one of the 1,000 who got the AP2 beta features. So I guess they are on a different branch / build.FULL WRAP? Dang...give us some pics man!
Weird, why aren't you on 2.52.xxx series?
I came up to a stopped car (red light) that was outside of Tesla Vision, I tried to engage TACC and my car did not move. I was ready to brake right way if any acceleration started to happen. I then noticed that my dashboard "saw" the stopped non moving car because there was an illustration for it.
Does anyone know the differences between 2.50.201 and the other firmware versions that were released? The only difference that has been mentioned is that I got the AP2 Beta firmware as part of the initial 1000. I plan on checking with Model S owners as well to see if they have the same build that I did.
Am guessing the model has been trained enough to identify what a stopped vehicle looks like. Your scenario is a corner case which with enough usage will be trained as well. So I will recommend you try it multiple times till the model learns itSTILL RECOMMEND CAUTION WITH TACC UNDER 2.50.201 FIRMWARE
Had an opportunity to do city driving with the new 2.50.201 firmware and was pleased to see improved detection of immobile objects while TACC is engaged.
As an approximation, TACC picked up 9/10 immobile objects in front of me and slowed down/stopped accordingly. I am unsure if it was alterations to hardware/software/both which allowed for improved accuracy.
How I seem to be able to determine if TACC is 'aware' of a stopped object in front of me or not is if the 'vehicle' is rendered as a picture on the dash.
The one situation where TACC did not succeed in immobile object detection was:
Road Conditions: Night time. Dark. Only visible light to me is my headlights. traffic light and brake lights from the other car.
Other Car: White car stopped at red light in a left turn lane.
My Car: Driving with TACC at 50MPH. I merge to left turn lane.
I felt TACC wouldn't stop so applied hard pressure to the brakes. I stop behind the white car. The Tesla does not register a graphic of a vehicle in front of me on the dash. Light turns green and then a vehicle magically appears in front as the white car starts accelerating.
I do not think FCW or AEB would have kicked off had I not braked myself. Prior to 2.50.201, it would have been 0/10 detection of immobile objects so big improvement. I would like to see it at 99.999/100.
Since TACC does slow down from a long distance when it does detect an immobile object, I am considering driving with it on all the time. But you need to be cautious and alert since the detection software still needs to keep improving.
Your scenario is a corner case which with enough usage will be trained as well. So I will recommend you try it multiple times till the model learns it
I'd be amazed if the on-board learning is sophisticated enough to learn that sort of things. In fact, it would probably be pretty dangerous if each car is independently trying to figure out what does or doesn't count as another car.
My guess is that (at best) the on board-logic is able to remember (I) locations where the driver has made the same correction multiple times in order to stay in lane, or (ii) locations where something (like an overpass) always appears in the sensors, but does not block travel.
If there is something that your car isn't handling correctly, best to report it to Tesla and then avoid using AP/TACC in that situation. Not good idea to keep intentionally running the car through scenarios it can't handle. That's just dangerous and rude to other drivers.
Hopefully, it won't be necessary to put the vehicle into valet mode in order to keep AutoPilot under control....
I just treat AP2 like a new teenage driver. You let them fly to see their limits but never too far enough to hurt themselves or reach a point of no return where you can't take the wheel back.
Isn't the fleet learning model just about that? Each driver behaves differently in different situations, AP records its behavior and drivers behavior and uploads to the AP servers, similarly all the objects the camera is scanning are being sent to the server and also processed by the on board computer.I'd be amazed if the on-board learning is sophisticated enough to learn that sort of things. In fact, it would probably be pretty dangerous if each car is independently trying to figure out what does or doesn't count as another car.
I only need to test enough to determine if a feature is reliable or not. I share that with others who will hopefully spread the word to prevent any mishaps which can harm other drivers and the Tesla brand.
Isn't the fleet learning model just about that? Each driver behaves differently in different situations, AP records its behavior and drivers behavior and uploads to the AP servers, similarly all the objects the camera is scanning are being sent to the server and also processed by the on board computer.
The on board computer has a version of the trained model and runs the scenario through it and performs an action. The server keeps processing new data and updates the model and pushes it to the on board computer on all cars.
Probably they enhanced the code to do more things in this release but the fact that MXWing was able to see it work in 9/10 scenarios, it's also the result of the neural net processing.
I think you're way overestimating the sophistication and depth of the machine learning. I doubt Tesla is using it in such a broadbased way. It would be impossible to debug. Also, I suspect there is a real limit to the amount of data they are actually uploading to the mothership. I bet most of the "machine learning" taking place is actually just simple map notation happening in-car.
No way to know that this has anything to do with neural net processing. And, in this context, 9 of 10 is a pretty horrible and dangerous result.
Hopefully, it won't be necessary to put the vehicle into valet mode in order to keep AutoPilot under control.