Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Navigate on Autopilot is Useless (2018.42.3)

This site may earn commission on affiliate links.
Interesting - thanks for the thoughts @wk057. I live in a rural area and haven't been able to test NoAP given our roads. That said, we also own both AP2.5 and AP1 Teslas (Model 3 and Model S). My experience around here is that the AP2.5 car is far superior to our AP1 car with regards to general Autopilot performance. I wish it read traffic speed signs like our AP1 Model S, but beyond that it's proven to be a much better performer.

It really just depends on the roads. With clear lane markings and minimal hills-in-curves, AP2 does pretty well on rural roads. Around here, our back roads have a lot of elevation change, and a lot of those in curves. AP2 just either gives up or makes really dumb decisions (my best example is a place where AP2 still comes out of the curve in and locked on to the lane of oncoming traffic, if I let it). AP1 usually makes the effort, and at reasonable speeds for the curve will usually make it happen.

Definitely a YMMV situation.
 
@wk057

To your point about the visualization of the car 'swimming' or knocking into each other, please realize that this is just Tesla trying to dumb down the image processing algorithm's data for our perusal.

The actual data being generated and reacted upon is exceptionally accurate to the last inch. Since a lot of sensor fusion is happening (ultrasound/camera), it is impossible to show all that data accurately on the screen for the driver.

Considering all the variables here, to me it's a surprise Tesla even bother with the visual representation. Because buyers normally dont look past what is directly in front of them and misconstrue that as the software not working.
 
AP1 hasn't had a single improvement in about two years

Just few months ago AP1 used to false positive brake on an overpass -- every single time. Now it is doesn't do it, ever. I don't know if it was a whitelist improvement, or an actual design improvement, but it is an improvement.

I also think it takes sharper turns with more precision and likelihood at staying within lane lines on sharp curves.

I also think it has gotten better at dealing with going over hills -- hopefully it will catch up to AP2 which doesn't have those problems.

  • NavOnAP has no concept of "Keep Right, Pass Left". It never suggests lane changes back to the right in any of the available modes.
  • Further, it randomly suggests lane changes to the left for no reason whatsoever. No traffic, no interchanges, nothing.

That is really bad. All driver assistance tech should build in -- stay out of the left lanes except to pass.

some of my post history for the past ~5 years before.

Each post should stand on its own terms. And you can count on new readers coming to this forum and we can't reasonably expect them to do dui diilgence on each poster.

Your post seemed unnecessarily negative. You could have described the exact same experience of NavOnAP even pointing out its faults and areas for further improvement, without all that negativity. it would have been more effective at conveying the information both good and bad and it would have been more constructive. but not really a big deal. I do the same thing all the time but try to be better.
 
It really just depends on the roads. With clear lane markings and minimal hills-in-curves, AP2 does pretty well on rural roads. Around here, our back roads have a lot of elevation change, and a lot of those in curves. AP2 just either gives up or makes really dumb decisions (my best example is a place where AP2 still comes out of the curve in and locked on to the lane of oncoming traffic, if I let it). AP1 usually makes the effort, and at reasonable speeds for the curve will usually make it happen.

Definitely a YMMV situation.

It's so weird because I would say our experience is the opposite on those examples too. With limited paint striping, our AP2.5 car shines in comparison to our AP1. It will engage in tougher situations (less paint) and stay engaged longer. And on roads with rolling hills and curves, the AP2.5 car is superior also. Far less ping-ponging going over rises. Seems a lot more confident. AP2.5 for sure centers up way better on roads that only have a center stripe. I can't even use our AP1 car on such roads as it hugs the right side of the road too much and ends up hitting breaks (potholes) in the asphalt on the side of the road. Learned this the hard way with AP1.
 
The car regularly suggested lane changes directly into objects it clearly detected. It would even show the proposed path on the visualization as going directly through the other vehicle. In one instance I wondered if it really was going to let me change lanes into a semi truck, or if it would wait until it was clear. Nope, it started to move right towards it after confirmation. No red lane, nothing, while directly alongside a semi. *shakes head*
Did you ever tried initiating one/confirming it?
It has always done this but if you do enable the lance change/confirm it it will wait for the vehicle to move (sometimes it will break to get behind him or sometimes it will get in front) the lane will only change color once you initiate the lane change confirmation. in around 500 miles it has never moved me into a lane with a vehicle. I had the opposite refuse to switch lane once because it saw a large semi in my left lane but it was 2 lanes to the side a second or two later realized it was not really there and then it started the lane change.

It would also set a seemingly random max speed at times, with no speed limit changes or interchanges.
This is pulled from either the map database speed signs(which as you mentioned may be stale) or map database road type or history of the driver. If you always set it to 53 on a 55 road it seems to remember for that road type and next time you engage it on a 55 zone it goes back to the limit you last had configured. (I have the suspicion that is also remembering it based on specific geographic road areas) but haven't measured it correctly enough to be able to confirm it.
AP2 still doesn't read speed limit signs, so the noted speed limit doesn't always match the real highway speed limit in areas where it was recently upped or lowered (happens a lot around here with places bumping to 70).
I hope this and/or stop and traffic lights would be next unless they ran out of processing power and need their new asic/tpu ap3.

Vehicle detection to the sides and behind your vehicle is complete garbage.

This seems to be based on what's rendered and not what it's actually acted upon soo hard to know for sure. Maybe you can get into the firmware and print the real readings from all the vehicles around it instead of the rendered ones?
 
  • Helpful
Reactions: Ulmo
Thanks for the report Jason. Makes me a bit worried that they'd ship something like this to customers. Hopefully soon remedied? Could it be an issue with some specific combination of your particular hardware in combination with some specific iteration of code?
 
@wk057

To your point about the visualization of the car 'swimming' or knocking into each other, please realize that this is just Tesla trying to dumb down the image processing algorithm's data for our perusal.

The actual data being generated and reacted upon is exceptionally accurate to the last inch. Since a lot of sensor fusion is happening (ultrasound/camera), it is impossible to show all that data accurately on the screen for the driver.

Considering all the variables here, to me it's a surprise Tesla even bother with the visual representation. Because buyers normally dont look past what is directly in front of them and misconstrue that as the software not working.

Overall, I disagree. While I believe visualizing the data may prove moderately difficult, it's not *that* difficult. Keep in mind, AP1 does this just fine, and AP2 has had the same issue even before bringing more cameras online. So, I don't buy this particular logic.


Just few months ago AP1 used to false positive brake on an overpass -- every single time. Now it is doesn't do it, ever. I don't know if it was a whitelist improvement, or an actual design improvement, but it is an improvement.

I also think it takes sharper turns with more precision and likelihood at staying within lane lines on sharp curves.

I also think it has gotten better at dealing with going over hills -- hopefully it will catch up to AP2 which doesn't have those problems.

I have copies of firmwares from initial AP1 release to present. I can assure you, nothing has changed with the core of AP1 in nearly 2 years.

The overpass false positive improvement is in fact a whitelist situation, as Tesla is still processing "ADAS tiles" for both systems.

The other "improvements" you note are, frankly, your imagination... most likely other circumstances impacting the performance of the system (shadows/sun position, windshield cleanliness, etc). I actually found that AP works significantly better in areas when nearby trees have leaves vs when they don't, likely due to the huge variations in shadowing.


[Not keeping right] is really bad. All driver assistance tech should build in -- stay out of the left lanes except to pass.

Agreed, which is why this was a bit confusing to me to find this behavior.

Each post should stand on its own terms. And you can count on new readers coming to this forum and we can't reasonably expect them to do dui diilgence on each poster.

Your post seemed unnecessarily negative. You could have described the exact same experience of NavOnAP even pointing out its faults and areas for further improvement, without all that negativity. it would have been more effective at conveying the information both good and bad and it would have been more constructive. but not really a big deal. I do the same thing all the time but try to be better.

I'll give this the benefit of appearing to be fair criticism. Unfortunately there's just no way to spin this in a positive manner. The post is about the failings of Tesla, their AP teams, and NavOnAP in general. Not much neutral or positive spin to those things.
 
Vehicles behind your vehicle are actually detected only part of the time, apparently due to some issue with the rear cam setup in the hardware (@verygreen I believe has documented this).
huh, even on the model 3? I thought their different setup makes it immune, but I guess not?

anyway I mostly agree with your experience to the point that I am not even bothering using this feature lately.
 
It really just depends on the roads. With clear lane markings and minimal hills-in-curves, AP2 does pretty well on rural roads. Around here, our back roads have a lot of elevation change, and a lot of those in curves. AP2 just either gives up or makes really dumb decisions (my best example is a place where AP2 still comes out of the curve in and locked on to the lane of oncoming traffic, if I let it). AP1 usually makes the effort, and at reasonable speeds for the curve will usually make it happen.

Definitely a YMMV situation.
did you try tail of the dragon on ap1? I did yesterday and was super underwhelmed by all the errors so fast I needed to abandon the idea entirely (ap footage of that forthcoming shortly), but might be interesting to compare to ap1
 
I just gave unmodified, unhacked, fully stock 2018.42.3 Navigate on Autopilot a good try with an open mind on a ~300 mile drive this weekend.

I've seen some videos, tweets, posts, etc praising the feature. I hadn't really had a chance to try it on a longer drive myself as of yet, though, until this weekend.

TLDR version: This is the most useless thing I've ever seen. I've seen some whoppers, but this takes the cake.

Let's do a rundown of what I think was improved:
  • Autosteer in highway interchanges and off-ramps was improved. It would stay in the ramp without too much trouble, while prior it would freak out and demand the driver intervene for sharper curves. (We'll ignore that it was taking the turns at ~15 MPH lower than the suggested speeds, but baby steps I suppose).
  • Visual indication of what travel lane was needed for upcoming interchanges was reasonable and a good addition to normal navigation.
  • I do like the path visualization when lane changes are initiated.
  • It does usually try to take exits without intervention (more on this later) which is a step in the right direction for on-ramp to off-ramp autopilot.

So, some improvements I suppose.

Now for the bad.
  • I now fully understand why Tesla makes it require confirmation. If it had been allowed to make the suggested lane changes on its own without confirmation, I'd likely have died 10-20x if I didn't take control every time.
  • AP1 and AP2 previously did *okay* when following a lane that ended and gradually merged into a single lane. While using NavOnAP this weekend, the car just wanted to make its own lane instead every time instead of merging... usually trying to run into a barrier or median, requiring intervention every time.
  • The car regularly suggested lane changes directly into objects it clearly detected. It would even show the proposed path on the visualization as going directly through the other vehicle. In one instance I wondered if it really was going to let me change lanes into a semi truck, or if it would wait until it was clear. Nope, it started to move right towards it after confirmation. No red lane, nothing, while directly along side a semi. *shakes head*
  • NavOnAP has no concept of "Keep Right, Pass Left". It never suggests lane changes back to the right in any of the available modes.
  • Further, it randomly suggests lane changes to the left for no reason whatsoever. No traffic, no interchanges, nothing.
  • I found the car randomly decelerating at least 10x during the trip with no obvious cause. More common when driving in the right lane vs left. It would also set a seemingly random max speed at times, with no speed limit changes or interchanges.
  • AP2 still doesn't read speed limit signs, so the noted speed limit doesn't always match the real highway speed limit in areas where it was recently upped or lowered (happens a lot around here with places bumping to 70).
  • At least once the car detected a construction zone with a popup about it (kudos on that) and then immediately proceeded to try and suggest a lane change into construction cones..... which negates this from making the "improvements" list above.
  • Overtake suggestions are useless. On two lanes, driving in the right lane, I would approach a vehicle ahead that was traveling more slowly. No other traffic. The car would decelerate... 5.... 10.... 15 MPH.... as it sees the vehicle. Then, after matching its speed at my set following distance, a few seconds later it'd popup "Confirm lane change" to overtake. Seriously, wtf. And not just once in a while. Every single time I waited for the suggested change, it behaved this way. In every mode setting, including "Mad Max".
    • The car detects the other vehicle way in advance, even when just using the in-car visualization for reference, and could easily make the suggested lane change early enough so that no deceleration at all would be needed, even with the delay of requiring confirmation.
  • On multiple occasions the car would start doing a lane change (either a confirmed one, a manually initiated one, or an automatic one for an exit), get part way through, and quickly veer back into the starting lane for no reason. About half of those times it would popup with "Lane change cancelled". In one instance I actually missed an exit because it was 2/3's into the exit ramp lane, stayed there a moment, then just jumped back to the left for no reason.... ugh.
  • Even features that were usable before, like manually initiated auto lane changes, are no longer reliable.

Overall, using "Navigate on Autopilot" did not improve the experience of using Autopilot at all, with the limited exception of autosteer's new ability to mostly keep in lane on a tight interchange... with that being negated by the fact that it tries to kill you any time a lane ends. Also, it seems that the ability to take tight interchanges is mostly thanks to nav fusion, as the vision model does not appear to be properly detecting lanes in some of these situations, yet the system presses onward.

The suggested lane changes were completely useless on every mode. It would either suggest changes that weren't necessary, weren't safe, or weren't useful. It was even suggesting lane changes for an interchange upwards of 8 miles away at one point, then refusing to suggest overtake lane changes until after that interchange.

Some more notes:
  • Vehicle detection to the sides and behind your vehicle is complete garbage.
    • This is super obvious when sitting still with other still vehicles all around. You'll seem them "swimming" around the visualization, colliding with each other, with you, etc.
    • Also obvious when overtaking large vehicles. Almost every single semi truck, bus, or RV I passed ended up with a twin ghost visual on the screen.
    • Finally, vehicles to the side are regularly shown overlapping my own vehicle visual, despite them being firmly in their own lane.
    • Vehicles behind your vehicle are actually detected only part of the time, apparently due to some issue with the rear cam setup in the hardware (@verygreen I believe has documented this).
  • It seems very obvious that Tesla has no real data fusion whatsoever between the cameras. This results in both huge gaps in the usable data as well duplicate data (like the ghost trucks). This is like computer vision 101 stuff that I don't understand why Tesla hasn't overcome this, especially in something shipped to thousands of customers.
  • Radar/vision fusion on AP2 appears to be significantly worse than AP1, with AP1 easily accurate for a few cm... AP2 easily worse than +/- 1m... very obvious when looking at the lead vehicle visualization.
  • Some of the failings of NavOnAP don't even make sense. If it clearly "sees" a vehicle, it seems like a basic sanity check in the higher level code would prevent it from suggesting a lane change into it.... but this isn't what happens.

Could probably go on for quite a while, but suffice it to say I won't be using the feature any further... not at least until it's actually useful.

It doesn't improve the experience of using autopilot for me one bit. In fact, it makes it even more frustrating. This is ignoring the super frequent nags that plague the more recent firmwares, too.

I'll be sticking to my AP1 vehicles for longer trips from now on I think. In fact, I'm probably going to try and make time to make some videos/posts about AP1/AP2 modifications that are actually useful.

For example, my modded AP1 vehicle would handle the situation I noted above (overtaking a vehicle) smoothly with zero deceleration. AP1 (and AP2) can detect a vehicle ahead of you over 100m away... no excuse for the behavior of NavOnAP.

I'm just super disappointed in Tesla. Their spat with Mobileye has cost Tesla customers a huge amount of progress on the autopilot front. AP1 owners are completely screwed because they will get zero improvements. (Despite promises of ongoing improvements, AP1 hasn't had a single improvement in about two years). Meanwhile, AP1 is running on Mobileye hardware that was released nearly 5 years ago and still handles many situations better than AP2. And it's not like Mobileye has stopped. They're positioned to blow Tesla out of the water with their current hardware (EyeQ4), and off the face of the Earth with their upcoming hardware (EyeQ5). Had Tesla not screwed us all over in that regard, it's likely AP1 would still be improving and that AP2 would be running the next gen of Mobileye hardware with features well beyond what Tesla is capable of doing today. Again, just disappointing that they've decided to forsake early adopters yet again, and also give current adopters less value for their $ in the meantime.

I'm sure people will come out in force to defend Tesla, say how great NavOnAP is, etc... and by all means, do what you must. I personally own both types of vehicles (AP1 and AP2) and drive both regularly... pretty simple to tell the deficiencies of AP2. You're not going to convince me that somehow my extensive first-hand experience is somehow flawed and that things are way better than I claim. lol.

I find it hard to believe that as a one-man free-time dev crew I've been able to do better than Tesla's entire multi-million-dollar-funded AP dev team has been able to do in more than two years. I have maybe 40 hours of total work into my AP1 mods, and they've been more usable than NavOnAP for nearly two years. I just don't get it. I'm pretty good at what I do, but I can't believe I'm better-than-full-teams-with-millions-in-funding good.

To that end, when/if I get the time, I'm going to set a goal for myself of ~25 hours of work to make a hardware/software modification to an AP2 vehicle that actually does what NavOnAP is supposed to do. Start to finish, from scratch. It's been suggested that I get some basics prepped (hardware I need in-hand, for example), then keep a GoPro on a chesty running until the modifications are done and working to document the entire thing.

If doable, then I think it would be even more obvious that we have to reevaluate Tesla's progress on the driver assistance front.

Anyway, enough of that for now.

Disclaimer: I have no positions with any of the companies mentioned nor do I intend to initiate one at any point in the future.

First, thanks for your informative post and accounting of issues you found. I'm not nearly as knowledgeable about the inner workings of AP V42 nor Tesla SW in general and I find your posts very useful so, please continue informing us here.
I just completed 3 trips with AP V42.2 on my '17X (V2.0 HW) and '17 3 (V2.5 HW). So my recent experience is with ~200 miles of auto-navigate on I-5 in CA with the X. And about 150 miles of travel on various freeways in SoCal the Model 3.

Relative to "bad" points 1-11 in your list:
1) I assumed detection of stationary objects in the field of travel is no better with this version and saw no more or fewer situations in which AP might kill me (meaning 1 situation in a construction zone in the Model 3).
2) I did not experience this.
3) I had the opposite experience and tested just for this scenario because I'm concerned with how much trust my spouse puts in AP.
Examples:
A)Merging onto freeway traffic with entrance below freeway grade and semi-truck on collision course with my merge. Model X detected truck, slowed, and just waited for the truck to pass before flawlessly merging into freeway lane. This was impressive to me because of the poor visibility due to bad geometry for seeing the traffic.
B) I had exactly the scenario you described with a large truck merging from lane 2 into lane 3 just as I confirmed to AP, my merge from lane 4 into lane 3. AP detected the truck as my Model 3 was just over the lane line and then quickly, but not abruptly came back into lane 4, avoiding the truck. Then, asked to re-confirm lane merge after truck passed. It was good driving.

4) Totally agree. Also doesn't detect disabled or emergency vehicles on shoulder which, by fed law require slowing or changing lanes (pref, latter).
5) have not experienced this.
6) I have seen this a couple times. It's disturbing. I think both times were in construction zones that had no actual construction. I wonder if it's a DOT data base issue? Anyway, I concur.
7) Yes! AP, please use posted speeds.
8) Around here, I haven't come across the cones scenario, I'll take your word for it. In general, I avoid AP use in construction zones and advised my spouse not to use it there either.
9) Mad Max mode fixed that issue for me. It was my greatest frustration during initial test before I RTFM.
10) Agreed. I've seen this too. In fact in general I get frustrated with just not comprehending what the hell AP is "thinking". I do note that sometimes it seems to want to get over to lane 1 quickly I suspect because I have HOV set as a pref. But the lane change freakout has happened twice in thick traffic and I'm sure it PO'd drivers around me. (just bad driving, AP).
11) I haven't noticed this. Manual lane changes always worked for me.

It's not useless for me based on a few days of testing. I found it pleasant, with Mad Max, on my long drive down I-5 dealing with truck traffic with only 2-lanes to work with. Over the winding Grape-vine section, AP seemed no better or worse than before, meaning I had 2 interventions due to scary AP driving over a 40 mile section. But, I was prepared for that. I also don't know why auto-nav engages turn signal to exit freeway but not to enter. Is that not a thing? I always do.
I'm going to wait a version or 2 before I ok auto-nav for general family use but I'll keep using it if for no other reason than to provide Tesla with stats. TLDR; Two-steps forward, one back. Inverse for you I guess.
 
Last edited:
Honestly, that's kind of my hope if I'm able to get around to doing my project on an AP2 car. If me doing some amazing stuff with an AP2 car with modded hardware/software ends up being a bit of a spur to Tesla's AP team's rear...... well, then success! My hope is that afterward someone will walk into that meeting and be like, "OK, watch this, then tell me wtf have you guys been doing the past two years? (Presses play on Youtube video)"

First of all, amazing thread, big fan of your work.

Second, I truly hope the post I quoted happens for the benefit of all Tesla customers. I want you to embarrass the f*** out of their AP team. Please please please make a video with your mods and get them to say “s*** why is this one guy beating us?”

Embarrassing progress in 2 years. I would love to tweet your video at Elon, and say quit ****ing around on twitter and DO THIS.
 
  • Like
Reactions: Bladerskb
@wk057

To your point about the visualization of the car 'swimming' or knocking into each other, please realize that this is just Tesla trying to dumb down the image processing algorithm's data for our perusal.

The actual data being generated and reacted upon is exceptionally accurate to the last inch. Since a lot of sensor fusion is happening (ultrasound/camera), it is impossible to show all that data accurately on the screen for the driver.

Considering all the variables here, to me it's a surprise Tesla even bother with the visual representation. Because buyers normally dont look past what is directly in front of them and misconstrue that as the software not working.
Well, if they can't get the fused sensor data stable/accurate for the display, what does that say about what the car is using as inputs to their driving algorithms? The display is a whole lot easier to control than the car. When I was doing that kind of stuff, the display showed what the system's view of what was going on, for debugging if nothing else.

Since it's not physically possible for vehicles to bounce around the way the display is showing, that says there's something seriously wrong with how they are modeling the environment around the Tesla. If Tesla is doing some kind of physical model, then it's pretty bizarre that they display the surrounding traffic so badly, if not, then IMHO they're going down a dead end.

Also, anyone who isn't aware of the positions of cars around them when driving is going to have a pretty bad experience sooner or later.
 
  • Like
Reactions: pilotSteve
Some more notes:
  • Vehicle detection to the sides and behind your vehicle is complete garbage.
    • This is super obvious when sitting still with other still vehicles all around. You'll seem them "swimming" around the visualization, colliding with each other, with you, etc.
    • Also obvious when overtaking large vehicles. Almost every single semi truck, bus, or RV I passed ended up with a twin ghost visual on the screen.
    • Finally, vehicles to the side are regularly shown overlapping my own vehicle visual, despite them being firmly in their own lane.
    • Vehicles behind your vehicle are actually detected only part of the time, apparently due to some issue with the rear cam setup in the hardware (@verygreen I believe has documented this).
  • It seems very obvious that Tesla has no real data fusion whatsoever between the cameras. This results in both huge gaps in the usable data as well duplicate data (like the ghost trucks). This is like computer vision 101 stuff that I don't understand why Tesla hasn't overcome this, especially in something shipped to thousands of customers.
  • Radar/vision fusion on AP2 appears to be significantly worse than AP1, with AP1 easily accurate for a few cm... AP2 easily worse than +/- 1m... very obvious when looking at the lead vehicle visualization.
  • Some of the failings of NavOnAP don't even make sense. If it clearly "sees" a vehicle, it seems like a basic sanity check in the higher level code would prevent it from suggesting a lane change into it.... but this isn't what happens.

Telsa's Motion planning and control algorithms is straight garbage among other things, in-addition to their terrible in-accurate and in-efficient network my friend. But be-careful, you can easily go from a Saint to a foe around these quarters now.

I'm sensing a theme that a bunch of the proponents that aren't haven't as many issues are driving California routes.

The country is more than California, despite what many Californians may believe. This is probably also why rain sensing on AP2+ absolutely sucks.

Jimmy has this entire forum hypnotized!
I have even felt strong resistance from @lunitiks and @S4WRXTTCS of all people.

Look at the stuff coming out of the Tesla camp quoting him...
Deep Dive Into Tesla's Autopilot & Self-Driving Architecture vs Lidar-Based Systems | CleanTechnica

Its not safe making posts like yours though with the current climate, even i am taking precautions. So...

tenor.gif
 
Last edited:
I'm sensing a theme that a bunch of the proponents that aren't haven't as many issues are driving California routes.

The country is more than California, despite what many Californians may believe. This is probably also why rain sensing on AP2+ absolutely sucks.
There would be a huge benefit in all aspects of the car for them to have engineers outside California. List of gripes is far too long.
 
I'm just super disappointed in Tesla. Their spat with Mobileye has cost Tesla customers a huge amount of progress on the autopilot front. AP1 owners are completely screwed because they will get zero improvements. (Despite promises of ongoing improvements, AP1 hasn't had a single improvement in about two years). Meanwhile, AP1 is running on Mobileye hardware that was released nearly 5 years ago and still handles many situations better than AP2. And it's not like Mobileye has stopped. They're positioned to blow Tesla out of the water with their current hardware (EyeQ4), and off the face of the Earth with their upcoming hardware (EyeQ5). Had Tesla not screwed us all over in that regard, it's likely AP1 would still be improving and that AP2 would be running the next gen of Mobileye hardware with features well beyond what Tesla is capable of doing today. Again, just disappointing that they've decided to forsake early adopters yet again, and also give current adopters less value for their $ in the meantime.
Completely respect your work. From a highlevel it appears your input and then output is limited in scope.

It appears you are only evaluating this at a point-in-time and technical aspect on a limited type and length of roads.
Have you considered what limitations (restrictions) and cost Mobileye put on Tesla?
Are you considering that Tesla can now accelerate since they have this in house now?
Have you reviewed all of Jimmy_D post on this? Neural Networks
How about listed to this podcast/interview: Interview: Neural Networks & Autopilot V9 With Jimmy_d (10.29.18) - TechCast Daily

What about this:
Good discussion he is having on why the cars in the display are not displaying accurately. HTH

jimmy_d, Sunday at 6:21 PM
Neural Networks


Was having a conversation about this with someone the other day. I think the display behavior you point out must be confusing to a lot of people because I've heard it from pretty much everybody. It's obvious when you think about it, but what appears on the display is not what the car is using to drive, it's a separate output that is just there to give the driver some utility. If the car's driving decision's were based on the same interpretation of that display data that a human uses it would be braking and swerving all over the place. But the car doesn't do that. The display info is just for driver consumption and it's obviously still got some problems.

The display stuff is probably getting interpreted heuristically from the camera network outputs. The camera networks provide a smooth probabilistic output that is not very easy to display with a human written algorithm. But you can't put the NN output directly on the display either. So the display code probably looks at periodic snapshots of the camera network outputs in order to provide a low latency output and it has to have a hard threshold for deciding whether some vehicle 'exists' at some point in space because, seriously, a cloud of vehicle expectation values distributed over 3-space is not something you can put on a car screen.

But it's not a problem for the car driving itself because the driving decision networks get the full feed and get to watch all those probabilities vary smoothly over time. Those networks have a much more nuanced interpretation of what's coming out of the camera networks than the display code does.

Ideally that display should be a reliable window into the perceptions of the car. It's not there yet.
 
Last edited:
I'm sensing a theme that a bunch of the proponents that aren't haven't as many issues are driving California routes.

The country is more than California, despite what many Californians may believe. This is probably also why rain sensing on AP2+ absolutely sucks.
I have no doubt that EAP and FSD are monumental technical problems and being the leader in this field is the next closest thing to "bet the farm" that Model 3 is (was) to Tesla. Even Calif has a wealth of driving corner cases (even parts with 100 cm annual rainfall). I have it on good authority that in the cramped open-plan engineering floor at Deer Creek, Musk occupies a desk, no office, not far from center, surrounded immediately by the AP folks.
Presumably they are the developers and alpha testers (Musk included) and only drive CA roads and mostly freeways. I've encountered testers at SC spots along CA North-South route including one doing exit to exit AP testing 2 years ago. Now, employees, including SpaceX are also AP beta testers. @wk057, have you tried to sign up as one? Seems like you'd be excellent. Maybe many others outside of our little CA weather-bubble should as well.
 
  • Like
Reactions: f?Re and Boomer19
What about this:
Good discussion he is having on why the cars in the display are not displaying accurately. HTH

jimmy_d, Sunday at 6:21 PM
Neural Networks

"The camera networks provide a smooth probabilistic output that is not very easy to display with a human written algorithm."

To be blunt, a bunch of the info in this quote is complete BS, but I unfortunately don't have the time to dive into the details to unequivocally disprove this right this second. Suffice it to say, while mapping the raw data to a visualization is challenging, it's not super difficult. I've already done it with AP1 for my mods and started on modding my own visualizer (top-down) for use with AP2. It's basically pre-pre-alpha that's already more usable than Tesla's. :rolleyes: