You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
My AP2 radar has gone through one winter without an issue. Worked during a 16" snowfall (and my car drove like a dream with my Michelin X-Ices). Really no complaints from Chicagoland (and we get all kinds of weather except hurricanes).
There have been alot of posts here about people radar failing.
Its amazing how everytime i come to this forum it reminds me of what Uber's engineer Levandowski said.
"We've got to start calling Elon on his bs... lets start giving physics lessons about stupid sh*t Elon says"
I'm not mad at the tesla fanbase anymore because i know where they get it from.
9 months ago (dec 1) when i predicted that no EAP feature will show up until Aug/Sep. I was ridiculed and laughed to scorn and cursed at.
Now everything i said is a reality. When i said Tesla was only using one active camera initially, again i was ridiculed until that was proven and then they added another camera.
When I said Tesla haven't started any data collection and that the whole 2 billion fleet learning was a sham and that mobileye didn't allow recording of video data and that not only that but that Tesla had no capability of recording mobileye camera data, i was ridiculed and laughed at. But now its been proven as Tesla began their image collection.
and many more...
Now I'm yet again telling you how useless Tesla's radar is and that will be evident in due time. although its easily apparent to a novice.
But i know people would rather believe whatever garbage elon pushes than objective analysis.
Radial moving pedestrians in line of sight can be recognized up to 95.3 % of the samples. The recognition rate drops to 39.5 % for lateral moving pedestrians and to 29.4 % for lateral moving pedestrians occluded in a gap between two parked cars.
The radar sensor’s main limitations in this use case turned out to be insufficient Doppler shift measurement resolution and antenna side lobe effects. Possibly, a finer spatial resolution would also improve the sensor’s performance in recognizing occluded pedestrians. Our findings indicate that pedestrian recognition using this low-level approach is limited. Only radar data with higher Doppler frequency measurement resolution (independent of the used technology, e.g. 24 GHz, 77 GHz) would make major recognition improvements possible.
For what it's worth, the scenario I most commonly encounter AP2 TACC NOT detecting stationary objects is when I pull off at freeway speeds and then pull up to a set of traffic lights, so 100km/h | 60mph approaching a stationary vehicle.
Don't give me that BS. Yes a picture is a collection of numbers, so is lidar data, and radar data. Attach a computer to a camera and you can tell the difference between all sorts of things. To say otherwise is just a lie and you contradict yourself in previous posts.Not true. A camera can't see a object, it can't tell the difference between a pedestrian, animal, rock, whatever.
A picture is a collection of RGB pixels (aka numbers). So no camera is no where near the best sensor for confirmation of what an object is.
Radar doesn't need the same corrections to measure distance through fog, snow, rain, etc. Lidar has to throw away data from laser reflections on nearby objects whereas radar can go through water.Everything is based on corrective algorithm. EVERYTHING.
Ice or dust layers too thick will block radar, lidar, and cameras for that matter. All autonomous vehicles are affected. If those same things block human vision then humans are affected too. That has nothing to do with Elon or Tesla's engineering.AP1 radar fails all the time when it gets dirty or when it gets snow/ice build up on it.
Stop praising Elon's and Tesla's terrible engineering job.
No one said radar is the primary sensor, all current autonomous vehicles use cameras. Waymo's implementation uses eight cameras and 360 degree radar, in addition to it's lidar that it likes to show in presentations.Again incorrect, lidar works in heavy rain and snow.
You keep mentioning radar. how is it you don't understand that radar is useless.
RADAR is NOT a primary sensor? how can you not understand that?
seriously, that baffles me.
You realize that what you just posted says very clearly that radar can detect an object that could be a deer thus proving @stopcrazypp correct even though you claim it's incorrect. Read the picture you posted.There's a reason radars are configured to only pick up moving objects.. its because they can't differentiate a static object from the environment.
I think Tesla moved the radar from AP1 to AP2 to make it harder to gunk up but, yes, I think actual ice storms or very wet snow might create more of an issue than the run of the mill snow. Powder, especially, wouldn't stick. That being said, I found the AP2 radar to be surprisingly robust. I only had the car from Jan until present, so it wasn't a full winter but it was long enough to get a good feel (though AP2 was pretty much garbage at the time, it might protest more now that its better).
Needs a heating element for sure.
For what it's worth, the scenario I most commonly encounter AP2 TACC NOT detecting stationary objects is when I pull off at freeway speeds and then pull up to a set of traffic lights, so 100km/h | 60mph approaching a stationary vehicle.
Not sure how that would solve anything. My exposed AP1 Bosh radar does a great job at turning on the heat when ambient temp. drops below 5 or 4 Celsius (not sure if it was 5 or 4, I did some testing last year on this). And the radar gets hot!Needs a heating element for sure.
The NN model did not change since June (i.e. since that 17.26.xx release), yet there were driving behavior changes, the silky smooth, the showing/not showing the second car, ...
There are other signs that basically tell us that NN is just for image recognition, but the actual driving is outside of NN and in regular cpu code.
NN in 17.34 is the same as before.@verygreen - Have you looked into to the latest 17.34.xx release files as well?
I've been thinking about this. We have received several updates, at least monthly, but the NN model is 2-3 months old now...
Did Andrej Karpathy decide when he came on board in June it needed to be done properly and start again from scratch? Or is it finished for EAP (I doubt it)? Or something else?
Anyone else care to speculate? ;-)
NN in 17.34 is the same as before.
A new binary appeared named "clip_logger", I guess I really need to install it to try it out now.
Also somewhat unexpectedly the changelog is completely empty for everybody which I don't think I saw before usually there were always conditional changes listed.
You are looking at a very simplistic marketing picture. Radar can be used to detect stationary objects. It's just many systems filter them out because there are typically many stationary objects in a scene, and the application is only interested in moving objects (for example a police radar is only concerned with speeding vehicles, most ACC systems are only concerned with following a moving car).The FOV of the radar still misses over 95% of all driving tasks.
This is MAJOR incorrect.
There's a reason radars are configured to only pick up moving objects.. its because they can't differentiate a static object from the environment.