Separate names with a comma.
Discussion in 'Model S' started by Gilzo, Jul 14, 2016.
Elon Musk on Twitter
Guess this is the big thing in 8.0.
Bit of background
Lidar - Wikipedia, the free encyclopedia
Interesting possibilities for AP.
Let's hope it's a successful R&D which ends up giving drivers a much clearer representation of just how far/what the system can "see".
I think the problem with AP is anthropomorphism; we tend to infer that it has a field of view and visual acuity that is similar to our own.
I am sure that drivers would benefit from a clearer visualisation/representation of the system's true visual field/range/resolution.
In the meantime, just remember the Autopilot song:
"All together now, 1, 2, 3,
Keep your mind on your driving,
Keep your hands on the wheel,
And keep your shifty eyes on the road ahead..."
Says no changes to hardware. Interesting take.
I thought that was interesting myself. I wonder if there will be no changes to future hardware. I certainly hope that when I get my M3 that it will be on even newer hardware. ( I'm sure it will be )
No.. or partly
He said that he can improve the current radar to an extent, while a new HW will get even better results
It's seems he really don't like lidar, and it's commited to prove his point
Model 3 will get watered down S and X hardware (if they want to make $$$). Also, there will be a premium for such features (e.g. fee to access superchargers)
So,... what is a coarse point cloud?
It means the radar returns a series of points in space (a cloud of points).
It's low resolution (coarse), so you might not see how many fingers someone's holding up, but you can see there's a person with arms and legs (for example).
Follow up tweet from Elon:
Good thing about radar is that, unlike lidar (which is visible wavelength), it can see through rain, snow, fog and dust.
Elon Musk on Twitter
SpaceX uses LIDAR so he does not hate it. He said it's overkill and way to expensive for cars.
I never said hate.. i said don't like, you are right, i didn't specified "in the automotive".
and i didn't say it aloud, but i'm partly for his side, altought i would like more redundancy
but i'm a software guy, and since this could be a security issue, i would like the more redundancy i can.. but then, he need to make the car at an accessible price, and of course, estetical appetible, and lidars while can be small they are a little more hard to hide while manteining the usefullness
That's a very good point people should keep in mind. People think, if they can see something, the car should too, but even most animals have a very different view of the world than we do. Cats for example have a different mix of receptors in their eyes, so they see things differently than we do. They do see color, but much more washed out than we do, and not only do they have eyes tuned for better night vision, but their vision is also optimized for seeing moving things. We see stationary objects very well, but stationary objects for cats are kind of blurry, but moving objects are sharp and clear. They sometimes move their head to see things better.
The sensors on the car see the world differently than we do. Much of the time the computer can come to the same conclusion a human would and do it faster, but other times the system just doesn't see some types objects or misinterprets objects. Pulling into the garage the car sometimes things my tool chest is another car, which is harmless in that circumstance, but the car didn't come to the same conclusion about the object I would.
It is interesting that it looks like Tesla has figured out how to get more information about the world using the existing sensor suite. It may be a while before it is in our cars though. I doubt it will be in version 8.0 of the software, but who knows, they may have been working on it for some time.
I'm just wondering if this is part of the Master Plan v2.0.
Full autonomy in X years is nothing new and it will be a major blow on expectations (and stock) if the plan only includes that.
Is it just me or this week went a lot slower since Elon announced the revelation of Master Plan for the end of the week?
Because Tesla's radar is not phased array radar, I'm really curious how this would work.
By using the motion of the vehicle, they time sample and smooth the point cloud data to approximate what a spinning LIDAR system does. Wonder what the processing overhead on that technique is? Has to be high.
To my understanding, Tesla's radar has no angular resolution. So I don't understand, how it could distinguish between a truck that has stopped before a T-junction or is right at the junction.
Temporal sampling tells, that there is something new somewhere near the junction, but not exactly where it is.
On the second though, if this new radar return is straight ahead, distance to it gets shorter faster than if it is at some angle relative to a car. But is this difference clear enough to make AEB?
I'll offer my wild guess about this. I also assume that Tesla's radar hardware gives no angular information at all, just distance at a point in time. So as you say, the problem is distinguishing between a truck stopped ahead of you and a truck stopped off to one side.
I think you can distinguish between the two if you make some assumptions. if you are going v = 100 kph (just to use round numbers) and the truck is straight ahead of you, the echo will be closing at a steady 100 kph. However let's say the truck is stopped x = 100 meters ahead of you and a = 10 meters off to the side. It's line of sight distance is r = Sqrt(x^2 + a^2) = Sqrt(100^2+10^2) = 100.499 meters. Its rate of closing would be v * d/dx (r) = v (x / (Sqrt(x^2 + a^2)) = 99.5 kph. Assuming the truck was really stopped and was a fixed distance a to the side, the rate of closing would diminish with time as the angle between your car's direction of travel and the truck increased, until you finally pass the truck, where the closing rate would be zero. The change in the rate of closing would follow a particular smooth curve if the truck was really stopped at a fixed distance to the side. Tracking the change in closing velocity (calculated from distance) and fitting the curve would indicate that the radar echo was not an obstruction ahead, or if the closing speed was steady it would indicate it was an obstruction. Of course if the truck was slowly moving in just the right way it would throw off the conclusion.
EDIT - I see that you came to the same conclusion and posted while I was writing my post.