Meanwhile, Faraday (and I think Lucid) are making their EV's with more sensors than Tesla.
Faraday Future FF 91, Lucid Air First Rides - Motor Trend
Faraday Future 91: Its complement of sensors includes lidar, 10 cameras, 13 radars, and 12 ultrasonic sensors.
Lucid Air: I don't know yet.
Tesla Model S: 8 cameras, 1 radar, and 12 ultrasonic
Looks like FF91 has 13 radars, the things that Tesla is able to see a huge amount of data with, whereas Tesla only has 1. I'd also like to know camera placement for the FF91: can it see around cars to the left and right? If they put it in those mirrors that hang out left and right, then likely they can. I think I'd be a lot safer in a fully programmed self driving FF91 than AP HW2 Tesla, simply because it has better sensor input, more knowledge of what is going on. Anyone sitting bitch knows they can't see around the vehicles in front of them as easily as the people on left and right (driver and passenger), and yet, that's where Tesla mounted their AP HW2 cameras. A FF91 will see red brake lights faster than the Tesla AP HW2, if I interpret that right. I'd like to know if the FF91 cameras are color aware; we know Tesla's don't see in color, so they'd probably not be able to distinguish a headlamp from a taillamp.
I'm a lot disappointed that we are seeing supposedly "self-driving" hardware start off with less tools than humans already have driving the car. We see color. We see depth. We see around things by being on the side of the car and able to look down the side of a column of traffic (especially if we weave out gently for a fraction of a second and weave back to let others see past us too so they don't hit us).