Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
Unless all you care about is highway nag removal?

Missed this part. So I think the "opt-in, constantly expanding FSD area", as part of the Tesla Network, will allow two separate things:
  • "Eyes off the road FSD", with driver present. This is stronger than "no hands on wheel". This is a big consumer convenience feature but it doesn't generate FSD income directly.
  • "No driver present FSD". This is what matters most to "taxi revenue" and "truck hauling revenue" income streams.
The ultimate goal would be "no driver present FSD" in some areas, hence the strict opt-in approach.
 
Last edited:
My position is simple: to realistically enable point-to-point service (the use case where self-driving actually means anything more than removing a nag), it has to work essentially everywhere within broad regions where self-driving is allowed. Exclusions need to be comparably rare exceptions, not the general case.

Unless all you care about is highway nag removal?
Most people do not have to drive by the lip of an active volcano day to day. Most people do not drive along dirt tracks day to day. Most people don’t care about these edge cases frankly. Sometimes when I go on holiday, Uber and Deliveroo are available in that location. Sometimes they’re not. Oh well. Doesn’t stop me giving a big chunk of my paycheque every month to those companies.
 
I agree, but note that while standardization of networking was important due to interoperability requirements between networking products, standardization of autonomy levels is almost 100% pointless, as the products do not and will not have to interoperate in any serious fashion.
This seems incorrect when we are talking about caravans and emergency vehicles. It's also incorrect in the agricultural, narrow road, cars parked on both sides scenarios mentioned in previous posts. Vehicles could talk to each other to determine who pulls over, who passes, how to give an emergency vehicle the clearest path through traffic, etc. I can see a lot of applications for AI vehicle communication. Admittedly, this won't come until a certain saturation happens, but it shouldn't be ignored.
 
  • Like
Reactions: neroden
It might be 15 times safer than a human in said cities, but 100 times more dangerous than a human in Podunk, Idaho. Is it okay to kill off the population of Podunk?
The question is: What would make it less safe? Pedestrians walk although the light it red? Angled parking cars suddenly back out. People run into the street like deer?

None of this seems very convincing because these same scenarios happen everywhere. I'm still of the opinion that infrared is needed to spot/track humans and animals so that the car can take action if they suddenly start to dart out in the street.
 
  • Like
Reactions: neroden
This seems incorrect when we are talking about caravans and emergency vehicles. It's also incorrect in the agricultural, narrow road, cars parked on both sides scenarios mentioned in previous posts. Vehicles could talk to each other to determine who pulls over, who passes, how to give an emergency vehicle the clearest path through traffic, etc. I can see a lot of applications for AI vehicle communication. Admittedly, this won't come until a certain saturation happens, but it shouldn't be ignored.

Agreed, as I noted:

FSD vehicles won't network with each other telling their autonomy levels. Any sort of interoperability (such as charging authorization, or future convoying and FSD tunnel access features) are or will be far more specific.

Note how FSD levels won't be exchanged between vehicles, only telemetry and specific capabilities such as convoying protocol supported, braking and acceleration ability, etc.

Just consider three FSD vehicles communicating, are they going to say:
  • Vehicle A broadcasting: "Hey I'm a Level 5 Tesla!"
  • Vehicle B broadcast reply: "Cool bro, I'm a level 4 Porsche!"
  • Vehicle B to C, private message: "Arrogant showoff ..."
? :D

The FSD numeric levels of 1-5 defined in the 35 pages long specification are completely arbitrarily, way too coarse and are thus mostly meaningless from a technological and networking point of view.
 
The question is: What would make it less safe? Pedestrians walk although the light it red? Angled parking cars suddenly back out. People run into the street like deer?

None of this seems very convincing because these same scenarios happen everywhere. I'm still of the opinion that infrared is needed to spot/track humans and animals so that the car can take action if they suddenly start to dart out in the street.
Do humans need inrfared to spot these events? If a human can detect it then a camera based system can too with enough data input. We are all not giving the significance of the NN enough credit I think.

Dan
 
Do humans need inrfared to spot these events? If a human can detect it then a camera based system can too with enough data input. We are all not giving the significance of the NN enough credit I think.

Dan

Even single eyed humans without proper 3D vision are allowed to drive a car. I can just see no single reason why 8 cameras, 12 US sensors and a long range radar won't do it.
 
A thought of what would should REALLY impress investors on April 22nd, more than even a Level 5 demonstration.

A business model that showed how the Tesla Network would be profitable.

This is a hurdle that both Lyft and Uber have NOT YET CONQUERED, AND MAY NEVER, and yet they are both valued in the tens of billions of dollars.

We are all focused on the technology. The killer feature of the Tesla Network may be profitability !
 
Even single eyed humans without proper 3D vision are allowed to drive a car. I can just see no single reason why 8 cameras, 12 US sensors and a long range radar won't do it.


The Tesla cameras do not move to follow motion. Some are wide angle lenses. Like a horse’s eye when grazing.


Mapping the distortion from multiple wide angle lenses into Cartesian coordinates takes faster processors.

20 successful navigations of a road without correction is statistically significant.

If Tesla has 1/4 the accidents per mile, is it 1/2 per minute?
 
Do humans need inrfared to spot these events? If a human can detect it then a camera based system can too with enough data input. We are all not giving the significance of the NN enough credit I think.

Dan
Given the number of deer strikes and vehicle pedestrian accidents, I'd say yes, they need them. The problem is that just vision doesn't track these conditions until it's too late to avoid them--doubly true at night. Having an infrared system that's aware of the surroundings has the potential of eliminating close to all of these accidents.
 
Given the number of deer strikes and vehicle pedestrian accidents, I'd say yes, they need them. The problem is that just vision doesn't track these conditions until it's too late to avoid them--doubly true at night. Having an infrared system that's aware of the surroundings has the potential of eliminating close to all of these accidents.

Aren't Tesla's cameras grey-grey-red-blue? Seem to remember seeing that somewhere. Grey = no filter, so all light (including IR) comes in (barring the limits of the sensor and its optics). Of course, thermal IR (as opposed to NIR) is quite low intensity, and for non-cooled sensors, easy to drown out in its own noise. NIR does allow you to see the world in rather interesting ways, mind you. One could also make use of polarization data, to help distinguish direct light from reflected light, and the properties of the reflecting surface. There's all sorts of data that one could gather. The question is what do you actually need...

Thermal IR would certainly be nice, of course.
 
Aren't Tesla's cameras grey-grey-red-blue? Seem to remember seeing that somewhere. Grey = no filter, so all light (including IR) comes in (barring the limits of the sensor and its optics). Of course, thermal IR (as opposed to NIR) is quite low intensity, and for non-cooled sensors, easy to drown out in its own noise. NIR does allow you to see the world in rather interesting ways, mind you. One could also make use of polarization data, to help distinguish direct light from reflected light, and the properties of the reflecting surface. There's all sorts of data that one could gather. The question is what do you actually need...

Thermal IR would certainly be nice, of course.
As you say, there's no point fitting IR sensitive cameras if there's no source of IR light to illuminate the world around you. Cameras sensitive to the thermal part of the IR spectrum are coming down in price and size, but I'd be surprised if they'd be used for a while yet.

It would make more sense to use low light cameras. There are plenty of cheap, high resolution sensors available which can make use of very little light to see things the human eye can't.