Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
What microphones for FSDb? The internal ones? Come on!
Come on what? It’s a sensor. I am not even remotely implying that they are using it (though I suppose in some edge case it could be useful, it seems doubtful).

All I claim is that it is a sensor (and I think I am on pretty solid ground on that one).

Seems like there are plenty of other sensors that would be more important to start with.

Not sure why people are under the impression that their cars have no sensors useful for perception other than cameras. It’s weird.
 
Last edited:
I'm pretty sure Tesla vehicles use sensors other than cameras because;

The car occasionally displays voice commands and sometimes acts on some of them, proving it has some sort of sonic sensors?

In navigation the route sometimes suddenly changes to a different destination hundreds or thousands of miles away and displays things that are not locally visible. It must be getting the image behind the route display of roads streets, bushes, fields, buildings, etc., from some place other than a camera on this car?

When the car alerts me with a message warning me to apply torque to the steering wheel and then threatens my safety, sanity and or authorization to use FSDb, if I don't; - it will sometimes respond to one of the steering wheel roller buttons being pressed or rotated and sometimes will respond to torque on the steering wheel and be satisfied for several seconds. So there must be touch sensors?

The car occasionally gives me a warning message that the windshield washer fluid is low and becomes satisfied when I dump in a gallon or so. The Tesla service manual simply describes the relative device as "the fluid level sensor" and doesn't give any further clues about it, but it does not appear to be a camera?

The car certainly has lots of temperature sensors to maintain the cabin, battery, refrigerant, coolant, etc., temperatures?

The difficulty the car has been having maintaining connectivity since the last software update does hint that the camera it uses for WiFi, (unless it has some other sort of sensor like an antenna), may be in trouble. And carefully cleaning all the known cameras has not helped the connectivity, so I'm pretty sure that uses something besides a camera.

Most humans have 5 senses and many have a "sixth sense" that doesn't seem to be well understood. Personally, I have always assumed the 6th is just the result of the NN in the human brain processing data from all 5 physical senses and subconsciously integrating them and producing answers to unasked questions. We manage to kill and injure a whole lot of people with our driving, even when using all 5 or perhaps 6 of our senses.

I do get that if you are manually programming a driving system, then data with little value, that has the potential to confuse the program, rather than helping it make good decisions, might be discarded. But if you are building a driving system based on NNs processing and decisions, then in my opinion the problems of too much data get easily filtered out with the training and it would make sense to add as many kinds of sensors as can be thought of; they may slow down the training, but should provide a safer more human like driving experience.
 
Obviously, Tesla needs a router to know where to go. But I know in most AVs, the router is separate from the perception-planner stack. In other words, the perception stack sees the world. The router is separate and informs the planner where the car needs to go. So I imagine it would be similar with V12. The end-to-end takes in camera data and does the perception-planner tasks and the router is a separate module which tells the end-to-end where the car needs to go.
That's similar to what I envision. However we regularly see "wrong lane" issues - it's in my top 3 things to fix. I assume that's part of what the planner does. If that doesn't improve significantly, then FSD v12 will be perceived as wrong and/or dangerous regardless of how well it may have driven. I do hope that portion does improve.
 
  • Like
Reactions: Jeff N
So which ones of these do you sugest Tesla are using for FSD?😂
Unclear which are still used for 12.x, but previously Ashok Elluswamy has given some example inputs to predictions:

11.x object prediction inputs.png



I would think navigation is a must for 12.x for it to decide to make lane changes, turns at intersections, which side of street to park, etc.
 
Mine comes with an additional human sensor. It is constantly overriding the other sensors and ruining the FSD excitement. I know this is a dumb a$$ sensor since it can't even perform moderate algebra problems. Clearly not capable of making driving decisions and should be removed ASAP. Elon keeps promising to remove this obtuse and unneeded sensor "by the end of the year" but so far has just about removed all the other sensor but this one.
 
Mine comes with an additional human sensor. It is constantly overriding the other sensors and ruining the FSD excitement. I know this is a dumb a$$ sensor since it can't even perform moderate algebra problems. Clearly not capable of making driving decisions and should be removed ASAP. Elon keeps promising to remove this obtuse and unneeded sensor "by the end of the year" but so far has just about removed all the other sensor but this one.
The best driver is no driver.
 
Of course, the cars have GPS for localization and nav maps for routing. But Elon said V12 is "photons in, control out". So the end-to-end is trained on camera data only.
I'm pretty sure that is false. Wayve (the only major end-to-end solution that comes to mind) takes in GPS, map data, vehicle state too as input.


Think about it, it has to do that because the AI needs to know which direction you plan to go for example if you come to an intersection. With only camera input there is no way for the AI to make a decision on what are the proper outputs.
 
Last edited:
Another Whole Mars Catalog drive. This was an interesting one.
  1. It found itself stopped because of a car two cars up. It decided to go around at the same time as the car in front of it.
  2. It went around a vehicle stopped as it tried to "cut out" of the Tesla's lane.
  3. It handled traveling with an ambulance with lights flashing without a problem. Perhaps it just doesn't recognize emergency vehicles yet.
  4. It handled a variety of intersection geometries well.
  5. Several poor stop sign interactions that seem to show that the car doesn't want to start into an intersection when cross traffic is present/approaching. It may not recognize that they have a stop sign.
  6. At the end, it slid into a reasonably-narrow spot between two cars
If it wasn't for the stop sign problem, it would feel like a product. In San Francisco, anyway.

 
Another Whole Mars Catalog drive. This was an interesting one.
  1. It found itself stopped because of a car two cars up. It decided to go around at the same time as the car in front of it.
  2. It went around a vehicle stopped as it tried to "cut out" of the Tesla's lane.
  3. It handled traveling with an ambulance with lights flashing without a problem. Perhaps it just doesn't recognize emergency vehicles yet.
  4. It handled a variety of intersection geometries well.
  5. Several poor stop sign interactions that seem to show that the car doesn't want to start into an intersection when cross traffic is present/approaching. It may not recognize that they have a stop sign.
  6. At the end, it slid into a reasonably-narrow spot between two cars
If it wasn't for the stop sign problem, it would feel like a product. In San Francisco, anyway.


There's another instance from the following AI Driver video. V12 tried to go around two cars waiting at a stop sign. V12 very quickly came to the conclusion that both cars were were double parked and needed to be passed. Upon passing the cars a long horn sound was heard from a very upset SF driver. So much for being human-like and comforting.

UPDATE: I found the video...

 
Last edited:
Not sure if this is impressive or terrifying or both? :eek: Is 12.x showing signs of creativity in figuring out a left turn to Hayes after crossing Market from 9th to Larkin? ;)

12.1.2 hayes left.jpg


Typically the 2 left lanes are used to cross Market to then curve to Hayes while the right 3 lanes curve to Larkin. People coming from the right on Market can cross Larkin to get to Hayes, and 12.1.2 make use of this link to stay on route. This intersection is quite unusual, but the maneuver might have actually been legal with no explicit signage preventing this clever(?) left turn, and I suppose one could argue the "DO NOT CHANGE LANES" sign on the left side of the screenshot might only apply to the 2 lanes going to Hayes (although Google Street View shows there used to be a matching sign on the right side back in 2016…).
 
  • Like
Reactions: powertoold
Because everyone knows it's literally impossible to hear important sounds like sirens from inside the cabin.
That’s not the issue… The issue is that people don’t love having an active mic in the cabin for privacy reasons, but hey, good luck with that. External directional mics is needed for useful signal in a city context at least.
 
Last edited:
A need for detecting and interpreting emergency sirens is not required in the United States. Deaf people drive are permitted to drive cars.

I always find these types of arguments a bit silly. Just because we allow something does not mean it is the best approach. I would argue that non deaf people probably handle emergency vehicles more reliably than deaf people because they can hear the sirens. And we want AVs to be as reliable as possible. If adding microphones helps AVs detect emergency vehicles sooner, that would be a good thing.