Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Rumor: HW4 can support up to 13 cameras

This site may earn commission on affiliate links.
I'm enjoying this thread. I'm curious as to the cost difference to switch from the current wiring to plastic optical fiber, both on cable and all connectors. Also curious about any latency difference, especially in getting raw pixels into HWx.

Also, are there multiple form factors for HW3? I just assumed that the liquid cooling for the Ryzen in the Plaid/3/Y included cooling for the HW3. It will be interesting to see if the first HW4 boards can even be retrofited in HW3 liquid cooled setups.
 
Last edited:
  • Helpful
Reactions: n.one.one
Resolution may help, but more critical imo is cameras in the front bumper facing left and right to provide more visibility, and quicker decision making and commitment to the turn.
I think they went with the B pillar because the damage to bumpers is so common. Putting cameras there would make repairs more expensive and thus insurance.

How about A pillar ?
 
I'm enjoying this thread. I'm curious as to the cost difference to switch from the current wiring to plastic optical fiber, both on cable and all connectors. Also curious about any latency difference, especially in getting raw pixels into HWx.

Also, are there multiple form factors for HW3? I just assumed that the liquid cooling for the Ryzen in the Plaid/3/Y included cooling for the HW3. It will be interesting to see if the first HW4 boards can even be retrofited in HW3 liquid cooled setups.
Fiber optics are useful for carrying signals over longer distances, but I doubt it is worth pursuing in short distances like a car. Not only are the cables more expensive (you can easily google that they cost multiple times more), the controllers are more expensive too. Coaxial cables work fine in this application.

For either one, the distances are so short, the latency from the cable you can essentially treat as nonexistent, the controller latencies, any video processing, will contribute a lot more than the cable.
For optical fiber, it's 4.9 microseconds per km
Calculating Optical Fiber Latency
For an electrical signal through a coaxial cable, it's 8 inch per nanosecond, which also translate to the same: 4.9 microseconds per km (with caveat a coax cable won't be able to get a signal through a one km long cable without repeaters of some sort that will add latency).
76.18 -- Speed of signal through coaxial cable
The signal in both travel about 2/3 the speed of light in a vacuum.
 
Last edited:
I'm enjoying this thread. I'm curious as to the cost difference to switch from the current wiring to plastic optical fiber, both on cable and all connectors. Also curious about any latency difference, especially in getting raw pixels into HWx.
Theoretically, it depends on whether optical media is the camera module's native interface or not.

If you have a camera module that's designed specifically to transmit over fiber, it shouldn't add any meaningful latency beyond what copper would. If anything, I'd expect optical cables to have ever-so-slightly lower latency, because UHD over copper requires splitting the data across multiple twisted pairs and then recombining it on the receiving end, whereas optical fiber should be able to easily carry the data on a single fiber.

If the camera module is designed to transmit over multiple copper pairs, the extra conversion step will add some latency.

I have no idea what sorts of modules exist out there, though.
 
Also, are there multiple form factors for HW3? I just assumed that the liquid cooling for the Ryzen in the Plaid/3/Y included cooling for the HW3. It will be interesting to see if the first HW4 boards can even be retrofited in HW3 liquid cooled setups.
Probably .. in general a new hardware generation is usually a process shrink, which gets you more compute performance in the same power envelope.
 
Fiber optics are useful for carrying signals over longer distances, but I doubt it is worth pursuing in short distances like a car. Not only are the cables more expensive (you can easily google that they cost multiple times more), the controllers are more expensive too. Coaxial cables work fine in this application.
Are you sure? AFAIK, all UHD/4K video transmission over coax is compressed (even SDI). Compression typically introduces both unacceptable latency and quality loss (artifacts) that could potentially interfere with proper analysis of the images, not to mention making its driving measurably less safe by delaying its response to what's happening in the outside world.

For automotive purposes, you'll need at least 10 bits per color channel, with three color channels. I'm also assuming that they will move to 60 fps, both because most cameras run at that rate these days and because higher frame rates can at least potentially reduce pipeline latency, which is a highly desirable improvement over what we have now.

At 3840 x 2160 resolution, then, those specifications would require a whopping 15 gigabits per second (10-bit * 3840h * 2160v * 60fps * 3 color channels) uncompressed. I'd be really shocked if it were possible to do that without compression over coaxial cables. I think the highest ever successfully achieved in the real world is only about 10 Gbps, although at least in theory, some of the DOCSIS encodings can at least theoretically send 15 Gbps over coax under ideal conditions. Either way, that's seriously pushing the upper limits of coax, given current technology.

But even if you manage to find a way to push 15 gigabits per second over coax, you'd still be replacing one grossly inadequate technology (a single twisted pair) with another technology that is just barely good enough for the immediate need, and definitely not good enough for any future improvements. So if they later decided that they need 8k or 120 fps or 14 bits per color channel, they would have to replace all of the cabling again. Cutting corners in something that's hard to replace is not the best strategy. Using twisted pairs was a mistake to begin with. Compounding that mistake by replacing it with a second technology that once again leaves you no room for future expansion would be utterly embarrassing.

By contrast, the record for a single fiber is about 80 terabits per second per fiber. All three 4K UHD cameras in the center could easily be multiplexed on a single fiber with room to spare for 5,330 more cameras. And that, right there, is how you future-proof your cabling.
 
  • Informative
Reactions: pilotSteve
Are you sure? AFAIK, all UHD/4K video transmission over coax is compressed (even SDI). Compression typically introduces both unacceptable latency and quality loss (artifacts) that could potentially interfere with proper analysis of the images, not to mention making its driving measurably less safe by delaying its response to what's happening in the outside world.

For automotive purposes, you'll need at least 10 bits per color channel, with three color channels. I'm also assuming that they will move to 60 fps, both because most cameras run at that rate these days and because higher frame rates can at least potentially reduce pipeline latency, which is a highly desirable improvement over what we have now.

At 3840 x 2160 resolution, then, those specifications would require a whopping 15 gigabits per second (10-bit * 3840h * 2160v * 60fps * 3 color channels) uncompressed. I'd be really shocked if it were possible to do that without compression over coaxial cables. I think the highest ever successfully achieved in the real world is only about 10 Gbps, although at least in theory, some of the DOCSIS encodings can at least theoretically send 15 Gbps over coax under ideal conditions. Either way, that's seriously pushing the upper limits of coax, given current technology.

But even if you manage to find a way to push 15 gigabits per second over coax, you'd still be replacing one grossly inadequate technology (a single twisted pair) with another technology that is just barely good enough for the immediate need, and definitely not good enough for any future improvements. So if they later decided that they need 8k or 120 fps or 14 bits per color channel, they would have to replace all of the cabling again. Cutting corners in something that's hard to replace is not the best strategy. Using twisted pairs was a mistake to begin with. Compounding that mistake by replacing it with a second technology that once again leaves you no room for future expansion would be utterly embarrassing.

By contrast, the record for a single fiber is about 80 terabits per second per fiber. All three 4K UHD cameras in the center could easily be multiplexed on a single fiber with room to spare for 5,330 more cameras. And that, right there, is how you future-proof your cabling.
The latest HDMI is 12Gb/s per channel over twisted pair. I suspect that fiber optics would be more expensive than just running as many wires as you could conceivably need.
 
Considering Chuck Cook's YouTube videos of the Tesla Model 3 repeatedly failing over 18 months with unprotected left turns, the increase in cameras doesn't surprise me
Although I have been convinced for awhile that the present camera set will never consistently do Chuck's specific ULT, it does do ULTs. Chuck's ULT is an extreme case. The car has to make a left turn onto a high speed limit multilane highway. The median is not particularly wide. The intersection is partially obscured by vegetation. There are no nearby upstream or downstream traffic controls.

It's become an interesting academic exercise, but I am not sure that constantly testing new versions of FSD beta with this specific ULT is telling me anything other than additional cameras are needed to safely make this turn.
 
  • Like
Reactions: tmoz and Sigma4Life
Are you sure? AFAIK, all UHD/4K video transmission over coax is compressed (even SDI). Compression typically introduces both unacceptable latency and quality loss (artifacts) that could potentially interfere with proper analysis of the images, not to mention making its driving measurably less safe by delaying its response to what's happening in the outside world.

For automotive purposes, you'll need at least 10 bits per color channel, with three color channels. I'm also assuming that they will move to 60 fps, both because most cameras run at that rate these days and because higher frame rates can at least potentially reduce pipeline latency, which is a highly desirable improvement over what we have now.

At 3840 x 2160 resolution, then, those specifications would require a whopping 15 gigabits per second (10-bit * 3840h * 2160v * 60fps * 3 color channels) uncompressed. I'd be really shocked if it were possible to do that without compression over coaxial cables. I think the highest ever successfully achieved in the real world is only about 10 Gbps, although at least in theory, some of the DOCSIS encodings can at least theoretically send 15 Gbps over coax under ideal conditions. Either way, that's seriously pushing the upper limits of coax, given current technology.

But even if you manage to find a way to push 15 gigabits per second over coax, you'd still be replacing one grossly inadequate technology (a single twisted pair) with another technology that is just barely good enough for the immediate need, and definitely not good enough for any future improvements. So if they later decided that they need 8k or 120 fps or 14 bits per color channel, they would have to replace all of the cabling again. Cutting corners in something that's hard to replace is not the best strategy. Using twisted pairs was a mistake to begin with. Compounding that mistake by replacing it with a second technology that once again leaves you no room for future expansion would be utterly embarrassing.

By contrast, the record for a single fiber is about 80 terabits per second per fiber. All three 4K UHD cameras in the center could easily be multiplexed on a single fiber with room to spare for 5,330 more cameras. And that, right there, is how you future-proof your cabling.

I think you are significantly overstating the amount of bandwidth required for autonomous driving purposes (or at least plausible given current computing limitations).

10 bit color channels? The current cameras still use RCCB filters to purposely limit color depth. I find this leap to be highly unlikely.

I also think jumping from 720p (1.2mp) to 4K sensors (8.3mp) is highly unlikely. I’ve read some stuff that indicates ~5mp sensors might be likely (link below), and I think that’s a reasonable guess.

A more reasonable bandwidth estimate:

24 (8 bit x 3 channel) x 2896h x 1896v x 60fps = ~8gbps, so even a full uncompressed stream would comfortably transmit over copper 10g Ethernet.

Fiber is expensive and fragile. I can see an automotive application doing everything they possibly can to avoid using it.


 
  • Informative
Reactions: pilotSteve
Are you sure? AFAIK, all UHD/4K video transmission over coax is compressed (even SDI). Compression typically introduces both unacceptable latency and quality loss (artifacts) that could potentially interfere with proper analysis of the images, not to mention making its driving measurably less safe by delaying its response to what's happening in the outside world.

For automotive purposes, you'll need at least 10 bits per color channel, with three color channels. I'm also assuming that they will move to 60 fps, both because most cameras run at that rate these days and because higher frame rates can at least potentially reduce pipeline latency, which is a highly desirable improvement over what we have now.

At 3840 x 2160 resolution, then, those specifications would require a whopping 15 gigabits per second (10-bit * 3840h * 2160v * 60fps * 3 color channels) uncompressed. I'd be really shocked if it were possible to do that without compression over coaxial cables. I think the highest ever successfully achieved in the real world is only about 10 Gbps, although at least in theory, some of the DOCSIS encodings can at least theoretically send 15 Gbps over coax under ideal conditions. Either way, that's seriously pushing the upper limits of coax, given current technology.
I should note this is not how raw video outputs from the image sensor. A 4K image sensor will only has 3840h * 2160v pixels, not 3 channels of such. The 3 channels of color are generated after demosaicing (which can be done by the image signal processor in the computer, in HW3 there is a ISP on the HW3 chip that does it). That means for 4K60p at 10-bit, it only requires 5 Gbps over the coax cable.
Tesla uses a RCCC or RCCB filter (not bayer), but just so you get an idea of how it is done:
Bayer filter - Wikipedia
Color filter array - Wikipedia
But even if you manage to find a way to push 15 gigabits per second over coax, you'd still be replacing one grossly inadequate technology (a single twisted pair) with another technology that is just barely good enough for the immediate need, and definitely not good enough for any future improvements. So if they later decided that they need 8k or 120 fps or 14 bits per color channel, they would have to replace all of the cabling again. Cutting corners in something that's hard to replace is not the best strategy. Using twisted pairs was a mistake to begin with. Compounding that mistake by replacing it with a second technology that once again leaves you no room for future expansion would be utterly embarrassing.

By contrast, the record for a single fiber is about 80 terabits per second per fiber. All three 4K UHD cameras in the center could easily be multiplexed on a single fiber with room to spare for 5,330 more cameras. And that, right there, is how you future-proof your cabling.
12G-SDI which has been around since 2015 gives you 4K60p uncompressed using a 12 Gbps connection.
24G-SDI is aiming to give 8K30p uncompressed using a 24Gbps connection.
Serial digital interface - Wikipedia

But as above, as I mention, they don't need as much bandwidth as suggested by these standards, given the cable isn't transmitting 3 channels of color.

In reality the NNs don't need that much resolution from the cameras either (Tesla's current cameras aren't even FHD). Maybe the center camera can have a 4K sensor for digital zoom purposes, but it'll likely be used in a crop mode, and in that mode most image sensors today have readout modes that can output a crop with no additional processing, which means you don't need to send a 4K signal through the wire (just the cropped one).

Note, Tesla does not use SDI, it uses MIPI CSI-2 (common standard that image sensors use) which is serialized over FPD-Link III (commonly used in automotive industry for backup cameras) which can be transmitted over a single coax cable:
FPD-Link - Wikipedia
Camera Serial Interface - Wikipedia
This tech allows up to 4 Gbps over a coax cable or in other words comfortably 1080p60 (or 2M 60p) at 10-bit:
Here's a reference design from TI that uses exactly this tech:
https://www.ti.com/lit/ug/tidueb1/tidueb1.pdf?ts=1647741889560

I imagine for higher resolutions, if they stick with CSI and FPD-Link, they'll just add more cable pairs and it'll still be drastically cheaper than using fiber. MIPI is cognizant of support of 8K or higher resolutions, and has a 18 wire design (6 trios) that delivers 34Gbps which allows for 50MP at 10-bit, which is more than good enough for forseeable applications:
https://www.mipi.org/sites/default/files/MIPI_CSI-2_Specification_Brief.pdf

On the cable end if they stick with 4Gbps FPD-Link III, you can squeeze by 10-bit 4K48p (or maybe slightly lower frame rate or res) on a single cable. If you double up the cables/controllers you get 8Gbps, which will give you 10-bit 4K60p. 3 cables gives you 12Gbps, which gets you past 10-bit 6k60p (6144x3160). So on and so forth.
 
Last edited:
The latest HDMI is 12Gb/s per channel over twisted pair. I suspect that fiber optics would be more expensive than just running as many wires as you could conceivably need.
The problem with HDMI is it is a compressed standard. That said, you can use HDMI cables to transmit the uncompressed CSI-2 signals that the image sensors use. Just from studying the Pi HQ camera which uses a 12MP Sony IMX477 sensor, it can get 10-bit 4K30p (actually slightly higher 4032x3040) over 2 CSI lanes when connected to a Nvidia Jetson Nano or NX,
Raspberry Pi HQ Camera in Jetson Nano
There are boards that can deliver the CSI-2 signal over an HDMI cable with no processing (given the CSI cable only uses 15 pins while an HDMI cable has 19 pins). In fact, you can squeeze 2 extra lanes in there for even more bandwidth, although currently the drivers don't support it.
Arducam Complete High Quality Camera Bundle, 12.3MP 1/2.3 Inch IMX477 HQ Camera Module with 6mm CS-Mount Lens, Metal Enclosure, Tripod and HDMI Extension Adapter for Jetson Nano, Xavier NX - Arducam

As discussed in my other post however, that is not how the automotive world does it, they serialize the signals so that they can be delivered over a signal coax cable (without needing so many signal pairs).
 
Last edited:
The problem with HDMI is it is a compressed standard.
LOL.


High-Definition Multimedia Interface (HDMI) is a proprietary audio/video interface for transmitting uncompressed video data and compressed or uncompressed digital audiodata from an HDMI-compliant source device, such as a display controller, to a compatible computer monitor, video projector, digital television, or digital audiodevice.[3] HDMI is a digital replacement for analog video standards.


The latest HDMI is 12Gb/s per channel over twisted pair. I suspect that fiber optics would be more expensive than just running as many wires as you could conceivably need.
Actually 48 Gb with HDMI 2.1. 10.8 Gb until HDMI 1.4.

2.1 is used by XBOX S and new PlayStation for 4k120.
 
  • Like
Reactions: drtimhill
LOL.


High-Definition Multimedia Interface (HDMI) is a proprietary audio/video interface for transmitting uncompressed video data and compressed or uncompressed digital audiodata from an HDMI-compliant source device, such as a display controller, to a compatible computer monitor, video projector, digital television, or digital audiodevice.[3] HDMI is a digital replacement for analog video standards.



Actually 48 Gb with HDMI 2.1. 10.8 Gb until HDMI 1.4.

2.1 is used by XBOX S and new PlayStation for 4k120.
HDMI used to be uncompressed. It has now introduced compression (called DSC):
Connections and Compression in HDMI Transmission
Chroma subsampling (which is a type of compression, even though not everyone calls it that) also is very commonly used with HDMI (I dealt with this when I set up my home theater receiver):
Chroma Subsampling: 4:4:4 vs 4:2:2 vs 4:2:0

This is why you see a lot of HDMI that claims high resolution and framerates, but if you look at the details, it is actually compressed using chroma subsampling:
The PS5 you talk about is one such example, 4K120 is on YUV422 (aka 4:2:2 chroma subsampling):
"However, playing a game at a 120Hz refresh rate, the console automatically downgrades the output to 4:2:2 chroma subsampling"
Everything You Need To Know About PS5 HDMI Cable

Of course as mentioned above, automakers are transmitting raw image sensor data, they aren't trying to transmit a RGB or YUV signal, so they wouldn't be using these types of standards anyways.
 
Last edited:
For automotive purposes, you'll need at least 10 bits per color channel, with three color channels. I'm also assuming that they will move to 60 fps, both because most cameras run at that rate these days and because higher frame rates can at least potentially reduce pipeline latency, which is a highly desirable improvement over what we have now.

At 3840 x 2160 resolution, then, those specifications would require a whopping 15 gigabits per second (10-bit * 3840h * 2160v * 60fps * 3 color channels) uncompressed.
On what are you basing these claims? Why do you need 10 bits? And why 60 fps and 4K? Pipeline latency has very little to do with frame rate, and much more to do with resolution and bit depth (for which you provide no information as to why these are needed).