Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Software updates - Australia

This site may earn commission on affiliate links.
People, stop making excuses for Tesla's engineering shortcomings! They're in the process of cocking it all up by going vision only. A predictable predicament, so to speak.

Like @TrevRex I drove a Prius (both a Gen II and a Gen III) for 12 years before buying my Tesla. It had radar based adaptive cruise control, and the Gen III even had auto park - didn't work as well as the Tesla's, but that's also a decade of technological advancement inbetween.

Automotive radars operate at 76 GHz, which is a good frequency to use because it gives reasonable angular resolution, sees in the dark and through fog, but it is highly attenuated by liquid water, i.e. rain. I experienced a "cruise control not available" message in my Prius several times in heavy rain (sometimes even in dense drizzle). And that's where the problem lies for Tesla. They can't have FSD not work because it's raining.

After having damaged the Gen II driving through a mob of roos (I lived in outback NSW at the time) I bought a FLIR system and installed it in the Gen III, which worked a treat identifying wildlife from sufficiently far away at night. It was useless though in the daytime with sunshine, too many hot/warm surfaces.

The radar worked a treat 99.99% of the time, i.e. when it wasn't bucketing down.

The driver's eyeballs worked for the remainder of the time.

And therein lies the magical concoction: What in astronomy we call "multi-messenger" astronomy: Observing a source at multiple (as many as possible) wavelengths will give you the full picture of what's going on with the physics.

The same applies to self driving vehicles. You want as much information as possible, then select which wavelengths contains the useful bits, and use them.

Unfortunately, Tesla seem to think that because humans can drive on vision only, so can their cars. Sadly, car vision using the current cameras is resolution limited to a degree that probably best compares to a legally blind person driving.
 
So I received a message on my car screen this morning that my AP has reverted back to previous state. I noticed the EAP additional configuration no longer showing so I am back on the standard AP. Also the car now shows that the new update is available for download once I get to wi-fi (2022.44.30) will install in the evening when I get home.
same. I had hoped once the Free 30 Trial completed, as it did today for me, it would open the door for Tesla to provide a subscription model to EAP or FSD in Aus as they have available in the US. But no go.
 
The same applies to self driving vehicles. You want as much information as possible, then select which wavelengths contains the useful bits, and use them.

The problem is what to believe when you've got conflicting data.

As Tesla has described it, Radar is very accurate but without context...
So it sends back three blobs - one at 0km/h and 500m away, one at 0km/h and 600m away and one at 100km/h and 400m away.

Vision can do almost the same +/- 5km/h but tell you the first blob is a bridge pylon that the road curves away from, the third a car travelling in front of you that's just started braking and the second a broken down truck that needs to be avoided.

But what happens when the radar blob is 1° outside the vision blob - how to label it, and what to do.

Tesla isn't stopping installing an $80 part because it saves money, or not using that part already installed in millions of cars for the fun of it...
They are doing it because it doesn't add any value to driving and makes for more spurious noise.
 
  • Like
Reactions: Hairyman and ARMARM
Conflicting information can only be resolved reliably by having three independent sources. If two agree, it's extremely likely the odd one out is wrong.

That's been the accepted best automation engineering practise in aviation for decades (actually, half a century!), and Tesla (and all other manufacturers) would have done well to replicate the standards of that safety critical environment.

Boeing have impressively demonstrated how to cock that up with the Max disaster, by coupling a safety critical system to a single sensor without redundancy.

Airbus have equally impressively (but with opposite sign) demonstrated, that always having 3 redundant systems has not lead to a single crash due to failing automation.

Tesla are now demonstrating that not learning lessons from other but related industries is a recipe for disappointment.
 
After updating the Tesla app (iOS), noticed a new icon in the top right which turns out to be the Loot Box. This just leads to a screen basically saying the referral program is kaput.

Does anyone know why this Loot Box is now rather prominent - are we likely to get something again here in Aust?
 
  • Like
Reactions: AlbertParker
As Tesla has described it, Radar is very accurate but without context...
So it sends back three blobs - one at 0km/h and 500m away, one at 0km/h and 600m away and one at 100km/h and 400m away.
Futhermore the kind of radar usually found in cars can only give you a horizontal angle to the object, it doesn't tell you whether it's above the road (eg. overpass) or on it. And there's obviously heaps of 0km/h returns from the road surface and all the stationary objects around it so those basically have to be thrown out anyway.
 
After updating the Tesla app (iOS), noticed a new icon in the top right which turns out to be the Loot Box. This just leads to a screen basically saying the referral program is kaput.

Does anyone know why this Loot Box is now rather prominent - are we likely to get something again here in Aust?
In the past tesla have used the referral program when they need more sales, and agree it has returned in a prominant location
 
The problem is what to believe when you've got conflicting data.

As Tesla has described it, Radar is very accurate but without context...
So it sends back three blobs - one at 0km/h and 500m away, one at 0km/h and 600m away and one at 100km/h and 400m away.

Vision can do almost the same +/- 5km/h but tell you the first blob is a bridge pylon that the road curves away from, the third a car travelling in front of you that's just started braking and the second a broken down truck that needs to be avoided.

But what happens when the radar blob is 1° outside the vision blob - how to label it, and what to do.

Tesla isn't stopping installing an $80 part because it saves money, or not using that part already installed in millions of cars for the fun of it...
They are doing it because it doesn't add any value to driving and makes for more spurious noise.
I think the problem is more to do with their software and hardware implementation and probably their relative inexperience as an auto manufacturer. I had an Merc EQA EV for 12 months with radar cruise and autosteer etc and no phantom braking issues at all like I had with the model 3 with radar. The question is why couldn't Telsa get it to work when other OEM's can? Maybe vertical integration isn't the be-all and end-all.
 
  • Like
Reactions: DasBoot
The question is why couldn't Telsa get it to work when other OEM's can? Maybe vertical integration isn't the be-all and end-all.

Silicon Valley arrogance? “We’re gonna break the mould. Not only are we gonna break it, we’re gonna to atomise it and scatter it to all corners of the universe. Then we’ll build a totally new mould that will show those dinosaurs that we can do it ten times better than they could have ever imagined.”

But sometimes there’s no point in trying to build a better mousetrap. Case in point - automatic wipers. Perfectly good existing solution - reliable and low-cost sensors. But no, Tesla was determined to do it solely with cameras for absolutely no benefit.
 
I think the problem is more to do with their software and hardware implementation and probably their relative inexperience as an auto manufacturer. I had an Merc EQA EV for 12 months with radar cruise and autosteer etc and no phantom braking issues at all like I had with the model 3 with radar. The question is why couldn't Telsa get it to work when other OEM's can? Maybe vertical integration isn't the be-all and end-all.
It’s interesting that merc use the radar to pick up a child running out from behind a car and brake in good time. Is that a failing of a vision only system?
 
Silicon Valley arrogance? “We’re gonna break the mould. Not only are we gonna break it, we’re gonna to atomise it and scatter it to all corners of the universe. Then we’ll build a totally new mould that will show those dinosaurs that we can do it ten times better than they could have ever imagined.”

But sometimes there’s no point in trying to build a better mousetrap. Case in point - automatic wipers. Perfectly good existing solution - reliable and low-cost sensors. But no, Tesla was determined to do it solely with cameras for absolutely no benefit.
I think tesla broke the mould and scattered it across the earth. There is no doubt they are fully responsible for making ev mainstream. They are unlikley to be responsible for creating a better wiper.
 
I think tesla broke the mould and scattered it across the earth. There is no doubt they are fully responsible for making ev mainstream. They are unlikley to be responsible for creating a better wiper.

Exactly. Companies still need wisdom (and a bit of humility that maybe they don’t know it all) in deciding where innovation has the potential to make a meaningful difference the world and where it won’t.
 
  • Like
Reactions: evpaddie
"In the latest banana test, this tomato grower has really fallen down the ranks. Their tomatoes really don't taste like bananas at all, and when running fruit against fruit, the banana overall really is just much longer. They also peel much easier. We only tested them peeled, sliced, and eaten raw though, we didn't test the fruits suitability to make pasta sauce or other more complex dishes."

... if ADAS were fruit, and CR ran a report on it.