So I've been sticking it out on 40.2.1, which was the first with 'deep rain' (make sure to pause then use your special voice for that!) as the 50's don't offer much in the UK and this working really well.
Today I did my first long drive in a while - 6 hours in total - in mixed weathers, across sun up and down. And the big thought I had was 'these wipers are as good as my old car now', ie effectiveness parity has been reached. It was 90%+ effective , and a press of the button brought it to life of needed. Very similar to my 2011 VAG which would need the aggressiveness dial tweaked sometimes.
It was the same for the auto full beam. It turned off reliably for oncoming vehicles, only occasionally for the wrong reason and came back on predictably after 2 seconds (and anti hysterious defence I think). Again, using the camera, not a sensor.
It doesn't sound like much, but a camera plus software is now doing what previously needed dedicated specialised hardware sensors to achieve. And it can be upgraded if needed, or extra actions on the back of its output created (eg auto fog lights?). It, along with the 50. branch is showing that computer vision is a solvable problem, and that is huge.
Its the same change for cars that the cloud was for computing and the electric grid did for power. Why have your own personal generator or servers when you can just plug into something. This is the start of the same, but for sensors - a single vision system that lots of stuff can be built on top of to give flexibility, power and reactivity.
I was getting skeptical about their self driving claims for a while, bit thinking about it (and I had plenty of thinking time today) I realised that there is good evidence of real progress against some of the gnarly problems Tesla face on making it real, you just have to look past the 'woo wipers!' and 'pretty pictures' comments. I'm still not making any comments on timelines tho!
Today I did my first long drive in a while - 6 hours in total - in mixed weathers, across sun up and down. And the big thought I had was 'these wipers are as good as my old car now', ie effectiveness parity has been reached. It was 90%+ effective , and a press of the button brought it to life of needed. Very similar to my 2011 VAG which would need the aggressiveness dial tweaked sometimes.
It was the same for the auto full beam. It turned off reliably for oncoming vehicles, only occasionally for the wrong reason and came back on predictably after 2 seconds (and anti hysterious defence I think). Again, using the camera, not a sensor.
It doesn't sound like much, but a camera plus software is now doing what previously needed dedicated specialised hardware sensors to achieve. And it can be upgraded if needed, or extra actions on the back of its output created (eg auto fog lights?). It, along with the 50. branch is showing that computer vision is a solvable problem, and that is huge.
Its the same change for cars that the cloud was for computing and the electric grid did for power. Why have your own personal generator or servers when you can just plug into something. This is the start of the same, but for sensors - a single vision system that lots of stuff can be built on top of to give flexibility, power and reactivity.
I was getting skeptical about their self driving claims for a while, bit thinking about it (and I had plenty of thinking time today) I realised that there is good evidence of real progress against some of the gnarly problems Tesla face on making it real, you just have to look past the 'woo wipers!' and 'pretty pictures' comments. I'm still not making any comments on timelines tho!