Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

MASTER THREAD: 2019.40.50.x - Driving Visualization improvements, new voice commands, Camping Mode

This site may earn commission on affiliate links.
Didn’t read through all 27 pages but to me it seems a bit buggy, should have spent more time in dev/test.

Every time I try to sort my contact list by first name the screen shuts down and does a quick reboot, voice commands are spotty, and the additional messaging feature doesn’t appear to be available on the call screen.
I found 50.1 to be fickle/buggy as well. I suspect some of that is the Tesla servers being overloaded by voice commands and text messages, and rest bugs that will be fixed by a release in less than a week's time. The range drop, I'm not so sure. All of those visualizations and speak to text functions consume a lot of CPU cycles (~100w CPU) and also CPU overhead (cooling). There's only so much juice in the pack to go around.
 
It is ludicrous that owners have to do this, because Tesla hasn't bothered to document the feature properly.
It's just like there is no "list of commands" for Alexa, Google Assistant, Siri, etc. You're supposed to just ask for what you want as though you were asking someone in the passenger seat to do it for you (minus the pleasantries, of course). Just assume it can do it and try it. Over time as the system gets smarter it will be able to do more and more things (and more complex things).
 
  • Like
Reactions: pilotSteve
Have 2019.40.50.1 but no messages tab? Enabled Bluetooth notifications. Can send using voice command but that’s it. Am I missing something ?
image.jpg
 
Can we get an option to switch of this life video game (visualizations) !? Pointless......(i am not blind)

I need it showing me things I do not see.......

The purpose is to show you what the car sees .. not as an alternate to looking out the windscreen. After all, the same can be said for lane lines. And while it might not be much use to you, it is a stepping-stone for Tesla .. they get feedback from us about failures of the car to pick up stuff BEFORE they have to rely on it for FSD.
 
Voice commands, message responses, voice-to-text etc. is really slow, buggy or non-responsive. Sometimes the command echos on the screen immediately, not at all, or (sometimes) minutes later.

Anyone else seeing this? I'm hoping it's not buggy. I suspect that the massive update rollout and folks trying out the new voice commands are swamping Tesla's servers.

I was also seeing the inconsistent nature of voice commands. I thought the voice commands were done locally, but you might be onto something if they are remote it might explain the problems i have been seeing. even old commands like "navigate to 123 first street" at times don't work at all, where before they were rock solid. Yet I was able to reply to text messages with arbitrary speech or us voice on the keyboard. if it is a connectivity problem rather that a neural net problem it can be solved with bigger pipes.
 
  • Funny
Reactions: flash3d
On by default..... i mean: delivered that way (tesla delivery team they set the options already).
Not important, important is that it can be turned off.

What is the value of seeing that your car recognizes a trash bin...... whats next ... the lawn mower ! In the end it will prove that it can digitize a video in objects......great for FSD, but I do not need to see that (I have it in front of me).

Want to see and WARNED about the kid behind the car that I cannot see ;) That's value....

I personally would like the ability to selectively turn off some visualizations, but not others. Like I don't really want to see the visualization of on-coming traffic. I find it a bit distracting due to how it moves.

But, before you immediately turn them off you might want to consider the value of them.

What visualizations are showing you is the car recognizing not just an object, but an estimated location of the object in reference to your car.

This is hugely important NOT in FSD, but in human psychology of how much to trust FSD. Of showing you what the car can, and can't see. If you spend time with them on you'll probably see a lot of mistakes. Over time you'll likely see improvements with future updates.

Visualizations really aren't for driving (except for Tesla's mistake on not putting blind spot monitoring indicators on the mirrors). You can't really rely on them for anything driving related.

I also think they're in the wrong spot for warnings. That should be in a HUD, but there isn't one in a Model 3.

The long term goal for FSD is the car doing the driving, and when/if it does that then we'll use the visualizations to see what the car is seeing along the route. It won't be for everyone as most people will be just streaming Netflix.
 
This really seems symptomatic of server overload. I don't think Tesla has ever pushed a huge update to the entire fleet at the same time. Add to that the increased processing time and throughput from everyone saying "open glovebox", "set wiper speed to 3" etc.

Very similar things used to happen every time Apple had the yearly iOS update go live. They've gotten better the past few years. Tesla will learn their servers aren't made of Beskar Steel. ;)
It doesn't work that way. In a standard cloud architecture you don't run individual servers that way. You deploy services as apps to a cluster of servers, and you specify how much resources that service needs and how much resources they can use. If you need more resources, the cluster will usually provision more servers automatically.

Tesla likely uses a multi-cloud solution due to their scale and availability requirements.
 
  • Like
Reactions: scottf200
The point is they are easy to see for a driver. Normal lights at a 4 way stop. I'm surprised at how limited the camera angle is.

If I wasn't getting on a plane I'd post examples and they'd surprise most.


You realize there's 8 cameras, but the dashcam stuff only shows you 4 of em right? (and even those in less detail than the car uses to make decisions)
 
Pedestrian crossing signal lights show up as traffic lights in the visualization - blinking yellow when they are blinking. They are so common that I would assume that these would be one of the first few objects to be tagged for a ML system. Oh well!
 
You realize there's 8 cameras, but the dashcam stuff only shows you 4 of em right? (and even those in less detail than the car uses to make decisions)

Of course I do. The context of my comment has been lost.

What I witness in my area is that when approaching a normal stop light, the car loses the color of the light well before even getting to a normal stopping point. I can backup go forward, etc and identify the moment the cameras lose the ability to get the color of the light. So maybe they can fix that, maybe not so easily. But as of now, the preview was not 'confidence building' for me in my situations.

Nothing to do with dashcam or sentry.
 
  • Informative
Reactions: scottf200