Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Voice command "Trunk close" being interpreted as "Trunk clothes".

This site may earn commission on affiliate links.
From Dave Barry:

Another beer please, Hal.png
 
That works for me, though so does “close trunk” without spelling out “close”. Maybe OP won’t like the extra syllables but it would be pretty weird if a spelled-out “close” got changed to “clothes”…
I think the initial display of "trunk close" is what the car hears. Then it gets sent to the Tesla cloud and a neural net is converting it to what it thinks is a corrected version based on the context.
 
I'm thinking of giving more context. "Close the fahcking trunk". I'll report what I find.
I have to be careful. If I word it too sternly, the car could drive a mean slalom, and turn my insides into a milkshake. You never know what AI is thinking.
 
I think the initial display of "trunk close" is what the car hears. Then it gets sent to the Tesla cloud and a neural net is converting it to what it thinks is a corrected version based on the context.

They're probably using a generic neural network transformer language model which was trained on large corpus of English texts, for which "trunk clothes" seemed to be more probable than "trunk close" because nearly all of that training text was on descriptions and not imperative commands specific for an automotive context. There are upsides to that (cross learning of generic 'human understanding') and downsides like what we're seeing in that there is a lack of context, like nobody chit chats to their car about clothing.

Knowing this, the better way is to try to imagine how many documents "out in the world" would have a given word ordering/semantics and to match that. Like 'close the trunk' would be more probable than 'clothe the trunk', but 'trunk clothes' is more probable than 'trunk close' as you wouldn't find the latter in any normal sentence. Maybe try 'shut the trunk up'. :) The 'probability' of the phrases don't have to actually occur literally in the training datasets but it would go by higher probability of related words in a given ordering in ambiguous situations, as both 'trunk close' and 'trunk clothes' are rare, but the language models would have inferred from many documents that clothing and trunks are generally associated with related words in short distances from one another more often than chance. That's how statistical natural language processing words these days, there isn't true deep understanding, but lots of statistics.