Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Panasonic Debuts Next-Gen Head-Up Display

This site may earn commission on affiliate links.
Maybe. Or maybe there will be a P120D/T based on 2170 cells before the 3 deliveries begin. Tesla may not be able to convert the entire S/X production to 2170 right away, but they could certainly do a limited performance series.
Everything is possible, but not everything is probable.

I don't believe they will strip the M3 line of batteries. For goodness sake...they have over 300K cars worth of batteries to get done.
 
On the topic of eye-tracking... if the HUD has augmented reality then it must certainly have eye/face tracking. I can't think of any way around having to know where the driver's head is and how it's oriented to properly overlay information on real world objects with the correct perspective.

Of course there will be some method to adjust the HUD for best use/vision for each driver. But you don't need eye-tracking for that. And since I believe a HUD would only be in a moderately sized space in front of the driver (not full windshield, not along the dash, etc), I don't see any need for eye-tracking as it wouldn't be very useful on that specific section of windshield. Quite frankly, I don't want some monstrously large or complicated HUD/AR/eye-tracking system. I think it'd be foolish for Tesla to go that route when trying to launch the vehicle that could potentially make or break them.
 
Of course there will be some method to adjust the HUD for best use/vision for each driver. But you don't need eye-tracking for that. And since I believe a HUD would only be in a moderately sized space in front of the driver (not full windshield, not along the dash, etc), I don't see any need for eye-tracking as it wouldn't be very useful on that specific section of windshield. Quite frankly, I don't want some monstrously large or complicated HUD/AR/eye-tracking system. I think it'd be foolish for Tesla to go that route when trying to launch the vehicle that could potentially make or break them.
I don't see eye-tracking being part of it, if only because there are too many instances where it would fail or not work properly. Most obviously would be if the driver were wearing any sort of heavily tinted or mirrored sunglasses.
 
  • Like
Reactions: alseTrick
I don't see eye-tracking being part of it, if only because there are too many instances where it would fail or not work properly. Most obviously would be if the driver were wearing any sort of heavily tinted or mirrored sunglasses.
That's where the face tracking comes in. NVIDIA had a demo at CES.

If it was a moderately sized HUD then it's not going to be real AR.
 
  • Informative
Reactions: dsvick
I imagine that one could get pretty good alignment of the AR image to reality with manual adjustment. After all, we each remain relatively still while driving and looking forward. We shrink through the day, but we could easily tweak the image position. Then each of our settings could be saved, like seat position, so it is set for us on getting into the car. Eye/face tracking would be a step up from this.
 
I imagine that one could get pretty good alignment of the AR image to reality with manual adjustment. After all, we each remain relatively still while driving and looking forward. We shrink through the day, but we could easily tweak the image position. Then each of our settings could be saved, like seat position, so it is set for us on getting into the car. Eye/face tracking would be a step up from this.

Manual adjustment would be pretty easy - the car picks a couple high contrast elements in front of it and puts outlines on them, then gives you a way to move the outline until it corresponds.

It doesn't feel like a Tesla solution, though. Tesla tends to be more automatic and intuitive than that.
 
Tracking can literally be literally done with a single sensor which may cost under a few dollars. Processing would be done in software using other hardware already available in the car.... creating a UI to do manual adjustment just makes everything more complex. There's zero reason not to do a sensor if using augmented reality.

In addition, how would you feel is EAP didn't require grabbing the wheel, but instead simply checked if you were watching the road every so often?
 
  • Like
Reactions: S3XY and 3Victoria
I did not see the Panasonic videos posted anywhere, just a few pictures. Looking back at Panasonic's "Advanced Cockpit" with non-circular steering wheel, I could see this fitting the M3 nicely. The unfinished area between the cup holders and display could hold the 3D magic touch.

The AR-HUD video was posted 1-11-17
Advanced Cockpit if from 3-29-2016


 
I did not see the Panasonic videos posted anywhere, just a few pictures. Looking back at Panasonic's "Advanced Cockpit" with non-circular steering wheel, I could see this fitting the M3 nicely. The unfinished area between the cup holders and display could hold the 3D magic touch.

The AR-HUD video was posted 1-11-17
Advanced Cockpit if from 3-29-2016


I think what is most significant is that this is from Tesla's number one automotive partner. Hmmmm....

Dan
 
  • Like
Reactions: DrGuest
Obviously speculation on author's part, but still...
Did Tesla partner Panasonic just hint at new Model 3 features?

I haven't read or watched any of the Panasonic materials on this technology, but did they address the issue of polarized sunglasses? Also, if this technology does make its way into Tesla vehicles, would it be too costly to be included as standard equipment in a base Model 3?
 
Last edited:
  • Informative
Reactions: GoTslaGo
Very cool, so long as no one is called upon to, you know, actually drive the car.
Robin
You know, all of this is actually more intuitive than traditional driving. Less physical motion needed. Less distraction from the view in front of the car. I am sure the amount of displayed information could be customized to one's preference. I think just because it is such a new way of controlling the driving environment it seems radical. I think with a month or so of getting used to it I would find this type of layout to be very enjoyable and linked with Tesla Vision and autopilot...much safer as well.

Dan
 
You know, all of this is actually more intuitive than traditional driving. Less physical motion needed. Less distraction from the view in front of the car. I am sure the amount of displayed information could be customized to one's preference. I think just because it is such a new way of controlling the driving environment it seems radical. I think with a month or so of getting used to it I would find this type of layout to be very enjoyable and linked with Tesla Vision and autopilot...much safer as well.

Dan
Dan:
Maybe so. But the clip reminded me of a demo flight I took in a Cirrus (a "technologically advanced aircraft" with a lot of screens to look at). The company pilot (sitting right seat) set up all the lights and buttons and handed the airplane to the airplane. He sat back, arms crossed, with a "Sure is something isn't it?" look.
I could see (because I was looking out an actual window), that he'd set course for a fairly busy GA field, and at our altitude we'd be flying uncomfortably close to any a/c in the pattern. I rested my hand near the autopilot disengage button as we approached, the demo pilot all "look at this," and "look at that", his eyes totally inside the cockpit.
I saw a departing airplane turn in our direction, and stabbed the disconnect just as a conflict alert blossomed on a screen.
"See that? It picked up that that traffic right away," he bragged.
I turned away from the airport, my eyes 100% outside. "It sure did," I agreed.
Tech is like wine. Not enough is a bad thing, but so is too much. And tech for tech's sake is, well, like booze for booze's sake, isn't it?
Robin
 
Last edited: