Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Short-Term TSLA Price Movements - 2016

This site may earn commission on affiliate links.
Status
Not open for further replies.
My understanding is that yes all of the parts are used: Ultrasonics/steering are used for SCA, camera is used for LDW, radar is used for AEB, and iBooster brakes are used for hill hold. (And obviously the Mobile Eye hardware is the central processing.)

I think that covers all of the "AP" hardware or am I forgetting something?

As best I can determine, the Mobileye hardware only runs the visual processing algorithms. I'm not 100% sure, but I believe that the Tesla Autopilot software runs on processors outside the EyeQ3 unit.
 
You guys are awesum! Thanks! :) :) :D :D
SEC approval process can take anywhere from 2-4 months. It really depends on how much they dig into the filings. Note that a lot of things can be sped up in life but SEC approval is not one of them. There are many rounds of comments back and forth, potentially in-person hearings, etc. Given the scrutiny placed upon Tesla generally I wouldn't be surprised to see it near the 3-4 month mark.

Once approval is in hand the vote can follow shortly thereafter. Tesla will issue a press release announcing a (probably) future record date (a week or so out is my guess) and a vote shortly following the record date, like 1-2 weeks after. Wild guess - everything is concluded by mid December.

I found this in the 8K filing related to Record Date. It is all very lawerly and I can barely read it. The key (only) take away I could find is that the record date will be into the future from the announcement date. Even that one bit I'm not entirely sure I got it right.

Section 5.4. Record Date. For the purpose of determining the stockholders entitled to notice of or to vote at any meeting of stockholders, or to receive payment of any dividend or other distribution or allotment of any rights or to exercise any rights in respect of any change, conversion or exchange of stock or for the purpose of any other lawful action, the Board may fix a record date, which record date shall not precede the date on which the resolution fixing the record date is adopted and which record date shall not be more than sixty (60) nor less than ten (10) days before the date of any meeting of stockholders, nor more than sixty (60) days prior to the time for such other action as hereinbefore described; provided, however, that if no record date is fixed by the...
 
Last edited:
OK, I concede that most of the hardware is required for one or more of the included features. Is all of the hardware required for one or more included features though?
In the Tony Seba video he said that one thing driving this is the falling cost of HW. He used the cost of Lidar as one example. Used to cost over $1k, currently about $100 (6-12 month's ago), next gen under $50. Plus the later HW is more capable, more compact, and draws less power.
 
  • Like
Reactions: GoTslaGo and TMSE
Thanks! Tomy seba video:
"The Self-Driving Vehicle Disruption - End Of Parking & Car Ownership by 2030"

The solid state LIDAR is lower resolution, and of limited sight (not 360 degrees like the Google ones), but are $50 each. One could use four of these to give 360 degree coverage. Radar is similar to Lidar but has better penetration through weather (eg rain).

I am not sure what suite of sensors, ultrasound, LIDAR, RADAR, cameras (colour and IR), etc. will be best, but I am sure Tesla is on top of it.
 
90 degrees is an aesthetic advantage!
The solid state LIDAR is lower resolution, and of limited sight (not 360 degrees like the Google ones), but are $50 each. One could use four of these to give 360 degree coverage. Radar is similar to Lidar but has better penetration through weather (eg rain).

Lower-cost lidar is key to self-driving future - SAE International
The high cost, and the aesthetic desire to hide the lidar in the body of the car, led Velodyne to develop smaller units with fewer than 64 beams.
Lower-cost lidar is key to self-driving future
11-Feb-2015 04:24 EST

The criticality of lidar notwithstanding, there is universal agreement that the technology is currently too expensive for wide deployment. The most recognized lidar machine—the rooftop 64-beam Velodyne HDL-64E used on Google’s autonomous car and for mapping streets by Nokia’s Here division and Microsoft Bing—is sold for about $80,000 each.

“Three-hundred-sixty degrees on top of the car is obviously the best,” said Dr. Wolfgang Juchmann, Velodyne’s Director of Sales and Marketing. “It’s like a castle built on a mountain,” he said. “The king can see who’s coming from any direction.”

The high cost, and the aesthetic desire to hide the lidar in the body of the car, led Velodyne to develop smaller units with fewer than 64 beams. The company now offers a lidar with 32 beams at about $40,000, and its “puck” with 16 beams for $8000. The reduction in cost represents a price drop of a factor of 10 in just seven years since the introduction of the 64-beam lidar in 2007.

“The puck costs $8000 today, but car companies want it for $100 to $200,” Juchmann said during a recent visit to the company’s Morgan Hill, CA, headquarters and plant, where about 60 employees work. He believes that, in the coming years, the industry will see the same downward cost curves in lidar that occurred with radar in the past few years.

“If you look at 10 or so years ago, radar sensors were $10,000. I’m sure the first people said, ‘radar? Putting six of those on every car is completely unbelievable,’” said Juchmann. “But everybody saw the benefits. The market got there, and the price came down.” He said there is nothing intrinsically cost-prohibitive, in terms of components, in today’s lidar—except the inordinate amount of manual labor that goes into building the scanners.

Today, Velodyne, which has sold a thousand or so of its lidars, is considered the dominant player. The criticality of lidar notwithstanding, there is universal agreement that the technology is currently too expensive for wide deployment. The most recognized lidar machine—the rooftop 64-beam Velodyne HDL-64E used on Google’s autonomous car and for mapping streets by Nokia’s Here division and Microsoft Bing—is sold for about $80,000 each.

The difference between Velodyne’s $80,000 64-beam lidar, and a $150 single-beam unit, underscores an important point: not all lidar is created equal. There are three primary levels of automotive applications—one for mapping (like what Google and Nokia do); another for limited driver-assistance functions; and the more forward-looking fully self-driving vehicles.

The Holy Grail is a small set of lidars scanning fully around the vehicle, from the foot of a pedestrian 1 m away to vehicles 100 m (328 m) down the road—at the same cost as today’s radar. Cameras or radar will likely provide redundancy.

The feasibility of that proposition is beyond question for Louay Eldada, Chief Executive of Quanergy, a Silicon Valley startup. He is aiming to achieve that ambitious goal by using a solid-state strategy—an evolution of packaging and manufacturing for mass production that Velodyne also hopes to achieve. Valeo, Bosch, and other Tier-1 auto suppliers are also working on lidar.

Eldada’s big promises for $100 lidar by 2018 helped Quanergy earn $30 million in Series A venture funding (announced in December). The company obtained Mercedes, Renault-Nissan, and Hyundai as its first automotive customers. “I’m not aware of anyone who disagrees that if lidar is at the right price and performance level, it is the ideal sensor,” said Eldada.

Unlike radar that can’t make the distinction between a bridge and a stalled vehicle under the bridge, lidar can provide the precise location of the stalled vehicle under the bridge, according to Eldada. “It can determine that the car is halfway in the lane and halfway on the shoulder, and that it’s a Volkswagen,” he said. That’s the level of rock-solid reliable 3D data needed to fully entrust a vehicle to drive without a driver.
 
  • Informative
  • Like
Reactions: Wenche and TMSE
As best I can determine, the Mobileye hardware only runs the visual processing algorithms. I'm not 100% sure, but I believe that the Tesla Autopilot software runs on processors outside the EyeQ3 unit.

Yes, Tesla autopilot software runs in a separate (generic) chip that is placed right next to the EyeQ3 unit.

If you saw Amnon Sashua's CES presentation, he says there are three prices to autonomy. Perception, planning and maps.
Perception is provided by Mobileye (and other things), planning (and execution) are done by Tesla. Amnon himself said if you need maps is debatable. Afaik Tesla doesn't have/use maps (not navigation maps, Google style centimeter maps)

Edit: corrected CEO to CES
 
Last edited:
  • Like
Reactions: TMSE
Yes, Tesla autopilot software runs in a separate (generic) chip that is placed right next to the EyeQ3 unit.

If you saw Amnon Sashua's CEO presentation, he says there are three prices to autonomy. Perception, planning and maps.
Perception is provided by Mobileye (and other things), planning (and execution) are done by Tesla. Amnon himself said if you need maps is debatable. Afaik Tesla doesn't have/use maps (not navigation maps, Google style centimeter maps)
So is this the same for other users of Mobileye hardware? Does Mercedes program their own application? Do they use Mobileye hardware?
 
  • Disagree
Reactions: RobStark
So is this the same for other users of Mobileye hardware? Does Mercedes program their own application? Do they use Mobileye hardware?

I don't know how Merced implemented their solution actually.

Once again going back to that CES presentation, MobilEye's intention is to provide a fully autonomous solution as a package to large automakers. There is no way Musk will WAIT and use an off the solution from a third party. There is too much at stake here. Musk's intention from the very beginning is to use Mobileye as a component supplier. From the same presentation, I vividly remember Amnon mentioning that they intend to have a two prong solution. So when Mobileye declared that they won't be working with Tesla on full autonomous solution, it was no news at all. I was quite surprised by how much the market/media was surprised.
 
Last edited:
As I understand, MobilEye provides a complete solution. However, Tesla only uses the camera and MB chip to determine and label items in view. The Tesla software then uses this to plan and execute the resulting actions.

We don't know what Tesla implements. Summons and other features were/are described on Mobileye's website. It is likely that the current AP leverages heavily off of mobileye. Tesla only got serious about AP recently. Software development typically can't be pushed ahead much more rapidly simply by adding resources.
 
So we already know about the delivery miss-- I think we are pricing in pretty poor financials due to this. It'll all be about in-transit cars and how Tesla plans to optimize their delivery. Probably have a better look at the run-rate for demand now as well.

In addition, I do expect a couple things from letter today:

- Delivery Optimization
- Sales Update (guidance on mix and word about how the 60 batteries are doing now)
- Gigafactory Recap
- Solarcity deal recap
- Model 3 progress update (probably a quick note about pencils down)
 
  • Like
Reactions: EinSV
Status
Not open for further replies.