Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla, TSLA & the Investment World: the Perpetual Investors' Roundtable

This site may earn commission on affiliate links.
Q1'2020 will be OK I believe: China will ramp up, plus the $1,875 step down is half of the $3,500 step down that Q1'2019 suffered from. The U.S. economy will probably be in a better shape as well.

TIL $1875 = 1/2 * $3500:)

Sorry I couldn’t resist:)

But in all seriousness, I believe the psychological difference is greater than the $1875 difference as people last q4 were rushing to prevent losing out on a $7500 credit (although only dropping to $3750) rather than this q4 where one is only losing $1875.
 
Interesting. Europe sales so far this quarter seem to be really slow. This seems to be somewhat contradicting the higher number of loading days @JustMe showed above. However, in the map it looks like there are now many ships on the way which will arrive in much shorter intervals than what I had seen before, so maybe deliveries will peak at the end of the quarter even moren than they used to.

Yes, indeed - this could be a bit of a bear-trap. The low October delivery numbers are being pointed to by Tesla short-sellers as evidence of weakening demand in the face of new competition and falling subsidies. No amount of reasoning the facts of the situation seems to sway their resolve. Those betting against Tesla may lose their shirts again after the next quarterly report
 
  • Like
Reactions: capster and hobbes
Um, yeah, that's a steep jump ;)
Actually, it’s just continuation of what happened in the last few weeks in Q3. Tesla set up a delivery center at the Amsterdam port/docks, and it can deliver several hundred cars per day. Now that the ships with fresh Model 3 stock have arrived I expect to see this daily volume untill the end of the quarter (last quarter had only about 3 weeks of delivery in Amsterdam, now we will have 8 or 9 weeks).
 
How do you figure? The 2020 Dem. Nat'l convention is Jul 13-16. That's Q3.

This time, by SuperTuesday (Mar-3) we'll have a good idea. Infact just IA & NH can have a big effect on the market.

dem_nomination_080119.png
 
Last edited:
Actually, it’s just continuation of what happened in the last few weeks in Q3. Tesla set up a delivery center at the Amsterdam port/docks, and it can deliver several hundred cars per day. Now that the ships with fresh Model 3 stock have arrived I expect to see this daily volume untill the end of the quarter (last quarter had only about 3 weeks of delivery in Amsterdam, now we will have 8 or 9 weeks).
Wowsers! That's ~12K cars to the Netherlands alone. :D
 
Not only is your argument a logical fallacy, there actually is one FSD competitor who is following Tesla's lead - Intel

How did you come up with the conclusion that Intel is following Tesla's cameras only solution? I might have misunderstood your comment.


Mobileye plans to deploy fully autonomous cars in 4 years

Level 5 vehicles — vehicles that can operate on any road and in any condition without a human driver — aren’t in the cards right now. The reason? Even the best systems on the market today sometimes struggle in severe weather like snowstorms and downpours, Shashua said, and Mobileye’s is no different.

“That’s why deployments are done in good weather, like in Phoenix,” he added.
“You need a two sensor-modality … [sensors] with resolutions that can work in snow, for example,” he explained. “One of the issues with current cameras is that in snow, you don’t see the edges of the road or landmarks.


Shashua predicts that many of today’s autonomous driving challenges will be overcome within the next five to 10 years, with the advent of cheap radars and high-fidelity lidar.
Mobileye-equipped cars are becoming more adept at completing challenging road maneuvers. They’re now capable of handling unprotected left turns — a notorious trip-up for driverless cars — and lane changes in heavy congestion, as well as side passes, narrow lanes, and speed bumps.

“They’re able to do all of this in a very aggressive setting — in Jerusalem,” Shashua said.

That’s with cameras alone, mind you.
It’s not that Mobileye is opposed to integrating additional sensors — quite the contrary; EyeQ5 supports both radar and lidar. Instead, Shashua said that while the company’s focus is on vision, it’s committed to building redundant systems with radar and lidar in the first half of this year.


Navigating the Winding Road Toward Driverless Mobility | Intel Newsroom

Design an SDS with a backbone of a camera-centric configuration. Building a robust system that can drive solely based on cameras allows us to pinpoint the critical safety segments for which we truly need redundancy from radars and lidars.




Now let's look at 3D object detection with cameras only.

https://arxiv.org/pdf/1812.07179.pdf
The state of art stereo camera solution in good weather (cameras are at least a foot apart) claims

On the popular KITTI benchmark, our approach achieves impressive improvements over the existing state-of-the-art in image-based performance — raising the detection accuracy of objects within the 30m range from the previous state-of-the-art of 22% to an unprecedented 74%.
while <$10k lidars achieve 120m.

Mono camera solutions, like Tesla's have the 2 eyes in the line of direction. Therefore they have good 3D object detection on the side (can reach the 30m range mentioned above) but their performance straight ahead is worse. That shall mean less than 30m.
(additionally mono cameras can't do 3D when stopped)

Based on this I think current good weather camera only solutions might be good up to 35mph. But not higher (need to increase the cameras' resolution and keep the glass clean). If one drives faster and the neural network misses to recognize a stationary object on the road that is 50m away, it won't be able to stop in time. Radars have a lot of false positives for stationary objects due to their coarse resolution so they can't be used reliably for this.
 
Last edited:
But does having lidar lead to a better system by whatever metrics you care about: safety, drive speed, accessible % of world. Or does it lead to faster development time. Just because others are using it doesn't mean they think vision+radar can't work.


Lidar leads to the illusion of faster development time. It’s perfect if the goal is to convince investors you are “almost there”. Downside is that “almost there” is where they hit the wall.
 
I am perplexed why intelligent well-informed people still imagine lidar to be useful in complex visual environments.

We might be slightly better off were we to emphasize when lidar DOES work exceptionally well. Rather that list those cases we have Elon describing the use of lidar at SpaceX to help in docking operations at the ISS. In geophysics there are some excellent
What is Lidar and what is it used for?

As in most geophysical applications the accuracy of results depends on distinguishing between ‘noise’ and the signal one wants to measure. I use this example because, like navigating a vehicle, geophysics presents extraneous information much more dense than is useful information. In geophysics one can devote long times and huge computing capacity to distinguish signals.

Navigation is inherently time-sensitive. Lidar cannot actively resolve extraneous information quickly. Even more important lidar cannot penetrate anything at all other than air without generating huge reflections (by definition) so will not operate effectively in polluted air, much less in snow, rain and slush.

Since lidar is used to excellent effect to map sea floors it is quite worthwhile to distinguish between lidar ability to reflect solid under liquid and the ability to do so rapidly and compactly.

People accustomed to using radar in airborne environments know how effectively radar can be in expanding visual range while distinguishing between a wide variety of reflections.

I am no expert, but I do have experience using radar and have looked at several applications of lidar. That makes me think lidar can help on the factory floor, maybe even in warehousing. In driving, no chance!

People are very prone, all of us, to adopt confirmation bias. Some continue to waste billions in lidar. When the weather is bad nothing will do a perfect job, but lidar is the least effective of the sensor lot in poor conditions.


I find it useful to imagine I am a bat. Bats see the world as a lidar system does. Every object, every position, but colourless. Would I rather drive my car as a human, with vision, or as a bat? And if I had vision, how useful would it be to have the additional capabilities of a bat? In a world with light, and headlights, bat capability is not required. Bats evolved for caves with zero light, which is not where we drive.

Edit: tldr, lidar aficionados are batty.
 
Last edited:
Now let's look at 3D object detection with cameras only.

https://arxiv.org/pdf/1812.07179.pdf
The state of art stereo camera solution in good weather (cameras are at least a foot apart) claims

On the popular KITTI benchmark, our approach achieves impressive improvements over the existing state-of-the-art in image-based performance — raising the detection accuracy of objects within the 30m range from the previous state-of-the-art of 22% to an unprecedented 74%.​

while <$10k lidars achieve 120m.

You are making the major mistake of trying to extrapolate Tesla's vision network based 3D object distance detection ability from academic papers.

In reality Tesla's neural networks are way ahead, which is obvious if you look at annotated videos of Autopilot object detection:


In that (older and HW2 based) video there are several examples of accurate tracking of cars 100-150m away - and consistent tracking of 50m+ cars. The HW3 networks will likely be significantly better.

So your claim is simply false - not to mention that the angular resolution and field of view of $10k LIDAR units is much poorer than that of $50k+ Velodyne units.

So Tesla Autopilot object detection is getting better with time - while LIDARs are getting significantly worse as they are trying to reduce costs - and that doesn't even address poor LIDAR performance in common driving scenarios that @KarenRei outlined (winter, rain).

There's also the point @MarcusMaximus made:

"Teslas have 3 forward-facing cameras, with different FOV’s simulating multiple vantage points, and correctly identify objects and distances >100m away."
You are mixing up 3D object detection with 2D neural network image recognition.

I'm not, the Autopilot video I cited shows that Autopilot is perfectly aware of the 3D object location of detected cars: the position and distance labels are 3D coordinates in essence, and Autopilot uses this 3D object data to estimate relative velocities as well.

(Anyway, futures markets open soon so this is getting very OT.)
 
Last edited:
MODERATOR:

Good news: It now is mid-day, and FC's post #98771 is THE LAST ONE in this thread concerning Lidar, etc., that will be permitted to remain. Although the disinterested observer will likely conclude that the topic is (far more than) exhaustively covered, any further points you think are somehow of some utility to the world as we know they will happily be welcomed in some other thread.

Bad news: as previously alluded to, at long last as of a few minutes ago Lord Vetinari now has REAL internet at REAL (>260mbps) speeds, so all o' youz are to proceed henceforth only on your very, very best behaviour. It's really not difficult: prior to writing or especially responding to anything, stop and think: But WWVD?

~~~Vetinari~~~
 
*Facepalm*

How do you expect to deliver cars until they actually get there and are unloaded? Europe was almost entirely drained of inventory at the end of last quarter.

Why do you have to be dismissive?

I didn´t say anything that contradicts your points. Your own plot shows that the quarter so far was much slower than previous ones until the recent tick-up. I was intially a bit worried about that I admit, and a drained delivery pipeline makes sense. Just hadn´t been aware that it had been drained even more than in previous quarters, sorry I´m a little slow & thanks for enlightening me ;).
 
Last edited:
MODERATOR:

Good news: It now is mid-day, and FC's post #98771 is THE LAST ONE in this thread concerning Lidar, etc., that will be permitted to remain. Although the disinterested observer will likely conclude that the topic is (far more than) exhaustively covered, any further points you think is of some utility to the world as we know it will happily be welcomed in some other thread.

Bad news: as previously alluded to, at long last as of a few minutes ago Lord Vetinari now has REAL internet at REAL (>260mbps) speeds, so all o' youz are to proceed henceforth on your very, very best behaviour. It's really not difficult: prior to writing or especially responding to anything, stop and think: But WWVD?

~~~Vetinari~~~
So... Starlink?

(In which case we'll just wait for a hole in the constellation :))
 
My question is why NHTSA isn't investigating all gas cars. There seams to be a problem at the back end where the tailpipe is leaking poison gas directly into the air, which people then breathe in and get sick and die. NHTSA is forcing EV makers to add a noise creating device to protect pedestrians, yet they aren't doing anything to protect pedestrians from breathing in poison gas which is obviously a much more serious risk than quiet EVs.
 
My question is why NHTSA isn't investigating all gas cars. There seams to be a problem at the back end where the tailpipe is leaking poison gas directly into the air, which people then breathe in and get sick and die. NHTSA is forcing EV makers to add a noise creating device to protect pedestrians, yet they aren't doing anything to protect pedestrians from breathing in poison gas which is obviously a much more serious risk than quiet EVs.
Only the unintended exposures...

NHTSA Widens Investigation Into Ford Explorer Exhaust Leaks