Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Neural Networks

This site may earn commission on affiliate links.
Quick! You better inform Mobileye of this.

Autonomous Driving - Mobileye

I didn't see anything there that disagreed with anything i said. Just to be clear, when i say "driver monitoring" i'm not talking about the Model 3 driver monitoring camera. I'm talking about the driver being required to monitor the environment and babysit the system.

Mobileye's EyeQ4 supports potentially up to level 5 SDC.

Their chips are currently used in Delphi / Aptiv system.


Also there is the Nissan's L3 highway full speed system for japan that was delayed to early 2019 according to Amon.
 
I have no empirical evidence, this is just my opinion but I think FSD is about a year away...maybe 2. I believe Tesla will indeed demonstrate the cross country drive by the end of this year and will make it available fleet wide by mid 2019. Of course legislative limitations may certainly limit the use of said technology but I think the technology will be there. I believe LIDAR based systems will be limited to specific areas or routes that can be more easily mapped but with its inability to read road signs and otherwise interpret changes in the driving environment I don't see it being used as a universal self driving platform.

Again, just my opinion, but I think the photo recognition route will be more accurate and sustainable in the long run. LIDAR systems will be the first to be used as has been recently demonstrated, but I think their limitations will be their overall downfall.

As with all things "self driving" time will tell.

Dan


As i have said before, there is this misconception that other self driving companies are using lidar only system and that Tesla are the only ones using cameras. This is completely far from the truth and i know its not your fault but the blame lies squarely on Elon and his continued trash talking. I just hope people would actually analyze his claims before accepting it more often going forward.

But its a fact that most SDC companies have a more mature camera system and vision software than Tesla. Tesla just started building their vision system foundations.

The biggest misconception: 1) Only Tesla uses deep learning neural networks 2) Only Tesla uses Cameras

At this point it seems like i'm simply repeating myself.

GM Cruise Computer Vision Lead Engineer gave one of the best self driving talks at Stanford university.
I almost want to call it THE BEST self driving presentation.
I think everyone who is participating in this thread should watch the entire talk.
Talk starts at 3:00 min, Computer Vision / Machine learning part starts at 10:00 mins.

 
Hi all,

With the latest 10.4 update, does anyone have a sense for how much of the "capacity" of the system is being consumed?

E.g. Code size for the NN, data size, processing power

I suspect clock cycles would be a poor metric for this kind of processor, but some sense of how much idle processing power is left...?
 
As i have said before, there is this misconception that other self driving companies are using lidar only system and that Tesla are the only ones using cameras. This is completely far from the truth and i know its not your fault but the blame lies squarely on Elon and his continued trash talking. I just hope people would actually analyze his claims before accepting it more often going forward.

But its a fact that most SDC companies have a more mature camera system and vision software than Tesla. Tesla just started building their vision system foundations.

The biggest misconception: 1) Only Tesla uses deep learning neural networks 2) Only Tesla uses Cameras

At this point it seems like i'm simply repeating myself.

GM Cruise Computer Vision Lead Engineer gave one of the best self driving talks at Stanford university.
I almost want to call it THE BEST self driving presentation.
I think everyone who is participating in this thread should watch the entire talk.
Talk starts at 3:00 min, Computer Vision / Machine learning part starts at 10:00 mins.

Like I said...time will tell.

Maybe the more pertinent question is will the cost of LIDAR based systems come down fast enough to make them a viable option in the time it takes for Tesla to master its system? I don't think there will be many people willing to pay 50-100K extra for a Chevy Bolt or equivalent. Late to the party will certainly be a deciding factor when the time comes.

Dan
 
Hi all,

With the latest 10.4 update, does anyone have a sense for how much of the "capacity" of the system is being consumed?

E.g. Code size for the NN, data size, processing power

I suspect clock cycles would be a poor metric for this kind of processor, but some sense of how much idle processing power is left...?
Considering that, as far as I know, it is still just using 2 of its 8 cameras that alone would lead me to believe that the system is only up to about 25% of its capacity.

Dan
 
Considering that, as far as I know, it is still just using 2 of its 8 cameras that alone would lead me to believe that the system is only up to about 25% of its capacity.

Dan

I don't think that conclusion necessarily follows from the premise.

For example, it may very well be that the core image processing libraries take 30% of the code space, and additional processing elements (like rain sensing) adds 3%.
 
  • Informative
Reactions: Dan Detweiler
Like I said...time will tell.

Maybe the more pertinent question is will the cost of LIDAR based systems come down fast enough to make them a viable option in the time it takes for Tesla to master its system? I don't think there will be many people willing to pay 50-100K extra for a Chevy Bolt or equivalent. Late to the party will certainly be a deciding factor when the time comes.

Dan

Watch this short clip

 
I don't think that conclusion necessarily follows from the premise.

For example, it may very well be that the core image processing libraries take 30% of the code space, and additional processing elements (like rain sensing) adds 3%.

Yes, @BigD0g or @verygreen or perhaps @jimmy_d can answer the question of whether the system is taxed at present but it's almost certain the current hardware is insufficient for actual FSD.
 
Watch this short clip


Although I believe that its too early to tell which technology approached will finally succeed I strongly believe that nature has tested what works best with low effort needed for millions of years and concluded overwhemingly.

Have you ever tried to count how many animals use visual and how many radar?

This almost convinces me that radar may have some advantages in very special situations but in the vast majority of situation we find today on roads, visual has a much better ratio of results/effort and therefore may very well succeed.
 
Watch this short clip

Great! I'll look forward to seeing it.

Not being sarcastic at all, I really do look forward to seeing the technology provide inexpensive, accurate, safe systems that don't ruin the look of the car. I will have no problem being proven wrong. For now though, I still believe that what Tesla is doing will provide all of those things sooner than their competitors.

Dan
 
There are threads about HW 2 cars suddenly getting much better on EAP with the latest software down loads. Also, reports of large downloads of software and files related to self driving are happening. Side cameras are calibrating. All this points to something major coming soon. If this is foreshadowing FSD roll out, it will be interesting to see how Tesla handles rollout. They tend to trust their customers much more than most companies. AP was rolled out very beta and no one was killed. I’m betting they do the same thing in a short time with FSD. Even my HW1 car has gotten much smarter lately. I think someone at Tesla has solved a big problem.
 
  • Like
Reactions: Yuri_G and jrad6515
I am really happy for this update, which, based on the marketing in 2016, is only 16 months plus late. Still not the full EAP, hope for it soon. So no reason to gloat, a lot of frustration for a lot of owners these 16 months. We also paid double for an inferior system.

My speculation is they needed 3 complete rebuilds. The first try was in production until summer 2017 when "silky smooth" replaced it based on iteration II. At that time, my car got less pingpongin on a straight road, but got a lot of issues with swerving over hills and not recognizing curbs. Another half year and we have iteration III, 18.10.4 - finally...

Did some more local road driving today, but still long from FSD. Did not manage this road, which of course is a huge challenge.
upload_2018-3-18_21-32-4.png


On the other hand, it did manage to recognize how the lane widened past the bus stop, and then steered away from the roundabout and found the correct lane (not the bus lane) on the other side. See pictures below. That was pretty impressive, even if the turn was pretty abrubt. A perfect ride reduces the speed a bit more. So hope is high now for summer 2018. Will try to grab a video a day trafic is scarce again.
upload_2018-3-18_21-36-29.png

upload_2018-3-18_21-37-45.png
 
Please tell more about what happened at 15 seconds? You mean a new feature is released?
I think that's where the Speeding Mercedes (aka "AssMaster 2000", per the credits), hauled ass by and @buttershrimp's car sensed it and alerted him of the the potential threat; right as he was signaling to change lanes. (basically quite similar to what the intro Tesla promo cartoon video shows happening)
 
I think that's where the Speeding Mercedes (aka "AssMaster 2000", per the credits), hauled ass by and @buttershrimp's car sensed it and alerted him of the the potential threat; right as he was signaling to change lanes. (basically quite similar to what the intro Tesla promo cartoon video shows happening)
I thought shrimpy is signaling left? The Merc came in hot, but to see that you need a rear corner radar. Can't see in the IC what it warned for really.
 
I thought shrimpy is signaling left? The Merc came in hot, but to see that you need a rear corner radar. Can't see in the IC what it warned for really.
I think @buttershrimp was in fact signaling left - just coincidentally - right before when AM2000 came by. I don't think the alert particularly changed his decision making process, but perhaps accelerated it a little.
 
Tried 10.4 on roundabouts last evening.

I'm afraid it's going to need 10x the processing power and better decision making than humans to successfully navigate one of those. :(

With that said, I was amazed at how well it was doing for the typical non-roundabout driving.