Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Full Autonomy All But Confirmed for 3, "We're doing the obvious thing"

What do you think? Will Model 3 have full autonomy?

  • Most Definitely, It's Obvious!

    Votes: 56 24.7%
  • Possibly... Still Not Sure

    Votes: 76 33.5%
  • No Way, You'll have to drive that sucker yourself!

    Votes: 95 41.9%

  • Total voters
    227
This site may earn commission on affiliate links.
That's not something I'd ever even considered with autonomous driving. So how would the system respond to that object in the road? That's the next question, especially if there is a vehicle in front of them obstructing their view of the object until it's much closer than 50m away.
 
I'm citing an interview with the CEO saying there were two using 5 EyeQ3 chips in late 2015. I posted the video in one of these threads
What the CTO is saying in that talk is that there's going to be a production vehicle using 3 EyeQ3 chips to do 3D vehicle detection he's not saying it's the first ever 3 chip vehicle.

I searched your previous posts and found the video:


great info and another MASSIVE piece to the puzzle. At 5:15 he indicates that Tesla is CURRENTLY using 5 EyeQ3's with 8 camera's, single radar and 4 ultrasonic's. That matches up perfectly with Amnon's presentation from CES posted by favo in relation to "full vision"

View attachment 179742

That would mean that all current Tesla's already have everything necessary for full autopilot, all they would need would be to switch out the processor. That's something Tesla could offer for a relatively low fee in the future.
 
Last edited:
  • Disagree
Reactions: zenmaster
Would you please repost the video from 2015 where he mentions the use of 5 EyeQ3's? I think that video would be very informative and would really like to see it.
I can't find a video of MobilEye CEO Ziv Aviram's presentation at Citi’s 2015 Global Technology Conference (9/8/2015), but here's an Electrek article covering it that quotes him.
Ziv Aviram said:
Today we are already preparing with one of the OEM, a first vehicle based on 8 cameras, one radar and ultrasonic around the vehicle. So this is much wider implementation of the first introduction of semi-autonomous driving and the trifocal is going to be here as we planned, but additional 4 cameras around the vehicle and one camera looking back. The system will run on 5 EyeQ3 chips and all of them will be connected.
Another pertinent article on Electrek from 3/29/2016: Elon Musk reportedly visited Mobileye to test tech for next gen Tesla Autopilot
 
Last edited:
  • Informative
Reactions: Omer
That would mean that all current Tesla's already have everything necessary for full autopilot, all they would need would be to switch out the processor. That's something Tesla could offer for a relatively low fee in the future.
The Tesla Model S & Model X Autopilot hardware sensors haven't been updated since first introduced. Sorry. A single front camera, front facing radar, and ultrasonic sensors around the car (that are limited to about 16 feet.) This is Autopilot 1.0, and it's nowhere near what's needed for full autonomy or significantly better semi-autonomy. Stay tuned for Autopilot 2.0 later this year, hopefully.
 
  • Like
Reactions: zenmaster
Personally, I think you are giving us humans too much credit when it comes to morals. Look around now a days. We are not exactly the most moralistic crowd, especially when it comes to driving.

You are only saying that because it is so ubiquitous as to be invisible. In the US, 100 Million of us get behind the controls of a deadly weapon every day. We interact in often stressful ways with thousands of complete strangers, who are not in our family or tribe. A mere moments unconcern for all those fellow humans would result in an accident. For all that, surprisingly few accidents occur, around 20 deaths per Billion miles traveled (and that number has been dropping for the entire time we have had cars, if you think things are getting worse, that is perception, not reality). The number which involve actual mendacity is probably much lower. I can't remember the last murder by car I read about. That really is quite a remarkable feat. We are moral to a degree which is basically unexplainable.

Thank you kindly.
 
In the latest shareholder meeting, Elon talked about not wanting to shove too much new technology into the 3 like they did with the X. At the same time, he talked about putting new technology in the S and X lines first (largely due to high cost on first iterations, though probably also so it can be tested on a smaller audience first, though they won't likely say that in public).

So....if there is ever to be autonomous driving, I would expect to see it in the S and/or X first. And then maybe it will make it's way to the 3 a year or two after that. In other words, no autonomous driving for the 3. At least, not at first.

You can read just about anything into the "doing the obvious thing" comment, as what Elon finds obvious, many others will not. I might think the obvious thing would be to drop the electric drive train for the 3 and use a gas engine. Obviously, I don't really think that. Or do I? You don't know!
 
In the latest shareholder meeting, Elon talked about not wanting to shove too much new technology into the 3 like they did with the X. At the same time, he talked about putting new technology in the S and X lines first (largely due to high cost on first iterations, though probably also so it can be tested on a smaller audience first, though they won't likely say that in public).

So....if there is ever to be autonomous driving, I would expect to see it in the S and/or X first. And then maybe it will make it's way to the 3 a year or two after that. In other words, no autonomous driving for the 3. At least, not at first.

You can read just about anything into the "doing the obvious thing" comment, as what Elon finds obvious, many others will not. I might think the obvious thing would be to drop the electric drive train for the 3 and use a gas engine. Obviously, I don't really think that. Or do I? You don't know!

First, the actual cost of autopilot hardware is really low if you aren't doing lidar. The MobilEye chips are under $50 a piece and camera modules are a few dollars at most. Second in the news left and right companies are already starting to make commercial deals for developing autonomous vehicles. I'd say we'll see an advanced version of autopilot on Model 3 reveal 2 late this year. Shortly after this we'll see the hardware on Model S and X then around ~8 months later deliveries for Model 3 may start with the new hardware and the promise that Model S and X would get tech first would be kept.
 
So....if there is ever to be autonomous driving, I would expect to see it in the S and/or X first. And then maybe it will make it's way to the 3 a year or two after that. In other words, no autonomous driving for the 3. At least, not at first.

There are two parts to autonomous driving. Hardware and software. The hardware exists on S and X, and will be put on the ≡. Will it be enough? No one knows. If Tesla figures out what IS enough in the next 18 months or so, will they put it on the S and X? Yes, as soon as possible. Will they wait to add it to the ≡? No reason they would, that just increases their service load later to no good purpose. Once the hardware is complete, The software will be incrementally improved, again the S and X might get it slightly sooner, but there is some liability in leaving less functional safety software in cars, when you could just update them.

Thank you kindly.
 
Really? How many times this week did you put poison in someone's coffee? No survival instinct or repair cost there.
There's also nothing to gain but a lot to lose. There's a clear risk/reward trade off.

I can't remember the last murder by car I read about.

This is also kind of interesting. I suggest listening to this Freakonomics episode for a slant on the subject.
 
I think our survival instinct plays a larger role than morality in explaining the relatively low accident rate. That, and the cost of repairs.
And what it can do to insurance rates in the future.

But - I've to say - when people start doing dangerous things very often, they get sort of numb to it. They start getting careless. Just look at all the texting and other stuff people do in cars - even people who are otherwise risk-averse.
 
Really? How many times this week did you put poison in someone's coffee? No survival instinct or repair cost there.
Hmmm...don't follow what that has to do with the point I made, which was that a major reason that human drivers avoid hitting other cars is they don't want to suffer injury or death. Also, being involved in an accident incurs financial costs and inconvenience. The moral reasons for avoiding accidents -- don't want to injure or kill another human -- is of course also a factor.

Bottom line is, almost everyone wants to avoid pain. Avoiding car accidents therefore seems like a good idea.
 
Hmmm...don't follow what that has to do with the point I made,

You said that survival instinct and repair cost played more of a role than morality. But given a situation without those elements, we don't see immoral behavior. So perhaps morality is the major influence, unpopular as it might be to think of people as moral.

Thank you kindly.
 
You said that survival instinct and repair cost played more of a role than morality. But given a situation without those elements, we don't see immoral behavior. So perhaps morality is the major influence, unpopular as it might be to think of people as moral.
In "a situation" like driving on roads at speed with other cars, which is what we are talking about in this thread, the survival instinct is always present whether drivers consciously acknowledge it or not. That is the primary motivator that keeps us from randomly crashing into other vehicles: we don't want to get hurt.
 
In "a situation" like driving on roads at speed with other cars, which is what we are talking about in this thread, the survival instinct is always present whether drivers consciously acknowledge it or not. That is the primary motivator that keeps us from randomly crashing into other vehicles: we don't want to get hurt.

Both are true. Will people try and avoid hitting a dog in a road? The damage to the car would be minimal. Some won't, but most would.

However, I would argue that decisions about the proper behavior in cars is more driven by ethics than morals, and ethics is definitely something that can be programmed into a computer, probably with better results than the split second decisions mostly driven by instinct just before a crash (yes, it is better to damage the car than hit a person, but not to swerve wildly and flip the car, thus killing yourself as well as the person in question).
 
Last edited: