Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Full Autonomy All But Confirmed for 3, "We're doing the obvious thing"

What do you think? Will Model 3 have full autonomy?

  • Most Definitely, It's Obvious!

    Votes: 56 24.7%
  • Possibly... Still Not Sure

    Votes: 76 33.5%
  • No Way, You'll have to drive that sucker yourself!

    Votes: 95 41.9%

  • Total voters
    227
This site may earn commission on affiliate links.
I think Tesla has leapfrogged ahead of Google, because of the shift to machine learning, the company with the most data wins... and Tesla is getting A LOT more data than Google... and accumulating more and more each day. Google's only shot now is to license their tech to car companies that already have sensors on the road in their vehicles, and Elon mentioned this is what he thinks they'll do...

Yes, this *may* be true.. but from what I've researched about Google, they're already nearly there. I'd say 90% baked. But yes, their main issue is they never took the leap into production like Tesla has.

I think they're more than happy with possessing and licensing the software end of it to ride sharing companies like lyft and uber, but it will take quite a while for regulations to allow these cars to drive themselves around without a driver inside.

I'd be very curious to know if they have any plans to license direct to car manufacturers, along with their hardware...that would make more sense since the driver would need to be present and alert, etc. I could see it now -

"The all-new 2018 fully electric Chevy Ohm, featuring Android Auto with voice activated control and Google Autopilot."
 
  • Like
Reactions: DrivingTheFuture
I'd be very curious to know if they have any plans to license direct to car manufacturers, along with their hardware...that would make more sense since the driver would need to be present and alert, etc. I could see it now -

"The all-new 2018 fully electric Chevy Ohm, featuring Android Auto with voice activated control and Google Autopilot."
I'd bet anything that's exactly what Google will do. They're going to create a software platform, let's call it "Android Autopilot" and make it available to any car manufacturer. And it will either come with, or require, a specific set of sensors.

For Tesla, I think this year's announcement will be a much upgraded highway autopilot, and that recent S/X were already shipping with the hardware, and that the 3 will get the same features.

The hardware capable of full autonomy should arrive in 2018, per Musk's prediction and coinciding with the delivery of MobileEye's next-generation chip.
 
  • Like
Reactions: Siciliano
I think the obvious thing is: "It does what the Model S does".
That is beyond obvious: it's boring. No need to...
They've already announced this... why would they hold a second event to announce something they already announced??
Exactly. Elon won't hold an "event" unless there is something important to publicize. A significant step towards full autonomy -- but not full autonomy -- would be worth having an event for.
I think both the 3 and the S will have "near" full autonomy, it's simply a matter of the software being sufficiently baked within 2 years - hardware is the easy end of it.
Yes. It will likely be a major improvement in AP -- new hardware with software coming "soon" -- that will extend AP to streets with cross traffic, but as @Siciliano went on to say, the driver will still be fully responsible for the operation of the vehicle.

In addition, I think it will be announced that AP will follow the navigation route shown by the car once the driver enters a destination. That

By the end of this year, it will be just over 2 years since the first version of AP was announced in Sep. 2014. It's clearly time for a another big step forward.

Then in another 2 years, full autonomous driving capability if allowed by law.

I think AP will continue to be labeled as "beta" until the law covers it.
 
I doubt regulations/liability allow level 3 in most places. So I think the announcement is the S/X getting new AP and a path to level 3.

The new AP may allow level 3 by map coordinates. I doubt this announcement is the model 3 reveal 2, but does show the AP all car will get.

The "obvious thing" to me is new AP/major firmware release including new UI, this keeps S/X sales good going into the release of the model 3.
 
  • Like
Reactions: DrivingTheFuture
Based on the NHSTA definition of Level 3, laws in several US states allow it, including California, and more are sure to follow. Also, many states have no laws restricting autonomous driving of any type.

Level 3 means a licensed driver is present in the vehicle and ready to take control at any time.

U.S. Department of Transportation Releases Policy on Automated Vehicle Development | National Highway Traffic Safety Administration (NHTSA)

Limited Self-Driving Automation (Level 3): Vehicles at this level of automation enable the driver to cede full control of all safety-critical functions under certain traffic or environmental conditions and in those conditions to rely heavily on the vehicle to monitor for changes in those conditions requiring transition back to driver control. The driver is expected to be available for occasional control, but with sufficiently comfortable transition time. The Google car is an example of limited self-driving automation.
 
  • Like
Reactions: DrivingTheFuture
The chance of full autonomy in this timeframe is just about zero. However, there are a large number of hugely significant steps between where autopilot is now and full autonomy. A big one would be the "crash proof" technology like BMW demonstrated (no matter what the driver does, the car doesn't allow itself to be crashed).
 
  • Like
Reactions: grommet
The chance of full autonomy in this timeframe is just about zero. However, there are a large number of hugely significant steps between where autopilot is now and full autonomy. A big one would be the "crash proof" technology like BMW demonstrated (no matter what the driver does, the car doesn't allow itself to be crashed).

Any idea how it responds to bad vs worse situations? For example, it might be better to steer off road in order to avoid a moose - will it allow it?
 
  • Like
Reactions: Shawn Snider
Any idea how it responds to bad vs worse situations? For example, it might be better to steer off road in order to avoid a moose - will it allow it?

In such an instance yes, I think it will prioritize avoiding the collision even if it means venturing off the road.

The real interesting situation is of course if the systems knows in advance it is going to collide with either A or B and is in the position where a choice is possible to make (just keep going is then also a choice), and A and B are identified with high probability as pedestrians both.
 
What is the likely hood that Tesla builds the Model 3 with MobilEye's next gen chip and sensors BEFORE they master autonomous driving?

Most people agree its very unlikely they will have level 4 by the time the 3 rolls out, but the hardware that's expected to be able to handle it is already announced. Smart engineers will build with the best hardware they can, and ramp up the software to match it as time goes on.

Am I wrong in assuming they already know which sensors they need and its more of a journey to optimize the software in the car to detect & determine correct actions?
 
Yes, this *may* be true.. but from what I've researched about Google, they're already nearly there. I'd say 90% baked. But yes, their main issue is they never took the leap into production like Tesla has.
that and from my understanding Google is banking on their 3D mapping. It would be fantastic if you use the future google product in the bay area. Not so much in Nebraska. But Tesla's system instead is going toward the computer being able to control the car even if it's never seen that specific road before.
 
I think google and tesla are taking the opposite side of the AP possibility.
One is "think at the moment" ( tesla ), the other is "i already know all of it so i know what to do", of course this is the extreme of the concept, but the point is, if we design a line when on the left side is "think at the moment" and at the right side we have "use the database", tesla is a the left side going to right, google is at the right side going to left.
Wich for both is the better way to do, since tesla could get a system wich continue to get better but it is already usable, while google has no need to rush and has a huge platform where store and analyze the data.
At the end, google is the better approach, if you want a fully autonomus drive, but then, tesla for now want simply to get better and better until the google approach is ready and well solved so it can be merged with the other.

So, about the topic, i think they know what HW is needed by now, they have reasonable security on that, but i don't think it can be definitive.. i mean.. when the law will be ready, maybe they want an additional cpu/sensor/something maybe standard for all as safeguard to let the car drive itself, and of course they can't know now because the law is not set. So, for an AP wich require you to stay alert and take reponsability.. surely they know what to do, for the rest..
And of course we are talking about HW, the software could take another couple of year.. who care? the most important thing is that we have the final HW and not gettint to the point were "ah.. we are sorry, but you can't have the AP v2.3 since you don't have that sensor, if you want it is 4000$", that would make me very very sad.
 
The chance of full autonomy in this timeframe is just about zero. However, there are a large number of hugely significant steps between where autopilot is now and full autonomy. A big one would be the "crash proof" technology like BMW demonstrated (no matter what the driver does, the car doesn't allow itself to be crashed).
I was thinking of the only draw back to this capability and autopilot the other day... what if you are being chased by ill intentioned criminals or there's an emergency situation, and in either case the only escape route is by driving through a fence or baricade... and instead of plowing through said obstacle the safety emergency braking feature keeps slamming on the brakes every time you try to drive forward haha. I hope they install an emergency off switch for those rare instances...
 
I was thinking of the only draw back to this capability and autopilot the other day... what if you are being chased by ill intentioned criminals or there's an emergency situation, and in either case the only escape route is by driving through a fence or baricade... and instead of plowing through said obstacle the safety emergency braking feature keeps slamming on the brakes every time you try to drive forward haha. I hope they install an emergency off switch for those rare instances...
It supposedly disengages any automatic braking when you hit the gas.
 
What is the likely hood that Tesla builds the Model 3 with MobilEye's next gen chip and sensors BEFORE they master autonomous driving?
The Mobileye EyeQ4 is expected to be released in volume production in 2018.

Mobileye said it secured the first design win for EyeQ4 from a European car manufacturer, with production to start in early 2018. The EyeQ4 is expected to be available in volume in 2018.

So, unfortunately, this won't be available for start-of-production for the Model 3.

I think, depending on cost of course, that Tesla would ship the hardware earlier than the software -- but I still wouldn't expect to see the hardware for full-autonomy until the 2018 timeframe.

But, I do think we'll get our peek at Autopilot 2.0 at the end of this year and that it'll be shipping in the S/X before then.
 
It supposedly disengages any automatic braking when you hit the gas.

Braking is seen as a defensive act but accelerating (hitting the gas) as an offensive manoeuvre. It's then common for humans, in the example the auto pilot engineers, to program things so that the offensive act overrides the defensive one. This is probably a form of cognitive slippage from the engineers. I'm quite certain that real world data will show that there's no inherent reason to allow the human to override the autonomous safety system - it seems to me that if/when the artificial system outperforms the human consistently it would be dumb to let the human oversteer it?