Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Ford CEO says FSD is hard and applications will be "narrow"

This site may earn commission on affiliate links.

diplomat33

Average guy who loves autonomous vehicles
Aug 3, 2017
12,712
18,673
USA
"Ford Chief Executive Jim Hackett recently admitted that the company has “overestimated” the arrival of full self-driving vehicles. While speaking at the Detroit Economic Club on Tuesday, the Ford CEO noted that while the company’s first autonomous car is still coming in 2021, the applications of the vehicle’s self-driving technology will be limited.

“We overestimated the arrival of autonomous vehicles. Its applications will be narrow, what we call geo-fenced, because the problem is so complex,” the Ford executive said."

Ford is realizing that the self-driving car market is not as simple as it thought

I would venture that nobody, including Tesla, will actually get to true L4 autonomy anytime soon. FSD is indeed very very difficult. The fact that the leaders in self-driving who have the best self-driving test cars still say that they are not ready yet, should tell us something.

I think the real race is who can get the closest to L4 to the most cars on the roads. This is where I think Tesla is actually poised to do very well in this race. The traditional auto makers like Ford might have good self-driving test cars but they can't push the tech out quickly to the public. The best they can do is put the tech on next year's production models and hope the cars sell. Tesla has already deployed a very competent L2 driver assist to all their cars. And with OTA updates, Tesla can push new features to all the cars quickly, getting the cars a little bit closer to L4 each time. The fact is that Tesla does not really need to get to L4. They just need to develop better features and then push them to all their cars via OTA updates to gain an advantage over the competition. We are already seeing this happen now. Tesla improves Nav on AP, adds autosteer stop light warning and enhanced summon, and can push future updates like traffic light detection, and more, especially with AP3. Meanwhile, Ford says that they plan to have an "autonomous car" in 3 years but it will be limited. :rolleyes:
 
  • Like
Reactions: OPRCE and Pdkj
@diplomat33 Are you saying my ”Level 5 capable hardware” in my Autopilot 2 Tesla is... not? :)

L5? No. However, I do think that the current hardware with AP3 should eventually get pretty close to L4 with the right software. But the software development will probably take awhile before it gets there.

Again, I am just trying to take a realistic approach. I am trying to ignore the bombastic marketing hype of "L5 capable hardware" and focus on what Tesla can actually deliver. As I wrote in the HW3 thread, I do expect Tesla to make some decent self-driving progress with AP3 in the next couple of years. My car does not need to be L5 to make me happy. Heck, just highway L3 with no nags would please me greatly.
 
  • Like
Reactions: OPRCE and Dre78
200.webp
 
They just need to develop better features and then push them to all their cars via OTA updates to gain an advantage over the competition. We are already seeing this happen now. Tesla improves Nav on AP, adds autosteer stop light warning and enhanced summon, and can push future updates like traffic light detection, and more, especially with AP3.

The main problem with pushing better and better autonomous driving is that at some point the cars become too good, but are still too bad.

While we are still talking about heavily supervised level 2, plus assistance features (stop light warning etc.) any improvement is great and makes driving the car safer. But at some point the car might be good enough, that people will stop paying attention, but still not as good as a human driver in most circumstances.

The problem here is, that people get easily bored and are really bad at estimating what the computer will do and where it will have problems. How a NN really perceives vision, is already something we can't really relate to. And everyone who has experienced phantom breaking can clearly see what I mean. Often it's just a shadow, that can make the car behave not like it should.

Another good example is the Chinese research firm, that tricked Tesla's AP by using some white stickers on the road. And white marks on the road are really hard to anticipate as a driver.

Now right now NOA is not good enough, where you would not have to constantly watch it. But as it gets better, you might have 1000 mile drives on the highway where it behaves totally normal. But causing one accident per 1000 miles highway driving on average, would be a catastrophe. On the other hand, if you had 500 miles of smooth driving already, where you didn't step in once, no one would blame you from taking your eyes off the road.

And it will get even harder, when the car works perfectly on highways and rural roads, but only ok in cities and suburbs. The false sense of security will become even more of a problem. And since there are tons of weird circumstances, from driving in dusk, to children running onto the road, to red, or green neon signs in cities.

So at some point there will be a big gap, that you have to bridge. And that might become a real problem.