Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Made the right call upgrading from EAP to FSD for 2k

This site may earn commission on affiliate links.
Personally I do not believe that full self driving with the current sensor package will ever be feasible. It cannot detect marginal things like potholes, moderate sized animals, etc. The driver must always be “road aware” to maintain safety. Full stop.
 
Personally I do not believe that full self driving with the current sensor package will ever be feasible. It cannot detect marginal things like potholes, moderate sized animals, etc. The driver must always be “road aware” to maintain safety. Full stop.

I see many say this, and I'm not necessarily disagreeing, but your point-of-view directly goes against Elon's stance that all you need is computer vision (since that's what human drivers mostly rely on in new situations as well). The cameras can see everything you just mentioned - just depends whether the software is programmed to notice and respond to such things, and whether there is enough processing power to do so.
 
I guess I found out about the $2,000 FSD offer after it had expired. I never received an email or message on the Tesla app or anything directly from Tesla about it. So, the only way to find out about special deals is through blogs like this?

Tesla has been a little turbulent with their pricing and strategy. It's not like it was a planned "sale," though it is being perceived that way. Essentially:
1. They lowered pricing (possibly to stimulate demand before end of quarter). They planned to close most stores in order to help lower prices. This made earlier buyers angry. It made affected employees angry.
2. When the blowback got to be too much, they reversed course on both the stores and prices. The end result was a temporary "sale". You can debate whether that was part of the masterplan or not.

If you follow Tesla on EV news sites, forums like this, Facebook, or Reddit, you knew what was going on. If you don't, you didn't. It really seems like it was more about shifting strategies and the company being unable to settle on exactly what it wanted to do in a short period of time, creating a buying window.
 
I see many say this, and I'm not necessarily disagreeing, but your point-of-view directly goes against Elon's stance that all you need is computer vision (since that's what human drivers mostly rely on in new situations as well). The cameras can see everything you just mentioned - just depends whether the software is programmed to notice and respond to such things, and whether there is enough processing power to do so.

There are some calculations that equate human vision at 576 mega pixels. What are the HW3 cameras? A couple mega pixel each?

That also includes hearing (in 3D) and 3D visual perception.

How many MIPS is a human brain?

That also doesn’t include huge dynamic range and probably more than 64bit color depth. Now that also doesn’t include a great gyro and even something like smell, like the smell of burning rubber tire might put you on guard for a blowout in front of you.

Just because the car sees you can’t even come close to ALL human sensors and processing power.

That said, I’m optimistic FSD on approved highways is feasible. Humans hit potholes and roadkill too.

Humans do have latency issues :)
 
There are some calculations that equate human vision at 576 mega pixels. What are the HW3 cameras? A couple mega pixel each?

That also includes hearing (in 3D) and 3D visual perception.

How many MIPS is a human brain?

That also doesn’t include huge dynamic range and probably more than 64bit color depth. Now that also doesn’t include a great gyro and even something like smell, like the smell of burning rubber tire might put you on guard for a blowout in front of you.

Just because the car sees you can’t even come close to ALL human sensors and processing power.

That said, I’m optimistic FSD on approved highways is feasible. Humans hit potholes and roadkill too.

Humans do have latency issues :)


latency issues- also humans can't be looking in several different directions at once, or sense close proximity to inches of accuracy in 12 different directions at once, and don't have radar, so there's that.

I'd expect the "anticipatory human thinking" part to be a lot harder to get right than the vision part.... like "Ok, that douchebag half a mile ahead is gonna go do this stupid thing here when the idiot behind HIM tried to pass him, and then the guy in the other lane will do this...."

That's apart from more basic stuff like knowing what lane is best any further than the radar/camera can see.
 
There are some calculations that equate human vision at 576 mega pixels. What are the HW3 cameras? A couple mega pixel each?

That also includes hearing (in 3D) and 3D visual perception.

How many MIPS is a human brain?

That also doesn’t include huge dynamic range and probably more than 64bit color depth. Now that also doesn’t include a great gyro and even something like smell, like the smell of burning rubber tire might put you on guard for a blowout in front of you.

Just because the car sees you can’t even come close to ALL human sensors and processing power.

That said, I’m optimistic FSD on approved highways is feasible. Humans hit potholes and roadkill too.

Humans do have latency issues :)

The social aspect of driving is probably the biggest barrier in my opinion, though I agree with other limitations you list. There are subtle ways we communicate with one another and adjust our behavior consciously and unconsciously. Before we transition to completely autonomous vehicles communicating/coordinating only with one another, we will have a messy (and possibly unnecessary) period of computers attempting to integrate with humans. There's an argument to be made that the safest way to jump to autonomous vehicles (when they are ready) should be sudden and complete.

In other words, the computers have a lot more to worry about now than they will in the future.
 
  • Funny
Reactions: BillyDale
latency issues- also humans can't be looking in several different directions at once, or sense close proximity to inches of accuracy in 12 different directions at once, and don't have radar, so there's that.

I'd expect the "anticipatory human thinking" part to be a lot harder to get right than the vision part.... like "Ok, that douchebag half a mile ahead is gonna go do this stupid thing here when the idiot behind HIM tried to pass him, and then the guy in the other lane will do this...."

That's apart from more basic stuff like knowing what lane is best any further than the radar/camera can see.

Right, you can make up for some lake of anticipatory capability with lower latency. But it’s hard to predict what the end result will be until it statistically exceeds a human (which it eventually will).

Mobileye’s Demo shows what might be doable with cameras.

Mobileye - Most Impressive Self Driving Demo Yet (CES 2019)
 
my car routinely slams the brakes on the highway near overpasses because its "database" thinks I am on the overpass (which has a lower speed limit)

That is false. The braking for overpasses is a result of the low resolution of the radar. It's detecting a stationary object and it can't be sure if the object is on the road or not. The visual system is responsible for making the call, and if it can't within a certain window, the car will brake for safety sake, just in case.

Tesla tweaked the AI to err more on the side of caution after the Joshua Brown fatal crash with a tractor trailer, and people started to report a lot more false positives with braking for overpasses.

HW3 likely will solve this issue, since it gives the visual system more data per unit time to process what the object is and conclude that it's an overpass.
 
  • Informative
Reactions: Jason_G
Good thing AI's can't figure out stuff like this:

1476456832178.png
 
I don't see anywhere on the Tesla order site that mentions customers paying to be beta testers. Nor do I ever recall paying for the promise of a PC that might be delivered at some undefined time in the future, or possibly never.

It may not say on the website that we're beta testers, but Musk has stated several times that the company continues to innovate, and that they are modeled after a software development firm. They bring a basic product to market, customers use what's there, report on issues (or get data fed back automatically), then design improvements. Updates are sent out over the air, Repeat.

It's intrinsic in their business model that all customers are testers whether we agreed to it or not.
 
That is false. The braking for overpasses is a result of the low resolution of the radar. It's detecting a stationary object and it can't be sure if the object is on the road or not. The visual system is responsible for making the call, and if it can't within a certain window, the car will brake for safety sake, just in case.

Tesla tweaked the AI to err more on the side of caution after the Joshua Brown fatal crash with a tractor trailer, and people started to report a lot more false positives with braking for overpasses.

HW3 likely will solve this issue, since it gives the visual system more data per unit time to process what the object is and conclude that it's an overpass.
In my case however the speed limit indicator on the display dropped from 70mph to 45mph...what does the radar have to do with this? And that just happened to be the speed limit on the overpass.
 
Are you in the rightmost lane? This happens to me when the car thinks I’m exiting, but I’m actually staying on the highway. In ABQ on I-25, they recently added an extra lane on the right, and this problem happens consistently now, but only in the new lane. So I just stay in the faster lanes now. ;-)
 
Getting EAP + FSD for $5k, and also getting HW3 upgrade for free was a no brainer for me. Given Tesla's track record, additional future discounts on software will happen again but not sure if it will as drastic as this last one as Elon went on record and said it was a mistake to do so.