Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Traffic aware cruise control and FSD

This site may earn commission on affiliate links.
Traffic Aware cruise control: Must one have FSD for traffic Aware Cruise Control to come to complete stop when traffic has stopped and then resume when traffic resumes? This was one of my favorite features in the test drive. FSD $10,000 is a hard sell but an even a harder sell to part with $200 every month and then part with even
more $$$$ when Hardware 4 is needed to have actual FSD when it is "accomplished"
 
It is included as part of the basic Autopilot. So, to answer your question, no, you don't need FSD.

@reidyn @RidgeRunner to be clear: without FSD, the car will only stop at a red light if there is a vehicle ahead of you that stops for the light; otherwise, with just Autopilot, your car will drive right through the red light without stopping. So yes it's an available feature but only if you're tracking someone ahead of you that stops. Likewise the resume on green will only happen if you are tracking that vehicle ahead of you at the red light and they pull away on green.
 
This could be a safety problem. One gets used to slowing down and stopping on a city street because of the car in front of you slows down and stops. But then, on the 79th stop light, when there is no car in front of you, TACC takes you right through the red-light, I just don't know if one could trust their "creature of habit" syndrome. FSD is becoming more attractive. But then again the existing state of FSD probably has its own set of "habit" problems too. Answer is only use AP and TACC on freeways and interstates.
 
  • Like
Reactions: linux-works
...Answer is only use AP and TACC on freeways and interstates...

I used to think that way until shortly after 2014 Autopilot release, a Model S driver on Autopilot on Interstate 5 at Tejon Pass, California collided with a stationary car in front due to a stop-and-go traffic jam.

The answer is to get your car equipped with a LIDAR as if yours is a Waymo. Waymo has never collided with a stationary obstacle or vehicle.
 
I used to think that way until shortly after 2014 Autopilot release, a Model S driver on Autopilot on Interstate 5 at Tejon Pass, California collided with a stationary car in front due to a stop-and-go traffic jam.

The answer is to get your car equipped with a LIDAR as if yours is a Waymo. Waymo has never collided with a stationary obstacle or vehicle.
Easy to state this, but is it really feasible and how does it integrate with Tesla's existing software? Uber ended up disabling the ability to do emergency breaking on their lidar equipped self driving cars in Phoenix because it made the car too jerky and that killed a pedestrian because the driver wasn't paying attention.
 
Easy to state this, but is it really feasible and how does it integrate with Tesla's existing software? Uber ended up disabling the ability to do emergency breaking on their lidar equipped self driving cars in Phoenix because it made the car too jerky and that killed a pedestrian because the driver wasn't paying attention.
Exactly. Uber didn't use LIDAR. It installed LIDAR and didn't use it.

Those who installed LIDAR and used it have not any problem with stationary vehicles.

Waymo started using LIDAR in 2009. 3 years later in 2012, it allowed a blind man to ride behind the wheel of a Prius.

Again, 4 years later, in 2016, it built the pedalless and steering wheelless "Firefly" and allowed a blind man riding alone in it

Uber and Tesla have not used it claiming it's too hard. Both Uber and Tesla have fatalities because it's "too hard".

Yes, it's hard, and you can't finish writing a sensor fusion algorithm overnight but after 3 years, you should master LIDAR the way Waymo did in 2012 or 9 years ago.
 
Last edited:
  • Like
Reactions: linux-works
My 2019 BMW 540i has many of FSD's bells and whistles, top notch adaptive cruise control, self parking, etc. I wouldn't rely solely on FSD or AP, especially in its current iteration. Bad drivers pose danger. I wonder if future iterations of FSD will save bad drivers from harming the public and themselves, or spawn a new generation of bad drivers who are even riskier?
 
Last edited:
...I wonder if future iterations of FSD will save bad drivers from harming the public and themselves, or spawn a new generation of bad drivers who are even riskier?

Tesla has bet that it would flood 1 million robotaxis by 2020. Those without any drivers at all. So, Tesla's plan doesn't care whether the driver is bad or good. Owners would be able to sleep at home and its FSD would drive on its own to pick up rides in the middle of the night and earn money for its owners. It's a "bargain".

That said, Waymo has picked up rides with no drivers or safety officers onboard since 2019 with no collisions.

Its task of avoiding collisions have been accomplished years ago but the difficult part is intelligence.

It can get stuck in some scenarios that only need rudimentary basic human intelligence to get the car going.
 
Last edited:
There is no doubt that TACC and AP can get you into trouble when not supervised. Happens on both highway and back roads.
Seems the Tesla AP / FSD today is an assistant, not a pilot. Every driver must be "ready to take over at any moment", or "please keep your hands on the wheel"... we all see it on screen. IMO Tesla has one of the best systems on the market even when needing some supervision.
 
What do I own anyone my disagreement? Do you also ask others when they agree with you?
Unless someone has hacked into an account to input an icon, that action belongs to its owner.

As a matter of fact, I do ask for clarifications very often for the past 9 years because an icon can be misinterpreted.

This week is yours, 3 weeks ago is:

 
Last edited:
  • Like
Reactions: linux-works
That said, Waymo has picked up rides with no drivers or safety officers onboard since 2019 with no collisions.

Its task of avoiding collisions have been accomplished years ago but the difficult part is intelligence.

It can get stuck in some scenarios that only need rudimentary basic human intelligence to get the car going.
I question whether autonomous driving can become reliably safe until it interacts with infrastructure -- transportation, roads, lights, bridges, exits/onramps, etc. I use but don't rely on AD. And I like to drive!
 
I question whether autonomous driving can become reliably safe until it interacts with infrastructure -- transportation, roads, lights, bridges, exits/onramps, etc.

That could be done in a controlled environment such as for those shuttles on railways running between airport terminals without a conductor. It's just like an elevator with a predictable path. Or even the train industry. The problem is the railroad industry hasn't made it autonomous and it's still cheaper to have a human conductor.

Waymo has proven that it's been safe since 2009 with a backup human driver and since 2019 with no human from the company onboard at all in Chandler, AZ.

Safety has been achieved but intelligence has not. Thus, Waymo is scaling up its business in running L4 in dedicated routes from one known depot to another, from one known route to another for other commercial companies Ryder, JB Hunt across TX, AZ, and CA.


I use but don't rely on AD. And I like to drive!

I too like to drive but on a long trip, I want to shift that task to the machine from time to time if that could be done safely.
 
  • Like
Reactions: TSLY
@AlexHung:

Please explain your Disagree Icon. What does that disagreement imply? Does that imply that it's false to claim "Waymo has never collided with a stationary obstacle or vehicle"? And you have any references to support that disagreement?
The statement about the accident. Radar could not detect stationary objects. This is no longer the case. In fact, Tesla is abandoning radar in part because of this behavior.

Disagree for the misleading statement.
 
The statement about the accident. Radar could not detect stationary objects. This is no longer the case. In fact, Tesla is abandoning radar in part because of this behavior.

Disagree for the misleading statement.

Your explanation of "could not detect" and "no longer the case" confuses me!

My understanding is: Sensors detect objects including stationary objects just fine. For example, the camera can faithfully display a picture of the stationary object. The Radar can measure the speed of a stationary object with its signal bounces back to it and it records the speed as zero.

So, in your statement that "Radar could not detect stationary objects" and to compare with what Tesla wrote on its 9/11/2016 blog "Upgrading Autopilot: Seeing the World in Radar", the radar can detect the soda can with its concave bottom:

"On the other hand, any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size. A discarded soda can on the road, with its concave bottom facing towards you can appear to be a large and dangerous obstacle, but you would definitely not want to slam on the brakes to avoid it."

It's not about the inability to detect. Cameras, Radars can detect objects, including stationary objects fine.

The issue here is how can humans write an algorithm to make those camera pictures, those radar signals applicable in the car's industry.
 
Your explanation of "could not detect" and "no longer the case" confuses me!

My understanding is: Sensors detect objects including stationary objects just fine. For example, the camera can faithfully display a picture of the stationary object. The Radar can measure the speed of a stationary object with its signal bounces back to it and it records the speed as zero.

So, in your statement that "Radar could not detect stationary objects" and to compare with what Tesla wrote on its 9/11/2016 blog "Upgrading Autopilot: Seeing the World in Radar", the radar can detect the soda can with its concave bottom:

"On the other hand, any metal surface with a dish shape is not only reflective, but also amplifies the reflected signal to many times its actual size. A discarded soda can on the road, with its concave bottom facing towards you can appear to be a large and dangerous obstacle, but you would definitely not want to slam on the brakes to avoid it."

It's not about the inability to detect. Cameras, Radars can detect objects, including stationary objects fine.

The issue here is how can humans write an algorithm to make those camera pictures, those radar signals applicable in the car's industry.
Perhaps I should say stationary vehicles, since that's how the accident cited occurred.

Radar is being abandoned because it could not reliably differentiate stationary objects from stationary vehicles. Thus the phantom braking phenomenon.
 
  • Informative
Reactions: JerseyShoreMY