So now that we've got a CNN review from a "skittish" first time Tesla driver (!), I thought I'd weigh in based on my own experiences.
First, is it possible to do a drive with no interventions? Yes. But there are a number of problems, and this post will focus on those.
Camera Suite Needs Upgrading
My neighborhood has a lot of 2 lane roads with funky intersections, making left turns challenging even for human drivers. Here's an intersection that gives FSD problems.
You are turning left as shown by the yellow arrow. Traffic coming from the left is obscured by a bend in the road. FSD actually recognizes this since it will creep into the intersection to try to get better vision. Unfortunately, it creeps so far that traffic coming from the left actually has to swerve to miss clipping the front of my car. Chuck Cook did a good video analysis of how the Tesla B pillar cameras (the only ones that give you a left/right 90 degree view) aren't up to the task (
).
IMHO, Tesla needs better left/right 90 degree vision. Maybe A pillar cameras, or bumper cameras as Chuck simulated, or both. You may not realize it, but there are many times when you are driving when you move your entire body to get a better look when turning in an intersection. You do it so automatically that you don't even realize it, but good human drivers do move themselves to get clear vision. Tesla's B pillar cameras just aren't up to the task.
Perception/Planner Needs Smoothing
This is something that has bothered me since the very first AP v1.0. The system that creates a 3-d view of the world has very little to no temporal smoothing. You see cars wink in and out of existence in the visualizer, or shift sideways by a foot and then shift right back. While this is mildly annoying to see in the visualizer, it has real world negative consequences when the planner does the same.
You can see the effects in the CNN review (
Tesla owner lends FSD Beta to inexperienced operator from CNN--to chaotic results) when the car quickly turns the steering wheel towards the path of an oncoming UPS truck in the next lane. I am guessing here that the planner thought the UPS truck was moving slow enough or stopped that it could partially move into the UPS truck lane to dodge the scooter (rather than just slow down). This points to an inadequacy of the planner (humans would have assumed the UPS truck isn't stopped) and vision system (the vision system isn't as accurate as radar in determining vehicle speeds).
But beyond those problems, the system reacts
too quickly. Isn't that a good thing, you ask? No, not when the result is moving into an oncoming vehicle. Now the CNN reviewer was skittish, so he immediately took over control. From personal experience, I am pretty sure the car would have realized its mistake a split second later and turned back and slowed down.
But this behavior gives drivers heart attacks. It isn't uncommon in beta 10.4 for the steering wheel to whip side to side or jitter at like 10 hz or something. Both the perception system and path planner could have a dose of real world physics inserted into their code. Cars cannot jump left a foot. Nothing useful comes from jittering the steering wheel back and forth at 10 hz. When something unexpected occurs, take a moment to gather more data before reaching a hasty (and often wrong) conclusion.
Whoa - Dual Left Turn Lanes
Here's an example of a major intersection where there are dual side by side left hand turn lanes.
In a recent FSD drive, my car was in the left most position in this picture waiting to turn left. When making the turn, the car beside me was slow starting their turn, and my car crossed lanes while making the turn and ended up in the leftmost lane heading away from the view in this picture.
I don't know what would have happened if the driver beside me had been keeping up - would my car have noticed the car beside me and turned into the correct lane? Probably, I hope. Nonetheless, this is simply bad driving and no doubt freaked the guy beside me out since I went into his lane in the middle of the intersection.
Another time, to avoid some traffic ahead of me across an intersection, my car signaled to change lanes and would have changed lanes in the middle of the intersection. Again, it would have worked since there wasn't a car beside me, but this is either illegal or just horrible driving (never change lanes in the middle of an intersection).
Right Turn Lanes
This same intersection, like a lot of large intersections, has a mini right turn "lane" where cars, assuming drivers going straight give them space, go into to turn right.
FSD does not recognize these lanes and does not use them. This is does not work well in practice and causes all sorts of problems.
Mode Confusion
There are two software coded behaviors of FSD which are unsafe, and one of them has been there forever. I am appalled that Tesla doesn't fix this.
When you take over control by grabbing the steering wheel, FSD turns off and tells you that it turned off by a playing a chime. BUT IT KEEPS TACC ON. So now you naturally think you've got full control of the car, and by the way, you've taken control because of something that demands your full attention ahead of you outside the car. So you drive for a while and eventually you ease off the go pedal to take a corner, BUT TACC IS STILL ON and keeps the car driving at the full speed limit through the corner! This is massively unsafe and Tesla needed to fix this years ago but continues to not fix this.
Here's a new bad bug. Try turning on FSD with NO ROUTE SET in the Nav. You'd think the system wouldn't allow you to turn it on since where is the car going to go? The car actually does just sit there, so thinking that you've a route set (I'm going home afterall), you give the car a press of the go pedal. The car now starts driving by itself. It really does. I haven't fully characterized it, but it seems to follow other cars in front of it, turning when they do, etc. Given the driver thinks the car should be going somewhere else typically, this causes a bit of angst. Anyways, FSD should give you an error, not allow it to be turned on with no route set.
Hugs Center Lane
Way too often, on a 2 lane roads, the car will hug the double yellow lane divider, sometimes driving for 100 feet over the cat eye reflectors in the road (bump, bump, bump). So yeah, not technically over the yellow lane divider, but on narrow roads, this is actually unsafe. I have no idea why FSD has a left of lane bias, but it does. Of note, the car knows it is driving right next to the center road lane divider since the visualizer shows it doing so, so the car knows it is doing this and yet doesn't correct to re-center in the lane when on long sweeping turns.
Conclusion
In its current state, I only use FSD when I decide to spend the time to give Tesla feedback. I don't use it for regular driving - it just doesn't work well enough for that. Contrast that with freeway autopilot which I do use routinely. I have a 2019 Model X with radar, so freeway autopilot, which uses the older software stack, still works great.
Based on all the "no intervention" videos I saw around, I thought FSD was further ahead than it is.
Tesla still has a bunch of just routine behaviors and perception issues it needs to fix. Most concerning is that I suspect it will need a new camera suite to work properly.