Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
Yes, all of that added later, not from the debut, with more and more frequent nags added after some serious accidents we all remember.
Sounds like FUD to me. I clearly remember that all "hands on" warning have been in place in 2019 when I got my car, and no we all don't remember accidents related to the lack of warnings. I can recall just 2 accidents where the AP drove the inattentive owners to death.

If autonomous cars will crash and kill the passengers once a month for all the cars on the road resulting in 12 sure death per year, should we allow autonomous vehicles considering that human drivers in the US kill over 40,000 people? This is not a question about Tesla, but about the engineering ethics.
 
If autonomous cars will crash and kill the passengers once a month for all the cars on the road resulting in 12 sure death per year, should we allow autonomous vehicles considering that human drivers in the US kill over 40,000 people? This is not a question about Tesla, but about the engineering ethics.
Not just that - if "even one death" is unacceptable - how do we accept any kind of medicine, activity or any vaccines ? Or for that matter even seatbelts or airbags.

All these strict rules and sacredness of every life come to play only for Tesla AP/FSD. For all other situations it becomes "we don't want a nanny state".

ps :

The bar should be simple - better than average drivers. Below is the crash statistics I got, IIRC, from a white paper / article by Cruise.

1639507480439.png


With 20,000 beta testers, even if everyone drives only 5 miles per day on FSD beta - we are clocking 100k miles a day. The alleged crash would be considered a L2 crash - so, on average one would expect such a crash every day, given testers are driving 100k miles a day on FSD Beta. So, we already know FSD Beta + attentive drivers is better than average drivers in US.

pps : Found the article.


Here is a paper with details about crash level descriptions.

 
Last edited:
Sounds like FUD to me. I clearly remember that all "hands on" warning have been in place in 2019 when I got my car, and no we all don't remember accidents related to the lack of warnings.
Not FUD, in the AP1 days, you could go for 20 minutes at highway speeds without any nags and got infinite numbers of audio warnings (no AP jail). Joshua Brown was killed in an under run accident with a semi crossing a highway causing the first NHTSA investigation of AP and resulted in Tesla increasing nags and eventually adding AP "jail" after a few more high profile events.

Tesla was perfectly fine with virtually no nags and a one camera AP system. There were several owners here who refused updates for years to avoid having nags added to their cars. It was quite a controversy at the time.
 
Your posts aren't remotely logical...(moderator edit) Hint: In-cabin cameras and (new) radar, seat and seatbelt sensors, etc.

Your previous post:
Tesla has rather designed their system as "hands off".

Yes, all of that added later, not from the debut, with more and more frequent nags added after some serious accidents we all remember. One could almost believe they designed it without the proper sensors, thinking no nags necessary ever, the torque sensor being a workaraound.

Anyway, point was, it still doesn't let you steer with the car. Either the car is steering or you are steering. Not both, it is binary.
 
Last edited by a moderator:
  • Like
Reactions: bhzmark
Did the OP or someone else die in this incident ("stacking bodies on the pyre", "Tesla needs to put a stop to this before someone dies")?

Advocate? Sure. Sounds like an advocate for TSLAQ and shorties.

Many people die daily in automobile accidents. It's been like that for many decades before Tesla was around. Very few Teslas are involved where people die. All Teslas have very high (top) crash ratings. We all hear about if a Tesla even drove by an accident where someone died.

And here is where you show that you're disingenuous, and willing to ignore hard evidence to further a false narrative. There are a dozen or more links to FSD doing this exact same thing in all types of conditions and you're just ignoring them like it doesn't exist. This video ABSOLUTELY shows a continuation of a clear pattern of behavior. But none of that matters to the zealot, and why would it? You're completely happy to stack bodies on the pyre if it means Elon looks smart.



This is the hilarious part. I'm literally feet away from my Model 3, and I've made great gains on my shares which I've been holding since 2016. Have you ever stopped to think that maybe some of us care about the outcome? Maybe some of us want Tesla to be better and not bare minimum effort that's attracting more and more scrutiny every day? Can you not conceive of advocates actually advocating for something better to be done here? Again, in the fact of countless videos of FSD doing this exact same thing, your reaction is "NUH UH!!!!!!" and mine is "Tesla needs to put a stop to this before someone dies". And I'm the gullible hater? You need to rethink how you value human life, dude.
 
Your posts aren't remotely logical...
Eh, I think @daktari has a point. Tesla DID design Autopilot to be as hands-free as possible even back in the AP1 days when it was essentially hands free. They only added the various safety items after it became obvious people were misusing the system. It's always been reactionary, never proactive. Now they are trying to solve the human factors problem with a system that wasn't really designed for that.

The interior camera is a good example. If that camera was designed and intended for driver monitoring, then why was it placed in an odd location for good eye tracking? My guess is the intent of the camera was originally CYA for Tesla and/or the whole Tesla Network stuff that never came to fruition. Remember that Tesla was originally going to buy back every single leased Model 3 for the Tesla Network as stated in those original lease contracts with no buy-out options.
 
Last edited by a moderator:
We all hear about if a Tesla even drove by an accident where someone died.

Yes, and we would have heard about it if there had been anyone standing in the path of the Model Y here.

I own a lot of Tesla stock. As long as they are conducting this glorious experiment on public roads, I would like them to do everything in their power to make their automation safe, as an accident that their automation contributes to will not be good for business or their market cap. They should:

1) Study blended steering and see if it is safer for avoiding overcorrection than the current implementation. They have data on this already.
2) Find a way to ensure drivers keep at least one hand on the wheel in a high leverage position (i.e. not the bottom where you have a short lever arm) at all times. Use new sensors if you must. Retrofit if you must.
3) Kick people out for not paying attention even in the slightest way. Make this much more strict. It’s actually quite good now, but now that they have the capability, make it better. Tell the owner what they are doing, and tell them to stop. But don’t give too much leeway. Retrofit vehicles that do not have cabin cameras.
4) Stop steering into oncoming traffic. This is probably the hardest task as it likely involves perception, but I believe they can do better.
5) Address any other known safety issues highlighted by their access to the crash data on their vehicles.
6) Find a way to prevent road departure accidents like this through the use of active driver assistance. Why did the car allow itself to be steered off the road? Why does the capability to avoid such accidents not exist? It’s a very difficult problem, because you don’t want to override driver authority, but I am not convinced it is more difficult than FSD. Can they improve on their emergency lane departure feature? What if someone has a seizure or heart attack and tries to drive the car off the road?

Incessant nags that require precise and consistent attentive use of the test feature are no big deal for FSD beta - it’s beta and no one has to use it. Let’s do it, Tesla.

They are never going to satisfy the people who do not believe beta testing should be done on public roads. (I’m personally on the fence…I doubt it is helping Tesla that much but I think I understand why they are doing it.) But Tesla can take concrete steps to address shortcomings in their safety systems to avoid injury to their customers and other road users. And they’ll have a better product as a side effect.

There’s a lot to be learned from incidents like this, and it’s very fortunate that nothing of significant value was lost in this accident. Tesla should address this and narrow the ability of owners to be a danger to themselves and others. This accident was not strictly speaking Tesla’s fault, but FSD Beta was likely a contributing factor, and Tesla should view it as a problem to be solved. After all, they want their cars to be the safest on the road - and eliminating human factors is part of that.
 
Let me make it clear - the whole circumstances surrounding this allegation by OP (that is exactly what it is at this point, not "proof" but allegation) stinks.

I'm a little confused on how you can say that yet you're an FSD Beta tester.

Have you not experienced FSD Beta suddenly glitching and momentarily turning into the oncoming lane? If you haven't I absolutely promise you that it will eventually, and to be ready for it.

To me the easiest explanation is the OP didn't expect that, and simply wasn't ready.

The removal of the video also has a fairly easy explanation. It's the first thing a lawyer is going to do is to tell you to shut up about a case.

I do feel like the OP wasn't as honest as they could have been by where there hands were, and why they didn't immediately stop it. Why they had to resort to the over correction. So I'm not saying I believe the entirety of the claim, but I do see how it could easily happen based on my own experience with FSD Beta,
 
Last edited:
Regulators don't start percolating until there are deaths or serious injuries. I'm thinking we can have dozens of fender benders and it will dust right off. Also have to compare with number of accidents saved. If Tesla comes out and says FSD drivers were in less accidents than non FSD drivers , then it doesn't matter number of incidents, since it would have saved more incidents than those that occurred.

normally I would say you are right, but Biden appointed an NHTSA senior advisor Missy Cummings who walked in with her guns loaded and safety off aimed squarely at FSD just waiting for that moment she can open fire.
 
Wow, you are a terrible driver 🤣
I'm sure the OP thanks you greatly for your expert analysis.
The CNN guy just so happened to have the exact same experience as many FSD testers, but somehow he's a phony?
I'm not going to disagree with you on that point. Just to say I have lost all respect for most media (especially CNN) even if they were reporting about dog food. We all need better reporting these days. :)
 
You linked to a bunch of them with no timestamps.

This is an outright lie, and we've been over it already. Don't lie to make a scene and try to pretend you're a hero. Those videos I posted are linked to the youtube timecode. Anybody that hovers over the link could see you're a liar.

That is why establishing facts and not jumping to conclusions like you are doing is important.

Fact: FSD is jerking the wheel toward oncoming traffic and there are multiple people uploading footage of exactly that behavior.
Fact: You're pretending no such footage exists even though there are dozens of examples. Almost all of them from known Tesla fans.

I HATE this hypocrisy.

I hate it when people defend indefensible software behavior that can obviously lead to a major collision with an innocent party.

Silicon Desert:

Just to say I have lost all respect for most media (especially CNN) even if they were reporting about dog food.

Just for you, I posted about a dozen links showing the exact same behaviors by owners.
 
Did the OP or someone else die in this incident

So you quoted where I said before but don't bother to consider the fact I said before. Neat.

Sounds like an advocate for TSLAQ and shorties.

Of course, this tired, ignorant comment again. And once again I'm going to tell you I'm still holding shares from 2016 and I own the most expensive Model 3 configuration ever offered. But yep. Keep deflecting. That's going to work out great for you.

Many people die daily in automobile accidents.

Not because their vehicle attempts to jerk the wheel toward an oncoming car. And that's the problem we're seeing here. Why is it that people like you work so hard to ignore the evidence staring you right in the face? Why deflect so hard? Do you think that if the person from the original video had crashed head-on into that oncoming vehicle and NHTSA was able to prove that FSD was enabled at the time of the crash that this would be a good thing for Tesla? Take two seconds to think critically about something here. Someone almost died because FSD steered TOWARD an oncoming car, which I've posted dozens of videos showing it doing at all times of day. And you think that's acceptable behavior from a system that's supposed to at the base of its functionality NOT hit another car?

This is why the die hard fans are considered toxic. You're like a parent that can't accept their kid is a serial killer or something and you've deluded yourself into calling him a good boy still.
 
1639521553820.jpeg
1639521814819.png
1639521880183.png

this never stopped me. (and clearly many others)

My point is, regardless of the truth of the story. This is a beta software, it involves a lot of money and human life (including your own) are a stakes, us as beta tester, need to be (for the lack of better word) responsible for any action/decision the car makes. I have never been more tired when driving with FSD. With FSD ON, my senses are dialed to the max, I am watching everything every where and anticipating FSD will just drive into other object for no reason.
 
View attachment 744384View attachment 744385View attachment 744386
this never stopped me. (and clearly many others)

My point is, regardless of the truth of the story. This is a beta software, it involves a lot of money and human life (including your own) are a stakes, us as beta tester, need to be (for the lack of better word) responsible for any action/decision the car makes. I have never been more tired when driving with FSD. With FSD ON, my senses are dialed to the max, I am watching everything every where and anticipating FSD will just drive into other object for no reason.

Cool, so, what do you tell the two ruined families when a Tesla on FSD jerks the wheel into an oncoming car at the last second, causing a high speed and fatal head-on collision? Sorry, but Steve was beta testing software with a known poor behavior? That's not going to cut it. And since the deepest pockets here are Tesla's, who do you think is going to face the lawsuit first?

At this point, FSD is basically an attractive nuisance. It's a pool without a fence, and you're playing the role of the neighbor kid that falls in and drowns.

As for the pictures, you are only exposing yourself to danger of death with off piste snow sports. When you drive your 1 ton vehicle on the road, you're exposing the public at large to the potential danger. That's why we have traffic laws and FMVSS. Nobody cares if you hurt yourself or worse doing something dumb or dangerous, but we do care if you hurt someone else while doing it.
 
Cool, so, what do you tell the two ruined families when a Tesla on FSD jerks the wheel into an oncoming car at the last second, causing a high speed and fatal head-on collision? Sorry, but Steve was beta testing software with a known poor behavior? That's not going to cut it. And since the deepest pockets here are Tesla's, who do you think is going to face the lawsuit first?

At this point, FSD is basically an attractive nuisance. It's a pool without a fence, and you're playing the role of the neighbor kid that falls in and drowns.

As for the pictures, you are only exposing yourself to danger of death with off piste snow sports. When you drive your 1 ton vehicle on the road, you're exposing the public at large to the potential danger. That's why we have traffic laws and FMVSS. Nobody cares if you hurt yourself or worse doing something dumb or dangerous, but we do care if you hurt someone else while doing it.
We will all die. If minimizing death is the priority then ALL SEMIs must be banned from highways. A semi killed my friends family, but you know what, I blame the driver not Volvo or whatever car company made the vehicle that was unable to stop without crossing the median.
 
Idiots are ingenious.

People have been killing themselves doing dumb things for centuries. How long did it take seat-belts to become law and mandatory? How about airbags? Who would have thought people would do virtually anything to get views and likes on Tiktok/FB/Insta/YouTube/..etc..?

Tesla is definitely doing a lot of CYA. There's been a concerted effort to harm and kill Tesla since it started. They know people, including journalists/bloggers/cops/government regulators/hedgies/anonymous nicks/..etc.., will lie and blame the other driver/the car/weather/G*d/..etc.. Tesla will point to the data and facts.

Eh, I think @daktari has a point. Tesla DID design Autopilot to be as hands-free as possible even back in the AP1 days when it was essentially hands free. They only added the various safety items after it became obvious people were misusing the system. It's always been reactionary, never proactive. Now they are trying to solve the human factors problem with a system that wasn't really designed for that.

The interior camera is a good example. If that camera was designed and intended for driver monitoring, then why was it placed in an odd location for good eye tracking? My guess is the intent of the camera was originally CYA for Tesla and/or the whole Tesla Network stuff that never came to fruition. Remember that Tesla was originally going to buy back every single leased Model 3 for the Tesla Network as stated in those original lease contracts with no buy-out options.
 
Nothing is going to be perfect. FSD will never be done and it'll be updated constantly.

1, 6) Liability. I think the driver should always have precedence. Car's internal dynamic control is probably better at recovering if the driver let go and allowed it to take over.
4) This has never happened to me. I've only had a few months of experience with FSDbeta since 10.3.1. If anything, it'll always pull right away from oncoming traffic.
5) Tesla gets top ratings on thier crash worthiness for all of their vehicles.

You're absolutely right there are people that will never be convinced.

Tesla is well aware that bad stuff can happen and are trying to avoid mishaps as they know it'll be very public and amplified.

Yes, and we would have heard about it if there had been anyone standing in the path of the Model Y here.

I own a lot of Tesla stock. As long as they are conducting this glorious experiment on public roads, I would like them to do everything in their power to make their automation safe, as an accident that their automation contributes to will not be good for business or their market cap. They should:

1) Study blended steering and see if it is safer for avoiding overcorrection than the current implementation. They have data on this already.
2) Find a way to ensure drivers keep at least one hand on the wheel in a high leverage position (i.e. not the bottom where you have a short lever arm) at all times. Use new sensors if you must. Retrofit if you must.
3) Kick people out for not paying attention even in the slightest way. Make this much more strict. It’s actually quite good now, but now that they have the capability, make it better. Tell the owner what they are doing, and tell them to stop. But don’t give too much leeway. Retrofit vehicles that do not have cabin cameras.
4) Stop steering into oncoming traffic. This is probably the hardest task as it likely involves perception, but I believe they can do better.
5) Address any other known safety issues highlighted by their access to the crash data on their vehicles.
6) Find a way to prevent road departure accidents like this through the use of active driver assistance. Why did the car allow itself to be steered off the road? Why does the capability to avoid such accidents not exist? It’s a very difficult problem, because you don’t want to override driver authority, but I am not convinced it is more difficult than FSD. Can they improve on their emergency lane departure feature? What if someone has a seizure or heart attack and tries to drive the car off the road?

Incessant nags that require precise and consistent attentive use of the test feature are no big deal for FSD beta - it’s beta and no one has to use it. Let’s do it, Tesla.

They are never going to satisfy the people who do not believe beta testing should be done on public roads. (I’m personally on the fence…I doubt it is helping Tesla that much but I think I understand why they are doing it.) But Tesla can take concrete steps to address shortcomings in their safety systems to avoid injury to their customers and other road users. And they’ll have a better product as a side effect.

There’s a lot to be learned from incidents like this, and it’s very fortunate that nothing of significant value was lost in this accident. Tesla should address this and narrow the ability of owners to be a danger to themselves and others. This accident was not strictly speaking Tesla’s fault, but FSD Beta was likely a contributing factor, and Tesla should view it as a problem to be solved. After all, they want their cars to be the safest on the road - and eliminating human factors is part of that.
 
Just more non-sense and gibberish from you.

Show me the evidence. NHTSA recovered the logs from a burnt-out MS in Texas which wound up clearing Tesla's AP. Doing that should be easy in this case.

OP has an extremely valuable vehicle. Here's my advice for OP:

1) OP should auction his vehicle. Highest offer wins. Vehicle comes as-is, no guarantees, no refunds.
2) Bidders need to place a deposit of $75k in order to place an offer/bid. No refunds.
3) Offers/bids must be more than value of new Tesla vehicle that crashed. So minimum winning offer/bid will be at least $75k+cost of new Tesla vehicle.


So you quoted where I said before but don't bother to consider the fact I said before. Neat.



Of course, this tired, ignorant comment again. And once again I'm going to tell you I'm still holding shares from 2016 and I own the most expensive Model 3 configuration ever offered. But yep. Keep deflecting. That's going to work out great for you.



Not because their vehicle attempts to jerk the wheel toward an oncoming car. And that's the problem we're seeing here. Why is it that people like you work so hard to ignore the evidence staring you right in the face? Why deflect so hard? Do you think that if the person from the original video had crashed head-on into that oncoming vehicle and NHTSA was able to prove that FSD was enabled at the time of the crash that this would be a good thing for Tesla? Take two seconds to think critically about something here. Someone almost died because FSD steered TOWARD an oncoming car, which I've posted dozens of videos showing it doing at all times of day. And you think that's acceptable behavior from a system that's supposed to at the base of its functionality NOT hit another car?

This is why the die hard fans are considered toxic. You're like a parent that can't accept their kid is a serial killer or something and you've deluded yourself into calling him a good boy still.
 
  • Funny
  • Disagree
Reactions: glide and DrDabbles
6) Find a way to prevent road departure accidents like this through the use of active driver assistance. Why did the car allow itself to be steered off the road? Why does the capability to avoid such accidents not exist? It’s a very difficult problem, because you don’t want to override driver authority, but I am not convinced it is more difficult than FSD. Can they improve on their emergency lane departure feature? What if someone has a seizure or heart attack and tries to drive the car off the road?

Don't we already have a feature where the car attempts to stay in lane if it detects deviation? I know i turned my setting off because the car was freaking out when I gave a cyclist a wide berth, and the car proceeded to swerve back in. Cyclist thought I was trying to push him off the road.

Fact: FSD is jerking the wheel toward oncoming traffic

Having watched the OP's video, it was clear to me that FSD beta did not do a jerk across the double yellow. it happened quickly yes, but I'd hardly call it a swerve or a jerk. I remain convinced that a vigilant tester would not have overcorrected as much as this person did. Likewise, the earlier first known collision on FSD beta was someone who wasn't fully paying attention during a left turn and let the car collide to someone also turning left parallel with him.

Both of these scenarios have happened to me multiple times, and by being alert, I was easily able to keep the car under my control. People who can't do this, or can't maintain the vigilance needed to stay safe should not be a beta tester. Either have the self-awareness to realize this and voluntarily bail from the program, or Tesla will continue to have to be more of a nanny to keep things safe.
 
  • Like
Reactions: bhzmark