Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
Casual observation (and maybe a hot take?) :

Appears many believe the manufacturer would take liability for at fault accidents.

My assumption is in at fault situations operating in L3-5 the owner/operator is responsible. The O/O could subsequently pursue legal action against the manufacturer but I don't see laws changing in the near term absolving an owner/operator of responsibility while operating a L3+ system.
Owner/operator is a contradiction of terms if L3-L5 is engaged since you are NOT the operator. So you have an L4 vehicle and it is driving and you are reading book (or in the passenger or back seat napping). Your car runs over and kills a pedestrian. So you the owner are at fault/liable and may even spend time in prison or on parole for an involuntary manslaughter conviction. Also you will have your driving record premaritally blemished even in a minor accident plus the possibility of being sued personally and your insurance NOT covering you (since you were NOT the driver)?
 
Last edited:
It did another stunt few days ago. Moving at 25 mph around a home depot complex making a right turn into a 3 lanes service road of the interstate, traffic is usually 55-50 mph as they are about to get on the on-ramp (also the off ramp). It didn’t even stop or slow down to look, just drove right out, after that it only move to the lane next to on ramp and didn’t turn into the on ramp lane until it meets the chevron, brakes and then make a quick left turn, it made it but less than one car distance between the vehicle in the front at 50 mph.
Not sure the code is setup to traverse through Home Depot parking lots yet. May want to stick with street testing for now.
 
I had a similar problem with visualization: FSD showed a sedan in front of me as a tall truck. In my case the sedan was far away though.

Maybe this explains sometimes FSD is too cautious. It sees a small thing but interprets it as a big thing.
Or it confirms as many have that having a bike or other mounted to the back of your car blockings camera and sensors will provide inaccurate results. They should have a disclaimer for this but for now it’s simply a known incorrect use case.
 
  • Like
Reactions: JHCCAZ
Casual observation (and maybe a hot take?) :

Appears many believe the manufacturer would take liability for at fault accidents.

I agree many BELEIVE that.

But nobody seems able to cite any law that would make the belief true.



Owner/operator is a contradiction of terms if L3-L5 is engaged since you are NOT the operator. So you have an L4 vehicle and it is driving and you are reading book (or in the passenger or back seat). Your car runs over and kills a pedestrian. So you the owner are at fault/liable and may even spend time in prison or on parole for an involuntary manslaughter conviction. Also you will have you driving record permeant blemished even in a minor accident plus the possibility of being sued personally and your insurance NOT covering you (since you were NOT the driver)?

You are conflating three completely different things- all of which operate under different rules--- criminal penalties for moving violations, criminal liability for the accident (if there is any), and civil liability for the accident


You wouldn't get any points on your license or driving record, since you were not driving (more on THAT in a second) but your insurance WOULD pay out since it's your car.

As previously mentioned- most states that allow L3 or higher require the OWNER of the car to be insured (usually for a large amount) for that specific reason.

Anyway speaking of citations--- apparently the law is far behind here- and in some states self-driving cars can not be ticketed because the law doesn't allow moving violations to be cited against non-humans.

See-


Now- who would be CRIMINALLY liable if say someone was killed or horribly injured? Great question. Currently the law is unclear on this point.... which means the lawyers probably sue the owner of the car, and the maker of car, and the maker of the driving system (if those last 2 aren't the same).
 
It is perplexing. Would love to know the reason. Things should improve as the team 'fixes' issues and training is presumably improving unless HW3 is hardware limited - which is a good assumption.

It would be interesting to know if HW4's more capable processing is providing v11.4.4 improvements or the same dismal regressions. If it's regressions then training is likely failing miserably and/or the team is pumping out garbage.

Poorly managed teams sometimes result in pet projects that shouldn't make production but continue on in spite of poor performance. A person might rationalize it by benefitting from research papers, presentations, statistics gathering...
Well, the top AI leader left a year or so ago, after a long sabbatical... Perhaps he knew then FSDj was at or about to hit a progress glass ceiling?
 
  • Like
Reactions: Sharps97
Owner/operator is a contradiction of terms if L3-L5 is engaged since you are NOT the operator. So you have an L4 vehicle and it is driving and you are reading book (or in the passenger or back seat napping). Your car runs over and kills a pedestrian. So you the owner are at fault/liable and may even spend time in prison or on parole for an involuntary manslaughter conviction. Also you will have your driving record premaritally blemished even in a minor accident plus the possibility of being sued personally and your insurance NOT covering you (since you were NOT the driver)?
Speaking of insurance companies…


“As higher levels of autonomy are commercially introduced (SAE automation levels 3 and 4), the insurance industry stands to see higher proportions of commercial and product liability lines, while personal automobile insurance shrinks.[4]
 
  • Informative
Reactions: FSDtester#1
I agree many BELEIVE that.

But nobody seems able to cite any law that would make the belief true.





You are conflating three completely different things- all of which operate under different rules--- criminal penalties for moving violations, criminal liability for the accident (if there is any), and civil liability for the accident


You wouldn't get any points on your license or driving record, since you were not driving (more on THAT in a second) but your insurance WOULD pay out since it's your car.

As previously mentioned- most states that allow L3 or higher require the OWNER of the car to be insured (usually for a large amount) for that specific reason.

Anyway speaking of citations--- apparently the law is far behind here- and in some states self-driving cars can not be ticketed because the law doesn't allow moving violations to be cited against non-humans.

See-


Now- who would be CRIMINALLY liable if say someone was killed or horribly injured? Great question. Currently the law is unclear on this point.... which means the lawyers probably sue the owner of the car, and the maker of car, and the maker of the driving system (if those last 2 aren't the same).
Technically what you say is true, and the laws will have to come at some point as manufacturers get to and surpass L3. Absent a federal law, it’s fair (using the “Reasonable Person” legal standard) to assume today where liability will likely reside.

Edit: I think the injured’s lawyer would sue both, as the manufacturers’ have DEEP pockets….
 
  • Like
Reactions: FSDtester#1
Technically what you say is true, and the laws will have to come at some point as manufacturers get to and surpass L3. Absent a federal law, it’s fair (using the “Reasonable Person” legal standard) to assume today where liability will likely reside.

Edit: I think the injured’s lawyer would sue both, as the manufacturers’ have DEEP pockets….
The “Reasonable Person” standard, defined here -


 
  • Like
Reactions: FSDtester#1
Not sure the code is setup to traverse through Home Depot parking lots yet. May want to stick with street testing for now.
This is outside of the Home Depot premise and not in the parking lot. There is no traffic control in the junction, normal drivers usually
will slow down, quick stop, check on the left traffic coming from the under bridge structure nearby, and then quickly merge right, It works correctly in 11.3.6 many times, now it just too aggressive and over confident. I have seen quite a few careless accidents at that location
 
The @diplomat33 ”Primer” thread, link up thread, is that place.

Here’s the link again…

 
It is perplexing. Would love to know the reason. Things should improve as the team 'fixes' issues and training is presumably improving unless HW3 is hardware limited - which is a good assumption.

It would be interesting to know if HW4's more capable processing is providing v11.4.4 improvements or the same dismal regressions. If it's regressions then training is likely failing miserably and/or the team is pumping out garbage.
After riding in a HW3 vehicle yesterday it appears to confirm my thoughts on 11.4.x. HW4 is definitely performing better than HW3 in the same update. Wonder if what is learned from 11.4.x is intended to go to the 11.3.6 folks who are used to the more stable performance when it’s ready and that is the reason for the separate tracks. I have never experienced a HW3 on 11.3.6 so I can’t speak from experience but it seems a lot of folks on here believe that 11.3.6 is the superior version for HW3 and below and anyone who updated to 11.4.x sees it as a regression that doesn’t seem to get better. Or none of that is right and the real reason for the separate beta tracks is something else entirely.
 
Just wanted to chime in on the bike-on-hitch issue (see this post for what it looks like on the outside when I mount my bike on the hitch).

I’ve had several instances of FSD swerving suddenly and way out of its lane for apparently no reason (almost running into a car in the adjacent lane once).

I finally understood that it does this because it sometimes sees the bike as a vehicle tailgating it, so it’s trying to take evading action.

Here’s a snapshot of what it thinks it sees (note that this ghost tailgater isn’t constantly shown but flashes on and off, thereby often changing FSD’d path):
My favorite display rendering was when the car in front of me had a bike hanging from their bike rack. FSD understood it was a bike so it showed the bike riding on the highway 5 feet off the back of the car. The bike rode nicely at 65mph right in front of me. I'm thinking to myself that the bike is cranking some serious watts to maintain that speed. After about 15 minutes the car with bike rack changed lanes and the bike disappeared from the road.
 
Talking about right turns, yesterday driving south in the left lane of two right turning lanes (the right most one enters a merge lane, not a full lane), car drives toward the merge lane.

Map pic, I am represented by a yellow blob on far right road upper road, moving south to turn west.

FSD clip taken just after the light changed, FSD shows path crossing right toward adjacent turning lane to enter merge lane. A left turn was coming up soon. (I disengaged)

This was a disappointment as I was happy it chose the left of the two turn lanes then messed up.

Map:
2023-06-24_google-map.jpg


2023-06-24_r-turn-center_lane.jpg
 
Technically what you say is true, and the laws will have to come at some point as manufacturers get to and surpass L3. Absent a federal law, it’s fair (using the “Reasonable Person” legal standard) to assume today where liability will likely reside.

That gets pretty tricky absent any other state or federal law though.

Quoting from your first link:

The reasonable person standard applies when the defendant could reasonably foresee how his conduct could cause harm or injury

Who is the defendant here? If it's the owner of the car who turned on the L3 system that the car maker specifically did NOT up front say the car maker was liable for, then could the owner of the car reasonably foresee that turning on a system the car maker disclaimed responsibility for could cause harm or injury? Could be!

(Then you get into the weeds of states that are contributory negligence vs ones with comparative, and...yeesh...)

Hence why as you suggest state laws really need to get this nailed down sooner the better otherwise it'll be a hodgepodge of unrelated jury decisions in various directions... (and liability laws vary a LOT from state to state- as do self driving laws- so it'll almost certainly be at the state level this stuff has to get worked out all 50 individual times (though I expect states with similar TYPES of negligence and liability laws to start copying from each other once a few get something passed).




Edit: I think the injured’s lawyer would sue both, as the manufacturers’ have DEEP pockets….

I mean... I literally said that in the post you replied to (and quoted!) so glad you finally realized I was right that they'll just sue everyone :)
 
  • Like
Reactions: jebinc
Oh… I think I know why. You haven’t been worshipping at the alter of His High Holiness Musk every evening, now have you? Let me read a passage from His Holiness’s Scriptures, King Musk Version:

Book of Tesla, Chapter 1, verse 69:

”Woe beith to the soul who failith to praise his High Holiness Musk, the father, son, and spirit of FSD, every night betweenith thine hours of 6:13 pm and 6:29 pm every night, for he shall be cast into the depths of FSD hell and no strikes shall ever be removed withoutith he shall waitith the mandatory period, as definithed by his High Holiness”.

Say amen brother.

Joe

11.4.4 here, strikes are still there for me.