I don't disagree about stopping for parked cars in general but I'm not sure how you can say that TACC knew not to follow the Prius anymore. Neither of us knows for sure what the TACC actually thought here. We can't even see what the indicator was on the dash. Based on the video and my experiences with TACC, it looks to me like the Prius was sufficiently in front of the car for TACC to stay locked on to it, thinking it was just going around a bend past some stopped/parked cars. Either way, it doesn't really matter. The point is that this is simply another example scenario (out of many) for why we should not expect TACC to work well on surface streets. We can use it on surface streets at our own risk but we should be very prepared to intervene often.
I don't see the difference between surface streets and highways.
When I think highway, I think highway CA-17, I-280, US-101, I-880, I-680, and further east, CA-152, I-580, I-5, CA-120, CA-99. These are often jammed packed, often fast moving, often many lanes, often people coming and going, and some of them have cross traffic and construction crews moving about in odd directions. CA-152 has stoplights and freeway portions. CA-132 is the same. You can be driving down a large freeway then suddenly run into a meter light for a transfer to another freeway, solid with traffic creeping or stopped. Farm areas are even more interesting in dissolving this distinction.
Many surface streets have long expanses which allow fast driving without a lot of cross traffic, and then you get to areas where there is cross traffic.
Any kind of autopilot would have to know what the driver intends to do. I.e., in the example here, if the Prius turning right was the direction you wanted to go in, then autopilot would follow that vehicle so as to not hit any other vehicle and not hit the Prius. But in the example here, if you actually wanted to go the direction of the stopped light, the autopilot would similarly need to know that you intended to go the direction of the stopped light ("straight" in driving nomenclature). TACC is not able to interpret your intentions and actual future actions, if for no other reason than the programmers don't know whether or not you know whether or not you put your turn signal on or not (or, for that matter, are a good signaler). Obviously, a good autopilot should predict probable outcomes by looking at all sensory input, interpreting movements, signals, visuals, attitudes, etc. But I think this is a key difference between the current infantile version of TACC and what autopilot would be.
Let me say that distinction a bit more: if the autopilot is driving, it knows where it could and will try to go. If the autopilot isn't driving, it (TACC, assist, autopilot, etc.) doesn't know what you (the driver) are intending and are going to decide to do and end up doing. * This is a HUGE difference between what we've all seen Google Driverless cars do and what Tesla Autopilot features are currently available. We are easily confused by watching all the Youtubes Google puts out and thinking Tesla is more modern than that (it isn't **).
I think the programmer to driver communication from Tesla ought to be better. You'd think that the manual writer in Palo Alto software division would just put it on the instruction manual and all the Silicon Valley Tesla customers would read that manual and everything would be easy. But instead I have this foreboding fear that the driver instruction manual is actually written in a foreign language dozens of timezones away by insular software programmers, routed through Southern California, East Coast Lawyerville (Iowa/Connecticut), Palo Alto and Fremont before getting sent to translators in China and India then back to Palo Alto, many months after the actual software is released.
---
* What if you intended to go right, had your turn signal on, and there wasn't space in the lane for your fat Tesla to fit between the stopped car and the right curb even though the skinny Prius easily made it? What if there was room, but you didn't get far right enough because you thought there wasn't? What is driver assist supposed to do then? Autopilot would figure it out (supposedly), but driver assistance would have to be far more superior to that to guess whether or not you were going to try to squeeze through, if it even knew with confidence you were going that way to begin with.
** They say LIDAR units cost $150,000.00 (the things Google driverless cars use).