We drove through snow in the air on I-90 on Snoqualmie pass yesterday, and despite clear lane-markings, both AP and TACC were disabled. I thought it might have been due to snow adhered to the radar, but it took driving out of the snow and a half-hour of driving in rain before TACC and AP came back.
On a separate point relevant to the thread title, earlier in the day, we drove WA-26 (a very straight 2-lane) with almost complete snow cover. We had easily driven that stretch on AP before, with no issues at intersections, etc. I did not expect AP to work with invisible lane-markings, and watched as the car had no idea where the lane was for dozens of miles.
It did get me thinking, though. Elon has said they will have full autonomy in two years. What technology would allow even AP, much less full autonomy, in those conditions? I thought about how an attentive human does it, quite well. Of course, we look at the tracks of traffic which has gone before us, and estimate the lane boundaries. That road is not heavily travelled, especially in bad weather early in the morning, and there was very little, so following a vehicle in front could not be counted on.
A very sophisticated page understanding algorithm could "see" very low contrast tracks in the snow in principle, but we are clearly a long way from that now!