After doing 100km test drive and sleeping on it: this AP update was really a step change.
I have been thinking that AP's logic to driving will struggle to ever come to parity with human driving. It has been about mechanically seeking line markings and radar distances.
That is not how you drive. Good driving is based on situational awareness and ability to predict next seconds ahead. You also use more than visual recognition of line markings. E.g. case hill crests, a person learns to trust the markings are there behind the crest, and inner ear confirms this with sense of balance when approaching the crest. AP would panic at the crest because visual markings gave unexpected input for one second.
Now 2018.10.4, seems to have a basic perception of what it is doing. I base this anecdote on letting the car drive a narrow and very windy country road without white lane markings, where you do need to understand the shape of the road and what is oncoming traffic. So I activated AP before this unmarked road starts, and AP was able to continue totally confidently for the short 1-2 kms of it, until final 90 degree bend where it would have cut the corner and was happy to hand over. But this is a bend where humans have to be extra careful as well. I did not expect to be able to do this, maybe ever because of missing lanes and logic depending on them until now.
Later I was able to activate AP also on a strip where there was partial snow and ice covering on asphalt, and it drove even there. Not perfect, but it was notable that when there was less confidence, it didn't panic but it even smoothly corrected driving line. AP also avoided a big pool of water in the shoulder of the road and lane.
Much less of an AP which follows road markings (even as it requires markings to activate). It seems to have some clue about the mission it is on.
Not to hype parity yet or discuss singularity but... the car can hear us, cannot talk right now.