The tools still use spice format for netlists but there are have been much faster simulators for decades. One nice thing about FSD simulations is that they can run in realtime (or way faster if you're not including the perception stack) whereas circuit simulations are millions to billions of times slower than real life.
I'm agreeing in general but not for this case. I think what makes FSD simulations difficult is the modeling of actions and reactions of other drivers but that is not the case in Chuck's ULT.
Speaking as a practicing EE who spends 'way too much time mucking with SPICE.. Yes, people still use it. One uses a simulator that, hopefully, is matched to the kind of work one is doing. If one's doing transmission lines, s-parameter design, stripline design, antenna design, and (for lots and lots and lots of fun) mixed-signal VHDL simulations of multimega-gate ASICs, there's simulators for all of those.
The problem, generally, with
any simulation tool is the accuracy of the models. Sometimes, one can take the first-order characteristics of, say, an inductor, and emulate it with a simplistic ideal R in series with an ideal inductor, both in parallel with some kind of small, but ideal capacitor. Now, try and do that at, say, 500 MHz; people who know about inductors will now fall over laughing. Inductors are distributed circuit elements that are, no kidding,
very difficult to model mathematically. Capacitors tend to be just Evil. Resistors stop resisting above a GHz or two and become distributed devices as well.
S-parameters (and similar) attempt to
measure some $RANDOM device, then, if one simulates the measured model using the S-parameter description of some device, hooked in with other devices, one might get somewhere. Or not.
Thing is, simulation of circuits can be
useful. Below 10 MHz or so the first and second order descriptors of how a component might work will actually yield useful results, and it's a heck of a lot faster to modify a simulation (especially when one is playing with electrical transmission lines and distributions on a circuit board) than to play cut-and-try, 1940's style. But, it's all about GIGO: Feed Garbage In to a Simulator, and expect Garbage Out. What they pay those of us who do this kind of work for is, well, not
knowing, but
suspecting where the garbage might lie.
And if that's not enough to give designers ulcers, then there's the cross-talk problem. One might get one's spiffy new circuit to work beautifully in sim, and even on that nifty test board that the manufacturer helpfully sells one, and it all looks golden. Put it in with a zillion other circuits with signals that radiate out the wazoo and one's spiffy new circuit falls on its face. I think it was last year that, in the space of three months, I found a half-dozen of these kinds of problems. If somebody had run around with a 'scope or had been properly paranoid, most of these problems would have been detected, early. But training up to the right level of paranoia isn't something easy to do.
So, as a general rule: Simulation has its place. But the best design practice is to run back and forth between simulation and the real world, improving the simulations as one goes, and finding bugs in the hardware that Nobody Would Have Expected. Major point: Mathematical models, CS or otherwise, are Not The Real World. One ignores that at one's peril.
Not a joke: When there's a hundred independent variables, one has to consider not just one variable at a time, but what all these variables do when they line up/don't line up and so on. This makes simulating large, multi-signal ASICs an interesting trip; and they pay Smart People to dream up simulations that actually exercise all the different features. Add CPUs to all of this, neural and othewise, with variable processing times from data input to data output, and one's life just gets more difficult.
Come to think of it: It's a wonder that these two-legged, two-armed, two-eyes ambulatory creatures with a complex neural net on top don't fall over all the time. Ha. They do, don't they?
One actual example of all this: Boeing and the ULA or whomever were trying to get a space capsule up to the ISS a year or so ago. They darn near lost the capsule and people are
glad that that capsule got nowhere near the ISS. Why? Software that was firing the attitude thrusters was on the fritz, firing the wrong thrusters at the wrong time. And there was a time-of-day fault as well. Why did this all occur? They builders simulated
everything, in batches, and never ran a full wet-dress rehearsal. Presumably to save costs. I know people in Aero: They were jumping up and down, screaming: Are they idiots!?! Of
course you run full dress rehearsals, with as close as one can get to flight hardware as one can!!!
So, I'm not surprised that Tesla showed up in force in Texas and monitored that ULT like mad. They're not talking, but I'll bet a plugged nickel that, Dojo or no Dojo, they had simulations of that guy's failures that showed Success! a lot more than the guy was getting. And a guaranteed failure mode of a complex system is worth its weight in gold: Monitor the heck out of it, figure what's going wrong, and go back with Real Data.
Simulation will get one somewhere, but it's very positively not the end-all and be-all.
For fun: See Stanislaw Lem's Cyberiad, where the protagonists, Trurl and Klapacius, independently build whole-universe simulators to get Answers to Questions that they want answered. Good read, good flight of fancy.