Ouch. Well, I suppose I should be glad to have given
@Artful Dodger and
@wtlloyd a laugh. Neither my ex nor myself found it amusing that the neural net PhD that I received 30 years ago proved less valuable than expected in my career. Ironic that the “AI" (actually NN) talent war now going on is the fiercest talent war Elon says he has seen.
Of course I wasn’t trained as an engineer, I was is the inaugural classes of a couple of Cognitive Science etc. programs, from whence neural networks arose (literally in my case, hope I don’t dox myself here). Though much of NN’s seems to have been renamed or reframed by engineering departments perhaps to have them as their own, to have their own jargon and to sweep "artificial intelligence"’s ignominious roots in rule based systems under the rug (Who here remembers Marvin Minky’s "sterile" comment from the rules v NN’s war or even who Minsky was and that he was on the wrong side of history?) Still, I hear the core is not that that much changed from the late nineties.
Trying to share knowledge from the junction of neuroscience, neural modeling, non-linear dynamics, psychophysics,… is difficult when others don’t speak the lingo. Indeed my posts here using such referents usually went unremarked or in some cases were even deleted.
Fortunately, having a grasp of how vision works computationally (e.g. yet again FCS/BCS) gave me confidence that a camera based approach would work, even if Tesla used just a big Backpropagation network w lots of layers or a variant thereof. Tesla could after all pick up a few tricks from the brain even if it predominantly uses unsupervised learning.
In any case and as with motor learning, we often learn fastest when we miss. And I did, in fact, do one or two things w NN’s professionally. Though no great shakes themselves, the experience provided calibration for estimating the progress of the field.
In investing, it helpful to know approximately when to expect the real deal versus, say, a shiny object.