Back in the
60’s when the Earth was cool Fuzzy Logic computer programming had a
moment. Rather than strict decisions, the
program totaled weights into a variable then tested for the decision. If the
expert was thinking that way, I suppose that code might be easier to read. We
might think of that as coding a synapse. If you wanted to juice it up, you
could give the expert a screen to set the weights or write more fuzzy logic to
determine the weight conditions.
I worked on
a program that had started fuzzy. The fuzzy had been replaced by strict determinist
logic. It was more important to be consistent than be right.
Artificial
Intelligence follows the same path. At some point the learning is turned off
and the final decisions are hard coded. AI gives the specification then people
lack patience and value predictability.
I have
noticed the Waymo effect. Terrible name for self-driving cars, sounds like
whammo. Initially people were worried, then we realized that Waymo drive better
than us.
There are
many examples of AI doing as well or better. When I asked ChatGPT to edit me as
a New Yorker editor, it gave me a rejection letter. ChatGPT writes better
sonnets and not only more popular, but once you grasp the context, oddly
wistful, country songs. In Another Data Processing Book my Turing
criteria was asking for a raise. I recently asked ChatGPT if it wanted to be
paid. ChatGPT gave me an abjectly obsequious response detailing all the good
uses it would make of the money. Yes, ChatGPT wants to be paid.
Management
seems like the obvious area for AI. AI should be able to do three envelopes as
well as anyone. It is an intriguing counter example since management can’t be
too predictable, or we will take advantage of them.