When a system becomes complex and our knowledge peters out, we’re tempted to assert, in the words of Gilbert Ryle, that there’s a ‘ghost in the machine.’
“How does the stoplight work?” “Well, it knows that there’s a break in the traffic so it switches from green to red.”
Actually, it doesn’t ‘know’ anything.
Professionals can answer questions about how. All the way down.
[This is one reason why the LLM AI tech stack is so confounding. Because there are no experts who can tell you exactly what’s going to happen next. It turns out that there might be a ghost, or at least that’s the easiest way to explain it.]
Last Comments