Welcome back.

Have you thought about subscribing? It's free.
seths.blog/subscribe

The ghost in the machine

When a system becomes complex and our knowledge peters out, we’re tempted to assert, in the words of Gilbert Ryle, that there’s a ‘ghost in the machine.’

“How does the stoplight work?” “Well, it knows that there’s a break in the traffic so it switches from green to red.”

Actually, it doesn’t ‘know’ anything.

Professionals can answer questions about how. All the way down.

[This is one reason why the LLM AI tech stack is so confounding. Because there are no experts who can tell you exactly what’s going to happen next. It turns out that there might be a ghost, or at least that’s the easiest way to explain it.]

After you make a strategic error

Of course, we make strategic errors all the time.

Not enough time. Incomplete information. A fast-changing system.

Sooner or later, a significant strategic error occurs. Don’t beat yourself up.

Now what?

The real problems occur after the error is made.

Don’t follow a strategic error with an investment error, or an effort error or a time error. Don’t follow it with an emotional one either.

Sticking with our original error, devoting our savings, well-being and future to proving ourselves right–that’s the real error. Don’t invest in the cover up.

After you make a strategic error, announce it. Own it. And then move on.