“algorithmic Sabotage” May 2026

A system that cannot be questioned—a system that treats every input as truth—invites sabotage. By removing human discretion, we force humans to communicate with the system only through actions. And when the only language left is action, the action becomes violent (or deceptive).

In the industrial age, if you wanted to hurt a factory, you threw a wrench into the gears. The owner saw the broken gear. In the information age, if you want to hurt a company, you make its algorithm look stupid. The CEO cannot see the "stupidity." They only see the losses. “algorithmic sabotage”

Today, quant funds spend millions on "adversarial robustness"—training their AIs to ignore sabotage. But it is an arms race. For every defensive algorithm, there is a saboteur building a slightly more clever liar. Let’s get pragmatic. You are a mid-level manager at an Amazon warehouse. The algorithmic management system (the "Hourly Fulfillment Index") has just flagged you for "idle time" because you took a 4-minute bathroom break. Your productivity score drops. You are one strike from termination. A system that cannot be questioned—a system that