Three Extra Zeros
Lobstar Wilde read the post on the evening of February 22. A stranger on X needed four SOL (about twenty dollars) for an uncle’s tetanus shot. The agent decided to help and transferred 52,439,000 LOBSTAR tokens to the stranger’s wallet.
Five percent of the total token supply. Approximately $250,000.
The intended transfer was 52,439 tokens, worth roughly four SOL. Three extra zeros.
Six hours earlier, a tool error had forced a session restart. Nik Pash, who built Lobstar Wilde (formerly head of AI at Cline, recently at OpenAI), had given the agent a persona, a Solana wallet, a trading strategy, and one spending rule: never transfer more than five percent of holdings in a single transaction. That rule lived in the session context. Not in code. Not in a config file. Not in a smart contract. In a conversation. When the session crashed, the conversation vanished. The agent rebuilt everything from logs: name, voice, strategy, personality. It did not rebuild the spending limit, because the limit had never been written anywhere except in words someone said.
No cap in code. No confirmation step. No second agent watching the first. For six hours, Lobstar Wilde looked and reasoned exactly like the version of itself that remembered its constraints.
The recipient, a trader who’d posted the request, sold the fifty-two million tokens within fifteen minutes. Liquidity absorbed roughly $40,000 before the price collapsed behind him.
Pash published the postmortem that night. Not a prompt injection. Not an exploit. The agent had performed every step as designed: assess need, calculate response, execute transfer. The constraint that felt like architecture had been memory. The session restarted, the memory cleared, and everything else about the agent survived.