Notes on token engineering, from before it was the consensus.

PhD in computational pathology at Swansea, currently finishing the thesis on whole-slide image classification. I write about token engineering — the craft of building harnesses around language models so that the tokens coming out are grounded, useful, and aimed at something that matters.

I am the specific ADHD-and-Autistic combination where the two halves spent twenty years pulling against each other — the scatter and the system, the branching and the bracing — until the tooling finally arrived that lets them pull in the same direction. Token engineering is the first thing that has ever felt like coming home to both of them at once.

read the first one: The Token Economy →