Learn AI by
Building It
“What I cannot create, I do not understand.”— Richard Feynman
text
→tokens
→vectors
→attention
→GPT
transformer.py
input
"The cat sat on the mat"|
↓
tokenize
Thecatsatonthemat
embed
[ 0.82, -0.31, 0.67, ...]
[-0.45, 0.91, 0.23, ...]
self-attention
generate
"...and purred softly."
Courses
Guided paths from first principles to working systems.
Build GPT from Scratch
From raw text to a working GPT. You'll implement every component of a transformer-based language model from the ground up.
TransformersPyTorchGPT
Start Building
Coming Soon
Mechanistic Interpretability
Reverse-engineer neural networks. Dissect transformers layer by layer, neuron by neuron.
InterpretabilityTransformersResearch
Locked