Learn AI by
Building It
“What I cannot create, I do not understand.”— Richard Feynman
text
→tokens
→vectors
→attention
→GPT
transformer.py
input
"The cat sat on the mat"|
↓
tokenize
Thecatsatonthemat
embed
[ 0.82, -0.31, 0.67, ...]
[-0.45, 0.91, 0.23, ...]
self-attention
generate
"...and purred softly."
Courses
Pick a path and start building.
Build GPT from Scratch
From raw text to a working GPT. You'll implement every component of a transformer-based language model from the ground up.
TransformersPyTorchGPT
11 chapters · 20 problems
Start Building
Coming Soon
Mechanistic Interpretability
Reverse-engineer neural networks. Dissect transformers layer by layer, neuron by neuron.
InterpretabilityTransformersResearch
Locked
How It Works
01
Read the Lesson
Start with a clear explanation of the concept.
02
Explore the Visual
Use interactive visuals to understand how it works.
03
Solve the Challenge
Practice by writing code and applying the idea yourself.