Skip to content

GopiKWork/microdiffusion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

microdiffusion

A single Python file (~300 lines, zero dependencies) that trains and samples from a diffusion model. Inspired by Andrej Karpathy's microgpt.

What it does

Implements the complete DDPM (Denoising Diffusion Probabilistic Models) algorithm in pure Python:

  • Autograd engine (from microgpt, with iterative topological sort)
  • Cosine noise schedule
  • Forward diffusion
  • 3-layer MLP denoiser with skip connections
  • Label conditioning with learned embeddings
  • Classifier-free guidance (CFG)
  • DDPM sampling with ASCII art output

Trains on 62 hand-crafted 8×8 binary images (digits, letters, geometric patterns) stored in data.txt.

Run

python microdiffusion.py

No pip install. No dependencies. ~15 minutes on a modern laptop.

Output

num images: 62
num params: 50688

--- training (1500 steps) ---
step    1/1500 | loss 1.042816
...
step 1500/1500 | loss 0.295117

--- conditional sampling (label-guided) ---
generate 'heart':
  * * . * *
* * * . * * *
* * * * * * * *
. * * * * * * .
. . * * * * . .
. . . * * . . .

Configuration

Hyperparameters are at the top of microdiffusion.py:

Parameter Default Description
NUM_STEPS 1500 Training steps
HIDDEN_DIM 128 MLP hidden layer width
T 20 Diffusion timesteps
GUIDANCE_SCALE 3.0 CFG strength during sampling
LEARNING_RATE 0.001 Initial learning rate

Dataset

Add or edit images in data.txt. Format:

# label_name
.#####..
......#.
......#.
..####..
......#.
......#.
.#####..
........

# = black (0.0), . = white (1.0). Each image is 8×8.

References

About

Diffusion based on single Python file inspired by Andrej Karpathy's microgpt

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages