๐Ÿงฌ Genesis 1B โ€” Live Training Playground

A 1 billion parameter language model being pre-trained from scratch on 2ร— RTX 4090s. No cloud compute. No corporate backing. One person, one workstation.

Built by Kroonen AI โ€” read the technical blog post for the full story.

This is a raw base model โ€” it completes text, it doesn't chat. Select a checkpoint and watch the model evolve from random noise to coherent language as training progresses. Earlier checkpoints = more noise. Later checkpoints = emergent structure and fluency.

Training checkpoint
10 500
0.1 2
0 200
0.1 1

Genesis is being trained live โ€” new checkpoints appear as training progresses. Hit ๐Ÿ”„ Refresh to check for new ones.
Inference runs on HuggingFace ZeroGPU (NVIDIA H200). Base model weights are currently private while the model undergoes alignment training (SFT/DPO). The final aligned model will be released under Apache 2.0, with continued training producing improved revisions over time.