Atomic GPT uses a Go-based Autograd engine to compute gradients for a minimalist Transformer.
Guide Explanation:
Line = Loss trend
Dot color = Target Confidence:
Model Stats
Params: 0
Language: Go 1.21+
Optimizer
Type: Adam
Rate: 0.05
Train Controls
Live Token Feedback
Context: N/A -> Target: N/A
Model Guess: N/A (p=0.0000)
Target Confidence: 0.0000
Sampling Controls
The model predicts the next character based on training patterns.