←── back to feed
/topics/talkie-vintage-language-model-pre-1931
Talkie vintage language model pre-1931
3 items●1 sources●updated 1d ago●trend 0
Talkie is a new language model trained exclusively on 260 billion tokens of pre-1931 English text, developed by a team including Alec Radford. The model is small enough to run on-device and enables experiments testing whether it can independently develop post-1931 concepts like modern inventions or coding from historical examples alone.
- Trained on 260B tokens of historical English text from before 1931
- Team includes Alec Radford, co-founder of OpenAI
- Small enough to run on-device, enabling local deployment
- Can be used to create vintage versions of voice assistants like Siri
- Enables experiments on whether model can learn coding or develop post-1931 inventions from pre-1931 data alone
[BSKY]bluesky3
Some notes on talkie, a new "vintage language model" from a team including Alec Radford (yes, that Alec Radford) "trained on 260B tokens of historical pre-1931 English text" simonwillison.net/2026/Apr/28/...
The new LLM trained only on pre-1931 text is small enough that it can potentially run on device, so, with the right tools, you can get a fully vintage version of Siri, but from the era of Downton Abbey (also a small model).
Here is an AI trained just using text from 1931 or earlier, which leads to a lot of interesting experiments: can the model independently develop later inventions? Can it learn to code from examples alone?