←── back to feed
/topics/talkie-vintage-language-model-pre-1931

Talkie vintage language model pre-1931

3 items1 sourcesupdated 1d agotrend 0

Talkie is a new language model trained exclusively on 260 billion tokens of pre-1931 English text, developed by a team including Alec Radford. The model is small enough to run on-device and enables experiments testing whether it can independently develop post-1931 concepts like modern inventions or coding from historical examples alone.

  • Trained on 260B tokens of historical English text from before 1931
  • Team includes Alec Radford, co-founder of OpenAI
  • Small enough to run on-device, enabling local deployment
  • Can be used to create vintage versions of voice assistants like Siri
  • Enables experiments on whether model can learn coding or develop post-1931 inventions from pre-1931 data alone