Post
82
I just might have cracked tokenizer-free LLMs. No vocab, no softmax.
I'm training a 22M params LLM rn to test this "thing" and it's able to formulate coherent sentences π€―
Bear in mind, this is a completely new, tokenizer-free LLM architecture with built-in language universality.
Check the explainer video to understand what's happening. Feedback welcome on this approach!
I'm training a 22M params LLM rn to test this "thing" and it's able to formulate coherent sentences π€―
Bear in mind, this is a completely new, tokenizer-free LLM architecture with built-in language universality.
Check the explainer video to understand what's happening. Feedback welcome on this approach!