1B parameter LLM that thinks, doesn't memorize but reasons, can use tools, long term project - 2030
According to Andrej Karpathy current 1 trillion param models memorize a lot, he believes 1B is enough for model to know how to think and look up info it doesn't know.