Independent implementation of LLaMA that is fully open source under the Apache 2.0 license.
This implementation builds on nanoGPT.
The original LLaMA weights are distributed by Meta under a research-only license.
New Apache 2.0 licensed weights are being released as part of the Open LLaMA project. Both can be loaded in Lit-LLaMA.
Visit Official Website