entry-slick
About FLAN UL2

Google, 20 billion parameters, downloadable from HuggingFace

Flan-UL2 is an encoder decoder model and at its core is a souped-up version of the T5 model that has been trained using Flan. It shows performance exceeding the ‘prior’ versions of Flan-T5. Flan-UL2 has an Apache-2.0 license and is our pick for a self-hosted or fine tunable model as the details for it’s usage and training have been released.

If Flan-UL2s 20 billion parameters are a little too much, consider the previous iteration of Flan-T5 which comes in five different sizes and might be more suitable for your needs.

Visit Official Website

https://huggingface.co/google/flan-ul2

Community Posts
no data
Nothing to display