entry-slick
About mt0-large

We present BLOOMZ & mT0, a family of models capable of following human instructions in dozens of languages zero-shot. We finetune BLOOM & mT5 pretrained multilingual language models on our crosslingual task mixture (xP3) and find our resulting models capable of crosslingual generalization to unseen tasks & languages.

Repository: bigscience-workshop/xmtf Paper: Crosslingual Generalization through Multitask Finetuning Point of Contact: Niklas Muennighoff Languages: Refer to mc4 for pretraining & xP3 for finetuning language proportions. It understands both pretraining & finetuning languages.

Visit Official Website

https://huggingface.co/bigscience/mt0-large

Community Posts
no data
Nothing to display