GPT-4: MosaicML releases a new open-source large language model series called MPT (MosaicML Pretrained Transformer) to address limitations in existing models and provide a commercially-usable alternative. MPT models are trained on a large amount of data, handle extremely long inputs, and are optimized for fast training and inference. The release includes the base MPT model and three finetuned variants for various applications. MosaicML also open-sources the entire codebase for pretraining, finetuning, and evaluating MPT via their new MosaicML LLM Foundry.
Read more…