MPT-30B: Raising the bar for open-source foundation models
By A Mystery Man Writer
Description
Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on NVIDIA H100 Tensor Core GPUs.
The History of Open-Source LLMs: Better Base Models (Part Two), by Cameron R. Wolfe, Ph.D.
How to Use MosaicML MPT Large Language Model on Vultr Cloud GPU
Robert Linnaeus on LinkedIn: MPT-30B: Raising the bar for open-source foundation models
Is Mosaic's MPT-30B Ready For Our Commercial Use?, by Yeyu Huang
12 Open Source LLMs to Watch
Training-Free Long-Context Scaling of Large Language Models
MPT-30B: Raising the bar for open-source foundation models : r/LocalLLaMA
Hagay Lupesko on LinkedIn: MPT-30B: Raising the bar for open-source foundation models
GPT-4: 38 Latest AI Tools & News You Shouldn't Miss, by SM Raiyyan
Margaret Amori on LinkedIn: MPT-30B: Raising the bar for open-source foundation models
The List of 11 Most Popular Open Source LLMs of 2023 Lakera – Protecting AI teams that disrupt the world.
from
per adult (price varies by group size)