MPT-7B MosaicML’s Apache-licensed 7B family (base/instruct/long-context) widely used as a fine-tuning base. 0530 Open-source Models# Apache-2.0# instruct# long context
DBRX (Databricks) Open-license MoE model (Base & Instruct) with strong general performance and active community tooling. 0360 Open-source Models# Databricks# DBRX# instruct
Qwen2.5 (Alibaba Qwen) Latest Qwen series from 0.5B–72B; dense decoder models with strong general, coding and math variants for local use. 0340 Open-source Models# 32B# 72B# 7B