DBRX (Databricks) Open-license MoE model (Base & Instruct) with strong general performance and active community tooling. 0370 Open-source Models# Databricks# DBRX# instruct
Snowflake Arctic Enterprise-focused open models and embeddings; Apache-style licensing and efficient MoE architecture. 0440 Open-source Models# Arctic# enterprise# MoE
DeepSeek-R1 (and distilled checkpoints) Open (MIT-licensed) reasoning model with distilled 1.5B–70B local checkpoints; known for chain-of-thought quality. 0390 Open-source Models# DeepSeek-R1# distilled# local
Gemma 2 Google’s open-weight 9B/27B models; practical to run locally and widely adopted for fine-tuning and apps. 0640 Open-source Models# 27B# 9B# Gemma 2
Mixtral 8x22B (Mistral) Sparse MoE open model delivering top cost/performance among community LLMs; widely fine-tuned and quantized. 0400 Open-source Models# Apache-2.0# Mistral# Mixtral 8x22B
Qwen2.5 (Alibaba Qwen) Latest Qwen series from 0.5B–72B; dense decoder models with strong general, coding and math variants for local use. 0350 Open-source Models# 32B# 72B# 7B
Llama 3.2 (Meta) Family of open-weight models (1B–90B, some with vision) designed for local and edge deployment and strong instruction following. 0380 Open-source Models# edge AI# Llama 3.2# local inference