Gemma 2 Google’s open-weight 9B/27B models; practical to run locally and widely adopted for fine-tuning and apps. 0630 Open-source Models# 27B# 9B# Gemma 2
Falcon 2 11B TII’s newer open series (text+VLM variants) optimized for efficient inference; Apache-style release. 0590 Open-source Models# 11B# Falcon 2# local
MPT-7B MosaicML’s Apache-licensed 7B family (base/instruct/long-context) widely used as a fine-tuning base. 0530 Open-source Models# Apache-2.0# instruct# long context
Apple OpenELM Open Efficient Language Models (270M–3B) with code, weights, and training recipes for reproducible local use. 0520 Open-source Models# Apple# open# OpenELM
MiniCPM-V 2.6 Edge-friendly multimodal 8B model (images/video) with quantized variants for low-VRAM local inference. 0520 Open-source Models# edge# int4# local
TinyLlama 1.1B Compact 1.1B Llama-compatible model; popular GGUF quantizations make it fast on CPUs/GPUs locally. 0500 Open-source Models# 1.1B# GGUF# Llama-compatible
SmolLM2 Ultra-small open models (135M/360M/1.7B) tailored for on-device and constrained local deployments. 0490 Open-source Models# on-device# open weights# small LLM
InternVL 2.5 Open multimodal family (1B–78B); 78B surpasses 70% on MMMU; broad image/video understanding. 0480 Open-source Models# InternVL# MMMU# multimodal
Yi-1.5 01.AI’s upgraded Yi family (6B/34B) with improved instruction following and coding; widely used in local stacks. 0470 Open-source Models# 01.AI# 34B# instruction
IBM Granite 3.1 Apache-licensed 2B–8B family with long context options; optimized for enterprise/local deployments. 0450 Open-source Models# 8B# Apache-2.0# Granite 3.1
Phi-4 Reasoning Microsoft’s 14B open-weight reasoning model yielding strong complex-task performance with modest hardware. 0440 Open-source Models# 14B# local# open weights
LLaVA-OneVision 1.5 Fully open multimodal (images/video+text) models & training stack; strong results and reproducible recipes. 0430 Open-source Models# LLaVA# local# multimodal
Snowflake Arctic Enterprise-focused open models and embeddings; Apache-style licensing and efficient MoE architecture. 0430 Open-source Models# Arctic# enterprise# MoE
OLMo 2 (AI2) Fully open training data, code, and checkpoints; transparent 7B/13B/32B family for reproducible research and apps. 0410 Open-source Models# AI2# fully open# local
DBRX Instruct (HF) Hugging Face repo for DBRX Instruct checkpoints under an open license for local inference and finetuning. 0400 Open-source Models# DBRX Instruct# Hugging Face# local