Ollama + Open WebUI (Bundle) Community bundle marrying Ollama models with Open WebUI for a quick-start local RP chat experience. 0420 Roleplay Frontends (Local/Self-Hosted UI)# adult# bundle# local
H2O GPT Private, self-hostable chat UI/server (Apache-2.0) supporting local models & Ollama—usable for RP and long-form chats. 0440 Roleplay Frontends (Local/Self-Hosted UI)# adult# local# NSFW
Open WebUI Modern self-hosted chat UI for local models (Ollama/OpenAI-compatible) with histories, file/RAG, tools—great RP shell. 0390 Roleplay Frontends (Local/Self-Hosted UI)# adult# local# NSFW
Ollama + Llama Guard 3 (local) Run Llama Guard 3 locally via Ollama to classify prompts/responses for safety. 0370 Guardrails & Moderation# classification# Llama Guard# local
Ollama Library (for local label assist & synthetic data) Run open-weight LLMs locally to assist labeling or generate synthetic datasets. 0450 Datasets & Labeling# label assist# local LLM# offline
Ollama Library Pull open-weight LLMs locally with one command; browse popular chat and embedding models. 0430 Model Hubs# embeddings# GGUF# LLMs