oobabooga/text-generation-webui: The Premier Local LLM Interface

oobabooga/text-generation-webui: The Premier Local LLM Interface

Summary

oobabooga/text-generation-webui is a powerful and versatile web UI for running large language models (LLMs) locally. It offers a 100% offline and private environment for text generation, vision, tool-calling, and even training, all accessible through an intuitive interface and API.

Repository Info

Updated on May 1, 2026
View on GitHub

Tags

Click on any tag to explore related repositories

Introduction

oobabooga/text-generation-webui is recognized as the original and leading local interface for Large Language Models (LLMs). This project provides a comprehensive solution for interacting with LLMs, offering capabilities such as text generation, vision integration, tool-calling, and model training. Designed for complete privacy and offline operation, it features both a user-friendly web UI and a robust API. With over 46,914 stars and 5,970 forks, it stands as a highly popular choice for local AI experimentation and development.

Installation

Getting started with oobabooga/text-generation-webui is designed to be straightforward, offering several installation methods to suit different user needs.

Portable Builds

For the quickest setup, portable builds are available. These require zero setup: simply download, unzip, and run. They include all dependencies and are compatible with GGUF (llama.cpp) models.

Download Portable Builds

One-Click Installer

For users requiring additional backends (like ExLlamaV3, Transformers), training capabilities, image generation, or extensions, the one-click installer is recommended. This method simplifies the setup process, handling dependencies like PyTorch automatically.

  • Clone the repository or download its source code and extract it.
  • Run the startup script for your OS: start_windows.bat, start_linux.sh, or start_macos.sh.
  • Follow the prompts to select your GPU vendor.

Manual Portable Install with venv

For those who prefer a manual setup within a Python virtual environment, follow these steps:

# Clone repository
git clone https://github.com/oobabooga/textgen
cd textgen

# Create virtual environment
python -m venv venv

# Activate virtual environment (example for macOS/Linux)
source venv/bin/activate

# Install dependencies (choose appropriate file under requirements/portable)
pip install -r requirements/portable/requirements.txt --upgrade

# Launch server
python server.py --portable --api --auto-launch

For detailed instructions and other installation methods, including Conda and Docker, please refer to the official documentation.

Examples

The web UI provides a rich set of features for interacting with LLMs:

  • Chat & Generation: Engage with models in instruct mode for instruction-following, chat-instruct/chat for custom characters, or use the notebook tab for free-form text generation.
  • Multimodal Capabilities: Attach images to messages for visual understanding and upload text, PDF, or .docx documents to discuss their contents.
  • Flexible Backends: Seamlessly switch between various LLM backends, including llama.cpp, Transformers, ExLlamaV3, and TensorRT-LLM, without restarting the application.
  • OpenAI/Anthropic-compatible API: Utilize a local API that mimics OpenAI/Anthropic endpoints, complete with tool-calling support, making it a drop-in replacement for many applications.
  • Tool-Calling: Enable models to execute custom functions, such as web search or math operations, defined as simple Python files.
  • Training & Image Generation: Fine-tune LoRAs on datasets and generate images using diffusers models like Z-Image-Turbo, all within the same interface.

Why Use It

oobabooga/text-generation-webui stands out for several compelling reasons:

  • Complete Privacy: Operates 100% offline with zero telemetry, ensuring your data and interactions remain private.
  • Versatility: Supports a wide array of LLM backends, multimodal inputs, tool-calling, and even training and image generation, making it a comprehensive AI toolkit.
  • Ease of Use: Offers portable builds and a one-click installer for quick setup, alongside a user-friendly web interface.
  • Active Community: Benefits from a vibrant community, providing support and contributing to its continuous development.

Links