Hollama: A Minimal In-Browser LLM Chat App

Summary
Hollama is a lightweight LLM chat application designed to run entirely within your web browser. It offers support for both Ollama and OpenAI servers, providing a private and feature-rich environment for interacting with large language models. Users can enjoy a responsive interface, local data storage, and advanced customization options for their chat sessions.
Repository Info
Tags
Click on any tag to explore related repositories
Introduction
Hollama is a minimal, yet powerful, LLM chat application that operates entirely within your web browser. It provides a seamless interface for interacting with large language models, supporting both Ollama and OpenAI servers, making it versatile for various AI setups.
Features
Hollama comes packed with features designed to enhance your LLM chat experience:
- Support for Ollama & OpenAI servers
- Multi-server support
- Text & vision models
- Large prompt fields
- Support for reasoning models
- Markdown rendering with syntax highlighting
- KaTeX math notation
- Code editor features
- Customizable system prompts & advanced Ollama parameters
- Copy code snippets, messages or entire sessions
- Edit & retry messages
- Stores data locally on your browser
- Import & export stored data
- Responsive layout
- Light & dark themes
- Multi-language interface
- Download Ollama models directly from the UI
Get Started
Getting started with Hollama is straightforward, offering multiple ways to access its capabilities. You can try the live demo instantly, download desktop applications, or self-host using Docker.
- Live Demo: Experience Hollama directly in your browser with no sign-up required: Live demo
- Desktop Applications: Download native versions for macOS, Windows, and Linux from the releases page.
- Self-hosting: For more control, you can self-host Hollama using Docker. Refer to the self-hosting guide for instructions.
Why Use Hollama?
Hollama stands out for its commitment to privacy and local control, running entirely in your browser. It offers a rich set of features, including multi-server support, advanced customization, and a user-friendly interface, making it an excellent choice for local LLM interactions. The ability to store data locally and import/export sessions further enhances user control and data privacy.
Links
- GitHub Repository: fmaclen/hollama
- Live Demo: hollama.fernando.is
- Releases: Download Hollama