OS
OSRepos
HomeRepositoriesRSS

Repository History

Explore all analyzed open source repositories

Topic: Local Inference
oobabooga/text-generation-webui: The Premier Local LLM Interface

oobabooga/text-generation-webui: The Premier Local LLM Interface

oobabooga/text-generation-webui is a powerful and versatile web UI for running large language models (LLMs) locally. It offers a 100% offline and private environment for text generation, vision, tool-calling, and even training, all accessible through an intuitive interface and API.

May 1, 2026
View Details
llama-cpp-python: Python Bindings for llama.cpp

llama-cpp-python: Python Bindings for llama.cpp

llama-cpp-python provides robust Python bindings for the popular llama.cpp library, enabling efficient local inference with large language models. It offers a high-level API compatible with OpenAI's API, facilitating easy integration into existing applications. The project also includes a powerful web server for local deployment and supports various hardware acceleration backends.

Nov 11, 2025
View Details
Page 1
OS
OSRepos

Analysis and discovery of open source repositories. Find interesting projects and follow their updates.

Monitor your website with YourWebsiteScore

Navigation

HomeRepositoriesSitemapRSS Feed

Legal

Privacy PolicyCookie Policy

© 2025 OSRepos. Built with Nuxt 3 and lots of ❤️