OS
OSRepos
HomeRepositoriesRSS

Repository History

Explore all analyzed open source repositories

Topic: Inference
Optimum: Accelerate Hugging Face Models with Hardware Optimization

Optimum: Accelerate Hugging Face Models with Hardware Optimization

Optimum is an extension of Hugging Face Transformers, Diffusers, TIMM, and Sentence-Transformers, designed to provide a suite of optimization tools. It enables maximum efficiency for training and running models on targeted hardware, simplifying the process for developers. This library helps users achieve significant performance gains across various machine learning workflows.

Jan 6, 2026
View Details
LitServe: Build Custom Inference Engines for AI Models

LitServe: Build Custom Inference Engines for AI Models

LitServe is a powerful framework from Lightning AI designed to help developers build custom inference engines for a wide range of AI models and systems. It provides expert control over serving, supporting agents, multi-modal systems, RAG, and pipelines without the typical MLOps overhead. This framework offers a flexible and efficient solution for deploying AI models, whether self-hosted or managed on the Lightning AI platform.

Oct 29, 2025
View Details
Page 1
OS
OSRepos

Analysis and discovery of open source repositories. Find interesting projects and follow their updates.

Monitor your website with YourWebsiteScore

Navigation

HomeRepositoriesSitemapRSS Feed

Legal

Privacy PolicyCookie Policy

© 2025 OSRepos. Built with Nuxt 3 and lots of ❤️