Repository History

Explore all analyzed open source repositories

Topic: transformers
PEFT: State-of-the-Art Parameter-Efficient Fine-Tuning

PEFT: State-of-the-Art Parameter-Efficient Fine-Tuning

PEFT (Parameter-Efficient Fine-Tuning) is a cutting-edge library from Hugging Face designed to efficiently adapt large pretrained models for various downstream applications. It dramatically reduces computational and storage costs by fine-tuning only a small subset of model parameters. This approach enables achieving performance comparable to fully fine-tuned models, making advanced AI accessible on more modest hardware.

May 14, 2026
View Details
lagent: A Lightweight Framework for Building LLM-Based Agents

lagent: A Lightweight Framework for Building LLM-Based Agents

lagent is a lightweight, open-source framework developed by InternLM, designed for efficiently building large language model (LLM)-based agents. It provides a PyTorch-inspired design philosophy, making it intuitive for developers to create and manage multi-agent applications. This framework simplifies the process of agent communication, memory management, and tool integration.

Jan 18, 2026
View Details
PartCrafter: Structured 3D Mesh Generation via Compositional Latent Diffusion Transformers

PartCrafter: Structured 3D Mesh Generation via Compositional Latent Diffusion Transformers

PartCrafter is an innovative structured 3D generative model that enables the creation of complex 3D meshes and scenes from a single RGB image. Accepted at NeurIPS 2025, this project leverages compositional latent diffusion transformers to jointly generate multiple parts and objects in one shot. It offers powerful capabilities for both 3D object and scene generation, making it a valuable tool for researchers and developers in the field.

Jan 14, 2026
View Details
Whisper Web: ML-Powered Speech Recognition Directly in Your Browser

Whisper Web: ML-Powered Speech Recognition Directly in Your Browser

Whisper Web brings powerful, ML-powered speech recognition directly to your browser, leveraging ? Transformers.js. This innovative project allows for client-side audio processing, offering privacy and efficiency without relying on cloud services. It even includes experimental WebGPU support for accelerated performance.

Dec 5, 2025
View Details
LLaMA-Factory: Unified Efficient Fine-Tuning for 100+ LLMs & VLMs

LLaMA-Factory: Unified Efficient Fine-Tuning for 100+ LLMs & VLMs

LLaMA-Factory is an open-source project offering a unified and efficient framework for fine-tuning over 100 large language models (LLMs) and vision-language models (VLMs). Recognized at ACL 2024, it provides a comprehensive suite of tools and algorithms for various training approaches. This repository simplifies the complex process of adapting powerful models for specific tasks with ease and scalability.

Nov 8, 2025
View Details
Page 1