LAM: Large Avatar Model for One-shot Animatable Gaussian Head

LAM: Large Avatar Model for One-shot Animatable Gaussian Head

Summary

LAM is a cutting-edge project that enables the creation of ultra-realistic, animatable 3D avatars from just a single image in seconds. Leveraging advanced Gaussian Head technology, it offers super-fast cross-platform rendering and a low-latency SDK for real-time interactive chatting. This innovative model is set to be presented at SIGGRAPH 2025.

Repository Info

Updated on December 7, 2025
View on GitHub

Introduction

LAM, or Large Avatar Model, is an innovative project designed to revolutionize the creation of 3D digital humans. It allows users to generate ultra-realistic, animatable 3D avatars from a single image in mere seconds. This technology is particularly notable for its super-fast cross-platform animating and rendering capabilities, making it suitable for various devices. Furthermore, LAM provides a low-latency SDK, enabling the development of real-time interactive chatting avatars. The project is slated for presentation at SIGGRAPH 2025, highlighting its significance in the field of computer graphics and AI.

Installation

Getting started with LAM is straightforward, with options for both Windows and Linux users.

Online Demos

For a quick preview, you can explore the online demos:

Environment Setup

Windows One-Click Installer:

A convenient one-click installation package for Windows (Cuda 12.8) is available. You can find a video guide and the download link.

Linux:

git clone https://github.com/aigc3d/LAM.git
cd LAM
# Install with Cuda 12.1
sh ./scripts/install/install_cu121.sh
# Or Install with Cuda 11.8
sh ./scripts/install/install_cu118.sh

Windows (Manual):

For manual installation on Windows, please refer to the Windows Install Guide.

Model Weights

Model weights can be downloaded from HuggingFace or ModelScope.

HuggingFace Download:

# Download Assets
huggingface-cli download 3DAIGC/LAM-assets --local-dir ./tmp
tar -xf ./tmp/LAM_assets.tar && rm ./tmp/LAM_assets.tar
tar -xf ./tmp/thirdparty_models.tar && rm -r ./tmp/
# Download Model Weights
huggingface-cli download 3DAIGC/LAM-20K --local-dir ./model_zoo/lam_models/releases/lam/lam-20k/step_045500/

ModelScope Download:

pip3 install modelscope
# Download Assets
modelscope download --model "Damo_XR_Lab/LAM-assets" --local_dir "./tmp/"
tar -xf ./tmp/LAM_assets.tar && rm ./tmp/LAM_assets.tar
tar -xf ./tmp/thirdparty_models.tar && rm -r ./tmp/
# Download Model Weights
modelscope download "Damo_XR_Lab/LAM-20K" --local_dir "./model_zoo/lam_models/releases/lam/lam-20k/step_045500/"

Examples

After setting up the environment, you can run LAM locally.

Gradio Run

To run the Gradio demo:

python app_lam.py

If you wish to export ZIP files for real-time conversations on OpenAvatarChat, refer to the Avatar Export Guide and run with the Blender path:

python app_lam.py --blender_path /path/blender

Inference

For inference, use the provided script:

sh ./scripts/inference.sh ${CONFIG} ${MODEL_NAME} ${IMAGE_PATH_OR_FOLDER} ${MOTION_SEQ}

Why Use LAM?

LAM offers compelling advantages for anyone interested in 3D avatar creation and interactive AI applications:

  • Ultra-realistic 3D Avatar Creation from One Image in Seconds: Quickly generate high-fidelity avatars with minimal input.
  • Super-fast Cross-platform Animating and Rendering on Any Devices: Experience smooth performance across various hardware, from powerful GPUs to mobile phones.
  • Low-latency SDK for Realtime Interactive Chatting Avatar: Build responsive and engaging conversational AI experiences with integrated 3D digital humans.

Links