Toolkit-for-Prompt-Compression: A Unified Toolkit for LLM Prompt Compression
Summary
PCToolkit is a unified, plug-and-play toolkit designed for efficient prompt compression in Large Language Models (LLMs). It provides state-of-the-art compression methods, diverse datasets, and comprehensive metrics for evaluating performance. This modular toolkit simplifies the process of condensing input prompts while preserving crucial information.
Repository Info
Tags
Click on any tag to explore related repositories
Introduction
PCToolkit is a unified, plug-and-play prompt compression toolkit for Large Language Models (LLMs). Prompt compression is an innovative method for efficiently condensing input prompts while preserving essential information, crucial for optimizing LLM inference efficiency. This toolkit offers a comprehensive solution, featuring cutting-edge prompt compressors, diverse datasets, and metrics for thorough performance evaluation.
Key features of PCToolkit include:
- State-of-the-art and reproducible methods: It encompasses five distinct mainstream compression techniques: Selective Context, LLMLingua, LongLLMLingua, SCRL, and Keep it Simple.
- User-friendly interfaces: Designed for portability and easy adaptation, allowing for simple integration of new compressors, datasets, and metrics.
- Modular design: Organized into Compressor, Dataset, Metric, and Runner modules, simplifying transitions between different methods and evaluation components.
PCToolkit includes 5 compression methods, 11 datasets, and over 5 metrics, evaluated across various natural language tasks like reconstruction, summarization, mathematical problem-solving, and question answering.
Installation
To get started with PCToolkit, follow these steps:
First, clone the repository:
git clone https://github.com/3DAgentWorld/Toolkit-for-Prompt-Compression.git
Navigate into the cloned directory:
cd Toolkit-for-Prompt-Compression
Then, install the required dependencies:
pip install -r requirements.txt
Please note that while most models can be automatically downloaded from Huggingface Hub, models for the SCRL method require manual download. Refer to the guidance within the /models folder for detailed instructions.
Examples
PCToolkit provides straightforward interfaces for both prompt compression and evaluation.
Prompt Compression
To perform prompt compression, you can use the PromptCompressor class:
from pctoolkit.compressors import PromptCompressor
compressor = PromptCompressor(type='SCCompressor', device='cuda')
test_prompt = "test prompt"
ratio = 0.3
result = compressor.compressgo(test_prompt, ratio)
print(result)
Evaluation
For evaluating compressor performance, follow this example:
from pctoolkit.runners import run
from pctoolkit.datasets import load_dataset
from pctoolkit.metrics import load_metrics
from pctoolkit.compressors import PromptCompressor
compressor = PromptCompressor(type='SCCompressor', device='cuda')
dataset_name = 'arxiv'
dataset = load_dataset(dataset_name)
run(compressor=compressor, dataset=dataset, metrics=load_metrics, ratio=0.1)
Hint: Remember to fill in your Huggingface Tokens and API keys for OpenAI in pctoolkit/runners.py. You can also modify the URLs if you are using other OpenAI APIs. If you wish to change the metrics, specifically for the LongBench dataset, modify pctoolkit/metrics.py.
Why Use PCToolkit?
PCToolkit offers a robust and versatile solution for anyone working with Large Language Models and seeking to optimize prompt efficiency. Its key advantages include:
- Comprehensive Coverage: Integrates five state-of-the-art prompt compression methods, providing a wide range of options for different use cases.
- Ease of Use: Features user-friendly interfaces that simplify the process of applying compression techniques and evaluating their impact.
- Modularity and Extensibility: Its modular design allows for easy integration of new datasets, metrics, and even custom compression methods, making it highly adaptable.
- Extensive Evaluation: Comes with support for 11 diverse datasets and over 5 metrics, enabling thorough and reproducible evaluation across various NLP tasks.
- Research and Development Ready: Ideal for researchers and developers looking to experiment with, compare, and build upon existing prompt compression techniques.
Links
- GitHub Repository: https://github.com/3DAgentWorld/Toolkit-for-Prompt-Compression
- Technical Report (Paper): https://arxiv.org/abs/2403.17411
- Hugging Face Demo: https://huggingface.co/spaces/CjangCjengh/Prompt-Compression-Toolbox