agent-service-toolkit: A Comprehensive Toolkit for AI Agent Services with LangGraph

Summary
The agent-service-toolkit is a full-featured repository for building and running AI agent services. It leverages LangGraph for sophisticated agent logic, FastAPI for a robust service API, and Streamlit for an interactive chat interface. This toolkit provides a comprehensive and robust template for developing and deploying custom AI agents with ease.
Repository Info
Introduction
The agent-service-toolkit by JoshuaC215 offers a complete solution for running AI agent services. Built with LangGraph, FastAPI, and Streamlit, it provides everything needed from agent definition to a user-friendly chat interface. This project serves as an excellent template for developers looking to quickly build and deploy their own agents using the LangGraph framework, demonstrating a full, robust setup.
Key components include a LangGraph agent, a FastAPI service to serve it, a client for interaction, and a Streamlit application for a chat interface. Data structures and settings are meticulously built with Pydantic, ensuring reliability and ease of use.
Installation
Getting started with agent-service-toolkit is straightforward, with options for both Python virtual environments and Docker.
Quickstart with Python
- Set up environment variables: Create a
.envfile in the root directory with at least one LLM API key (e.g.,OPENAI_API_KEY=your_openai_api_key). - Install dependencies: Use
uv(recommended) orpip.curl -LsSf https://astral.sh/uv/0.7.19/install.sh | sh uv sync --frozen source .venv/bin/activate - Run the service: In one terminal:
python src/run_service.py - Run the Streamlit app: In another terminal:
source .venv/bin/activate streamlit run src/streamlit_app.py
Quickstart with Docker
- Set up environment variables: Create a
.envfile with yourOPENAI_API_KEY.echo 'OPENAI_API_KEY=your_openai_api_key' >> .env - Launch services: Ensure Docker and Docker Compose (>= v2.23.0) are installed.
This command will automatically start the PostgreSQL database, the FastAPI agent service, and the Streamlit app. Services will update automatically on code changes.docker compose watch
Examples
The repository includes a generic src/client/client.AgentClient for interacting with the agent service. Here's a quick example of how to use it:
from client import AgentClient
client = AgentClient()
response = client.invoke("Tell me a brief joke?")
response.pretty_print()
# Expected Output:
# ================================== Ai Message ==================================
#
# A man walked into a library and asked the librarian, "Do you have any books on Pavlov's dogs and Schrödinger's cat?"
# The librarian replied, "It rings a bell, but I'm not sure if it's here or not."
Why Use It
agent-service-toolkit stands out due to its comprehensive feature set and robust architecture:
- LangGraph Agent and Latest Features: Implements LangGraph v1.0 features, including human-in-the-loop with
interrupt(), flow control withCommand, long-term memory withStore, andlanggraph-supervisor. - FastAPI Service: Provides both streaming and non-streaming endpoints for serving agents efficiently.
- Advanced Streaming: Features a novel approach supporting both token-based and message-based streaming.
- Streamlit Interface: Offers a user-friendly chat interface with voice input and output capabilities.
- Multiple Agent Support: Allows running and calling multiple agents by URL path, with available agents and models described in
/info. - Asynchronous Design: Utilizes async/await for efficient handling of concurrent requests.
- Docker Support: Includes Dockerfiles and a
docker composefile for easy development and deployment. - Testing: Comes with robust unit and integration tests for the entire repository.
Links
- GitHub Repository: https://github.com/JoshuaC215/agent-service-toolkit
- Streamlit App: https://agent-service-toolkit.streamlit.app/
- Video Walkthrough: https://www.youtube.com/watch?v=pdYVHw_YCNY