Kong Gateway: The Cloud-Native API and AI Gateway for Microservices

Kong Gateway: The Cloud-Native API and AI Gateway for Microservices

Summary

Kong Gateway is a high-performance, cloud-native API and AI Gateway, distinguished for its extensibility and multi-LLM support. It provides robust functionality for proxying, routing, load balancing, and authentication, serving as a central layer for orchestrating microservices and AI traffic. With native Kubernetes integration and a rich plugin ecosystem, Kong Gateway is a versatile solution for modern API management.

Repository Info

Updated on February 7, 2026
View on GitHub

Tags

Click on any tag to explore related repositories

Introduction

Kong Gateway, also known simply as Kong, is a powerful cloud-native, platform-agnostic, and scalable API, LLM, and MCP Gateway. It stands out for its high performance and extensive extensibility through a rich plugin architecture. Kong Gateway offers advanced AI traffic capabilities, including multi-LLM support, semantic security, and MCP traffic security and analytics.

By centralizing functionalities like proxying, routing, load balancing, health checking, and authentication, Kong serves as the essential layer for orchestrating microservices, conventional API traffic, and agentic LLM and MCP workflows with ease. It runs natively on Kubernetes, thanks to its official Kubernetes Ingress Controller.

Installation

Getting started with Kong Gateway is straightforward. For a cloud-hosted experience, you can sign up for a free trial of Kong Konnect. If you prefer to run Kong on your own infrastructure, the official installation page provides comprehensive instructions for various distributions, including Docker, Kubernetes, and bare metal.

To quickly test drive Kong using Docker Compose, follow these steps:

$ git clone https://github.com/Kong/docker-kong
$ cd docker-kong/compose/
$ KONG_DATABASE=postgres docker-compose --profile database up

The Gateway will be available on localhost at ports 8000 (traffic), 8001 (Admin API), and 8002 (Kong Manager UI).

Examples

Once installed, you can quickly test Kong Gateway's capabilities. A common first step is to configure a service and add authentication in under 5 minutes. This demonstrates how Kong centralizes API management tasks.

For those interested in its AI capabilities, the official AI documentation provides guides on getting started with Kong AI Gateway features, including LLM and MCP functionalities.

Why Use Kong Gateway?

Kong Gateway centralizes common API, AI, and MCP functionalities across all your organization's services, allowing engineering teams to focus on core business challenges. Key features include:

  • Advanced Traffic Management: Sophisticated routing, load balancing, and health checking, configurable via a RESTful Admin API or declarative configuration.
  • Security & Authentication: Robust authentication and authorization for APIs using methods like JWT, basic auth, OAuth, and ACLs.
  • Universal LLM API: Route traffic across multiple LLM providers such as OpenAI, Anthropic, GCP Gemini, AWS Bedrock, Azure AI, Databricks, Mistral, and Huggingface.
  • MCP Governance: Comprehensive MCP traffic governance, security, observability, and autogeneration from any RESTful API.
  • Extensive AI Features: Over 60 AI features, including AI observability, semantic security and caching, and semantic routing.
  • Flexible Connectivity: Proxy, SSL/TLS termination, and connectivity support for both L4 and L7 traffic.
  • Rich Plugin Ecosystem: A vast array of plugins for traffic controls, rate limiting, request/response transformations, logging, and monitoring, with a dedicated Plugin Hub and development kit.
  • Modern Deployment Models: Supports advanced deployment strategies like Declarative Databaseless Deployment and Hybrid Deployment (control plane/data plane separation) without vendor lock-in.
  • Kubernetes Native: Seamless integration with Kubernetes through its native Ingress Controller.

Links