turboseek: An Open-Source AI Search Engine Inspired by Perplexity

turboseek: An Open-Source AI Search Engine Inspired by Perplexity

Summary

turboseek is an innovative open-source AI search engine developed by Nutlope, drawing inspiration from platforms like Perplexity. Built with TypeScript, it leverages advanced LLMs and search APIs to provide comprehensive answers and related follow-up questions. This project offers a robust foundation for anyone interested in building their own AI-powered search solution.

Repository Info

Updated on January 17, 2026
View on GitHub

Tags

Click on any tag to explore related repositories

Introduction

turboseek is an innovative open-source AI search engine, developed by Nutlope, that aims to replicate the intelligent search experience of platforms like Perplexity. Built primarily with TypeScript and the Next.js app router, it integrates powerful LLMs from Together AI and search capabilities from Exa.ai to deliver concise, context-aware answers. The project provides a clear example of how to combine modern web technologies with advanced AI models to create a dynamic search solution.

Installation

To get turboseek up and running locally, follow these steps:

  1. Fork or clone the repository: git clone https://github.com/Nutlope/turboseek.git
  2. Create an account at Together AI (opens in a new tab) for LLM inference.
  3. Create an account at SERP API (opens in a new tab) or with Azure (Bing Search API (opens in a new tab)) for search capabilities.
  4. Create an account at Helicone (opens in a new tab) for observability.
  5. Create a .env file based on .example.env and populate it with your API keys.
  6. Run npm install to install dependencies.
  7. Run npm run dev to start the development server locally.

Examples

turboseek operates through a sophisticated workflow to answer user queries:

  1. A user submits a question to the engine.
  2. The system makes a request to a search API (e.g., Bing) to retrieve the top 6 relevant results.
  3. Text content is scraped from these 6 links and compiled as context.
  4. An LLM (e.g., OpenAI gpt-oss) processes the user's question along with the gathered context, streaming the answer back to the user.
  5. Concurrently, another LLM (e.g., Llama 3.1 8B) generates three related follow-up questions for the user.

Why Use

turboseek stands out as an excellent resource for several reasons. It offers a fully open-source implementation of an AI search engine, providing transparency and flexibility for developers. Its architecture demonstrates the integration of cutting-edge technologies like Next.js, Together AI, and Exa.ai, making it a valuable learning tool. Furthermore, the project includes a detailed tutorial (opens in a new tab) on how to build such an engine, making it accessible for those looking to understand or extend AI search capabilities. Whether you're building a new search solution or exploring advanced RAG techniques, turboseek provides a solid starting point.

Links

Explore turboseek further: