Skip to content

API User Guide

About This Guide

Welcome to the LLMInspect API User Guide. This guide is designed to help users of all levels interact seamlessly with various language models using the LLMInspect API. With LLMInspect API, you can connect to multiple LLM providers, including OpenAI, Gemini, and your locally deployed InspectGPT (Local LLM). It is compatible with the OpenAI API format, making it easy for users familiar with OpenAI's API to get started quickly.

Key Features

  • Multi-LLM Support: Connect to multiple language model providers such as OpenAI, Gemini, and Local LLMs like InspectGPT.
  • OpenAI API Compatibility: LLMInspect follows the OpenAI Chat Completions format, allowing for a smooth transition if you're already using OpenAI's API.
  • Supports Chat Completions and Image Generation: Interact with language models for chat completions and generate images using supported models.

By default, all requests sent through the LLMInspect API are directed to OpenAI. To interact with Gemini or InspectGPT, you can specify the desired provider by using the appropriate headers, which will be explained in detail later in this guide.

Image title
LLMInspect API

Contents