Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.pezzo.ai/llms.txt

Use this file to discover all available pages before exploring further.

Pezzo supports integration with LangChain for observability and monitoring. Integration is as easy as configuring the LLM to proxy requests to Pezzo.
If you want to learn more about the Pezzo Proxy, click here.

Example: LangChain with OpenAI

Below is an example using ChatOpenAI. The same can be applied to chains and agents.
import { ChatOpenAI } from "langchain/chat_models/openai";

const llm = new ChatOpenAI({
  openAIApiKey: process.env.OPENAI_API_KEY,
  temperature: 0,
  configuration: {
    baseURL: "https://proxy.pezzo.ai/openai/v1",
    defaultHeaders: {
      "X-Pezzo-Api-Key": "<Your API Key>",
      "X-Pezzo-Project-Id": "<Your Project ID>",
      "X-Pezzo-Environment": "Production",
    },
  },
});

const llmResult = await llm.predict("Tell me 5 fun facts about yourself!");