Using OpenAI With Pezzo
Ensure that you have the latest version of the Pezzo Client installed, as well as the OpenAI NPM package.Initialize Pezzo and PezzoOpenAI
Learn more about how to initialize the Pezzo Client:Making Requests to OpenAI
Option 1: With Prompt Management (Recommended)
We recommend you to manage your AI prompts through Pezzo. This allows you to easily manage your prompts, and keep track of your AI requests. Click here to learn about Prompt Management in Pezzo. Below is an example of how you can use Pezzo to retrieve a prompt, and then use it to make a request to OpenAI.Option 2: Without Prompt Management
If you don’t want to manage your prompts through Pezzo, you can still use Pezzo to make requests to OpenAI and benefit from Pezzo’s Observability features. You will consume the make request to the OpenAI exactly as you normally would. The only difference is that you will use thePezzoOpenAI
instance we created above. Here is an example:
Additional Capabilities
The Pezzo client enhances your developer experience by providing additional functionality to the OpenAI API. This is done through the second argument of thecreateChatCompletion
method.
Variables
You can specify variables that will be interpolated by the Pezzo client before sending the request to OpenAI. This is useful if you want to use the same prompt for multiple requests, but with different variables.variables
object.
Custom Properties
You can also specify custom properties that will be sent to Pezzo. This is useful if you want to add additional information to your request, such as the user ID, or the request ID. This information will be visible in the Requests page of your Pezzo project, and you will be able to filter requests based on these properties.Request Caching
Utilizing request caching can sometimes save up to 90% on your API costs and execution time. You can enable cache by settingcache
to true
in the second argument of the createChatCompletion
method.