PromptLayer is a platform for prompt engineering. It also helps with the LLM observability to visualize requests, version prompts, and track usage. WhileIn this guide, we will go over how to setup thePromptLayer
does have LLMs that integrate directly with LangChain (e.g.PromptLayerOpenAI
), using a callback is the recommended way to integratePromptLayer
with LangChain.
PromptLayerCallbackHandler
.
See PromptLayer docs for more information.
Installation and Setup
Getting API Credentials
If you do not have a PromptLayer account, create one on promptlayer.com. Then get an API key by clicking on the settings cog in the navbar and set it as an environment variable calledPROMPTLAYER_API_KEY
Usage
Getting started withPromptLayerCallbackHandler
is fairly simple, it takes two optional arguments:
pl_tags
- an optional list of strings that will be tracked as tags on PromptLayer.pl_id_callback
- an optional function that will takepromptlayer_request_id
as an argument. This ID can be used with all of PromptLayer’s tracking features to track, metadata, scores, and prompt usage.
Simple OpenAI Example
In this simple example we usePromptLayerCallbackHandler
with ChatOpenAI
. We add a PromptLayer tag named chatopenai
GPT4All Example
Full Featured Example
In this example, we unlock more of the power ofPromptLayer
.
PromptLayer allows you to visually create, version, and track prompt templates. Using the Prompt Registry, we can programmatically fetch the prompt template called example
.
We also define a pl_id_callback
function which takes in the promptlayer_request_id
and logs a score, metadata and links the prompt template used. Read more about tracking on our docs.