Portkey
in your LangChain app.
First, let’s import Portkey, OpenAI, and Agent tools
Get Portkey API Key
- Sign up for Portkey here
- On your dashboard, click on the profile icon on the bottom left, then click on “Copy API Key”
- Paste it below
Set Trace ID
- Set the trace id for your request below
- The Trace ID can be common for all API calls originating from a single request
Generate Portkey Headers
How Logging & Tracing Works on Portkey
Logging- Sending your request through Portkey ensures that all of the requests are logged by default
- Each request log contains
timestamp
,model name
,total cost
,request time
,request json
,response json
, and additional Portkey features
- Trace id is passed along with each request and is visible on the logs on Portkey dashboard
- You can also set a distinct trace id for each request if you want
- You can append user feedback to a trace id as well. More info on this here

Advanced LLMOps Features - Caching, Tagging, Retries
In addition to logging and tracing, Portkey provides more features that add production capabilities to your existing workflows: Caching Respond to previously served customers queries from cache instead of sending them again to OpenAI. Match exact strings OR semantically similar strings. Cache can save costs and reduce latencies by 20x. Docs Retries Automatically reprocess any unsuccessful API requestsupto 5
times. Uses an exponential backoff
strategy, which spaces out retry attempts to prevent network overload. Docs
Tagging
Track and audit each user interaction in high detail with predefined tags. Docs