LLMOps for LangChain
Portkey brings production readiness to LangChain. With Portkey, you can- Connect to 150+ models through a unified API,
- View 42+ metrics & logs for all requests,
- Enable semantic cache to reduce latency & costs,
- Implement automatic retries & fallbacks for failed requests,
- Add custom tags to requests for better tracking and analysis and more.
Quickstart - Portkey & LangChain
Since Portkey is fully compatible with the OpenAI signature, you can connect to the Portkey AI Gateway through theChatOpenAI
interface.
- Set the
base_url
asPORTKEY_GATEWAY_URL
- Add
default_headers
to consume the headers needed by Portkey using thecreateHeaders
helper method.
ChatOpenAI
model in LangChain
provider
. Portkey will also start logging all the requests in your account that makes debugging extremely simple.

Using 150+ models through the AI Gateway
The power of the AI gateway comes when you’re able to use the above code snippet to connect with 150+ models across 20+ providers supported through the AI gateway. Let’s modify the code above to make a call to Anthropic’sclaude-3-opus-20240229
model.
Portkey supports Virtual Keys which are an easy way to store and manage API keys in a secure vault. Let’s try using a Virtual Key to make LLM calls. You can navigate to the Virtual Keys tab in Portkey and create a new key for Anthropic.
The virtual_key
parameter sets the authentication and provider for the AI provider being used. In our case we’re using the Anthropic Virtual key.
Notice that the api_key
can be left blank as that authentication won’t be used.
ChatOpenAI
class making it a single interface to call any provider and any model.
Advanced Routing - Load Balancing, Fallbacks, Retries
The Portkey AI Gateway brings capabilities like load-balancing, fallbacks, experimentation and canary testing to LangChain through a configuration-first approach. Let’s take an example where we might want to split traffic betweengpt-4
and claude-opus
50:50 to test the two large models. The gateway configuration for this would look like the following:
gpt-4
and claude-3-opus-20240229
in the ratio of the defined weights.
You can find more config examples here.
Tracing Chains & Agents
Portkey’s LangChain integration gives you full visibility into the running of an agent. Let’s take an example of a popular agentic workflow. We only need to modify theChatOpenAI
class to use the AI Gateway as above.

- Observability - portkey.ai/docs/product/observability-modern-monitoring-for-llms
- AI Gateway - portkey.ai/docs/product/ai-gateway-streamline-llm-integrations
- Prompt Library - portkey.ai/docs/product/prompt-library