ChatXAI
features and configurations, head to the API reference.
xAI offers an API to interact with Grok models.
Overview
Integration details
Class | Package | Local | Serializable | JS support | Downloads | Version |
---|---|---|---|---|---|---|
ChatXAI | langchain-xai | ❌ | beta | ✅ |
Model features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ✅ | ❌ | ✅ | ✅ |
Setup
To access xAI models, you’ll need to create an xAI account, get an API key, and install thelangchain-xai
integration package.
Credentials
Head to this page to sign up for xAI and generate an API key. Once you’ve done this, set theXAI_API_KEY
environment variable:
Installation
The LangChain xAI integration lives in thelangchain-xai
package:
Instantiation
Now we can instantiate our model object and generate chat completions:Invocation
Chaining
We can chain our model with a prompt template like so:Tool calling
ChatXAI has a tool calling (we use “tool calling” and “function calling” interchangeably here) API that lets you describe tools and their arguments, and have the model return a JSON object with a tool to invoke and the inputs to that tool. Tool-calling is extremely useful for building tool-using chains and agents, and for getting structured outputs from models more generally.ChatXAI.bind_tools()
WithChatXAI.bind_tools
, we can easily pass in Pydantic classes, dict schemas, LangChain tools, or even functions as tools to the model. Under the hood, these are converted to an OpenAI tool schema, which looks like:
Live Search
xAI supports a Live Search feature that enables Grok to ground its answers using results from web searches:API reference
For detailed documentation of allChatXAI
features and configurations, head to the API reference.