This section is relevant for those using the LangSmith JS SDK version 0.2.0 and higher. If you are tracing using LangChain.js or LangGraph.js in serverless environments, see this guide.
- Set an environment variable named
LANGSMITH_TRACING_BACKGROUND
to"false"
. This will cause your traced functions to wait for tracing to complete before returning.- Note that this is named differently from the environment variable in LangChain.js because LangSmith can be used without LangChain.
- Pass a custom client into your traced runs and
await
theclient.awaitPendingTraceBatches();
method.
awaitPendingTraceBatches
alongside the traceable
method:
Rate limits at high concurrency
By default, the LangSmith client will batch operations as your traced run executions, sending a new batch every few milliseconds. This works well in most situations, but if your traced function is long-running and you have very high concurrency, you may also hit rate limits related to overall request count. If you are seeing rate limit errors related to this, you can try settingmanualFlushMode: true
in your client like this:
client.flush()
like this before your serverless function closes:
.flush()
.