How-to guides: Tracing
This section contains how-to guides related to tracing.
📄️ Annotate code for tracing
There are several ways to log traces to LangSmith.
📄️ Toggle tracing on and off
This section is only relevant for users who are
📄️ Log traces to specific project
You can change the destination project of your traces both statically through environment variables and dynamically at runtime.
📄️ Set a sampling rate for traces
This section is relevant for those using the LangSmith SDK or LangChain, not for those logging directly with the LangSmith API.
📄️ Add metadata and tags to traces
LangSmith supports sending arbitrary metadata and tags along with traces.
📄️ Implement distributed tracing
Sometimes, you need to trace a request across multiple services.
📄️ Access the current run (span) within a traced function
In some cases you will want to access the current run (span) within a traced function. This can be useful for extracting UUIDs, tags, or other information from the current run.
📄️ Log multimodal traces
LangSmith supports logging and rendering images as part of traces. This is currently supported for multimodal LLM runs.
📄️ Log retriever traces
Nothing will break if you don't log retriever traces in the correct format and data will still be logged. However, the data will not be rendered in a way that is specific to retriever steps.
📄️ Log custom LLM traces
Nothing will break if you don't log LLM traces in the correct format and data will still be logged. However, the data will not be processed or rendered in a way that is specific to LLMs.
📄️ Prevent logging of sensitive data in traces
In some situations, you may need to prevent the inputs and outputs of your traces from being logged for privacy or security reasons. LangSmith provides a way to filter the inputs and outputs of your traces before they are sent to the LangSmith backend.
📄️ Export traces
Before diving into this content, it might be helpful to read the following:
📄️ Share or unshare a trace publicly
Sharing a trace publicly will make it accessible to anyone with the link. Make sure you're not sharing sensitive information.
📄️ Compare traces
To compare traces, click on the Compare button in the upper right hand side of any trace view.
📄️ Trace generator functions
In most LLM applications, you will want to stream outputs to minimize the time to the first token seen by the user.
📄️ Trace with LangChain (Python and JS/TS)
LangSmith integrates seamlessly with LangChain (Python and JS), the popular open-source framework for building LLM applications.
📄️ Trace with LangGraph (Python and JS/TS)
LangSmith smoothly integrates with LangGraph (Python and JS)
📄️ Trace with Instructor (Python only)
We provide a convenient integration with Instructor, a popular open-source library for generating structured outputs with LLMs.
📄️ Trace with the Vercel AI SDK (JS/TS only)
This feature is currently in beta while Vercel rolls out official telemetry support.
📄️ Trace without setting environment variables
As mentioned in other guides, the following environment variables allow you to configure tracing enabled, the api endpoint, the api key, and the tracing project:
📄️ Trace using the LangSmith REST API
It is HIGHLY recommended to use our Python or TypeScript SDKs to send traces to LangSmith.
📄️ Calculate token-based costs for traces
Before diving into this content, it might be helpful to read the following:
📄️ Troubleshoot trace nesting
When tracing with the LangSmith SDK, LangGraph, and LangChain, tracing should automatically propagate the correct context so that code executed within a parent trace will be rendered in the expected location in the UI.