LangDB integrates seamlessly with libraries like LangChain to provide advanced tracing and logging support for workflows, allowing developers to streamline the development process while maintaining detailed logs. If you're familiar with LangChain, adding LangDB to your workflow can offer enhanced functionality without adding complexity.
In this blog post, we'll walk through how to use LangDB with LangChain, including a practical example. By the end, you'll understand how to capture detailed logs and take advantage of LangDB’s features in your own LangChain projects.
Pre-requisites
Tavily API token
OpenAI API token
Python v3.11
Pip packages: langchain (at least v0.1.0), openai, wikipedia, langchain-community, tavily-python, langchainhub, langchain-openai, python-dotenv
pip install langchain wikipedia langchain-community tavily-python langchainhub langchain-openai openai python-dotenv
Example: Using LangDB with LangChain
Below is an example of how you can integrate LangDB into your LangChain workflow. The integration is designed to be as simple as possible, letting you focus on writing logic without worrying about setup complexities.
from langchain import hub
from langchain.agents import (
AgentExecutor,
create_tool_calling_agent,
)
from langchain_openai import ChatOpenAI
import os
from langchain_community.agent_toolkits.load_tools import load_tools
from langchain_community.tools.tavily_search.tool import TavilySearchResults
from langchain_community.utilities.tavily_search import TavilySearchAPIWrapper
import uuid
api_base = "https://api.us-east-1.langdb.ai" ### LangDB API base URL
pre_defined_run_id = uuid.uuid4()
default_headers = {
"x-project-id": "xxxxx", ### LangDB Project ID
"x-thread-id": str(pre_defined_run_id),
}
os.environ["OPENAI_API_KEY"] = "xxxx" ### LangDB API key
os.environ["TAVILY_API_KEY"] = "tvly-xxxx"
def get_function_tools():
search = TavilySearchAPIWrapper()
tavily_tool = TavilySearchResults(api_wrapper=search)
tools = [tavily_tool]
tools.extend(load_tools(["wikipedia"]))
return tools
def init_action():
llm = ChatOpenAI(
model_name="gpt-4o-mini",
temperature=0.3,
openai_api_base=api_base,
default_headers=default_headers,
disable_streaming=True,
)
prompt = hub.pull("hwchase17/openai-functions-agent")
tools = get_function_tools()
agent = create_tool_calling_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
agent_executor.invoke(
{"input": "Who is the owner of Tesla company? Let me know details about owner."}
)
init_action()
In this example:
We define an API base URL for LangDB (
api_base
), which points to the LangDB.We add a
project-id
to the headers to specify the LangDB project being used, and athread-id
to create a unique thread for tracking and logging the execution within LangDB.Once executed, the agent is able to process an input query ("Who is the owner of Tesla company? Let me know details about owner.") and use the tools integrated into the agent, like Tavily search and Wikipedia.
Dynamic Model Switching
One of LangDB’s most powerful features is the ability to seamlessly switch between different models without major changes in the existing codebase. In the above example, if you want to use Claude 3.5 Sonnet, all you need to do is update the model name in your configuration:
# Switch to Anthropic's Claude model
llm = ChatOpenAI(
model_name='claude-3-sonnet-20240229', # Change model here
temperature=0.3,
openai_api_base=api_base,
default_headers=default_headers
)
With this small change, LangDB takes care of the rest, ensuring that your application can dynamically adapt to new models without the need for rewriting large parts of your codebase.
Tracing in LangDB
LangDB’s tracing feature provides real-time visualizations of your AI workflows, breaking down the time spent in each stage. The trace shows key stages like:
API Invoke: Total time for the request.
Model Call: Time spent interacting with the model.
Tool Usage: Duration of specific tool calls.
The trace visualization below highlights these stages, helping you identify bottlenecks and optimize your workflow.
This detailed view makes it easier to diagnose performance issues and fine-tune your LangChain integrations.
Conclusion
Using LangDB with LangChain is a powerful yet straightforward way to manage and trace your AI workflows. By leveraging LangDB’s capabilities, you can focus on developing complex workflows without worrying about the operational overhead. The ability to seamlessly switch between models also ensures that you can stay agile as new AI technologies emerge.
Start integrating LangDB with LangChain today, and enjoy the flexibility and scalability it offers; Check out LangDB!