Skip to main content

LangChain

Integrate AgentSuite with LangChain via the Virtue Gateway for AgentGuard runtime enforcement, access control, MCP server scanning (where enabled), and session observability in the dashboard.

Installation

pip install agentsuite-sdk[langchain]

The [langchain] extra does not include an LLM provider — install the one you need separately (e.g. langchain-openai, langchain-anthropic, langchain-google-genai). The model string follows LangChain's init_chat_model format (e.g. openai:gpt-4o).

How It Works

  • adapter.get_tools() — returns gateway tools to pass into create_agent(tools=[...]).
  • adapter.create_callback() — pass to config={"callbacks": [...]} for session tracking.
  • Use the adapter as an async context manager (async with adapter) to manage the MCP connection lifecycle.

Quickstart

from langchain.agents import create_agent
from langchain_core.messages import HumanMessage
from agentsuite import GatewayClient

client = GatewayClient(url="...", api_key="sk-vai-...")
adapter = client.langchain()

async with adapter:
tools = await adapter.get_tools()
graph = create_agent(
model="openai:gpt-4o",
tools=tools,
system_prompt="You are a helpful assistant.",
)
result = await graph.ainvoke(
{"messages": [HumanMessage(content="What are my open tickets?")]},
config={"callbacks": [adapter.create_callback()]},
)

Full runnable example: demo_langchain.py

Example Output

The agent responds to the query and prints the session ID:

LangChain demo terminal output

View the full session trace in the VirtueAgent dashboard (Observability → Sessions):

VirtueAgent Sessions tab: session overview and execution trace

Environment Variables

VariableDescription
VIRTUE_GATEWAY_URLGateway MCP endpoint URL
VIRTUE_API_KEYVirtueAI API key
OPENAI_API_KEYOpenAI API key (only when using an openai: model)
AGENT_MODELOptional; init_chat_model-style string (demo default: openai:gpt-4o)