LangChain + Govyn — Govern Your LangChain Agents

LangChain agents powered by Anthropic's Claude models can rack up costs quickly, especially with long-context completions. Without centralized control, you're trusting each agent to self-regulate — and there's no built-in way to enforce spending limits or audit what your agents are sending to Claude.

How it works

LangChain
Your agents
HTTPS
Govyn Proxy
Policy · Budget · Logs
API
Anthropic API
LLM provider

Step-by-step setup

1

Start the Govyn proxy

bash
npx govyn start --config govyn.yaml
2

Point LangChain at Govyn

python
from langchain_anthropic import ChatAnthropic

llm = ChatAnthropic(
    model="claude-sonnet-4-20250514",
    anthropic_api_url="http://localhost:4111",
    anthropic_api_key="gvn_agent_langchain_claude_01"
)
3

Run your chain as usual

python
from langchain.agents import AgentExecutor, create_tool_calling_agent

agent = create_tool_calling_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools)
result = executor.invoke({"input": "Draft a contract summary"})

Example policy

Define governance rules for your LangChain agents in a simple YAML file.

govyn.yaml
agents:
  langchain_claude_01:
    budget:
      daily: $10.00
      monthly: $200.00
    models:
      allow: [claude-sonnet-4-20250514, claude-haiku-4-5-20251001]
      deny: [claude-opus-4-20250514]
    rate_limit:
      requests_per_minute: 20
    context:
      max_input_tokens: 50000
    logging:
      replay: true

Why use Govyn with LangChain?

Budget caps tuned for Claude's token pricing
Model allowlists — keep agents on Sonnet, block Opus
Input token limits to control long-context costs
Full request/response replay for debugging
Works with Claude's tool-use and streaming APIs
Swap provider without changing agent code

Get started in 5 minutes

Add governance to your LangChain agents with a single config change. No code rewrites.

Read the docs

Frequently asked questions

Does Govyn support Claude's tool-use API through LangChain?
Yes. Govyn proxies the full Anthropic Messages API including tool-use calls, streaming, and multi-turn conversations. LangChain's tool-calling agent works identically through the proxy.
Can I limit the context window size for cost control?
Yes. Govyn lets you set max_input_tokens per agent. If a LangChain chain tries to send a completion request exceeding the token limit, Govyn rejects it before it reaches Anthropic — saving you from surprise bills on long-context calls.
Can I switch between OpenAI and Anthropic without code changes?
With Govyn's smart routing, you can configure routing rules to send different agents to different providers. Your LangChain code only talks to the Govyn proxy — switching providers is a config change, not a code change.

Related integrations

Explore more