Framework Integrations
Drop NocturnusAI into your agent stack in one line. Cut token costs by 82–90% (measured on live APIs) with native integrations for the most popular AI frameworks.
Installation
pip install nocturnusai # core SDK
pip install nocturnusai[langchain] # + LangChain tools
pip install nocturnusai[crewai] # + CrewAI BaseTool subclasses
pip install nocturnusai[autogen] # + AutoGen tool functions + Memory
pip install nocturnusai[langgraph] # + LangGraph checkpoint saver
pip install nocturnusai[openai-agents] # + OpenAI Agents SDK tools
pip install nocturnusai[all] # everything LangChain
Seven pre-built tools that plug directly into any LangChain agent. Assert facts, teach rules, query, infer, get salience-ranked context, goal-driven optimization, and LLM fact extraction.
from nocturnusai import SyncNocturnusAIClient
from nocturnusai.langchain import get_nocturnusai_tools
client = SyncNocturnusAIClient("http://localhost:9300")
tools = get_nocturnusai_tools(client)
# Pass tools to any LangChain agent CrewAI
Five BaseTool subclasses with Pydantic input schemas, plus a Storage backend for crew-level knowledge persistence.
from nocturnusai import SyncNocturnusAIClient
from nocturnusai.crewai import get_nocturnusai_tools, NocturnusAIStorage
client = SyncNocturnusAIClient("http://localhost:9300")
tools = get_nocturnusai_tools(client)
storage = NocturnusAIStorage(client=client) AutoGen
Five plain Python tool functions and an async Memory protocol implementation. Works with or without autogen-agentchat.
from nocturnusai import SyncNocturnusAIClient
from nocturnusai.autogen import get_nocturnusai_tools, NocturnusAIMemory
client = SyncNocturnusAIClient("http://localhost:9300")
tools = get_nocturnusai_tools(client)
memory = NocturnusAIMemory(client=client) LangGraph
Checkpoint saver that persists graph state as NocturnusAI facts. Maps threads to scopes for isolation.
from nocturnusai import SyncNocturnusAIClient
from nocturnusai.langgraph import NocturnusAICheckpointSaver
client = SyncNocturnusAIClient("http://localhost:9300")
saver = NocturnusAICheckpointSaver(client=client)
app = graph.compile(checkpointer=saver) OpenAI Agents SDK
Five tool functions, auto-decorated with @function_tool when the package is installed. Falls back to plain functions without it.
from nocturnusai import SyncNocturnusAIClient
from nocturnusai.openai_agents import get_nocturnusai_tools
client = SyncNocturnusAIClient("http://localhost:9300")
tools = get_nocturnusai_tools(client)
# Agent(name="reasoner", tools=tools) Anthropic SDK
JSON schema tool definitions and a dispatcher for the Anthropic Messages API. Zero framework dependencies.
from nocturnusai import SyncNocturnusAIClient
from nocturnusai.anthropic_tools import get_nocturnusai_tool_definitions, handle_tool_call
client = SyncNocturnusAIClient("http://localhost:9300")
tools = get_nocturnusai_tool_definitions()
# response = anthropic.messages.create(tools=tools, ...) Any Framework
Don't see your framework? NocturnusAI works with any Python or TypeScript agent via HTTP API, MCP protocol, or the direct SDK.
Running OpenClaw? Use the dedicated OpenClaw Context Integration guide for MCP registry setup and Context Engine plugin wiring.
Context Optimization Across All Frameworks
Every integration above gives your agent access to optimized context retrieval. In Python and TypeScript app code, call the context-management methods directly. In MCP/tool-first flows, call the context tool each turn and pair it with the HTTP Context API when you need goal-driven windows.
| Integration Surface | Salience Context | Goal Context + Diff |
|---|---|---|
| LangChain / CrewAI / AutoGen / OpenAI Agents / Anthropic (Python SDK) | context() or framework context tool (deprecated: context_window()) | context(goals=...), diff_context(), clear_context_session() (deprecated: optimize_context()) |
| MCP clients (Cursor, Claude Desktop, Windsurf) | JSON-RPC tools/call with tool context | MCP context tool (with goals) + HTTP POST /context/diff |
| TypeScript apps | context() (deprecated: contextWindow()) | context({goals}), diffContext(), clearContextSession() (deprecated: optimizeContext()) |
# Works with any framework above
from nocturnusai import SyncNocturnusAIClient
client = SyncNocturnusAIClient("http://localhost:9300")
ctx = client.context(
goals=[{"predicate": "eligible_for_sla", "args": ["acme_corp"]}],
max_facts=25,
session_id="session-42",
)
# ctx.entries -> optimized facts for this turn
diff = client.diff_context(session_id="session-42")
# diff.added / diff.removed -> send only deltas on later turns