Integrations
Wire Neruva into the tool you already use.
One MCP server. Every modern coding agent. One CLI for everything else. Memory and Substrate, behind a single API key.
MCP clients
Claude Code
Anthropic
claude mcp add neruva \ --env NERUVA_API_KEY=$NERUVA_API_KEY \ -- npx -y @neruva/mcpClaude Code docs →
Cursor
Anysphere
# ~/.cursor/mcp.json
{
"mcpServers": {
"neruva": {
"command": "npx",
"args": ["-y", "@neruva/mcp"],
"env": { "NERUVA_API_KEY": "..." }
}
}
}Cursor docs →Codex
OpenAI
# ~/.codex/config.toml
[mcp_servers.neruva]
command = "npx"
args = ["-y", "@neruva/mcp"]
env = { NERUVA_API_KEY = "..." }Codex docs →Gemini CLI
Google
# ~/.gemini/settings.json
{
"mcpServers": {
"neruva": {
"command": "npx",
"args": ["-y", "@neruva/mcp"],
"env": { "NERUVA_API_KEY": "..." }
}
}
}Gemini CLI docs →Any MCP host
open protocol
# stdio transport, JSON-RPC 2.0 npx -y @neruva/mcp # tools exposed: # memory.upsert, memory.query, memory.delete # hd.kg.add_fact, hd.kg.query # hd.analogy, hd.causal.query, hd.planAny MCP host docs →
Python (pip)
for non-Node hosts
pip install neruva-mcp neruva-mcp # stdio MCP server # or run inline: python -m neruva_mcpPython (pip) docs →
Neruva CLI
Install
npm i -g @neruva/cli # or pip install neruva-cli neruva auth login # paste API key neruva keys list
Memory
neruva memory upsert agents/coder \ --id note-42 --vector @./embedding.json neruva memory query agents/coder \ --vector @./query.json --topK 5 neruva memory export agents/coder > agent.nmm
Substrate
neruva hd kg add team \ --subject alice --rel works_at --obj acme neruva hd kg query team --subject alice --rel works_at neruva hd analogy --a paris --b france --c tokyo neruva hd causal observe --var smoke=1 --query cancer
Use as backend
# Drop-in for Pinecone (Python)
from neruva import Pinecone
pc = Pinecone(api_key="...")
idx = pc.Index("agents")
idx.upsert(vectors=[(id, vec)], namespace="coder")
idx.query(vector=qvec, top_k=5, namespace="coder")SDKs
TypeScript / Node
npm i @neruva/sdk
import { Neruva } from "@neruva/sdk";
const nv = new Neruva({ apiKey: process.env.NERUVA_API_KEY });
await nv.memory.upsert({ index: "agents", namespace: "coder",
vectors: [{ id: "n1", values: [...] }] });
const r = await nv.hd.analogy({ a: "paris", b: "france", c: "tokyo" });Python
pip install neruva
from neruva import Neruva
nv = Neruva(api_key=os.environ["NERUVA_API_KEY"])
nv.memory.upsert(index="agents", namespace="coder",
vectors=[{"id": "n1", "values": [...]}])
ans = nv.hd.causal.query(scm="lifestyle",
query_type="intervention",
condition_var=0, condition_value=1,
query_var=2, query_value=1)Under the hood
- One server, many clients. @neruva/mcp is a thin MCP stdio server. It exposes the same tool set to every host, authenticates with your Neruva API key, and forwards calls to api.neruva.io.
- No keys leave your machine. The MCP server reads NERUVA_API_KEY from the local environment. Vectors and queries travel directly to the Neruva API over TLS.
- Same wallet, every surface. Ops from MCP, CLI, SDK, and raw REST all meter against the single wallet you can see in the dashboard. One ledger, sub-cent precision, exportable.
- Pinecone-compatible. Already wired to pinecone-client? Change the import to from neruva import Pinecone and your existing upsert/query/delete code works unchanged.
One key. Every coding agent.
Sign up. Drop the line into your MCP client. Your agent has persistent memory and HD-native reasoning by lunch.