Skip to content
OUTATIME · MCP server

Plug OUTATIME into any AI client you already use.

We shipped OUTATIME as its own Model Context Protocol server. That means Claude Desktop, Cursor, Claude Code, Windsurf, or any other tool that speaks MCP can talk to OpenMetadata through our nine time-travel tools, without ever opening our web UI.

stdio JSON-RPCread onlysub-millisecond per call
How it connects
Any MCP client on the left, OpenMetadata on the right, OUTATIME doing the time-travel work in the middle.
What is MCP

Think of it as USB for AI assistants.

MCP stands for Model Context Protocol. It is a small open standard that lets an AI client (like Claude Desktop or Cursor) plug into a server and call its tools. The same way USB lets your laptop talk to a printer, a webcam, or a hard drive without caring who made it.

OUTATIME is a server in that picture. We expose nine tools over MCP. Any compatible client can call those tools to read your metadata, walk lineage, replay history, or simulate a schema change. The AI does the talking, OUTATIME does the work.

In one sentence

“Most teams use MCP. We are an MCP server.”

  • Read only. The server never mutates your graph.
  • Local. Runs on your machine over stdio. Nothing leaves the box.
  • Deterministic. Same input, same output, every time.
  • No keys to hand out. No external API to wake up.
The nine tools

Everything the AI can do, spelled out.

Each tool takes JSON in and returns JSON out. The AI picks which ones to call based on your question.

tool
describe_entity

Get the full record for any table, dashboard, dbt model, or ML feature, plus its owner.

Input
{ "id": "tbl:orders" }
Returns

Entity JSON with columns, tier, cost, and the resolved owner card.

tool
get_lineage_downstream

Walk the lineage graph downstream from any entity. Tells you what breaks if you change the source.

Input
{ "id": "tbl:orders", "max_depth": 4 }
Returns

Ordered list of every entity reachable downstream within the depth.

tool
get_lineage_upstream

Walk the lineage upstream from any entity. Tells you what feeds into it.

Input
{ "id": "dash:exec_finance", "max_depth": 6 }
Returns

Ordered list of every upstream source the entity depends on.

tool
list_dq_failures

Every data quality test that is currently failing or warning, with the table it watches.

Input
{}
Returns

Array of DQ tests where status is not passing.

tool
list_pii_columns

Every column tagged sensitive or restricted across the warehouse.

Input
{}
Returns

Array of { table, column, pii } records.

tool
find_deprecation_candidates

Tables that have zero downstream dashboards or ML features, ranked by monthly cost.

Input
{}
Returns

Tables ranked by spend, lowest-value first.

tool
travel_to_iso

Reconstruct the entire metadata graph as it was at any point in time, plus a diff against today.

Input
{ "iso": "2026-01-15T00:00:00Z" }
Returns

Graph snapshot at that timestamp and the diff to current state.

tool
bisect_metric

Causal-weight root-cause analysis. Give it a broken metric and a good and bad date, get the single most likely cause.

Input
{ "metric": "finance.revenue_daily", "lastGoodIso": "2026-02-10", "firstBadIso": "2026-02-20", "affectedEntityIds": ["dash:exec_finance"] }
Returns

The cause event, contributing events, narrative, and recommendation.

tool
simulate_change

Full Flux Capacitor blast radius. Drop, rename, retype, or deprecate, and see exactly what breaks.

Input
{ "kind": "drop_column", "tableId": "tbl:orders", "column": "customer_region" }
Returns

Affected entities, cost delta, SLA risk, migration plan, Slack drafts, and a PR diff.

Setup

Three steps. No surprises.

  1. 1

    Build the bundle

    Open a terminal in the project folder. Build produces one self-contained file at mcp-server/dist/index.js.

    cd mcp-server
    npm install
    npm run build
  2. 2

    Add to your client config

    Pick your client below. Paste the snippet. Replace the path with the absolute path to mcp-server/dist/index.js on your machine.

  3. 3

    Restart the client

    Quit and reopen the AI client. You should see outatime show up in the tools or connectors list with all nine tools listed.

File
claude_desktop_config.json
~/Library/Application Support/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "outatime": {
      "command": "node",
      "args": ["/absolute/path/to/outatime/mcp-server/dist/index.js"]
    }
  }
}

On macOS. On Windows the file lives at %APPDATA%\\Claude\\claude_desktop_config.json.

Try these

Eight prompts that prove the wiring works.

Once you have OUTATIME wired into your client, paste any of these into the chat. The AI picks the right tools and calls them for you.

Travel to 2026-01-15 and compare my data platform to today.

Simulate dropping orders.customer_region and tell me what breaks.

finance.revenue_daily was healthy on Feb 10 and broken on Feb 20. Bisect it.

List any failing DQ tests and who owns them.

Which tables hold PII?

Which tables cost the most and have no downstream dashboards?

Walk the lineage downstream from tbl:orders.

What does tbl:payments look like and who owns it?

How it talks to OpenMetadata

One JSON call goes in. One honest answer comes out.

01
Your AI client
Claude Desktop, Cursor, Claude Code, Windsurf

Speaks MCP over stdio. When you ask a question, it picks one of our nine tools and sends a JSON-RPC call.

02
OUTATIME MCP server
mcp-server/dist/index.js

Receives the tool call, runs the request through the same TypeScript engines that power our website, and returns JSON.

03
OUTATIME engines
lib/metadata, lib/simulator

Lineage walks, time-travel replay, bisect, simulate. Pure deterministic functions on the metadata graph.

04
OpenMetadata
live REST APIs (or seeded demo graph)

When live mode is on, the engines pull tables, dashboards, lineage, owners, DQ tests, and events directly from OpenMetadata.

One more thing

Most teams use MCP. We are an MCP server.

Same engine that powers our four pillars, exposed as nine tools to any AI client you already trust.