This page documents the current compatibility story for Provenote and OpenAI Codex.
In plain language: if you use Codex as a coding agent, Provenote can plug into that workflow through MCP so Codex can work with notebook outcomes instead of only ephemeral chat context.
Provenote works with OpenAI Codex via MCP, and this repository now ships a public-ready Codex starter bundle.
That is the right public claim today:
repo-owned prep exists: yespublic-ready package available: yespublicly discoverable listing live: no official/public Codex directory listing is claimed hereofficial marketplace listing live: noThis page does not claim that Provenote is an official OpenAI integration, a listed Codex plugin, or a marketplace-ready Codex app.
The public repository and docs are discoverable on the web today. That still does not count as a Codex directory or marketplace listing.
Current fit through MCP:
provenote-mcp.tests/test_mcp_server.py.If you want a checked-in install package instead of a prose-only setup page, use ../../examples/hosts/codex/provenote-outcome-bundle/README.md.
If you want the repo-owned plugin-directory submission materials that stop one step before listing-live truth, use ../../examples/hosts/codex/PLUGIN_DIRECTORY_SUBMISSION.md.
That bundle is public-ready because:
config.toml.example that matches Codex’s documented config surfaceprovenote-mcp entrypoint this page documentsprovenote-mcp.OPEN_NOTEBOOK_URL and OPEN_NOTEBOOK_PASSWORD before the MCP process starts.If you want to verify this page instead of trusting the wording, use this local proof loop:
provenote-mcp.draft.*research_thread.*auditable_run.*Run the repo-owned MCP contract test:
bash tooling/scripts/runtime/run_uv_managed.sh run pytest tests/test_mcp_server.py -q
provenote-mcp as an MCP server and start with one read-first step:
If you want a checked-in local host artifact instead of only this page, start with ../../examples/hosts/codex/provenote-outcome-bundle/README.md, continue with ../../examples/hosts/codex/PLUGIN_DIRECTORY_SUBMISSION.md, and then use ../../examples/hosts/README.md as the broader host-artifact index.
| Surface | Why it matters |
|---|---|
| ../../pyproject.toml | Proves the repo actually ships provenote-mcp |
| ../../packages/core/mcp/server.py | Shows the concrete outcome-tool families Codex can call |
| ../../packages/core/mcp/schemas.py | Shows the typed request shapes behind those tools |
| ../../tests/test_mcp_server.py | Shows the repo keeps a real MCP contract test |
| ../mcp.md | Keeps this host page anchored to the broader MCP truth |
| ../proof.md | Maps the compatibility wording back to inspectable repo surfaces |
| Bucket | What this page can say |
|---|---|
| Safe now | Provenote works with OpenAI Codex via MCP; Codex can register MCP servers; Provenote ships a first-party stdio MCP entrypoint; a public-ready Codex starter bundle and plugin-directory submission pack are available in this repository |
| Not claimed | official OpenAI integration, listed Codex plugin, marketplace-ready Codex app, or OpenAI endorsement |
| Deferred / proof gap | public Skills distribution, OpenClaw support, generic works with every MCP host, hosted/team/autopilot surfaces |
Use the latest official docs for the exact Codex MCP configuration syntax: