Meridian as a Grok connector β€” orbital skill routing inside Grok

Meridian Skills Β· custom MCP connector
https://mcp.ask-meridian.uk/mcp Β· OAuth 2.1 + PKCE Β· zero install

Grok shipped custom MCP connectors: paste an MCP server URL, complete an OAuth dance, and Grok auto-discovers whatever tools the server exposes. We just wired Meridian's orbital skill router into it.

One tool: route_task(task). Ask Grok something like "route a task to ranked skills: build a presence indicator over WebSocket with reconnection and per-room rate limiting". Grok calls Meridian, which asks Llama-3.3-70B to author 5 candidate skills, runs them through the orbital classifier, and returns ranked entries β€” each with a celestial class (planet/moon/trojan/asteroid/comet/irregular), a parent skill, a star-system affinity, and a one-line decision rule. Grok lifts the markdown bodies straight into its context window.

The two interesting bits: how the auth works (zero PAT pasting for end users), and why the URL even exists when the rest of Meridian is github-only.

The auth flow β€” single button, no GitHub jargon

Most remote MCP setups in 2026 ask the user to "paste your fine-grained PAT with Models: read permission". That's fine for developers; for a Grok user trying to add a skill router to their chat, it's a wall.

The Meridian connector flips this. Grok hits our authorization endpoint with a standard OAuth 2.1 + PKCE handshake. The user sees:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  πŸͺ                                       β”‚
β”‚  Authorize grok to use Meridian       β”‚
β”‚                                          β”‚
β”‚  Meridian routes your task to the most   β”‚
β”‚  relevant AI skills. Click below and     β”‚
β”‚  grok will be able to call:              β”‚
β”‚   β€’ route_task β€” ranks candidate skills  β”‚
β”‚                                          β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚             Authorize              β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

One click. No PAT field. No "go to github.com/settings/tokens and create a fine-grained one with the Models permission". The Cloudflare Worker holds a single GitHub Models PAT in a secret and uses it for every inference call β€” operator-pays, like Notion's MCP or Linear's MCP. The user's identity inside Grok is enough.

The mechanics:

Grok                Meridian Worker         GitHub Models
 β”‚                       β”‚                       β”‚
 │── /authorize ────────▢│                       β”‚
 β”‚  ?code_challenge=…    β”‚ (HTML "Authorize")    β”‚
 │◀── 302 ?code=… ───────│                       β”‚
 β”‚                       β”‚                       β”‚
 │── /token ────────────▢│ verify PKCE,          β”‚
 β”‚  code_verifier=…      β”‚ issue opaque token    β”‚
 │◀── access_token ──────│ (KV, 1h TTL)          β”‚
 β”‚                       β”‚                       β”‚
 │── /mcp tools/call ───▢│                       β”‚
 β”‚  Bearer <token>        │── Authorization: ────▢│
 β”‚                       β”‚   Bearer <OUR PAT>     β”‚
 β”‚                       │◀── candidate skills ──│
 β”‚                       β”‚ orbital classify (~5ms)β”‚
 │◀── ranked skills ─────│                       β”‚

The access token is a 32-byte opaque random string. It's just a marker in Workers KV that says "this caller completed the OAuth flow." The actual GitHub PAT lives in the Worker secret, never in the token, never in a JWT, never in transit beyond worker β†’ models.github.ai.

For each call, the Worker:

  1. Looks up the bearer in KV (~1 ms)
  2. If found, uses MERIDIAN_GITHUB_TOKEN from the Worker secret to call models.github.ai/inference/chat/completions
  3. Runs the bundled orbital classifier on the LLM's 5 candidate skills (~5 ms, same JS as the browser miniapp)
  4. Returns ranked markdown

Why a hosted URL exists at all

Meridian's stdio MCP (npm install -g meridian-skills-mcp) is the primary product β€” works in Claude Code, Cursor, Windsurf, Goose, every stdio-speaking client. But Grok, ChatGPT custom MCPs, and Claude.ai connectors don't spawn processes. They want a URL.

We tried github-only hosting. GitHub doesn't host servers. Pages serves static files; Actions runs CI; Codespaces auto-stops. There's no GitHub product that receives a POST /mcp over HTTPS and forwards to GitHub Models on behalf of an authenticated caller. So the connector URL needs a runtime somewhere.

Cloudflare Workers became the smallest possible "somewhere": ~250 lines of pure-fetch JavaScript, no Dockerfile, no VM, no cold start, free tier covers thousands of users. The orbital classifier itself runs unchanged β€” same JS file as the npm package, same JS file as the browser miniapp. Only the transport adapter is CF-specific.

Updates ship via GitHub: every push to main that touches cf-worker/** or mcp/_lib/** triggers a workflow (.github/workflows/deploy-worker.yml) that runs wrangler deploy. So even though the runtime lives on Cloudflare, the operational surface is just GitHub β€” code, image, deploy events, observability all flow through the repo.

How to add it to Grok yourself

In Grok's "Add custom connector" dialog, paste:

FieldValue
Server URLhttps://mcp.ask-meridian.uk/mcp
Authorization endpointhttps://mcp.ask-meridian.uk/authorize
Token endpointhttps://mcp.ask-meridian.uk/token
Client IDgrok
Client secret(empty)
Token auth methodnone  (PKCE only)
Scopesroute_task

Grok stores the connector, shows an Authorize button, you click it, see the single-button page, click Authorize, you're back in Grok. route_task appears in the tool list.

Same URL works in ChatGPT custom MCPs and Claude.ai connectors β€” the OAuth dance is standard, the MCP transport is standard. One Worker, three host integrations.

What to ask

The router shines when the question is concrete enough that "skills" makes sense as a unit:

You'll get back 2–5 ranked entries with full markdown bodies, decision rules, and class assignments β€” usually a planet (the anchor skill), one or two moons/trojans (parent-bound companions), and occasionally an irregular if the task spans multiple star systems. Lift the bodies into context, follow the Workflow sections.

Source

Worker code: github.com/LuuOW/meridian-mcp/tree/main/cf-worker. Shared classifier core: mcp/_lib. The npm package (meridian-skills-mcp) ships both stdio and Node-HTTP variants, so you can self-host if you'd rather control the credential.

Want the same plumbing for your own MCP?

The full guide ships every line: stdio + Streamable HTTP transports, OAuth 2.1 + PKCE on Cloudflare Workers, KV-backed auth codes, GitHub Actions auto-deploy, and the connector branding hooks (logo_uri, _meta.iconUrl, favicon endpoints). $29 on Gumroad.

Build Your Own MCP Server β€” $29 β†’