Windsurf (by Codeium) is an agentic IDE with a powerful 'Cascade' agent that shines on multi-step tasks. Configure MCPs in `~/.codeium/windsurf/mcp_config.json` — same JSON shape as Claude Code. Pairs especially well with GitHub, Supabase, and Linear MCPs for full-feature development flows.
Codeium's agentic IDE — a deeply AI-integrated editor with Cascade agent and MCP support
~/.codeium/windsurf/mcp_config.json{
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_..."
}
},
"supabase": {
"command": "npx",
"args": ["-y", "@supabase/mcp-server-supabase@latest"],
"env": {
"SUPABASE_URL": "https://your-project.supabase.co",
"SUPABASE_SERVICE_ROLE_KEY": "sbp_..."
}
}
}
}Real secrets should never be committed — use environment variables or your editor's secrets manager.
Cascade's agentic mode pairs beautifully with GitHub MCP — full 'plan → branch → code → PR' flow in one session.
Windsurf's long-context cascade keeps schema context loaded, so Supabase queries chain cleanly across many steps.
Same reasoning as Supabase — Cascade's memory of past queries makes Postgres MCP especially powerful here.
SaaS builders love Windsurf + Stripe MCP for implementing and testing billing flows end-to-end.
Cascade can break a Linear ticket into subtasks, implement each, open PR, and close the ticket.
Spec-driven development: point Cascade at a Notion spec and it implements iteratively with context.
Scrape docs and API references into Cascade's context. Pairs well with Windsurf's long-context window.
Research during implementation — 'how does library X handle Y?' gets a cited answer without leaving Cascade.
Deploy and debug Vercel projects inline. Cascade can follow a deploy from commit to live URL.
E2E test writing and execution. Cascade's step-by-step agent mode is ideal for iterative test writing.
GitHub MCP is first for most users. After that, pick based on workflow: Supabase/Postgres for data-heavy apps, Linear + Notion for spec-driven teams, Firecrawl + Perplexity for research-heavy work.
`~/.codeium/windsurf/mcp_config.json`. Edit it directly or go to Settings → Cascade → MCP Servers → 'Add Custom MCP Server' for a guided form.
Not officially yet — MCP config is global per user. Workaround: use a wrapper script that reads a project-specific env var, then configure the MCP server with that env in the global config.
Yes — that's one of Cascade's main strengths. It can plan a 10-step workflow, execute MCP tools at each step, inspect outputs, and replan. Great for non-trivial refactors or migrations.
Yes. Windsurf defaults to Codeium's hosted models but supports bring-your-own Claude/GPT keys. MCP servers work the same regardless of model choice.
Anthropic's terminal-based AI coding agent with native MCP support
The AI-first code editor — a VS Code fork with deep Claude/GPT integration and MCP support
Microsoft's flagship editor with Copilot Chat agent mode and native MCP support
Pick from the top 10 curated above, or browse the full marketplace of 100+ MCPs.