title: "MCP Setup in Antigravity IDE - Complete Configuration Guide" description: "Configure Model Context Protocol servers in Antigravity IDE. Advanced context management, multi-model support, and project-wide refactoring with MCP integration." slug: "antigravity" category: "ide" updatedAt: "2026-03-08T00:00:00.000Z" faqs:
- q: "What makes Antigravity different from other AI IDEs?" a: "Antigravity features advanced context awareness, MCP server orchestration, multi-model support, and intelligent project-wide refactoring capabilities."
- q: "Can I use multiple MCP servers simultaneously in Antigravity?" a: "Yes, Antigravity has built-in MCP orchestration that allows you to run and coordinate multiple MCP servers with intelligent context switching."
MCP Setup in Antigravity IDE
Overview
Antigravity is a next-generation AI IDE with advanced context management and native MCP support. It provides intelligent MCP server orchestration, multi-model AI integration, and powerful project-wide refactoring capabilities.
Requirements
- Antigravity IDE (version 1.5+)
- Node.js 18+ or Python 3.10+
- Active Antigravity account
- Administrator access for global MCP configurations
Installation
Step 1: Download Antigravity IDE
- Visit antigravity.dev
- Download the installer for your platform (Mac, Windows, Linux)
- Run the installer and complete setup
- Sign in or create an Antigravity account
Step 2: Install MCP Servers
Antigravity supports both global and project-level MCP installations:
# Global installation (recommended)
npm install -g @modelcontextprotocol/server-filesystem
npm install -g @modelcontextprotocol/server-github
npm install -g @modelcontextprotocol/server-postgres
# Project-level installation
npm install @modelcontextprotocol/server-memory
Configuration
MCP Configuration File
Antigravity uses antigravity.config.json in your project root or global config:
Location:
- Mac/Linux:
~/.config/antigravity/mcp_config.json - Windows:
%APPDATA%\Antigravity\mcp_config.json - Project:
<project-root>/antigravity.config.json
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/Users/username/projects"
],
"priority": "high",
"autoStart": true
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "${env:GITHUB_TOKEN}"
},
"priority": "medium",
"autoStart": true
},
"postgres": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-postgres"],
"env": {
"POSTGRES_CONNECTION_STRING": "${env:DATABASE_URL}"
},
"priority": "low",
"autoStart": false
}
},
"orchestration": {
"enabled": true,
"maxConcurrent": 5,
"contextSharing": true,
"intelligentRouting": true
},
"ai": {
"models": ["claude-3-5-sonnet", "gpt-4-turbo", "gemini-pro"],
"defaultModel": "claude-3-5-sonnet",
"fallbackModel": "gpt-4-turbo"
}
}
Key Features
1. MCP Server Orchestration
Antigravity intelligently manages multiple MCP servers:
- Priority-based execution - High-priority servers get more resources
- Intelligent routing - Queries automatically routed to appropriate servers
- Context sharing - MCP servers can share context for better responses
- Auto-scaling - Resource allocation based on demand
2. Advanced Context Management
{
"contextManagement": {
"maxTokens": 200000,
"smartPruning": true,
"semanticChunking": true,
"crossFileContext": true,
"gitHistoryDepth": 100
}
}
3. Multi-Model Support
Switch between AI models seamlessly:
- Claude 3.5 Sonnet - Best for complex reasoning
- GPT-4 Turbo - Fast and versatile
- Gemini Pro - Strong at code generation
- Local models - Support for Ollama and local LLMs
Usage
Command Palette
Access MCP features via Command Palette (Cmd+Shift+P / Ctrl+Shift+P):
Antigravity: Show MCP Servers- View all active serversAntigravity: Start MCP Server- Start a specific serverAntigravity: Stop All MCP Servers- Stop all running serversAntigravity: Reload MCP Configuration- Reload config without restartAntigravity: MCP Server Logs- View server output logs
MCP Panel
Access the MCP panel in the sidebar:
- Click the MCP icon in the left sidebar
- View active servers, their status, and resource usage
- Enable/disable servers with one click
- View real-time logs and performance metrics
AI Chat with MCP Context
Use natural language with full MCP context:
"Using the filesystem MCP, find all TypeScript files that import React"
"Query the Postgres database for users created in the last 7 days"
"Check GitHub for open pull requests mentioning 'authentication'"
"Refactor this component to use the Redux store from the database schema"
Advanced Configuration
Custom MCP Server
Create a project-specific MCP server:
// mcp-server.js
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
const server = new Server({
name: "my-custom-server",
version: "1.0.0"
}, {
capabilities: {
tools: {},
resources: {}
}
});
// Define custom tools
server.setRequestHandler("tools/list", async () => ({
tools: [
{
name: "analyze_code_quality",
description: "Analyzes code quality metrics",
inputSchema: {
type: "object",
properties: {
filePath: { type: "string" }
}
}
}
]
}));
const transport = new StdioServerTransport();
await server.connect(transport);
Add to antigravity.config.json:
{
"mcpServers": {
"custom": {
"command": "node",
"args": ["./mcp-server.js"],
"cwd": "${workspaceFolder}",
"priority": "high",
"autoStart": true
}
}
}
Environment Variables
Store sensitive data in .env:
GITHUB_TOKEN=ghp_xxxxxxxxxxxxx
DATABASE_URL=postgresql://user:pass@localhost:5432/db
OPENAI_API_KEY=sk-xxxxxxxxxxxxx
ANTHROPIC_API_KEY=sk-ant-xxxxxxxxxxxxx
Reference in config:
{
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "${env:GITHUB_TOKEN}",
"POSTGRES_CONNECTION_STRING": "${env:DATABASE_URL}"
}
}
Performance Optimization
Resource Management
{
"performance": {
"cacheEnabled": true,
"cacheSize": "500MB",
"maxMemoryPerServer": "256MB",
"serverTimeout": 30000,
"retryAttempts": 3
}
}
Intelligent Caching
Antigravity caches MCP responses for:
- Repeated file reads
- Database query results
- GitHub API responses
- Custom tool invocations
Troubleshooting
MCP Server Not Starting
- Check the MCP panel for error messages
- View server logs:
Antigravity: MCP Server Logs - Verify Node.js/Python installation
- Check config file syntax with JSON validator
- Ensure file paths are absolute
High Memory Usage
- Reduce
maxConcurrentin orchestration settings - Lower
maxMemoryPerServerlimit - Disable unused MCP servers
- Enable smart pruning for context management
Connection Timeouts
- Increase
serverTimeoutin performance settings - Check network connectivity for remote servers
- Verify firewall rules
- Use local MCP servers when possible
Best Practices
1. Organize MCP Servers by Priority
- High priority: Frequently used (filesystem, git)
- Medium priority: Occasional use (GitHub, databases)
- Low priority: Specialized tools (analytics, deployment)
2. Use Project-Specific Configurations
Create antigravity.config.json in each project for tailored MCP setups.
3. Enable Context Sharing
Allow MCP servers to share context for better AI responses:
{
"orchestration": {
"contextSharing": true
}
}
4. Monitor Resource Usage
Check the MCP panel regularly to identify resource-intensive servers.
Keyboard Shortcuts
| Action | Mac | Windows/Linux |
|--------|-----|---------------|
| Open MCP Panel | Cmd+Shift+M | Ctrl+Shift+M |
| Quick MCP Command | Cmd+K, M | Ctrl+K, M |
| Toggle MCP Server | Cmd+Shift+T | Ctrl+Shift+T |
| View MCP Logs | Cmd+Shift+L | Ctrl+Shift+L |
Related Guides
- How to Install and Use MCPs in Cursor IDE
- MCPs in VSCode – Complete Setup Guide
- Running MCPs in Windsurf IDE
- Debugging & Testing MCPs with MCP Inspector
FAQ
What makes Antigravity different from other AI IDEs?
Antigravity provides advanced context awareness, intelligent MCP server orchestration, multi-model AI support, and powerful project-wide refactoring. Its MCP orchestration engine can manage multiple servers simultaneously with intelligent routing and context sharing.
Can I use multiple MCP servers simultaneously in Antigravity?
Yes! Antigravity's orchestration engine is designed to run multiple MCP servers concurrently. Configure priority levels and let Antigravity intelligently route requests to the appropriate servers based on context and capability.
Does Antigravity work with self-hosted AI models?
Yes, Antigravity supports local models through Ollama and other local LLM providers. Configure them in the ai.models section of your config file.
How do I share MCP configurations across my team?
Commit antigravity.config.json to your repository (without sensitive env vars) and use .env for secrets. Team members can clone and use the same MCP setup instantly.
Was this guide helpful?
Last updated: March 8, 2026
Edit this page: antigravity/page.mdx