This document describes the Model Context Protocol (MCP) server implementation in the Prompt Optimizer system. The MCP server exposes prompt optimization capabilities as standardized tools that can be consumed by MCP-compatible AI applications such as Claude Desktop. It runs as a standalone Node.js HTTP service, either independently or integrated with the Docker deployment alongside the web application.
For general deployment instructions, see Deployment and Configuration. For Docker-specific setup, see Docker Deployment. For environment variable configuration, see Environment Configuration and API Keys.
MCP Server Deployment Architecture
Sources: packages/mcp-server/package.json1-53 README.md182-246 docker-compose.yml1-44
The MCP server operates as an independent service that bridges external MCP clients with the core prompt optimization engine. It accepts MCP protocol requests over HTTP with Server-Sent Events (SSE) transport, processes them using core services, and returns standardized MCP responses.
The MCP server is implemented as the @prompt-optimizer/mcp-server package within the monorepo. Key characteristics:
| Attribute | Value |
|---|---|
| Package name | @prompt-optimizer/mcp-server |
| Main entry | dist/index.js |
| Binary entry | dist/start.cjs |
| Primary dependency | @prompt-optimizer/core (workspace) |
| MCP SDK | @modelcontextprotocol/sdk v1.24.3 |
| HTTP server | express v4.21.2 |
| Minimum Node.js | 18.0.0 |
Sources: packages/mcp-server/package.json1-53
Key Scripts:
pnpm mcp:build - Build the MCP serverpnpm mcp:dev - Run in development mode with hot reloadpnpm mcp:start - Run in production modepnpm mcp:test - Execute unit testsSources: package.json50-53
Server Startup Sequence
Sources: packages/mcp-server/package.json12-14 env.local.example72-91
The server initialization follows this pattern:
preload-env.js script loads environment variables from .env.local before application code runsModelManager, PromptService, LLMService, TemplateManager) are instantiatedsetupDefaultModel() function selects an enabled model based on MCP_DEFAULT_MODEL_PROVIDER/mcpMCP_HTTP_PORT (default 3000)Sources: packages/mcp-server/src/config/models.ts1-69 env.local.example75-80
The MCP server reuses the core package's model management system. The setupDefaultModel() function implements a priority-based model selection:
Model Selection Algorithm
Sources: packages/mcp-server/src/config/models.ts12-66
The function prioritizes:
MCP_DEFAULT_MODEL_PROVIDER is set, attempts to match by:
"custom_qwen3")providerMeta.iddefaultModelscustom_<suffix> patternThe selected model is registered in ModelManager with the key "mcp-default".
Sources: packages/mcp-server/src/config/models.ts28-55 packages/core/src/services/model/defaults.ts99-115
The MCP server exposes three tools that wrap core prompt optimization functionality:
| Tool Name | Purpose | Input Schema | Core Service |
|---|---|---|---|
optimize-user-prompt | Optimize user-facing prompts | original_prompt, task_description?, language? | PromptService.optimizePrompt() |
optimize-system-prompt | Optimize system-level prompts | original_prompt, task_description?, language? | PromptService.optimizePrompt() |
iterate-prompt | Iteratively refine existing prompts | current_prompt, iteration_direction, language? | PromptService.iteratePrompt() |
Sources: docs/user/mcp-server.md5-14 README.md240-246
MCP Tool Execution Sequence
Sources: packages/mcp-server/src/config/models.ts1-69
Each tool handler:
MCP_DEFAULT_LANGUAGE)PromptService for core optimization logicThe MCP server uses Server-Sent Events (SSE) over HTTP for bidirectional streaming communication:
HTTP/SSE Transport Flow
Sources: packages/mcp-server/package.json14 docker-compose.yml10-17
HTTP Endpoint Configuration:
http://localhost:3000/mcphttp://localhost:8081/mcp (Nginx proxy)Streamable HTTP (SSE-based)The Express server configuration:
/mcptext/event-stream for streaming resultsSources: docker-compose.yml11-13 docs/user/mcp-server.md28-32
In Docker deployment, Nginx proxies MCP traffic to the internal Node.js server:
Browser/Client → Nginx:80 → /mcp path → Node.js:3000 → MCP Server
↓
/ path → Web static files
The Nginx configuration includes:
/mcp forwards to http://localhost:3000/mcpauth_basic off; for /mcp route (MCP protocol doesn't support Basic Auth)Sources: docker-compose.yml10-17 docs/user/mcp-server.md169-183
The MCP server requires API keys for at least one LLM provider:
| Variable Pattern | Description | Example |
|---|---|---|
VITE_OPENAI_API_KEY | OpenAI API key | sk-proj-... |
VITE_GEMINI_API_KEY | Google Gemini API key | AIza... |
VITE_DEEPSEEK_API_KEY | DeepSeek API key | sk-... |
VITE_ZHIPU_API_KEY | Zhipu AI API key | ... |
VITE_CUSTOM_API_KEY_<suffix> | Multi-custom model key | ollama-key |
VITE_CUSTOM_API_BASE_URL_<suffix> | Custom API endpoint | http://localhost:11434/v1 |
VITE_CUSTOM_API_MODEL_<suffix> | Custom model ID | qwen2.5:7b |
Sources: env.local.example4-76
| Variable | Default | Options | Purpose |
|---|---|---|---|
MCP_DEFAULT_MODEL_PROVIDER | (first enabled) | openai, gemini, anthropic, deepseek, siliconflow, zhipu, dashscope, openrouter, modelscope, custom, custom_<suffix> | Select preferred model when multiple are configured |
MCP_HTTP_PORT | 3000 | Any valid port | HTTP server port (Docker ignores this) |
MCP_LOG_LEVEL | debug | debug, info, warn, error | Logging verbosity |
MCP_DEFAULT_LANGUAGE | zh | zh, en | Default language for prompts |
Sources: env.local.example75-90 docker-compose.yml32-35
Docker Deployment Notes:
MCP_HTTP_PORT is fixed at 3000 internallyNGINX_PORT (default 80)/mcp path is automatically routed to the MCP serverSources: docker-compose.yml10-17 docker-compose.yml32
Claude Desktop requires a services.json configuration file to connect to the MCP server:
Configuration File Locations:
%APPDATA%\Claude\services~/Library/Application Support/Claude/services~/.config/Claude/servicesExample services.json:
Sources: docs/user/mcp-server.md93-116 README.md219-238
Claude Desktop Connection Flow
Sources: docs/user/mcp-server.md93-116
The connection process:
services.json on startupThe official MCP Inspector tool can test the server:
Inspector Configuration:
http://localhost:5173)Streamable HTTPhttp://localhost:3000/mcp (or http://localhost:8081/mcp for Docker)Sources: docs/user/mcp-server.md119-135
| Issue | Cause | Solution |
|---|---|---|
| Port occupied | Another service on port 3000 | Change MCP_HTTP_PORT or stop conflicting service |
| No enabled models | Missing or invalid API keys | Configure at least one valid API key |
| Provider mismatch | Wrong MCP_DEFAULT_MODEL_PROVIDER | Verify provider name matches configured keys |
| 401 Authentication error (Docker) | Nginx Basic Auth on /mcp | Fixed in v1.4.0+; older versions must disable ACCESS_PASSWORD |
Sources: docs/user/mcp-server.md137-193 docs/user/mcp-server_en.md149-210
Development mode features:
Sources: docs/user/mcp-server.md34-53 packages/mcp-server/package.json13
Production deployment characteristics:
/mcp)Sources: docker-compose.yml1-44 README.md112-127 docs/user/mcp-server.md13-32
The docker-compose.yml defines the integrated deployment:
Sources: docker-compose.yml1-44
The health check validates both:
http://localhost:80/http://localhost:80/mcpThis ensures the entire stack is operational before marking the container as healthy.
MCP Server Service Dependencies
Sources: packages/mcp-server/src/config/models.ts1-69
The MCP server operates entirely through core service APIs:
ModelManager to store and retrieve the selected model configurationPromptService.optimizePrompt() for both user and system promptsPromptService.iteratePrompt() for refinement workflowsTemplateManager provides optimization templates based on prompt type and languageLLMService handles streaming API calls through provider adaptersFileStorageProvider stores model configurations in Node.js filesystemThis architecture ensures the MCP server has zero business logic duplication—it purely provides a protocol adapter layer over existing core functionality.
Sources: packages/mcp-server/src/config/models.ts6-69 packages/core/src/services/model/defaults.ts1-119
Refresh this wiki