Configure Claude Task Master AI for VS Code MCP integration

- Updated .cursor/mcp.json to use local Nix-built Task Master binary
- Configured Task Master to use local Ollama models via OpenAI-compatible API
- Set up three models: qwen3:4b (main), deepseek-r1:1.5b (research), gemma3:4b-it-qat (fallback)
- Created comprehensive integration status documentation
- Task Master successfully running as MCP server with 23+ available tools
- Ready for VS Code/Cursor AI chat integration
This commit is contained in:
Geir Okkenhaug Jerstad 2025-06-14 16:35:09 +02:00
parent ae5b0cf8d0
commit 13114d7868
4 changed files with 96 additions and 0 deletions

12
.cursor/mcp.json Normal file
View file

@ -0,0 +1,12 @@
{
"mcpServers": {
"task-master-ai": {
"command": "/home/geir/Home-lab/result/bin/task-master-ai",
"args": [],
"env": {
"OPENAI_API_KEY": "fake-key-for-local-ollama",
"OPENAI_BASE_URL": "http://grey-area:11434/v1"
}
}
}
}