
- Updated .cursor/mcp.json to use local Nix-built Task Master binary - Configured Task Master to use local Ollama models via OpenAI-compatible API - Set up three models: qwen3:4b (main), deepseek-r1:1.5b (research), gemma3:4b-it-qat (fallback) - Created comprehensive integration status documentation - Task Master successfully running as MCP server with 23+ available tools - Ready for VS Code/Cursor AI chat integration
12 lines
No EOL
324 B
JSON
12 lines
No EOL
324 B
JSON
{
|
|
"mcpServers": {
|
|
"task-master-ai": {
|
|
"command": "/home/geir/Home-lab/result/bin/task-master-ai",
|
|
"args": [],
|
|
"env": {
|
|
"OPENAI_API_KEY": "fake-key-for-local-ollama",
|
|
"OPENAI_BASE_URL": "http://grey-area:11434/v1"
|
|
}
|
|
}
|
|
}
|
|
} |