![]() - Updated .cursor/mcp.json to use local Nix-built Task Master binary - Configured Task Master to use local Ollama models via OpenAI-compatible API - Set up three models: qwen3:4b (main), deepseek-r1:1.5b (research), gemma3:4b-it-qat (fallback) - Created comprehensive integration status documentation - Task Master successfully running as MCP server with 23+ available tools - Ready for VS Code/Cursor AI chat integration |
||
---|---|---|
.. | ||
templates | ||
config.json |