![]() - Updated .cursor/mcp.json to use local Nix-built Task Master binary - Configured Task Master to use local Ollama models via OpenAI-compatible API - Set up three models: qwen3:4b (main), deepseek-r1:1.5b (research), gemma3:4b-it-qat (fallback) - Created comprehensive integration status documentation - Task Master successfully running as MCP server with 23+ available tools - Ready for VS Code/Cursor AI chat integration |
||
---|---|---|
.. | ||
claude-task-master-ai-integration-status.md | ||
claude-task-master-ai-nix-packaging.md | ||
deploy-rs.md | ||
forgejo.md | ||
gnu-stow.md | ||
nfs.md | ||
ollama.md | ||
RAG-MCP-TaskMaster-Roadmap.md | ||
RAG-MCP.md | ||
ssh-forwarding-solutions.md | ||
taskmaster-ai.md |