
- Added detailed status report covering completed work - Documented current configuration for Ollama integration - Listed all available MCP tools and their functionality - Included troubleshooting guide and next steps - Documented architecture and workflow for VS Code MCP integration
4.3 KiB
4.3 KiB
Claude Task Master AI - Nix Package & VS Code MCP Integration
Current Status
✅ Completed
- Nix Package: Successfully built and packaged Claude Task Master AI
- Local Installation: Binary available at
/home/geir/Home-lab/result/bin/task-master-ai
- Ollama Integration: Configured to use local Ollama models on grey-area:11434
- VS Code MCP Setup: Configured for integration with Cursor/VS Code
🔧 Configuration Files
Task Master Configuration
- Location:
/home/geir/Home-lab/.taskmaster/config.json
- Models:
- Main:
qwen3:4b
(general tasks) - Research:
deepseek-r1:1.5b
(reasoning tasks) - Fallback:
gemma3:4b-it-qat
(backup)
- Main:
- Provider:
openai
(using OpenAI-compatible API) - Base URL:
http://grey-area:11434/v1
VS Code MCP Configuration
- Location:
/home/geir/Home-lab/.cursor/mcp.json
- Command: Direct path to Nix-built binary
- Environment: OpenAI-compatible mode with local Ollama
🎯 Available MCP Tools
The Task Master MCP server provides these tools for AI assistants:
Project Management
initialize_project
- Set up new Task Master projectmodels
- Configure AI models and check statusparse_prd
- Generate tasks from PRD documents
Task Operations
get_tasks
- List all tasks with filteringget_task
- Get specific task detailsnext_task
- Find next task to work onadd_task
- Create new tasks with AIupdate_task
- Update existing taskset_task_status
- Change task statusremove_task
- Delete tasks
Subtask Management
add_subtask
- Add subtasks to existing tasksupdate_subtask
- Update subtask informationremove_subtask
- Remove subtasksclear_subtasks
- Clear all subtasks from tasks
Advanced Features
expand_task
- Break down tasks into subtasksexpand_all
- Auto-expand all pending tasksanalyze_project_complexity
- Complexity analysiscomplexity_report
- View complexity reports
Dependencies
add_dependency
- Create task dependenciesremove_dependency
- Remove dependenciesvalidate_dependencies
- Check for issuesfix_dependencies
- Auto-fix dependency problems
🚀 Usage in VS Code
- Restart VS Code/Cursor after updating
.cursor/mcp.json
- Access via AI Chat: Use Claude or GitHub Copilot
- Example Commands:
- "Initialize a new Task Master project in my current directory"
- "Create a task for setting up a new home lab service"
- "Show me the next task I should work on"
- "Expand task 5 into detailed subtasks"
🔍 Current Issue
The Task Master binary works as an MCP server but appears to hang when making AI API calls to Ollama. This might be due to:
- Network connectivity between the host and grey-area
- OpenAI API compatibility formatting differences
- Authentication handling with the fake API key
Workaround: Use Task Master through the MCP interface in VS Code, where the AI assistant can handle the tool calls without direct API communication.
🛠️ Troubleshooting
Check Ollama Connectivity
curl -X POST http://grey-area:11434/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model": "qwen3:4b", "messages": [{"role": "user", "content": "Hello"}]}'
Verify Task Master Tools
cd /home/geir/Home-lab
echo '{"jsonrpc": "2.0", "id": 1, "method": "tools/list", "params": {}}' | \
OPENAI_API_KEY="fake-key" OPENAI_BASE_URL="http://grey-area:11434/v1" \
./result/bin/task-master-ai
Check MCP Server Status
In VS Code, open the Output panel and look for MCP server connection logs.
📋 Next Steps
- Test MCP Integration: Try using Task Master tools through VS Code AI chat
- Debug API Connectivity: Investigate why Task Master hangs on API calls
- Create Sample Project: Initialize a test project to validate functionality
- Documentation: Create user guides for common workflows
🏗️ Architecture
VS Code/Cursor (AI Chat)
↓ MCP Protocol
Task Master AI (Nix Binary)
↓ OpenAI-compatible API
Ollama (grey-area:11434)
↓ Model Inference
Local Models (qwen3:4b, deepseek-r1:1.5b, gemma3:4b-it-qat)
This setup provides a complete local AI-powered task management system integrated with your development environment while maintaining full privacy and control over your data.