![]() - Fix provider configuration from 'openai' to 'ollama' in .taskmaster/config.json - Remove conflicting MCP configurations (.cursor/mcp.json, packages/.cursor/mcp.json) - Standardize on single .vscode/mcp.json configuration for VS Code - Update environment variables for proper Ollama integration - Add .env.taskmaster for easy environment setup - Verify AI functionality: task creation, expansion, and research working - All models (qwen2.5-coder:7b, deepseek-r1:7b, llama3.1:8b) operational - Cost: /run/current-system/sw/bin/zsh (using local Ollama server at grey-area:11434) Resolves configuration conflicts and enables full AI-powered task management with local models instead of external API dependencies. |
||
---|---|---|
.. | ||
docs | ||
reports | ||
templates | ||
config.json | ||
config.json.backup.20250618_125801 | ||
state.json |