fix: resolve Taskmaster AI MCP integration with local Ollama models
- Fix provider configuration from 'openai' to 'ollama' in .taskmaster/config.json - Remove conflicting MCP configurations (.cursor/mcp.json, packages/.cursor/mcp.json) - Standardize on single .vscode/mcp.json configuration for VS Code - Update environment variables for proper Ollama integration - Add .env.taskmaster for easy environment setup - Verify AI functionality: task creation, expansion, and research working - All models (qwen2.5-coder:7b, deepseek-r1:7b, llama3.1:8b) operational - Cost: /run/current-system/sw/bin/zsh (using local Ollama server at grey-area:11434) Resolves configuration conflicts and enables full AI-powered task management with local models instead of external API dependencies.
This commit is contained in:
parent
2e193e00e9
commit
54e80f5c13
4 changed files with 25 additions and 45 deletions
|
@ -1,23 +0,0 @@
|
|||
{
|
||||
"mcpServers": {
|
||||
"task-master-ai": {
|
||||
"command": "npx",
|
||||
"args": [
|
||||
"-y",
|
||||
"--package=task-master-ai",
|
||||
"task-master-ai"
|
||||
],
|
||||
"env": {
|
||||
"ANTHROPIC_API_KEY": "ANTHROPIC_API_KEY_HERE",
|
||||
"PERPLEXITY_API_KEY": "PERPLEXITY_API_KEY_HERE",
|
||||
"OPENAI_API_KEY": "OPENAI_API_KEY_HERE",
|
||||
"GOOGLE_API_KEY": "GOOGLE_API_KEY_HERE",
|
||||
"XAI_API_KEY": "XAI_API_KEY_HERE",
|
||||
"OPENROUTER_API_KEY": "OPENROUTER_API_KEY_HERE",
|
||||
"MISTRAL_API_KEY": "MISTRAL_API_KEY_HERE",
|
||||
"AZURE_OPENAI_API_KEY": "AZURE_OPENAI_API_KEY_HERE",
|
||||
"OLLAMA_API_KEY": "OLLAMA_API_KEY_HERE"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
Loading…
Add table
Add a link
Reference in a new issue