
- Fix provider configuration from 'openai' to 'ollama' in .taskmaster/config.json - Remove conflicting MCP configurations (.cursor/mcp.json, packages/.cursor/mcp.json) - Standardize on single .vscode/mcp.json configuration for VS Code - Update environment variables for proper Ollama integration - Add .env.taskmaster for easy environment setup - Verify AI functionality: task creation, expansion, and research working - All models (qwen2.5-coder:7b, deepseek-r1:7b, llama3.1:8b) operational - Cost: /run/current-system/sw/bin/zsh (using local Ollama server at grey-area:11434) Resolves configuration conflicts and enables full AI-powered task management with local models instead of external API dependencies.
14 lines
549 B
Text
14 lines
549 B
Text
# Taskmaster AI Environment Variables
|
|
# Source this file to set up Ollama integration for taskmaster
|
|
export OPENAI_API_BASE="http://grey-area:11434/v1"
|
|
export OPENAI_API_KEY="ollama"
|
|
export OPENAI_BASE_URL="http://grey-area:11434/v1"
|
|
export OLLAMA_BASE_URL="http://grey-area:11434/api"
|
|
export MODEL="qwen2.5-coder:7b"
|
|
export RESEARCH_MODEL="deepseek-r1:7b"
|
|
export FALLBACK_MODEL="llama3.1:8b"
|
|
export MAX_TOKENS="8192"
|
|
export TEMPERATURE="0.3"
|
|
|
|
echo "✅ Taskmaster AI environment variables loaded"
|
|
echo "🤖 Using Ollama models at grey-area:11434"
|