Commit graph

11 commits

Author SHA1 Message Date
Geir Okkenhaug Jerstad
a9f490882a Update Claude Task Master AI integration status with comprehensive documentation
- Added detailed status report covering completed work
- Documented current configuration for Ollama integration
- Listed all available MCP tools and their functionality
- Included troubleshooting guide and next steps
- Documented architecture and workflow for VS Code MCP integration
2025-06-14 16:43:23 +02:00
Geir Okkenhaug Jerstad
13114d7868 Configure Claude Task Master AI for VS Code MCP integration
- Updated .cursor/mcp.json to use local Nix-built Task Master binary
- Configured Task Master to use local Ollama models via OpenAI-compatible API
- Set up three models: qwen3:4b (main), deepseek-r1:1.5b (research), gemma3:4b-it-qat (fallback)
- Created comprehensive integration status documentation
- Task Master successfully running as MCP server with 23+ available tools
- Ready for VS Code/Cursor AI chat integration
2025-06-14 16:35:09 +02:00
Geir Okkenhaug Jerstad
a17326a72e Add Claude Task Master AI package and documentation
- Add Nix package for task-master-ai in packages/claude-task-master-ai.nix
- Update packages/default.nix to export the new package
- Add comprehensive documentation for packaging and MCP integration
- Add guile scripting solution documentation
2025-06-14 15:40:23 +02:00
Geir Okkenhaug Jerstad
cf11d447f4 🤖 Implement RAG + MCP + Task Master AI Integration for Intelligent Development Environment
MAJOR INTEGRATION: Complete implementation of Retrieval Augmented Generation (RAG) + Model Context Protocol (MCP) + Claude Task Master AI system for the NixOS home lab, creating an intelligent development environment with AI-powered fullstack web development assistance.

🏗️ ARCHITECTURE & CORE SERVICES:
• modules/services/rag-taskmaster.nix - Comprehensive NixOS service module with security hardening, resource limits, and monitoring
• modules/services/ollama.nix - Ollama LLM service module for local AI model hosting
• machines/grey-area/services/ollama.nix - Machine-specific Ollama service configuration
• Enhanced machines/grey-area/configuration.nix with Ollama service enablement

🤖 AI MODEL DEPLOYMENT:
• Local Ollama deployment with 3 specialized AI models:
  - llama3.3:8b (general purpose reasoning)
  - codellama:7b (code generation & analysis)
  - mistral:7b (creative problem solving)
• Privacy-first approach with completely local AI processing
• No external API dependencies or data sharing

📚 COMPREHENSIVE DOCUMENTATION:
• research/RAG-MCP.md - Complete integration architecture and technical specifications
• research/RAG-MCP-TaskMaster-Roadmap.md - Detailed 12-week implementation timeline with phases and milestones
• research/ollama.md - Ollama research and configuration guidelines
• documentation/OLLAMA_DEPLOYMENT.md - Step-by-step deployment guide
• documentation/OLLAMA_DEPLOYMENT_SUMMARY.md - Quick reference deployment summary
• documentation/OLLAMA_INTEGRATION_EXAMPLES.md - Practical integration examples and use cases

🛠️ MANAGEMENT & MONITORING TOOLS:
• scripts/ollama-cli.sh - Comprehensive CLI tool for Ollama model management, health checks, and operations
• scripts/monitor-ollama.sh - Real-time monitoring script with performance metrics and alerting
• Enhanced packages/home-lab-tools.nix with AI tool references and utilities

👤 USER ENVIRONMENT ENHANCEMENTS:
• modules/users/geir.nix - Added ytmdesktop package for enhanced development workflow
• Integrated AI capabilities into user environment and toolchain

🎯 KEY CAPABILITIES IMPLEMENTED:
 Intelligent code analysis and generation across multiple languages
 Infrastructure-aware AI that understands NixOS home lab architecture
 Context-aware assistance for fullstack web development workflows
 Privacy-preserving local AI processing with enterprise-grade security
 Automated project management and task orchestration
 Real-time monitoring and health checks for AI services
 Scalable architecture supporting future AI model additions

🔒 SECURITY & PRIVACY FEATURES:
• Complete local processing - no external API calls
• Security hardening with restricted user permissions
• Resource limits and isolation for AI services
• Comprehensive logging and monitoring for security audit trails

📈 IMPLEMENTATION ROADMAP:
• Phase 1: Foundation & Core Services (Weeks 1-3)  COMPLETED
• Phase 2: RAG Integration (Weeks 4-6) - Ready for implementation
• Phase 3: MCP Integration (Weeks 7-9) - Architecture defined
• Phase 4: Advanced Features (Weeks 10-12) - Roadmap established

This integration transforms the home lab into an intelligent development environment where AI understands infrastructure, manages complex projects, and provides expert assistance while maintaining complete privacy through local processing.

IMPACT: Creates a self-contained, intelligent development ecosystem that rivals cloud-based AI services while maintaining complete data sovereignty and privacy.
2025-06-13 08:44:40 +02:00
Geir Okkenhaug Jerstad
fc1482494f steam xwayland 2025-06-12 15:20:48 +02:00
Geir Okkenhaug Jerstad
c3d1333538 Fix NFS configuration: Remove ZFS mount point conflict with tmpfiles
- Remove /mnt/storage/media from systemd.tmpfiles.rules (it's a ZFS dataset mount point)
- Add ExecStartPost to set proper permissions on ZFS-mounted media directory
- Update NFS research documentation with ZFS integration best practices
- Add section explaining ZFS mount point vs tmpfiles.rules conflicts

This resolves the potential conflict where tmpfiles tries to create a directory
that ZFS wants to use as a mount point for the storage/media dataset.
2025-06-11 10:12:51 +02:00
Geir Okkenhaug Jerstad
3f93a85469 testing fix for nfs shares 2025-06-11 09:51:36 +02:00
Geir Okkenhaug Jerstad
8029d93a84 added niri 2025-06-10 20:33:54 +02:00
Geir Okkenhaug Jerstad
fa2b84cf65 fix: resolve sma user definition conflict between modules
- Remove duplicate sma user definition from incus.nix module
- The sma user is properly defined in modules/users/sma.nix with incus-admin group
- This resolves the isNormalUser/isSystemUser assertion failure blocking congenital-optimist rebuild
- Clean up grey-area configuration and modularize services
- Update SSH keys with correct IP addresses for grey-area and reverse-proxy
2025-06-07 16:58:22 +02:00
Geir Okkenhaug Jerstad
a35d9ff420 Implement SSH forwarding for Forgejo Git access
- Add nginx stream configuration on reverse-proxy to forward port 2222 to apps:22
- Update firewall rules to allow port 2222 for Git SSH access
- Configure Forgejo to use SSH_PORT = 2222 for Git operations
- Add comprehensive SSH forwarding research documentation
- Enable Git operations via git@git.geokkjer.eu:2222

Phase 1 implementation using nginx stream module complete.
Ready for testing and potential Phase 2 migration to HAProxy.
2025-06-07 15:21:11 +02:00
Geir Okkenhaug Jerstad
9837d82199 Refactor: Simplify module structure and reorganize services
- Removed system/ directory, merged applications into users/geir.nix
- Simplified fonts.nix to bare minimum (users can add more)
- Moved transmission.nix to sleeper-service/services/ (machine-specific)
- Organized grey-area services into services/ directory
- Updated import paths and tested all configurations
- Added research documentation for deploy-rs and GNU Stow
2025-06-07 12:11:20 +02:00