- Rename Dockerfile.optimized to Dockerfile (now the default)
- Keep old Dockerfile as Dockerfile.old for reference
- Update GitHub Actions to use default Dockerfile
- Remove build-full job - only one image variant now
- Remove docker-compose.optimized.yml and other variants
- Update all documentation to reflect single image approach
The optimized 283MB image is now the only n8n-MCP Docker image.
This simplifies the user experience and provides the best solution
for all use cases.
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Change main build to use Dockerfile.optimized (targets ~200MB image)
- Add separate 'full' build job for development variant (2.6GB)
- Update tagging strategy: 'latest' for optimized, 'full' suffix for full variant
- Update documentation to reflect dual image strategy
- Update docker-compose.yml with variant selection comment
This provides users with two options:
- Optimized (default): Pre-built database, minimal size, for production
- Full: Complete n8n packages, dynamic scanning, for development
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Remove all references to workflow execution/management features
- Delete legacy scripts for bidirectional n8n integration
- Update documentation to focus on node documentation serving only
- Remove old docker-compose files for workflow management
- Add simplified docker-compose.yml for documentation server
- Update CHANGELOG.md to reflect v2.0.0 and v2.1.0 changes
- Update Dockerfile to use v2 paths and database
The project is now clearly focused on serving n8n node documentation
to AI assistants, with no workflow execution capabilities.
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
Major features implemented:
- SQLite storage service with FTS5 for fast node search
- Database rebuild mechanism for bulk node extraction
- MCP tools: search_nodes, extract_all_nodes, get_node_statistics
- Production Docker deployment with persistent storage
- Management scripts for database operations
- Comprehensive test suite for all functionality
Database capabilities:
- Stores node source code and metadata
- Full-text search by node name or content
- No versioning (stores latest only as per requirements)
- Supports complete database rebuilds
- ~4.5MB database with 500+ nodes indexed
Production features:
- Automated deployment script
- Docker Compose production configuration
- Database initialization on first run
- Volume persistence for data
- Management utilities for operations
Documentation:
- Updated README with complete instructions
- Production deployment guide
- Clear troubleshooting section
- API reference for all new tools
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
This commit adds a complete integration between n8n workflow automation and the Model Context Protocol (MCP):
Features:
- MCP server that exposes n8n workflows as tools, resources, and prompts
- Custom n8n node for connecting to MCP servers from workflows
- Bidirectional bridge for data format conversion
- Token-based authentication and credential management
- Comprehensive error handling and logging
- Full test coverage for core components
Infrastructure:
- TypeScript/Node.js project setup with proper build configuration
- Docker support with multi-stage builds
- Development and production docker-compose configurations
- Installation script for n8n custom node deployment
Documentation:
- Detailed README with usage examples and API reference
- Environment configuration templates
- Troubleshooting guide
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>