refactor: major cleanup of legacy and temporary files
- Removed ~965MB of temporary directories (temp/, extracted-nodes/, etc) - Deleted outdated database files and backups (.bak files) - Removed legacy shell scripts (mcp-server-v20.sh, rebuild-v20.sh) - Cleaned up orphan test files and debugging scripts - Removed duplicate schema file (src/db/schema.sql) - Deleted old Dockerfile.old and empty database files - Updated documentation structure in README.md - Added n8n_diagnostic tool to documentation - Condensed version history in CLAUDE.md - Created release notes for v2.7.0 Total impact: Removed 34 files, saving ~965MB+ of disk space 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
31
CHANGELOG.md
Normal file
31
CHANGELOG.md
Normal file
@@ -0,0 +1,31 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [2.7.0] - 2025-06-29
|
||||
|
||||
### Added
|
||||
- New `n8n_diagnostic` tool to help troubleshoot management tools visibility issues
|
||||
- Version utility (`src/utils/version.ts`) to read version from package.json as single source of truth
|
||||
- Script to sync package.runtime.json version (`scripts/sync-runtime-version.js`)
|
||||
|
||||
### Changed
|
||||
- Renamed core MCP files to remove unnecessary suffixes:
|
||||
- `tools-update.ts` → `tools.ts`
|
||||
- `server-update.ts` → `server.ts`
|
||||
- `http-server-fixed.ts` → `http-server.ts`
|
||||
- Updated imports across 21+ files to use the new file names
|
||||
|
||||
### Fixed
|
||||
- Version mismatch issue where version was hardcoded as 2.4.1 instead of reading from package.json (GitHub issue #5)
|
||||
|
||||
### Removed
|
||||
- Legacy HTTP server implementation (`src/http-server.ts`) with known issues
|
||||
- Unused legacy API client (`src/utils/n8n-client.ts`)
|
||||
|
||||
## [Previous versions]
|
||||
|
||||
For changes in versions prior to 2.7.0, please refer to the git history and CLAUDE.md file which contains detailed update notes for versions 2.0.0 through 2.6.3.
|
||||
92
CLAUDE.md
92
CLAUDE.md
@@ -68,94 +68,14 @@ n8n-mcp is a comprehensive documentation and knowledge server that provides AI a
|
||||
- ✅ Smart error handling for API limitations (activation, direct execution)
|
||||
- ✅ Conditional tool registration based on configuration
|
||||
|
||||
## ✅ Previous Updates (v2.5.1)
|
||||
## ✅ Previous Updates
|
||||
|
||||
### Update (v2.5.1) - AI Tool Support Enhancements:
|
||||
- ✅ **NEW: get_node_as_tool_info tool** - Get specific information about using ANY node as an AI tool
|
||||
- ✅ **Enhanced: get_node_info** - Now includes `aiToolCapabilities` section for all nodes
|
||||
- ✅ **Enhanced: list_ai_tools** - Added usage guidance explaining ANY node can be used as a tool
|
||||
- ✅ **Enhanced: WorkflowValidator** - Now validates `ai_tool` connections in workflows
|
||||
- ✅ AI workflow pattern detection - Warns when AI Agents have no tools connected
|
||||
- ✅ Community node detection - Reminds about N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE environment variable
|
||||
- ✅ **NEW: AI Tool TaskTemplates** - Added use_google_sheets_as_tool, use_slack_as_tool, multi_tool_ai_agent
|
||||
- ✅ Comprehensive examples showing how to connect regular nodes as AI tools
|
||||
- ✅ Tool usage documentation with $fromAI() expression examples
|
||||
For a complete history of all updates from v2.0.0 to v2.5.1, please see [CHANGELOG.md](./CHANGELOG.md).
|
||||
|
||||
### Update (v2.5.0) - Complete Workflow Validation:
|
||||
- ✅ **NEW: validate_workflow tool** - Validate entire workflows before deployment
|
||||
- ✅ **NEW: validate_workflow_connections tool** - Check workflow structure and connections
|
||||
- ✅ **NEW: validate_workflow_expressions tool** - Validate all n8n expressions in a workflow
|
||||
- ✅ **NEW: ExpressionValidator** - Comprehensive n8n expression syntax validation
|
||||
- ✅ **NEW: WorkflowValidator** - Complete workflow structure and logic validation
|
||||
- ✅ Detects cycles (infinite loops) in workflows
|
||||
- ✅ Validates node references in expressions ($node["Node Name"])
|
||||
- ✅ Checks for orphaned nodes and missing connections
|
||||
- ✅ Expression syntax validation with common mistake detection
|
||||
- ✅ Workflow best practices analysis with suggestions
|
||||
- ✅ Supports partial validation (nodes only, connections only, expressions only)
|
||||
- ✅ Test coverage for all validation scenarios
|
||||
|
||||
### Update (v2.4.2) - Enhanced Node Configuration Validation:
|
||||
- ✅ **NEW: validate_node_operation tool** - Operation-aware validation with 80%+ fewer false positives
|
||||
- ✅ **NEW: validate_node_minimal tool** - Lightning-fast validation for just required fields
|
||||
- ✅ **NEW: Validation profiles** - Choose between minimal, runtime, ai-friendly, or strict validation
|
||||
- ✅ **NEW: EnhancedConfigValidator** - Smart validation that only checks relevant properties
|
||||
- ✅ **NEW: Node-specific validators** - Custom logic for Slack, Google Sheets, OpenAI, MongoDB, Webhook, Postgres, MySQL
|
||||
- ✅ **NEW: SQL safety features** - Detects SQL injection risks, unsafe DELETE/UPDATE queries
|
||||
- ✅ Added operation context filtering (only validates properties for selected operation)
|
||||
- ✅ Integrated working examples in validation responses when errors found
|
||||
- ✅ Added actionable next steps and auto-fix suggestions
|
||||
- ✅ Basic code syntax validation for JavaScript/Python in Code node
|
||||
- ✅ Dramatic improvement for complex multi-operation nodes
|
||||
- ✅ Test results: Slack validation reduced from 45 errors to 1 error!
|
||||
|
||||
### Update (v2.4.1) - n8n Workflow Templates:
|
||||
- ✅ **NEW: list_node_templates tool** - Find workflow templates using specific nodes
|
||||
- ✅ **NEW: get_template tool** - Get complete workflow JSON for import
|
||||
- ✅ **NEW: search_templates tool** - Search templates by keywords
|
||||
- ✅ **NEW: get_templates_for_task tool** - Get curated templates for common tasks
|
||||
- ✅ Added Templates system with n8n.io API integration
|
||||
- ✅ Templates filtered to last 6 months only (freshness guarantee)
|
||||
- ✅ Manual fetch system - not part of regular rebuild
|
||||
- ✅ Full workflow JSON available for immediate use
|
||||
- ✅ 10 task categories: AI automation, data sync, webhooks, etc.
|
||||
|
||||
### Update (v2.4.0) - AI-Optimized MCP Tools:
|
||||
- ✅ **NEW: get_node_essentials tool** - Returns only 10-20 essential properties (95% size reduction)
|
||||
- ✅ **NEW: search_node_properties tool** - Search for specific properties within nodes
|
||||
- ✅ **NEW: get_node_for_task tool** - Pre-configured settings for 14 common tasks
|
||||
- ✅ **NEW: list_tasks tool** - Discover available task templates
|
||||
- ✅ **NEW: validate_node_config tool** - Validate configurations before use
|
||||
- ✅ **NEW: get_property_dependencies tool** - Analyze property visibility dependencies
|
||||
- ✅ Added PropertyFilter service with curated essential properties for 20+ nodes
|
||||
- ✅ Added ExampleGenerator with working examples for common use cases
|
||||
- ✅ Added TaskTemplates service with 14 pre-configured tasks
|
||||
- ✅ Added ConfigValidator service for comprehensive validation
|
||||
- ✅ Added PropertyDependencies service for dependency analysis
|
||||
- ✅ Enhanced all property descriptions - 100% coverage
|
||||
- ✅ Added version information to essentials response
|
||||
- ✅ Dramatically improved AI agent experience for workflow building
|
||||
- ✅ Response sizes reduced from 100KB+ to <5KB for common nodes
|
||||
|
||||
### Update (v2.3.3) - Automated Dependency Updates & Validation Fixes:
|
||||
- ✅ Implemented automated n8n dependency update system
|
||||
- ✅ Created GitHub Actions workflow for weekly updates
|
||||
- ✅ Fixed validation script to use correct node type format
|
||||
- ✅ Successfully updated to n8n v1.97.1 with all dependencies in sync
|
||||
- ✅ All 525 nodes loading correctly with validation passing
|
||||
|
||||
### Previous Update (v2.3.2) - Complete MCP HTTP Fix:
|
||||
- ✅ Fixed "stream is not readable" error by removing body parsing middleware
|
||||
- ✅ Fixed "Server not initialized" error with direct JSON-RPC implementation
|
||||
- ✅ Created http-server-fixed.ts that bypasses StreamableHTTPServerTransport issues
|
||||
- ✅ Full MCP protocol compatibility without transport complications
|
||||
- ✅ Use `USE_FIXED_HTTP=true` environment variable to enable the fixed server
|
||||
|
||||
### Previous Update (v2.3) - Universal Node.js Compatibility:
|
||||
- ✅ Automatic database adapter fallback system implemented
|
||||
- ✅ Works with ANY Node.js version (no more v20.17.0 requirement)
|
||||
- ✅ Seamless fallback from better-sqlite3 to sql.js
|
||||
- ✅ No manual configuration needed for Claude Desktop
|
||||
Key highlights from recent versions:
|
||||
- **v2.5.x**: AI tool support enhancements, workflow validation, expression validation
|
||||
- **v2.4.x**: AI-optimized tools, workflow templates, enhanced validation profiles
|
||||
- **v2.3.x**: Universal Node.js compatibility, HTTP server fixes, dependency management
|
||||
- ✅ Maintains full functionality with either adapter
|
||||
|
||||
## ✅ Previous Achievements (v2.2)
|
||||
|
||||
@@ -1,87 +0,0 @@
|
||||
# Stage 1: Dependencies
|
||||
FROM node:20-alpine AS deps
|
||||
WORKDIR /app
|
||||
COPY package*.json ./
|
||||
# Configure npm for better reliability in CI
|
||||
RUN npm config set fetch-retries 5 && \
|
||||
npm config set fetch-retry-mintimeout 20000 && \
|
||||
npm config set fetch-retry-maxtimeout 120000 && \
|
||||
npm config set fetch-timeout 300000
|
||||
# Install all dependencies including dev for building
|
||||
RUN npm ci
|
||||
|
||||
# Stage 2: Production Dependencies
|
||||
FROM node:20-alpine AS prod-deps
|
||||
WORKDIR /app
|
||||
COPY package*.json ./
|
||||
# Copy all dependencies from deps stage
|
||||
COPY --from=deps /app/node_modules ./node_modules
|
||||
# Prune to production only (much faster than fresh install)
|
||||
RUN npm prune --omit=dev
|
||||
|
||||
# Stage 3: Builder
|
||||
FROM node:20-alpine AS builder
|
||||
WORKDIR /app
|
||||
COPY package*.json ./
|
||||
COPY --from=deps /app/node_modules ./node_modules
|
||||
COPY . .
|
||||
# Build TypeScript
|
||||
RUN npm run build
|
||||
|
||||
# Stage 4: Runtime
|
||||
FROM node:20-alpine AS runtime
|
||||
WORKDIR /app
|
||||
|
||||
# Install only essential tools (flock is in util-linux)
|
||||
RUN apk add --no-cache curl su-exec util-linux && \
|
||||
rm -rf /var/cache/apk/*
|
||||
|
||||
# Copy production dependencies from prod-deps stage
|
||||
COPY --from=prod-deps /app/node_modules ./node_modules
|
||||
COPY package*.json ./
|
||||
|
||||
# Copy built application
|
||||
COPY --from=builder /app/dist ./dist
|
||||
|
||||
# Create data directory
|
||||
RUN mkdir -p /app/data
|
||||
|
||||
# Copy necessary source files for database initialization
|
||||
COPY src/database/schema.sql ./src/database/
|
||||
COPY scripts ./scripts
|
||||
|
||||
# Copy necessary files
|
||||
COPY .env.example .env.example
|
||||
COPY LICENSE LICENSE
|
||||
COPY README.md README.md
|
||||
|
||||
# Add container labels
|
||||
LABEL org.opencontainers.image.source="https://github.com/czlonkowski/n8n-mcp"
|
||||
LABEL org.opencontainers.image.description="n8n MCP Server - Simple Version"
|
||||
LABEL org.opencontainers.image.licenses="Sustainable-Use-1.0"
|
||||
LABEL org.opencontainers.image.vendor="n8n-mcp"
|
||||
LABEL org.opencontainers.image.title="n8n-mcp"
|
||||
|
||||
# Create data directory and fix permissions
|
||||
RUN mkdir -p /app/data && \
|
||||
addgroup -g 1001 -S nodejs && \
|
||||
adduser -S nodejs -u 1001 && \
|
||||
chown -R nodejs:nodejs /app
|
||||
|
||||
# Copy entrypoint script
|
||||
COPY docker/docker-entrypoint.sh /usr/local/bin/
|
||||
RUN chmod +x /usr/local/bin/docker-entrypoint.sh
|
||||
|
||||
# Switch to non-root user
|
||||
USER nodejs
|
||||
|
||||
# Expose HTTP port
|
||||
EXPOSE 3000
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=10s --start-period=10s --retries=3 \
|
||||
CMD curl -f http://127.0.0.1:3000/health || exit 1
|
||||
|
||||
# Entrypoint
|
||||
ENTRYPOINT ["/usr/local/bin/docker-entrypoint.sh"]
|
||||
CMD ["node", "dist/mcp/index.js"]
|
||||
26
README.md
26
README.md
@@ -387,6 +387,7 @@ These powerful tools allow you to manage n8n workflows directly from Claude. The
|
||||
|
||||
#### System Tools
|
||||
- **`n8n_health_check`** - Check n8n API connectivity and features
|
||||
- **`n8n_diagnostic`** - Troubleshoot management tools visibility and configuration issues
|
||||
- **`n8n_list_available_tools`** - List all available management tools
|
||||
|
||||
### Example Usage
|
||||
@@ -469,20 +470,23 @@ npm run dev:http # HTTP dev mode
|
||||
- [Installation Guide](./docs/INSTALLATION.md) - Comprehensive installation instructions
|
||||
- [Claude Desktop Setup](./docs/README_CLAUDE_SETUP.md) - Detailed Claude configuration
|
||||
- [Docker Guide](./docs/DOCKER_README.md) - Advanced Docker deployment options
|
||||
- [MCP Quick Start](./docs/MCP_QUICK_START_GUIDE.md) - Get started quickly with n8n-MCP
|
||||
|
||||
### Usage & Best Practices
|
||||
### Feature Documentation
|
||||
- [Workflow Diff Operations](./docs/workflow-diff-examples.md) - Token-efficient workflow updates (NEW!)
|
||||
- [Transactional Updates](./docs/transactional-updates-example.md) - Two-pass workflow editing
|
||||
- [MCP Essentials](./docs/MCP_ESSENTIALS_README.md) - AI-optimized tools guide
|
||||
- [Validation System](./docs/validation-improvements-v2.4.2.md) - Smart validation profiles
|
||||
|
||||
### Development & Deployment
|
||||
- [HTTP Deployment](./docs/HTTP_DEPLOYMENT.md) - Remote server setup guide
|
||||
- [Dependency Management](./docs/DEPENDENCY_UPDATES.md) - Keeping n8n packages in sync
|
||||
- [Claude's Interview](./docs/CLAUDE_INTERVIEW.md) - Real-world impact of n8n-MCP
|
||||
- [Claude Project Instructions](#claude-project-setup) - Optimal system instructions for n8n workflows
|
||||
- [MCP Tools List](#-available-mcp-tools) - Complete list of available tools
|
||||
|
||||
### Technical Documentation
|
||||
- [HTTP Deployment (Beta)](./docs/HTTP_DEPLOYMENT.md) - Remote server setup
|
||||
- [Dependency Updates](./docs/DEPENDENCY_UPDATES.md) - Managing n8n dependencies
|
||||
- [Validation Improvements](./docs/validation-improvements-v2.4.2.md) - Validation system details
|
||||
|
||||
### Troubleshooting
|
||||
- [Change Log](./docs/CHANGELOG.md) - Version history and updates
|
||||
- [Docker Fixes](./docs/DOCKER_FIX_IMPLEMENTATION.md) - Docker-specific troubleshooting
|
||||
### Project Information
|
||||
- [Change Log](./CHANGELOG.md) - Complete version history
|
||||
- [Claude Instructions](./CLAUDE.md) - AI guidance for this codebase
|
||||
- [MCP Tools Reference](#-available-mcp-tools) - Complete list of available tools
|
||||
|
||||
## 📊 Metrics & Coverage
|
||||
|
||||
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,136 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
echo "n8n-MCP Claude Desktop Diagnostic"
|
||||
echo "================================="
|
||||
echo ""
|
||||
|
||||
# Check current directory
|
||||
echo "1. Current directory:"
|
||||
pwd
|
||||
echo ""
|
||||
|
||||
# Check git status
|
||||
echo "2. Git status:"
|
||||
git log -1 --oneline
|
||||
git status --short
|
||||
echo ""
|
||||
|
||||
# Check if built
|
||||
echo "3. Build status:"
|
||||
if [ -f "dist/mcp/index.js" ]; then
|
||||
echo "✅ dist/mcp/index.js exists"
|
||||
echo " Last modified: $(stat -f "%Sm" dist/mcp/index.js 2>/dev/null || stat -c "%y" dist/mcp/index.js 2>/dev/null)"
|
||||
else
|
||||
echo "❌ dist/mcp/index.js not found - run 'npm run build'"
|
||||
fi
|
||||
|
||||
if [ -f "dist/mcp/server.js" ]; then
|
||||
echo "✅ dist/mcp/server.js exists"
|
||||
echo " Last modified: $(stat -f "%Sm" dist/mcp/server.js 2>/dev/null || stat -c "%y" dist/mcp/server.js 2>/dev/null)"
|
||||
else
|
||||
echo "❌ dist/mcp/server.js not found"
|
||||
fi
|
||||
|
||||
if [ -f "dist/mcp/tools.js" ]; then
|
||||
echo "✅ dist/mcp/tools.js exists"
|
||||
echo " Last modified: $(stat -f "%Sm" dist/mcp/tools.js 2>/dev/null || stat -c "%y" dist/mcp/tools.js 2>/dev/null)"
|
||||
else
|
||||
echo "❌ dist/mcp/tools.js not found"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check database
|
||||
echo "4. Database status:"
|
||||
if [ -f "data/nodes.db" ]; then
|
||||
echo "✅ data/nodes.db exists"
|
||||
echo " Size: $(du -h data/nodes.db | cut -f1)"
|
||||
echo " Last modified: $(stat -f "%Sm" data/nodes.db 2>/dev/null || stat -c "%y" data/nodes.db 2>/dev/null)"
|
||||
else
|
||||
echo "❌ data/nodes.db not found - run 'npm run rebuild'"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check tools in compiled code
|
||||
echo "5. Compiled tools check:"
|
||||
if [ -f "dist/mcp/tools.js" ]; then
|
||||
TOOL_COUNT=$(grep "name: '" dist/mcp/tools.js | wc -l | tr -d ' ')
|
||||
echo " Total tools found: $TOOL_COUNT"
|
||||
echo " New tools present:"
|
||||
for tool in "get_node_for_task" "validate_node_config" "get_property_dependencies" "list_tasks" "search_node_properties" "get_node_essentials"; do
|
||||
if grep -q "name: '$tool'" dist/mcp/tools.js; then
|
||||
echo " ✅ $tool"
|
||||
else
|
||||
echo " ❌ $tool"
|
||||
fi
|
||||
done
|
||||
else
|
||||
echo "❌ Cannot check - file not found"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Test MCP server
|
||||
echo "6. MCP server test:"
|
||||
if [ -f "dist/mcp/index.js" ] && [ -f "data/nodes.db" ]; then
|
||||
echo " Running quick MCP server test..."
|
||||
if node test-mcp-tools.js 2>/dev/null | grep -q "get_node_for_task: ✅ Found"; then
|
||||
echo "✅ MCP server correctly exposes all new tools"
|
||||
else
|
||||
echo "❌ MCP server issue detected"
|
||||
fi
|
||||
else
|
||||
echo "❌ Cannot test - missing required files"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Check Node.js version
|
||||
echo "7. Node.js version:"
|
||||
node --version
|
||||
echo ""
|
||||
|
||||
# Claude Desktop config location
|
||||
echo "8. Claude Desktop config location:"
|
||||
if [[ "$OSTYPE" == "darwin"* ]]; then
|
||||
CONFIG_PATH="$HOME/Library/Application Support/Claude/claude_desktop_config.json"
|
||||
elif [[ "$OSTYPE" == "linux-gnu"* ]]; then
|
||||
CONFIG_PATH="$HOME/.config/Claude/claude_desktop_config.json"
|
||||
elif [[ "$OSTYPE" == "msys" ]] || [[ "$OSTYPE" == "win32" ]]; then
|
||||
CONFIG_PATH="$APPDATA/Claude/claude_desktop_config.json"
|
||||
fi
|
||||
|
||||
if [ -f "$CONFIG_PATH" ]; then
|
||||
echo "✅ Config file found at: $CONFIG_PATH"
|
||||
echo " Checking n8n-mcp entry..."
|
||||
if grep -q "n8n-mcp" "$CONFIG_PATH" 2>/dev/null; then
|
||||
echo "✅ n8n-mcp server configured"
|
||||
# Extract the path
|
||||
NODE_PATH=$(grep -A5 "n8n-mcp" "$CONFIG_PATH" | grep -o '"args".*\[.*".*"' | sed 's/.*\[\s*"\(.*\)".*/\1/')
|
||||
if [ ! -z "$NODE_PATH" ]; then
|
||||
echo " Configured path: $NODE_PATH"
|
||||
if [ -f "$NODE_PATH" ]; then
|
||||
echo "✅ Path exists"
|
||||
else
|
||||
echo "❌ Path does not exist!"
|
||||
fi
|
||||
fi
|
||||
else
|
||||
echo "❌ n8n-mcp not found in config"
|
||||
fi
|
||||
else
|
||||
echo "❌ Config file not found at expected location: $CONFIG_PATH"
|
||||
fi
|
||||
echo ""
|
||||
|
||||
# Recommended actions
|
||||
echo "9. Recommended actions:"
|
||||
echo " If any ❌ items above:"
|
||||
echo " 1. git pull # Get latest code"
|
||||
echo " 2. npm install # Install dependencies"
|
||||
echo " 3. npm run build # Build TypeScript"
|
||||
echo " 4. npm run rebuild # Rebuild database"
|
||||
echo " 5. Update Claude Desktop config with absolute path to $(pwd)/dist/mcp/index.js"
|
||||
echo " 6. Completely quit and restart Claude Desktop"
|
||||
echo ""
|
||||
echo " If all ✅ but tools still missing in Claude:"
|
||||
echo " - Try removing and re-adding the n8n-mcp server in Claude Desktop config"
|
||||
echo " - Check Claude Desktop logs: View > Developer > Logs"
|
||||
echo ""
|
||||
@@ -1,424 +0,0 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
## [2.7.0] - 2025-06-28
|
||||
|
||||
### Added
|
||||
- **Diff-Based Workflow Editing**: Revolutionary token-efficient workflow updates
|
||||
- **NEW**: `n8n_update_partial_workflow` tool - Update workflows using diff operations for precise, incremental changes
|
||||
- **NEW**: WorkflowDiffEngine - Applies targeted edits without sending full workflow JSON
|
||||
- **NEW**: 13 diff operations - addNode, removeNode, updateNode, moveNode, enableNode, disableNode, addConnection, removeConnection, updateConnection, updateSettings, updateName, addTag, removeTag
|
||||
- **NEW**: Transaction safety - Validates all operations before applying any changes
|
||||
- **NEW**: Validation-only mode - Test your diff operations without applying them
|
||||
- **NEW**: Transactional Updates - Two-pass processing allows adding nodes and connections in any order
|
||||
- Smart node references - Use either node ID or name for operations
|
||||
- Order independence - Add connections before nodes, engine handles dependencies automatically
|
||||
- Operation limit - Maximum 5 operations per request ensures reliability
|
||||
- 80-90% token savings - Only send the changes, not the entire workflow
|
||||
- Comprehensive test coverage - All operations and edge cases tested
|
||||
- Example guide - See [workflow-diff-examples.md](./workflow-diff-examples.md) for usage patterns
|
||||
|
||||
### Changed
|
||||
- **RENAMED**: `n8n_update_workflow` → `n8n_update_full_workflow` - Clarifies that it replaces the entire workflow
|
||||
|
||||
### Fixed
|
||||
- **Docker Runtime Dependencies**: Added missing `uuid` package to runtime dependencies
|
||||
- Fixed MODULE_NOT_FOUND error in Docker containers
|
||||
- Updated both package.json and package.runtime.json
|
||||
- Aligned package versions between main and runtime configurations
|
||||
|
||||
## [2.6.2] - 2025-06-26
|
||||
|
||||
### Added
|
||||
- **Enhanced Workflow Creation Validation**: Major improvements to prevent broken workflows
|
||||
- **NEW**: Node type existence validation - Verifies node types actually exist in n8n
|
||||
- **NEW**: Smart suggestions for common mistakes - Detects `webhook` and suggests `n8n-nodes-base.webhook`
|
||||
- **NEW**: Minimum viable workflow validation - Prevents single-node workflows (except webhooks)
|
||||
- **NEW**: Empty connection detection - Catches multi-node workflows with no connections
|
||||
- **NEW**: Helper functions - `getWorkflowStructureExample()` and `getWorkflowFixSuggestions()`
|
||||
- Enhanced error messages with clear guidance and connection examples
|
||||
- Prevents broken workflows that show as question marks in n8n UI
|
||||
|
||||
### Fixed
|
||||
- **Critical**: `nodes-base.webhook` validation - Now caught BEFORE database lookup
|
||||
- Previously, `nodes-base.webhook` would pass validation because it existed in the normalized database
|
||||
- Now explicitly checks for and rejects `nodes-base.` prefix before any database operations
|
||||
- This fixes the exact issue causing workflows to appear broken in n8n UI
|
||||
|
||||
### Changed
|
||||
- Workflow validator now checks node types in order: invalid patterns → database lookup → suggestions
|
||||
- Connection validation messages now include proper format examples
|
||||
- Error messages are more actionable with specific fix instructions
|
||||
|
||||
## [2.6.1] - 2025-06-26
|
||||
|
||||
### Added
|
||||
- **Enhanced typeVersion Validation**: Comprehensive validation for versioned nodes
|
||||
- **NEW**: typeVersion validation enforced on all versioned nodes
|
||||
- Catches missing typeVersion and provides correct version to use
|
||||
- Warns on outdated versions when newer versions are available
|
||||
- Prevents invalid versions that exceed maximum supported
|
||||
- Helps AI agents avoid common workflow creation mistakes
|
||||
- Test coverage for all typeVersion scenarios
|
||||
|
||||
### Changed
|
||||
- WorkflowValidator now includes typeVersion in its validation pipeline
|
||||
- Enhanced error messages for typeVersion issues with actionable suggestions
|
||||
|
||||
## [2.6.0] - 2025-06-26
|
||||
|
||||
### Added
|
||||
- **n8n Management Tools Integration**: Complete workflow lifecycle management via API
|
||||
- **NEW**: 14 n8n management tools for creating, updating, and executing workflows
|
||||
- **NEW**: `n8n_create_workflow` - Create workflows programmatically with validation
|
||||
- **NEW**: `n8n_update_workflow` - Update existing workflows with safety checks
|
||||
- **NEW**: `n8n_trigger_webhook_workflow` - Execute workflows via webhooks
|
||||
- **NEW**: `n8n_list_executions` - Monitor workflow executions with filtering
|
||||
- **NEW**: `n8n_health_check` - Check n8n instance connectivity and features
|
||||
- **NEW**: Workflow management tools with smart error handling
|
||||
- **NEW**: Execution management tools for monitoring and debugging
|
||||
- Integrated n8n-manager-for-ai-agents functionality
|
||||
- Optional feature - only enabled when N8N_API_URL and N8N_API_KEY configured
|
||||
- Complete workflow lifecycle: discover → build → validate → deploy → execute
|
||||
- Smart error handling for API limitations (activation, direct execution)
|
||||
- Conditional tool registration based on configuration
|
||||
|
||||
### Changed
|
||||
- Updated `start_here_workflow_guide` to include n8n management tools documentation
|
||||
- Enhanced MCP server to conditionally register management tools
|
||||
- Added comprehensive integration tests for all management features
|
||||
|
||||
## [2.5.1] - 2025-01-25
|
||||
|
||||
### Added
|
||||
- **AI Tool Support Enhancement**: Major improvements to AI tool integration
|
||||
- **NEW**: `get_node_as_tool_info` tool - Get specific information about using ANY node as an AI tool
|
||||
- **ENHANCED**: `get_node_info` now includes `aiToolCapabilities` section for all nodes
|
||||
- **ENHANCED**: `list_ai_tools` - Added usage guidance explaining ANY node can be used as a tool
|
||||
- **ENHANCED**: `WorkflowValidator` - Now validates `ai_tool` connections in workflows
|
||||
- AI workflow pattern detection - Warns when AI Agents have no tools connected
|
||||
- Community node detection - Reminds about N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE environment variable
|
||||
- **NEW**: AI Tool TaskTemplates - Added use_google_sheets_as_tool, use_slack_as_tool, multi_tool_ai_agent
|
||||
- Comprehensive examples showing how to connect regular nodes as AI tools
|
||||
- Tool usage documentation with $fromAI() expression examples
|
||||
|
||||
### Changed
|
||||
- Clarified that ANY node can be used as an AI tool, not just nodes with usableAsTool property
|
||||
- Enhanced workflow validation to understand and validate ai_tool connections
|
||||
- Improved expression validation to support $fromAI() dynamic AI parameters
|
||||
|
||||
## [2.5.0] - 2025-01-20
|
||||
|
||||
### Added
|
||||
- **Complete Workflow Validation**: Professional-grade workflow validation system
|
||||
- **NEW**: `validate_workflow` tool - Validate entire workflows before deployment
|
||||
- **NEW**: `validate_workflow_connections` tool - Check workflow structure and connections
|
||||
- **NEW**: `validate_workflow_expressions` tool - Validate all n8n expressions in a workflow
|
||||
- **NEW**: `ExpressionValidator` - Comprehensive n8n expression syntax validation
|
||||
- **NEW**: `WorkflowValidator` - Complete workflow structure and logic validation
|
||||
- Detects cycles (infinite loops) in workflows
|
||||
- Validates node references in expressions ($node["Node Name"])
|
||||
- Checks for orphaned nodes and missing connections
|
||||
- Expression syntax validation with common mistake detection
|
||||
- Workflow best practices analysis with suggestions
|
||||
- Supports partial validation (nodes only, connections only, expressions only)
|
||||
- Test coverage for all validation scenarios
|
||||
|
||||
## [2.4.2] - 2025-01-15
|
||||
|
||||
### Added
|
||||
- **Enhanced Node Configuration Validation**: Operation-aware validation with dramatic accuracy improvements
|
||||
- **NEW**: `validate_node_operation` tool - Operation-aware validation with 80%+ fewer false positives
|
||||
- **NEW**: `validate_node_minimal` tool - Lightning-fast validation for just required fields
|
||||
- **NEW**: Validation profiles - Choose between minimal, runtime, ai-friendly, or strict validation
|
||||
- **NEW**: `EnhancedConfigValidator` - Smart validation that only checks relevant properties
|
||||
- **NEW**: Node-specific validators - Custom logic for Slack, Google Sheets, OpenAI, MongoDB, Webhook, Postgres, MySQL
|
||||
- **NEW**: SQL safety features - Detects SQL injection risks, unsafe DELETE/UPDATE queries
|
||||
- Added operation context filtering (only validates properties for selected operation)
|
||||
- Integrated working examples in validation responses when errors found
|
||||
- Added actionable next steps and auto-fix suggestions
|
||||
- Basic code syntax validation for JavaScript/Python in Code node
|
||||
- Dramatic improvement for complex multi-operation nodes
|
||||
- Test results: Slack validation reduced from 45 errors to 1 error!
|
||||
|
||||
### Removed
|
||||
- Deprecated `validate_node_config` tool in favor of new operation-aware validators
|
||||
|
||||
## [2.4.1] - 2025-01-10
|
||||
|
||||
### Added
|
||||
- **n8n Workflow Templates**: Integration with n8n.io template library
|
||||
- **NEW**: `list_node_templates` tool - Find workflow templates using specific nodes
|
||||
- **NEW**: `get_template` tool - Get complete workflow JSON for import
|
||||
- **NEW**: `search_templates` tool - Search templates by keywords
|
||||
- **NEW**: `get_templates_for_task` tool - Get curated templates for common tasks
|
||||
- Added Templates system with n8n.io API integration
|
||||
- Templates filtered to last 6 months only (freshness guarantee)
|
||||
- Manual fetch system - not part of regular rebuild
|
||||
- Full workflow JSON available for immediate use
|
||||
- 10 task categories: AI automation, data sync, webhooks, etc.
|
||||
|
||||
## [2.4.0] - 2025-01-05
|
||||
|
||||
### Added
|
||||
- **AI-Optimized MCP Tools**: Dramatically improved AI agent experience
|
||||
- **NEW**: `get_node_essentials` tool - Returns only 10-20 essential properties (95% size reduction)
|
||||
- **NEW**: `search_node_properties` tool - Search for specific properties within nodes
|
||||
- **NEW**: `get_node_for_task` tool - Pre-configured settings for 14 common tasks
|
||||
- **NEW**: `list_tasks` tool - Discover available task templates
|
||||
- **NEW**: `validate_node_config` tool - Validate configurations before use
|
||||
- **NEW**: `get_property_dependencies` tool - Analyze property visibility dependencies
|
||||
- Added PropertyFilter service with curated essential properties for 20+ nodes
|
||||
- Added ExampleGenerator with working examples for common use cases
|
||||
- Added TaskTemplates service with 14 pre-configured tasks
|
||||
- Added ConfigValidator service for comprehensive validation
|
||||
- Added PropertyDependencies service for dependency analysis
|
||||
- Enhanced all property descriptions - 100% coverage
|
||||
- Added version information to essentials response
|
||||
- Response sizes reduced from 100KB+ to <5KB for common nodes
|
||||
|
||||
### Changed
|
||||
- **License Change**: Changed from Apache 2.0 to MIT License for wider adoption
|
||||
- Fixed missing AI and LangChain node documentation
|
||||
- Improved documentation mapping for better coverage
|
||||
|
||||
## [2.3.3] - 2025-06-16
|
||||
|
||||
### Added
|
||||
- **Automated Dependency Update System**: Comprehensive solution for keeping n8n packages in sync
|
||||
- Custom update script (`scripts/update-n8n-deps.js`) that respects n8n's interdependencies
|
||||
- GitHub Actions workflow for weekly automated updates
|
||||
- Renovate configuration as an alternative solution
|
||||
- Dependency update documentation guide
|
||||
- Support for automatic n8n package version synchronization
|
||||
- Documentation updates reflecting current metrics
|
||||
|
||||
### Fixed
|
||||
- **Validation Script Node Type References**: Fixed node type format issues
|
||||
- Changed from short names (e.g., 'httpRequest') to full names (e.g., 'nodes-base.httpRequest')
|
||||
- Removed versioned check for Code node as it's not consistently detected
|
||||
- All validation tests now pass after dependency updates
|
||||
|
||||
### Changed
|
||||
- Updated n8n dependencies to latest versions:
|
||||
- n8n: 1.14.1 → 1.97.1
|
||||
- n8n-core: 1.14.1 → 1.96.0
|
||||
- n8n-workflow: 1.82.0 → 1.94.0
|
||||
- @n8n/n8n-nodes-langchain: 1.97.0 → 1.96.1
|
||||
- Significant increase in detected nodes and capabilities:
|
||||
- Total nodes: 458 → 525
|
||||
- AI-capable tools: 35 → 263 (major increase due to updated detection)
|
||||
- Nodes with properties: 98.7% → 99%
|
||||
- Nodes with operations: 57.9% → 63.6%
|
||||
|
||||
### Technical Details
|
||||
- Dependency update script now checks n8n's required dependency versions
|
||||
- Validation script uses correct database column names
|
||||
- All critical nodes (httpRequest, code, slack, agent) validate successfully
|
||||
|
||||
## [2.3.2] - 2025-06-14
|
||||
|
||||
### Fixed
|
||||
- **HTTP Server Stream Error**: Complete fix for "stream is not readable" error
|
||||
- Removed Express body parsing middleware that was consuming request streams
|
||||
- Fixed "Server not initialized" error with direct JSON-RPC implementation
|
||||
- Added `USE_FIXED_HTTP=true` environment variable for stable HTTP mode
|
||||
- Bypassed problematic StreamableHTTPServerTransport implementation
|
||||
- HTTP server now works reliably with average response time of ~12ms
|
||||
- Updated all HTTP server implementations to preserve raw streams
|
||||
|
||||
### Added
|
||||
- `http-server-fixed.ts` - Direct JSON-RPC implementation
|
||||
- `ConsoleManager` utility for stream isolation
|
||||
- `MCP Engine` interface for service integration
|
||||
- Comprehensive documentation for HTTP server fixes
|
||||
|
||||
### Changed
|
||||
- Default HTTP mode now uses fixed implementation when `USE_FIXED_HTTP=true`
|
||||
- Updated Docker configuration to use fixed implementation by default
|
||||
- Improved error handling and logging in HTTP mode
|
||||
|
||||
## [2.3.1] - 2025-06-14
|
||||
|
||||
### Added
|
||||
- **Single-Session Architecture**: Initial attempt to fix HTTP server issues
|
||||
- Implemented session reuse across requests
|
||||
- Added console output isolation
|
||||
- Created engine interface for service integration
|
||||
|
||||
### Fixed
|
||||
- Partial fix for "stream is not readable" error (completed in v2.3.2)
|
||||
|
||||
## [2.3.0] - 2024-12-06
|
||||
|
||||
### Added
|
||||
- **HTTP Remote Deployment**: Single-user HTTP server for remote access
|
||||
- Stateless architecture for simple deployments
|
||||
- Bearer token authentication
|
||||
- Compatible with mcp-remote adapter for Claude Desktop
|
||||
- New HTTP mode scripts and deployment helper
|
||||
- **Universal Node.js Compatibility**: Automatic database adapter fallback system
|
||||
- Primary adapter: `better-sqlite3` for optimal performance
|
||||
- Fallback adapter: `sql.js` (pure JavaScript) for version mismatches
|
||||
- Automatic detection and switching between adapters
|
||||
- No manual configuration required
|
||||
- Database adapter abstraction layer (`src/database/database-adapter.ts`)
|
||||
- Version detection and logging for troubleshooting
|
||||
- sql.js dependency for pure JavaScript SQLite implementation
|
||||
- HTTP server implementation (`src/http-server.ts`)
|
||||
- Deployment documentation and scripts
|
||||
|
||||
### Changed
|
||||
- Updated all database operations to use the adapter interface
|
||||
- Removed Node.js v20.17.0 requirement - now works with ANY version
|
||||
- Simplified Claude Desktop setup - no wrapper scripts needed
|
||||
- Enhanced error messages for database initialization
|
||||
- Made all MCP tool handlers async for proper initialization
|
||||
|
||||
### Fixed
|
||||
- NODE_MODULE_VERSION mismatch errors with Claude Desktop
|
||||
- Native module compilation issues in restricted environments
|
||||
- Compatibility issues when running with different Node.js versions
|
||||
- Database initialization race conditions in HTTP mode
|
||||
|
||||
### Technical Details
|
||||
- Better-sqlite3: ~10-50x faster (when compatible)
|
||||
- sql.js: ~2-5x slower but universally compatible
|
||||
- Both adapters maintain identical API and functionality
|
||||
- Automatic persistence for sql.js with 100ms debounced saves
|
||||
- HTTP server uses StreamableHTTPServerTransport for MCP compatibility
|
||||
|
||||
## [2.2.0] - 2024-12-06
|
||||
|
||||
### Added
|
||||
- PropertyExtractor class for dedicated property/operation extraction
|
||||
- NodeRepository for proper JSON serialization/deserialization
|
||||
- Support for @n8n/n8n-nodes-langchain package (59 AI nodes)
|
||||
- AI tool detection (35 tools with usableAsTool property)
|
||||
- Test suite for critical node validation
|
||||
- Comprehensive documentation (README, SETUP, CHANGELOG)
|
||||
- Example configuration files for Claude Desktop
|
||||
- Node.js v20.17.0 wrapper scripts for compatibility
|
||||
|
||||
### Fixed
|
||||
- Empty properties/operations arrays (now 98.7% nodes have properties)
|
||||
- Versioned node detection (HTTPRequest, Code properly identified)
|
||||
- Documentation mapping for nodes with directory-based docs
|
||||
- Critical node validation (httpRequest, slack, code all pass)
|
||||
|
||||
### Changed
|
||||
- Refactored parser to handle instance-level properties
|
||||
- Updated MCP server to use NodeRepository
|
||||
- Improved rebuild script with validation
|
||||
- Enhanced database schema with proper typing
|
||||
|
||||
### Metrics
|
||||
- 458 total nodes (100% success rate)
|
||||
- 452 nodes with properties (98.7%)
|
||||
- 265 nodes with operations (57.9%)
|
||||
- 406 nodes with documentation (88.6%)
|
||||
- 35 AI-capable tools detected
|
||||
- All critical nodes validated
|
||||
|
||||
## [2.1.0] - 2025-01-08
|
||||
|
||||
### Added
|
||||
- Remote deployment capabilities via HTTP/JSON-RPC transport
|
||||
- Domain configuration through environment variables (`MCP_DOMAIN`)
|
||||
- Bearer token authentication for remote access
|
||||
- Comprehensive remote deployment documentation
|
||||
- PM2 and Nginx configuration examples
|
||||
- HTTP server mode (`npm run start:http`)
|
||||
|
||||
### Enhanced
|
||||
- Support for both local (stdio) and remote (HTTP) deployment modes
|
||||
- Production deployment guide for VM/cloud environments
|
||||
- Claude Desktop configuration for remote servers
|
||||
|
||||
## [2.0.0] - 2025-01-08
|
||||
|
||||
### Major Refactoring
|
||||
- **BREAKING CHANGE**: Refocused project to serve only n8n node documentation
|
||||
- Removed all workflow execution and management features
|
||||
- Removed bidirectional n8n-MCP integration
|
||||
- Simplified to be a read-only documentation server
|
||||
|
||||
### Added
|
||||
- SQLite database with full-text search (FTS5) for node documentation
|
||||
- Integration with n8n-docs repository for official documentation
|
||||
- Automatic example workflow generation for each node type
|
||||
- Comprehensive node information including:
|
||||
- Source code
|
||||
- Official documentation
|
||||
- Usage examples
|
||||
- Properties schema
|
||||
- Credential definitions
|
||||
|
||||
### New MCP Tools
|
||||
- `list_nodes` - List available nodes with filtering
|
||||
- `get_node_info` - Get complete node information
|
||||
- `search_nodes` - Full-text search across nodes
|
||||
- `get_node_example` - Get example workflow for a node
|
||||
- `get_node_source_code` - Get only source code
|
||||
- `get_node_documentation` - Get only documentation
|
||||
- `rebuild_database` - Rebuild entire node database
|
||||
- `get_database_statistics` - Database statistics
|
||||
|
||||
### Infrastructure
|
||||
- New database schema optimized for documentation storage
|
||||
- `DocumentationFetcher` for n8n-docs repository integration
|
||||
- `ExampleGenerator` for creating node usage examples
|
||||
- `NodeDocumentationService` for database management
|
||||
|
||||
## [1.1.0] - 2024-01-07
|
||||
|
||||
### Added
|
||||
- Node source code extraction capability via `get_node_source_code` tool
|
||||
- List available nodes functionality with `list_available_nodes` tool
|
||||
- `NodeSourceExtractor` utility for file system access to n8n nodes
|
||||
- Resource endpoint `nodes://source/{nodeType}` for accessing node source code
|
||||
- Docker test environment with mounted n8n node_modules
|
||||
- Comprehensive test suite for AI Agent node extraction
|
||||
- Test documentation in `docs/AI_AGENT_EXTRACTION_TEST.md`
|
||||
|
||||
### Enhanced
|
||||
- MCP server can now access and extract n8n node implementations
|
||||
- Support for extracting credential definitions alongside node code
|
||||
- Package metadata included in extraction results
|
||||
|
||||
## [1.0.0] - 2024-01-07
|
||||
|
||||
### Initial Release
|
||||
- Complete n8n-MCP integration implementation
|
||||
- MCP server exposing n8n workflows as tools, resources, and prompts
|
||||
- Custom n8n node for connecting to MCP servers
|
||||
- Bidirectional data format conversion bridge
|
||||
- Token-based authentication system
|
||||
- Comprehensive error handling and logging
|
||||
- Full test coverage for core components
|
||||
- Docker support with production and development configurations
|
||||
- Installation scripts for n8n custom node deployment
|
||||
|
||||
### MCP Tools
|
||||
- `execute_workflow` - Execute n8n workflows
|
||||
- `list_workflows` - List available workflows
|
||||
- `get_workflow` - Get workflow details
|
||||
- `create_workflow` - Create new workflows
|
||||
- `update_workflow` - Update existing workflows
|
||||
- `delete_workflow` - Delete workflows
|
||||
- `get_executions` - Get execution history
|
||||
- `get_execution_data` - Get execution details
|
||||
|
||||
### MCP Resources
|
||||
- `workflow://active` - Active workflows
|
||||
- `workflow://all` - All workflows
|
||||
- `execution://recent` - Recent executions
|
||||
- `credentials://types` - Credential types
|
||||
- `nodes://available` - Available nodes
|
||||
|
||||
### MCP Prompts
|
||||
- `create_workflow_prompt` - Workflow creation
|
||||
- `debug_workflow_prompt` - Workflow debugging
|
||||
- `optimize_workflow_prompt` - Workflow optimization
|
||||
- `explain_workflow_prompt` - Workflow explanation
|
||||
@@ -1,163 +0,0 @@
|
||||
# Docker stdio Fix Implementation Plan for n8n-MCP
|
||||
|
||||
Based on community research and successful MCP Docker deployments, here's a streamlined fix for the initialization timeout issue.
|
||||
|
||||
## Root Cause
|
||||
|
||||
Docker treats container stdout as a pipe (not TTY), causing block buffering. The MCP server's JSON-RPC responses sit in the buffer instead of being immediately sent to Claude Desktop, causing a 60-second timeout.
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
### Step 1: Test Simple Interactive Mode
|
||||
|
||||
First, verify if just using `-i` flag solves the issue:
|
||||
|
||||
**Update README.md:**
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"n8n-mcp": {
|
||||
"command": "docker",
|
||||
"args": [
|
||||
"run",
|
||||
"-i", // Interactive mode - keeps stdin open
|
||||
"--rm",
|
||||
"-e", "MCP_MODE=stdio",
|
||||
"-e", "LOG_LEVEL=error",
|
||||
"-e", "DISABLE_CONSOLE_OUTPUT=true",
|
||||
"ghcr.io/czlonkowski/n8n-mcp:latest"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Test command:**
|
||||
```bash
|
||||
echo '{"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"2024-11-05"},"id":1}' | \
|
||||
docker run -i --rm \
|
||||
-e MCP_MODE=stdio \
|
||||
-e LOG_LEVEL=error \
|
||||
-e DISABLE_CONSOLE_OUTPUT=true \
|
||||
ghcr.io/czlonkowski/n8n-mcp:latest
|
||||
```
|
||||
|
||||
Expected: Should receive a JSON response immediately.
|
||||
|
||||
### Step 2: Add Explicit Stdout Flushing (If Needed)
|
||||
|
||||
If Step 1 doesn't work, add minimal flushing to the Node.js server:
|
||||
|
||||
**File: `src/mcp/server-update.ts`**
|
||||
|
||||
Update the `run()` method:
|
||||
```typescript
|
||||
async run(): Promise<void> {
|
||||
await this.ensureInitialized();
|
||||
|
||||
const transport = new StdioServerTransport();
|
||||
await this.server.connect(transport);
|
||||
|
||||
// Ensure stdout is not buffered in Docker
|
||||
if (!process.stdout.isTTY && process.env.IS_DOCKER) {
|
||||
// Force unbuffered stdout
|
||||
process.stdout.write('');
|
||||
}
|
||||
|
||||
logger.info('n8n Documentation MCP Server running on stdio transport');
|
||||
|
||||
// Keep process alive
|
||||
process.stdin.resume();
|
||||
}
|
||||
```
|
||||
|
||||
**File: `Dockerfile`**
|
||||
|
||||
Add environment variable:
|
||||
```dockerfile
|
||||
ENV IS_DOCKER=true
|
||||
```
|
||||
|
||||
### Step 3: System-Level Unbuffering (Last Resort)
|
||||
|
||||
Only if Steps 1-2 fail, implement stdbuf wrapper:
|
||||
|
||||
**File: `docker-entrypoint.sh`**
|
||||
```bash
|
||||
#!/bin/sh
|
||||
# Force line buffering for stdio communication
|
||||
exec stdbuf -oL -eL node /app/dist/mcp/index.js
|
||||
```
|
||||
|
||||
**File: `Dockerfile`**
|
||||
```dockerfile
|
||||
# Add stdbuf utility
|
||||
RUN apk add --no-cache coreutils
|
||||
|
||||
# Copy and setup entrypoint
|
||||
COPY docker-entrypoint.sh /docker-entrypoint.sh
|
||||
RUN chmod +x /docker-entrypoint.sh
|
||||
|
||||
ENTRYPOINT ["/docker-entrypoint.sh"]
|
||||
```
|
||||
|
||||
## Testing Protocol
|
||||
|
||||
### 1. Local Docker Test
|
||||
```bash
|
||||
# Build test image
|
||||
docker build -t n8n-mcp:test .
|
||||
|
||||
# Test with echo pipe
|
||||
echo '{"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"2024-11-05"},"id":1}' | \
|
||||
docker run -i --rm n8n-mcp:test | head -1
|
||||
|
||||
# Should see immediate JSON response
|
||||
```
|
||||
|
||||
### 2. Claude Desktop Test
|
||||
1. Update `claude_desktop_config.json` with new configuration
|
||||
2. Restart Claude Desktop
|
||||
3. Check Developer tab for "running" status
|
||||
4. Test a simple MCP command
|
||||
|
||||
### 3. Debug if Needed
|
||||
```bash
|
||||
# Run with stderr output for debugging
|
||||
docker run -i --rm \
|
||||
-e MCP_MODE=stdio \
|
||||
-e LOG_LEVEL=debug \
|
||||
ghcr.io/czlonkowski/n8n-mcp:latest 2>debug.log
|
||||
```
|
||||
|
||||
## Success Criteria
|
||||
|
||||
- [ ] No timeout errors in Claude Desktop logs
|
||||
- [ ] MCP tools are accessible immediately
|
||||
- [ ] No "Shutting down..." messages in stdout
|
||||
- [ ] Simple `-i` flag configuration works
|
||||
|
||||
## Rollout Plan
|
||||
|
||||
1. **Test locally** with simple `-i` flag first
|
||||
2. **Update Docker image** only if code changes needed
|
||||
3. **Update README** with working configuration
|
||||
4. **Community announcement** with simple Docker instructions
|
||||
|
||||
## Key Insights from Research
|
||||
|
||||
- Most MCP Docker deployments work with just `-i` flag
|
||||
- Complex solutions often unnecessary
|
||||
- Node.js typically doesn't need explicit unbuffering (unlike Python with `-u`)
|
||||
- Claude Desktop only supports stdio for local servers (not HTTP)
|
||||
- Proper testing can quickly identify if buffering is the actual issue
|
||||
|
||||
## What NOT to Do
|
||||
|
||||
- Don't add TTY flag (`-t`) - it's for terminal UI, not needed for MCP
|
||||
- Don't implement complex multi-phase solutions
|
||||
- Don't switch to HTTP transport (Claude Desktop doesn't support it locally)
|
||||
- Don't modify MCP protocol handling
|
||||
- Don't add unnecessary wrapper scripts unless proven necessary
|
||||
|
||||
The solution should be as simple as possible - likely just the `-i` flag in the Docker command.
|
||||
@@ -1,252 +0,0 @@
|
||||
# Docker MCP Initialization Timeout Fix Plan
|
||||
|
||||
## Problem Summary
|
||||
|
||||
The n8n-MCP Docker container fails to work with Claude Desktop due to MCP initialization timeout:
|
||||
|
||||
1. Claude sends `initialize` request
|
||||
2. Server receives it (logs show "Message from client: {"method":"initialize"...}")
|
||||
3. Server appears to connect successfully
|
||||
4. **No response is sent back to Claude**
|
||||
5. Claude times out after 60 seconds
|
||||
6. Container outputs "Shutting down..." which breaks JSON-RPC protocol
|
||||
|
||||
## Root Cause Analysis
|
||||
|
||||
### 1. **Stdout Buffering in Docker**
|
||||
|
||||
Docker containers often buffer stdout, especially when not running with TTY (`-t` flag). This is the most likely culprit:
|
||||
|
||||
- Node.js/JavaScript may buffer stdout when not connected to a TTY
|
||||
- Docker's stdout handling differs from direct execution
|
||||
- The MCP response might be stuck in the buffer
|
||||
|
||||
**Evidence:**
|
||||
- Common Docker issue (moby/moby#1385, docker/compose#1549)
|
||||
- Python requires `-u` flag for unbuffered output in Docker
|
||||
- Different base images have different buffering behavior
|
||||
|
||||
### 2. **MCP SDK Server Connection Issue**
|
||||
|
||||
The server might not be properly completing the connection handshake:
|
||||
|
||||
```typescript
|
||||
const transport = new StdioServerTransport();
|
||||
await server.connect(transport); // This might not complete properly
|
||||
```
|
||||
|
||||
### 3. **Missing Initialize Handler**
|
||||
|
||||
While the MCP SDK should handle `initialize` automatically, there might be an issue with:
|
||||
- Handler registration order
|
||||
- Server capabilities configuration
|
||||
- Transport initialization timing
|
||||
|
||||
### 4. **Process Lifecycle Management**
|
||||
|
||||
The container might be:
|
||||
- Exiting too early
|
||||
- Not keeping the event loop alive
|
||||
- Missing proper signal handling
|
||||
|
||||
## Fixing Plan
|
||||
|
||||
### Phase 1: Immediate Fixes (High Priority)
|
||||
|
||||
#### 1.1 Force Stdout Flushing
|
||||
|
||||
**File:** `src/mcp/server-update.ts`
|
||||
|
||||
Add explicit stdout flushing after server connection:
|
||||
|
||||
```typescript
|
||||
async run(): Promise<void> {
|
||||
await this.ensureInitialized();
|
||||
|
||||
const transport = new StdioServerTransport();
|
||||
await this.server.connect(transport);
|
||||
|
||||
// Force flush stdout
|
||||
if (process.stdout.isTTY === false) {
|
||||
process.stdout.write('', () => {}); // Force flush
|
||||
}
|
||||
|
||||
logger.info('n8n Documentation MCP Server running on stdio transport');
|
||||
|
||||
// Keep process alive
|
||||
process.stdin.resume();
|
||||
}
|
||||
```
|
||||
|
||||
#### 1.2 Add TTY Support to Docker
|
||||
|
||||
**File:** `Dockerfile`
|
||||
|
||||
Add environment variable to detect Docker:
|
||||
|
||||
```dockerfile
|
||||
ENV IS_DOCKER=true
|
||||
ENV NODE_OPTIONS="--max-old-space-size=2048"
|
||||
```
|
||||
|
||||
**File:** Update Docker command in README
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"n8n-mcp": {
|
||||
"command": "docker",
|
||||
"args": [
|
||||
"run",
|
||||
"--rm",
|
||||
"-i",
|
||||
"-t", // Add TTY allocation
|
||||
"--init", // Proper signal handling
|
||||
"-e", "MCP_MODE=stdio",
|
||||
"-e", "LOG_LEVEL=error",
|
||||
"-e", "DISABLE_CONSOLE_OUTPUT=true",
|
||||
"ghcr.io/czlonkowski/n8n-mcp:latest"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Phase 2: Robust Fixes (Medium Priority)
|
||||
|
||||
#### 2.1 Implement Explicit Initialize Handler
|
||||
|
||||
**File:** `src/mcp/server-update.ts`
|
||||
|
||||
Add explicit initialize handler to ensure response:
|
||||
|
||||
```typescript
|
||||
import {
|
||||
InitializeRequestSchema,
|
||||
InitializeResult
|
||||
} from '@modelcontextprotocol/sdk/types.js';
|
||||
|
||||
private setupHandlers(): void {
|
||||
// Add explicit initialize handler
|
||||
this.server.setRequestHandler(InitializeRequestSchema, async (request) => {
|
||||
logger.debug('Handling initialize request', request);
|
||||
|
||||
const result: InitializeResult = {
|
||||
protocolVersion: "2024-11-05",
|
||||
capabilities: {
|
||||
tools: {}
|
||||
},
|
||||
serverInfo: {
|
||||
name: "n8n-documentation-mcp",
|
||||
version: "1.0.0"
|
||||
}
|
||||
};
|
||||
|
||||
// Force immediate flush
|
||||
if (process.stdout.isTTY === false) {
|
||||
process.stdout.write('', () => {});
|
||||
}
|
||||
|
||||
return result;
|
||||
});
|
||||
|
||||
// ... existing handlers
|
||||
}
|
||||
```
|
||||
|
||||
#### 2.2 Add Docker-Specific Stdio Handling
|
||||
|
||||
**File:** Create `src/utils/docker-stdio.ts`
|
||||
|
||||
```typescript
|
||||
export class DockerStdioTransport extends StdioServerTransport {
|
||||
constructor() {
|
||||
super();
|
||||
|
||||
// Disable buffering for Docker
|
||||
if (process.env.IS_DOCKER === 'true') {
|
||||
process.stdout.setDefaultEncoding('utf8');
|
||||
if (process.stdout._handle && process.stdout._handle.setBlocking) {
|
||||
process.stdout._handle.setBlocking(true);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
protected async writeMessage(message: string): Promise<void> {
|
||||
await super.writeMessage(message);
|
||||
|
||||
// Force flush in Docker
|
||||
if (process.env.IS_DOCKER === 'true') {
|
||||
process.stdout.write('', () => {});
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Phase 3: Alternative Approaches (Low Priority)
|
||||
|
||||
#### 3.1 Use Wrapper Script
|
||||
|
||||
Create a Node.js wrapper that ensures proper buffering:
|
||||
|
||||
**File:** `docker-entrypoint.js`
|
||||
|
||||
```javascript
|
||||
#!/usr/bin/env node
|
||||
|
||||
// Disable all buffering
|
||||
process.stdout._handle?.setBlocking?.(true);
|
||||
process.stdin.setRawMode?.(false);
|
||||
|
||||
// Import and run the actual server
|
||||
require('./dist/mcp/index.js');
|
||||
```
|
||||
|
||||
#### 3.2 Switch to HTTP Transport for Docker
|
||||
|
||||
Consider using HTTP transport instead of stdio for Docker deployments, as it doesn't have buffering issues.
|
||||
|
||||
## Testing Plan
|
||||
|
||||
1. **Local Testing:**
|
||||
```bash
|
||||
# Test with Docker TTY
|
||||
docker run -it --rm ghcr.io/czlonkowski/n8n-mcp:latest
|
||||
|
||||
# Test initialize response
|
||||
echo '{"jsonrpc":"2.0","method":"initialize","params":{"protocolVersion":"2024-11-05"},"id":1}' | \
|
||||
docker run -i --rm ghcr.io/czlonkowski/n8n-mcp:latest
|
||||
```
|
||||
|
||||
2. **Claude Desktop Testing:**
|
||||
- Apply fixes incrementally
|
||||
- Test with each configuration change
|
||||
- Monitor Claude Desktop logs
|
||||
|
||||
3. **Debug Output:**
|
||||
Add temporary debug logging to stderr:
|
||||
```typescript
|
||||
console.error('DEBUG: Received initialize');
|
||||
console.error('DEBUG: Sending response');
|
||||
```
|
||||
|
||||
## Implementation Priority
|
||||
|
||||
1. **Immediate:** Add `-t` flag to Docker command (no code changes)
|
||||
2. **High:** Force stdout flushing in server code
|
||||
3. **Medium:** Add explicit initialize handler
|
||||
4. **Low:** Create Docker-specific transport class
|
||||
|
||||
## Success Criteria
|
||||
|
||||
- Claude Desktop connects without timeout
|
||||
- No "Shutting down..." message in JSON stream
|
||||
- Tools are accessible after connection
|
||||
- Connection remains stable
|
||||
|
||||
## References
|
||||
|
||||
- [Docker stdout buffering issue](https://github.com/moby/moby/issues/1385)
|
||||
- [MCP TypeScript SDK](https://github.com/modelcontextprotocol/typescript-sdk)
|
||||
- [Python unbuffered mode in Docker](https://stackoverflow.com/questions/39486327/stdout-being-buffered-in-docker-container)
|
||||
- [MCP initialization timeout issues](https://github.com/modelcontextprotocol/servers/issues/57)
|
||||
@@ -1,132 +0,0 @@
|
||||
# HTTP Server Final Fix Documentation
|
||||
|
||||
## Problem Summary
|
||||
|
||||
The n8n-MCP HTTP server experienced two critical issues:
|
||||
|
||||
1. **"stream is not readable" error** - Caused by Express.json() middleware consuming the request stream
|
||||
2. **"Server not initialized" error** - Caused by StreamableHTTPServerTransport's internal state management
|
||||
|
||||
## Solution Overview
|
||||
|
||||
We implemented a two-phase fix:
|
||||
|
||||
### Phase 1: Stream Preservation (v2.3.2)
|
||||
- Removed global `express.json()` middleware
|
||||
- Allowed StreamableHTTPServerTransport to read raw request stream
|
||||
- This fixed the "stream is not readable" error but revealed the initialization issue
|
||||
|
||||
### Phase 2: Direct JSON-RPC Implementation
|
||||
- Created `http-server-fixed.ts` that bypasses StreamableHTTPServerTransport
|
||||
- Implements JSON-RPC protocol directly
|
||||
- Handles MCP methods: initialize, tools/list, tools/call
|
||||
- Maintains full protocol compatibility
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### The Fixed Server (`http-server-fixed.ts`)
|
||||
|
||||
```javascript
|
||||
// Instead of using StreamableHTTPServerTransport
|
||||
const transport = new StreamableHTTPServerTransport({...});
|
||||
|
||||
// We handle JSON-RPC directly
|
||||
req.on('data', chunk => body += chunk);
|
||||
req.on('end', () => {
|
||||
const jsonRpcRequest = JSON.parse(body);
|
||||
// Handle request based on method
|
||||
});
|
||||
```
|
||||
|
||||
### Key Features:
|
||||
1. **No body parsing middleware** - Preserves raw stream
|
||||
2. **Direct JSON-RPC handling** - Avoids transport initialization issues
|
||||
3. **Persistent MCP server** - Single instance handles all requests
|
||||
4. **Manual method routing** - Implements initialize, tools/list, tools/call
|
||||
|
||||
### Supported Methods:
|
||||
- `initialize` - Returns server capabilities
|
||||
- `tools/list` - Returns available tools
|
||||
- `tools/call` - Executes specific tools
|
||||
|
||||
## Usage
|
||||
|
||||
### Environment Variables:
|
||||
- `MCP_MODE=http` - Enable HTTP mode
|
||||
- `USE_FIXED_HTTP=true` - Use the fixed implementation
|
||||
- `AUTH_TOKEN` - Authentication token (32+ chars recommended)
|
||||
|
||||
### Starting the Server:
|
||||
```bash
|
||||
# Local development
|
||||
MCP_MODE=http USE_FIXED_HTTP=true AUTH_TOKEN=your-secure-token npm start
|
||||
|
||||
# Docker
|
||||
docker run -d \
|
||||
-e MCP_MODE=http \
|
||||
-e USE_FIXED_HTTP=true \
|
||||
-e AUTH_TOKEN=your-secure-token \
|
||||
-p 3000:3000 \
|
||||
ghcr.io/czlonkowski/n8n-mcp:latest
|
||||
```
|
||||
|
||||
### Testing:
|
||||
```bash
|
||||
# Initialize
|
||||
curl -X POST http://localhost:3000/mcp \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-secure-token" \
|
||||
-d '{"jsonrpc":"2.0","method":"initialize","params":{},"id":1}'
|
||||
|
||||
# List tools
|
||||
curl -X POST http://localhost:3000/mcp \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-secure-token" \
|
||||
-d '{"jsonrpc":"2.0","method":"tools/list","id":2}'
|
||||
|
||||
# Call tool
|
||||
curl -X POST http://localhost:3000/mcp \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-secure-token" \
|
||||
-d '{"jsonrpc":"2.0","method":"tools/call","params":{"name":"get_node_info","arguments":{"nodeType":"httpRequest"}},"id":3}'
|
||||
```
|
||||
|
||||
## Technical Details
|
||||
|
||||
### Why StreamableHTTPServerTransport Failed
|
||||
|
||||
1. **Stateful Design**: The transport expects to maintain session state
|
||||
2. **Initialization Sequence**: Requires specific handshake before accepting requests
|
||||
3. **Stream Consumption**: Conflicts with Express middleware patterns
|
||||
4. **Version Issues**: Despite fixes in v1.10.1+, issues persist with stateless usage
|
||||
|
||||
### Why Direct Implementation Works
|
||||
|
||||
1. **No Middleware Conflicts**: We control the entire request lifecycle
|
||||
2. **No State Requirements**: Each request is handled independently
|
||||
3. **Protocol Compliance**: Still implements standard JSON-RPC 2.0
|
||||
4. **Simplicity**: Fewer moving parts mean fewer failure points
|
||||
|
||||
## Performance Characteristics
|
||||
|
||||
- **Memory Usage**: ~10-20MB base, grows with database queries
|
||||
- **Response Time**: <50ms for most operations
|
||||
- **Concurrent Requests**: Handles multiple requests without session conflicts
|
||||
- **Database Access**: Single persistent connection, no connection overhead
|
||||
|
||||
## Future Considerations
|
||||
|
||||
1. **Streaming Support**: Current implementation doesn't support SSE streaming
|
||||
2. **Session Management**: Could add optional session tracking if needed
|
||||
3. **Protocol Extensions**: Easy to add new JSON-RPC methods
|
||||
4. **Migration Path**: Can switch back to StreamableHTTPServerTransport when fixed
|
||||
|
||||
## Conclusion
|
||||
|
||||
The fixed implementation provides a stable, production-ready HTTP server for n8n-MCP that:
|
||||
- Works reliably without stream errors
|
||||
- Maintains MCP protocol compatibility
|
||||
- Simplifies debugging and maintenance
|
||||
- Provides better performance characteristics
|
||||
|
||||
This solution demonstrates that sometimes bypassing problematic libraries and implementing core functionality directly is the most pragmatic approach.
|
||||
@@ -1,32 +0,0 @@
|
||||
# Docker Fix for v2.6.2 - Missing Runtime Dependencies
|
||||
|
||||
## Issue
|
||||
After v2.6.0, the Docker image started failing with:
|
||||
```
|
||||
Error: Cannot find module 'axios'
|
||||
```
|
||||
|
||||
## Root Cause
|
||||
In v2.6.0, we added n8n management tools that require:
|
||||
- `axios` - For HTTP API calls to n8n instances
|
||||
- `zod` - For workflow validation schemas
|
||||
|
||||
However, our ultra-optimized Docker image uses `package.runtime.json` which didn't include these new dependencies.
|
||||
|
||||
## Fix
|
||||
Updated `package.runtime.json` to include:
|
||||
```json
|
||||
"axios": "^1.10.0",
|
||||
"zod": "^3.25.32"
|
||||
```
|
||||
|
||||
## Impact
|
||||
- Docker image size increases slightly (~5MB) but remains much smaller than full n8n dependencies
|
||||
- No changes needed to Dockerfile itself - just the runtime package list
|
||||
- Users need to pull the latest Docker image after rebuild
|
||||
|
||||
## Prevention
|
||||
When adding new features that require additional npm packages, always check:
|
||||
1. Is the package needed at runtime?
|
||||
2. If yes, add it to `package.runtime.json` for Docker builds
|
||||
3. Test the Docker image before release
|
||||
@@ -1,135 +0,0 @@
|
||||
# Phase 2 Improvements - v2.4.2
|
||||
|
||||
## 🎯 Overview
|
||||
|
||||
Following the successful implementation of operation-aware validation, Phase 2 adds professional-grade features that make the validation system even more powerful and flexible.
|
||||
|
||||
## ✅ Implemented Features
|
||||
|
||||
### 1. **Validation Profiles** 🎨
|
||||
|
||||
Different validation levels for different use cases:
|
||||
|
||||
```typescript
|
||||
validate_node_operation({
|
||||
nodeType: "nodes-base.slack",
|
||||
config: { ... },
|
||||
profile: "minimal" // or "runtime", "ai-friendly", "strict"
|
||||
})
|
||||
```
|
||||
|
||||
#### Available Profiles:
|
||||
|
||||
| Profile | Purpose | What it checks |
|
||||
|---------|---------|----------------|
|
||||
| **minimal** | Quick check | Only missing required fields |
|
||||
| **runtime** | Pre-execution | Critical errors + security warnings |
|
||||
| **ai-friendly** | Balanced (default) | Errors + helpful warnings |
|
||||
| **strict** | Code review | Everything + best practices |
|
||||
|
||||
### 2. **New Node Validators** 🔧
|
||||
|
||||
Added comprehensive validators for commonly used nodes:
|
||||
|
||||
#### **Webhook Validator**
|
||||
- Path format validation (no spaces, special chars)
|
||||
- Response mode checks
|
||||
- HTTP method validation
|
||||
- Authentication warnings
|
||||
|
||||
#### **PostgreSQL Validator**
|
||||
- SQL injection detection
|
||||
- DELETE/UPDATE without WHERE warnings
|
||||
- Operation-specific validation (insert, update, delete, execute)
|
||||
- Query safety checks
|
||||
|
||||
#### **MySQL Validator**
|
||||
- Similar to PostgreSQL
|
||||
- MySQL-specific syntax checks
|
||||
- Timezone configuration suggestions
|
||||
|
||||
### 3. **validate_node_minimal Tool** ⚡
|
||||
|
||||
Lightning-fast validation for just required fields:
|
||||
|
||||
```json
|
||||
{
|
||||
"nodeType": "nodes-base.slack",
|
||||
"displayName": "Slack",
|
||||
"valid": false,
|
||||
"missingRequiredFields": ["Channel"]
|
||||
}
|
||||
```
|
||||
|
||||
- No warnings
|
||||
- No suggestions
|
||||
- No examples
|
||||
- Just missing required fields
|
||||
- Perfect for quick checks
|
||||
|
||||
### 4. **SQL Safety Features** 🛡️
|
||||
|
||||
Comprehensive SQL query validation:
|
||||
- Detects template expressions that could be vulnerable
|
||||
- Warns about DELETE/UPDATE without WHERE
|
||||
- Catches dangerous operations (DROP, TRUNCATE)
|
||||
- Suggests parameterized queries
|
||||
- Database-specific checks (PostgreSQL $$ quotes, MySQL backticks)
|
||||
|
||||
## 📊 Impact
|
||||
|
||||
### Before Phase 2:
|
||||
- Single validation mode
|
||||
- Limited node coverage (4 nodes)
|
||||
- No SQL safety checks
|
||||
- Fixed validation behavior
|
||||
|
||||
### After Phase 2:
|
||||
- 4 validation profiles for different needs
|
||||
- 7+ nodes with specific validators
|
||||
- Comprehensive SQL injection prevention
|
||||
- Flexible validation based on use case
|
||||
- Ultra-fast minimal validation option
|
||||
|
||||
## 🚀 Usage Examples
|
||||
|
||||
### Using Validation Profiles:
|
||||
```javascript
|
||||
// Quick check - just required fields
|
||||
validate_node_minimal({
|
||||
nodeType: "nodes-base.webhook",
|
||||
config: { responseMode: "lastNode" }
|
||||
})
|
||||
// Result: Missing required field "path"
|
||||
|
||||
// Pre-execution validation
|
||||
validate_node_operation({
|
||||
nodeType: "nodes-base.postgres",
|
||||
config: {
|
||||
operation: "execute",
|
||||
query: "DELETE FROM users WHERE id = ${userId}"
|
||||
},
|
||||
profile: "runtime"
|
||||
})
|
||||
// Result: SQL injection warning
|
||||
|
||||
// Strict validation for code review
|
||||
validate_node_operation({
|
||||
nodeType: "nodes-base.slack",
|
||||
config: { /* valid config */ },
|
||||
profile: "strict"
|
||||
})
|
||||
// Result: Suggestions for best practices
|
||||
```
|
||||
|
||||
## 🎉 Summary
|
||||
|
||||
Phase 2 transforms the validation system from a simple checker into a comprehensive validation framework:
|
||||
|
||||
1. **Flexibility** - Choose validation level based on your needs
|
||||
2. **Safety** - SQL injection detection and prevention
|
||||
3. **Speed** - Minimal validation for quick checks
|
||||
4. **Coverage** - More nodes with specific validation logic
|
||||
5. **Intelligence** - Context-aware suggestions and warnings
|
||||
|
||||
The validation system now provides professional-grade safety and flexibility while maintaining the simplicity that makes it useful for AI agents.
|
||||
@@ -1,42 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# n8n-MCP Server Wrapper Script for Node v20.17.0
|
||||
# This ensures the server runs with the correct Node version
|
||||
|
||||
# Get the directory where this script is located
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
|
||||
# Change to the script directory
|
||||
cd "$SCRIPT_DIR"
|
||||
|
||||
# Use Node v20.17.0 specifically (what Claude Desktop uses)
|
||||
# Update this path to match your nvm installation
|
||||
export PATH="$HOME/.nvm/versions/node/v20.17.0/bin:$PATH"
|
||||
|
||||
# Verify we're using the right version
|
||||
NODE_VERSION=$(node --version)
|
||||
if [ "$NODE_VERSION" != "v20.17.0" ]; then
|
||||
echo "Error: Wrong Node.js version. Expected v20.17.0, got $NODE_VERSION" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check if node_modules exists
|
||||
if [ ! -d "node_modules" ]; then
|
||||
echo "Error: node_modules not found. Please run 'npm install' first." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check if database exists
|
||||
if [ ! -f "data/nodes.db" ]; then
|
||||
echo "Error: Database not found. Please run 'npm run rebuild' first." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check if dist directory exists
|
||||
if [ ! -d "dist" ]; then
|
||||
echo "Error: dist directory not found. Please run 'npm run build' first." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Run the MCP server
|
||||
exec node "$SCRIPT_DIR/dist/mcp/index.js"
|
||||
60
release-notes-v2.7.0.md
Normal file
60
release-notes-v2.7.0.md
Normal file
@@ -0,0 +1,60 @@
|
||||
# n8n-MCP v2.7.0 Release Notes
|
||||
|
||||
## 🎉 What's New
|
||||
|
||||
### 🔧 File Refactoring & Version Management
|
||||
- **Renamed core MCP files** to remove unnecessary suffixes for cleaner codebase:
|
||||
- `tools-update.ts` → `tools.ts`
|
||||
- `server-update.ts` → `server.ts`
|
||||
- `http-server-fixed.ts` → `http-server.ts`
|
||||
- **Fixed version management** - Now reads from package.json as single source of truth (fixes #5)
|
||||
- **Updated imports** across 21+ files to use the new file names
|
||||
|
||||
### 🔍 New Diagnostic Tool
|
||||
- **Added `n8n_diagnostic` tool** - Helps troubleshoot why n8n management tools might not be appearing
|
||||
- Shows environment variable status, API connectivity, and tool availability
|
||||
- Provides step-by-step troubleshooting guidance
|
||||
- Includes verbose mode for additional debug information
|
||||
|
||||
### 🧹 Code Cleanup
|
||||
- Removed legacy HTTP server implementation with known issues
|
||||
- Removed unused legacy API client
|
||||
- Added version utility for consistent version handling
|
||||
- Added script to sync runtime package version
|
||||
|
||||
## 📦 Installation
|
||||
|
||||
### Docker (Recommended)
|
||||
```bash
|
||||
docker pull ghcr.io/czlonkowski/n8n-mcp:2.7.0
|
||||
```
|
||||
|
||||
### Claude Desktop
|
||||
Update your configuration to use the latest version:
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"n8n-mcp": {
|
||||
"command": "docker",
|
||||
"args": ["run", "-i", "--rm", "ghcr.io/czlonkowski/n8n-mcp:2.7.0"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## 🐛 Bug Fixes
|
||||
- Fixed version mismatch where version was hardcoded as 2.4.1 instead of reading from package.json
|
||||
- Improved error messages for better debugging
|
||||
|
||||
## 📚 Documentation Updates
|
||||
- Condensed version history in CLAUDE.md
|
||||
- Updated documentation structure in README.md
|
||||
- Removed outdated documentation files
|
||||
- Added n8n_diagnostic tool to documentation
|
||||
|
||||
## 🙏 Acknowledgments
|
||||
Thanks to all contributors and users who reported issues!
|
||||
|
||||
---
|
||||
|
||||
**Full Changelog**: https://github.com/czlonkowski/n8n-mcp/blob/main/CHANGELOG.md
|
||||
@@ -1,88 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Direct test of the server functionality without MCP protocol
|
||||
*/
|
||||
|
||||
const { N8NDocumentationMCPServer } = require('../dist/mcp/server');
|
||||
|
||||
async function testDirect() {
|
||||
console.log('🧪 Direct server test\n');
|
||||
|
||||
try {
|
||||
// Initialize server
|
||||
console.log('Initializing server...');
|
||||
const server = new N8NDocumentationMCPServer();
|
||||
|
||||
// Wait for initialization
|
||||
await new Promise(resolve => setTimeout(resolve, 1000));
|
||||
|
||||
console.log('Server initialized successfully\n');
|
||||
|
||||
// Test get_node_essentials
|
||||
console.log('Testing get_node_essentials...');
|
||||
try {
|
||||
const result = await server.executeTool('get_node_essentials', {
|
||||
nodeType: 'nodes-base.httpRequest'
|
||||
});
|
||||
|
||||
console.log('✅ Success!');
|
||||
console.log('Result type:', typeof result);
|
||||
console.log('Result keys:', Object.keys(result || {}));
|
||||
console.log('Node type:', result?.nodeType);
|
||||
console.log('Required props:', result?.requiredProperties?.length || 0);
|
||||
console.log('Common props:', result?.commonProperties?.length || 0);
|
||||
console.log('Has examples:', !!result?.examples);
|
||||
|
||||
// Check sizes
|
||||
const size = JSON.stringify(result).length;
|
||||
console.log(`\nResponse size: ${(size / 1024).toFixed(1)} KB`);
|
||||
|
||||
} catch (error) {
|
||||
console.error('❌ Error executing get_node_essentials:', error);
|
||||
console.error('Error stack:', error.stack);
|
||||
}
|
||||
|
||||
// Test search_node_properties
|
||||
console.log('\n\nTesting search_node_properties...');
|
||||
try {
|
||||
const result = await server.executeTool('search_node_properties', {
|
||||
nodeType: 'nodes-base.httpRequest',
|
||||
query: 'auth'
|
||||
});
|
||||
|
||||
console.log('✅ Success!');
|
||||
console.log('Matches found:', result?.totalMatches || 0);
|
||||
console.log('First match:', result?.matches?.[0]?.name);
|
||||
|
||||
} catch (error) {
|
||||
console.error('❌ Error executing search_node_properties:', error);
|
||||
}
|
||||
|
||||
// Test get_node_info for comparison
|
||||
console.log('\n\nTesting get_node_info for comparison...');
|
||||
try {
|
||||
const result = await server.executeTool('get_node_info', {
|
||||
nodeType: 'nodes-base.httpRequest'
|
||||
});
|
||||
|
||||
const size = JSON.stringify(result).length;
|
||||
console.log('✅ Success!');
|
||||
console.log(`Full node info size: ${(size / 1024).toFixed(1)} KB`);
|
||||
console.log('Properties count:', result?.properties?.length || 0);
|
||||
|
||||
} catch (error) {
|
||||
console.error('❌ Error executing get_node_info:', error);
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.error('\n❌ Fatal error:', error);
|
||||
console.error('Stack:', error.stack);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log('\n✨ Direct test completed');
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
// Run the test
|
||||
testDirect().catch(console.error);
|
||||
@@ -1,225 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Simple test script for validating the essentials implementation
|
||||
* This version runs the MCP server as a subprocess to test real behavior
|
||||
*/
|
||||
|
||||
const { spawn } = require('child_process');
|
||||
const path = require('path');
|
||||
|
||||
const colors = {
|
||||
reset: '\x1b[0m',
|
||||
bright: '\x1b[1m',
|
||||
green: '\x1b[32m',
|
||||
red: '\x1b[31m',
|
||||
yellow: '\x1b[33m',
|
||||
blue: '\x1b[34m',
|
||||
cyan: '\x1b[36m'
|
||||
};
|
||||
|
||||
function log(message, color = colors.reset) {
|
||||
console.log(`${color}${message}${colors.reset}`);
|
||||
}
|
||||
|
||||
function runMCPRequest(request) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const mcp = spawn('npm', ['start'], {
|
||||
cwd: path.join(__dirname, '..'),
|
||||
stdio: ['pipe', 'pipe', 'pipe'],
|
||||
env: { ...process.env, NODE_ENV: 'production' }
|
||||
});
|
||||
|
||||
let output = '';
|
||||
let error = '';
|
||||
let timeout;
|
||||
|
||||
// Set timeout
|
||||
timeout = setTimeout(() => {
|
||||
mcp.kill();
|
||||
reject(new Error('Request timed out after 10 seconds'));
|
||||
}, 10000);
|
||||
|
||||
mcp.stdout.on('data', (data) => {
|
||||
output += data.toString();
|
||||
});
|
||||
|
||||
mcp.stderr.on('data', (data) => {
|
||||
error += data.toString();
|
||||
});
|
||||
|
||||
mcp.on('close', (code) => {
|
||||
clearTimeout(timeout);
|
||||
|
||||
if (code !== 0) {
|
||||
reject(new Error(`Process exited with code ${code}: ${error}`));
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
// Parse JSON-RPC response
|
||||
const lines = output.split('\n');
|
||||
for (const line of lines) {
|
||||
if (line.trim() && line.includes('"jsonrpc"')) {
|
||||
const response = JSON.parse(line);
|
||||
if (response.result) {
|
||||
const content = response.result.content[0].text;
|
||||
resolve(JSON.parse(content));
|
||||
return;
|
||||
} else if (response.error) {
|
||||
reject(new Error(response.error.message));
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
reject(new Error('No valid response found in output:\n' + output));
|
||||
} catch (err) {
|
||||
reject(new Error(`Failed to parse response: ${err.message}\nOutput: ${output}`));
|
||||
}
|
||||
});
|
||||
|
||||
// Send request
|
||||
mcp.stdin.write(JSON.stringify(request) + '\n');
|
||||
mcp.stdin.end();
|
||||
});
|
||||
}
|
||||
|
||||
async function testEssentials() {
|
||||
log('\n🚀 Testing n8n MCP Essentials Implementation\n', colors.bright + colors.cyan);
|
||||
|
||||
try {
|
||||
// Test 1: Get node essentials
|
||||
log('1️⃣ Testing get_node_essentials for HTTP Request...', colors.yellow);
|
||||
const essentialsRequest = {
|
||||
jsonrpc: '2.0',
|
||||
method: 'tools/call',
|
||||
params: {
|
||||
name: 'get_node_essentials',
|
||||
arguments: {
|
||||
nodeType: 'nodes-base.httpRequest'
|
||||
}
|
||||
},
|
||||
id: 1
|
||||
};
|
||||
|
||||
const essentials = await runMCPRequest(essentialsRequest);
|
||||
|
||||
log('✅ Success! Got essentials:', colors.green);
|
||||
log(` Node Type: ${essentials.nodeType}`);
|
||||
log(` Display Name: ${essentials.displayName}`);
|
||||
log(` Required properties: ${essentials.requiredProperties?.map(p => p.name).join(', ') || 'None'}`);
|
||||
log(` Common properties: ${essentials.commonProperties?.map(p => p.name).join(', ') || 'None'}`);
|
||||
log(` Examples: ${Object.keys(essentials.examples || {}).join(', ')}`);
|
||||
|
||||
const essentialsSize = JSON.stringify(essentials).length;
|
||||
log(` Response size: ${(essentialsSize / 1024).toFixed(1)} KB`, colors.green);
|
||||
|
||||
// Test 2: Compare with full node info
|
||||
log('\n2️⃣ Getting full node info for comparison...', colors.yellow);
|
||||
const fullInfoRequest = {
|
||||
jsonrpc: '2.0',
|
||||
method: 'tools/call',
|
||||
params: {
|
||||
name: 'get_node_info',
|
||||
arguments: {
|
||||
nodeType: 'nodes-base.httpRequest'
|
||||
}
|
||||
},
|
||||
id: 2
|
||||
};
|
||||
|
||||
const fullInfo = await runMCPRequest(fullInfoRequest);
|
||||
const fullSize = JSON.stringify(fullInfo).length;
|
||||
|
||||
log('✅ Got full node info:', colors.green);
|
||||
log(` Properties count: ${fullInfo.properties?.length || 0}`);
|
||||
log(` Response size: ${(fullSize / 1024).toFixed(1)} KB`);
|
||||
|
||||
const reduction = ((fullSize - essentialsSize) / fullSize * 100).toFixed(1);
|
||||
log(`\n📊 Size Comparison:`, colors.bright);
|
||||
log(` Full response: ${(fullSize / 1024).toFixed(1)} KB`);
|
||||
log(` Essential response: ${(essentialsSize / 1024).toFixed(1)} KB`);
|
||||
log(` Size reduction: ${reduction}% 🎉`, colors.bright + colors.green);
|
||||
|
||||
// Test 3: Search properties
|
||||
log('\n3️⃣ Testing search_node_properties...', colors.yellow);
|
||||
const searchRequest = {
|
||||
jsonrpc: '2.0',
|
||||
method: 'tools/call',
|
||||
params: {
|
||||
name: 'search_node_properties',
|
||||
arguments: {
|
||||
nodeType: 'nodes-base.httpRequest',
|
||||
query: 'auth'
|
||||
}
|
||||
},
|
||||
id: 3
|
||||
};
|
||||
|
||||
const searchResults = await runMCPRequest(searchRequest);
|
||||
log('✅ Search completed:', colors.green);
|
||||
log(` Query: "${searchResults.query}"`);
|
||||
log(` Matches found: ${searchResults.totalMatches}`);
|
||||
if (searchResults.matches && searchResults.matches.length > 0) {
|
||||
log(' Top matches:');
|
||||
searchResults.matches.slice(0, 3).forEach(match => {
|
||||
log(` - ${match.name}: ${match.description || 'No description'}`);
|
||||
});
|
||||
}
|
||||
|
||||
// Summary
|
||||
log('\n✨ All tests passed successfully!', colors.bright + colors.green);
|
||||
log('\n📋 Summary:', colors.bright);
|
||||
log(` - get_node_essentials works correctly`);
|
||||
log(` - Size reduction achieved: ${reduction}%`);
|
||||
log(` - Property search functioning`);
|
||||
log(` - Examples included in response`);
|
||||
|
||||
// Test more nodes
|
||||
log('\n4️⃣ Testing additional nodes...', colors.yellow);
|
||||
const additionalNodes = ['nodes-base.webhook', 'nodes-base.code', 'nodes-base.postgres'];
|
||||
const results = [];
|
||||
|
||||
for (const nodeType of additionalNodes) {
|
||||
try {
|
||||
const req = {
|
||||
jsonrpc: '2.0',
|
||||
method: 'tools/call',
|
||||
params: {
|
||||
name: 'get_node_essentials',
|
||||
arguments: { nodeType }
|
||||
},
|
||||
id: Math.random()
|
||||
};
|
||||
|
||||
const result = await runMCPRequest(req);
|
||||
const size = JSON.stringify(result).length;
|
||||
results.push({
|
||||
nodeType,
|
||||
success: true,
|
||||
propCount: (result.requiredProperties?.length || 0) + (result.commonProperties?.length || 0),
|
||||
size: (size / 1024).toFixed(1)
|
||||
});
|
||||
log(` ✅ ${nodeType}: ${results[results.length - 1].propCount} properties, ${results[results.length - 1].size} KB`);
|
||||
} catch (error) {
|
||||
results.push({ nodeType, success: false, error: error.message });
|
||||
log(` ❌ ${nodeType}: ${error.message}`, colors.red);
|
||||
}
|
||||
}
|
||||
|
||||
log('\n🎯 Implementation validated successfully!', colors.bright + colors.green);
|
||||
|
||||
} catch (error) {
|
||||
log(`\n❌ Test failed: ${error.message}`, colors.red);
|
||||
if (error.stack) {
|
||||
log('Stack trace:', colors.red);
|
||||
log(error.stack, colors.red);
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
// Run the test
|
||||
testEssentials().catch(error => {
|
||||
console.error('Unhandled error:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -1,101 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Final validation test
|
||||
*/
|
||||
|
||||
const { N8NDocumentationMCPServer } = require('../dist/mcp/server');
|
||||
|
||||
const colors = {
|
||||
green: '\x1b[32m',
|
||||
red: '\x1b[31m',
|
||||
yellow: '\x1b[33m',
|
||||
cyan: '\x1b[36m',
|
||||
reset: '\x1b[0m',
|
||||
bright: '\x1b[1m'
|
||||
};
|
||||
|
||||
async function testNode(server, nodeType) {
|
||||
console.log(`\n${colors.cyan}Testing ${nodeType}...${colors.reset}`);
|
||||
|
||||
try {
|
||||
// Get essentials
|
||||
const essentials = await server.executeTool('get_node_essentials', { nodeType });
|
||||
|
||||
// Get full info for comparison
|
||||
const fullInfo = await server.executeTool('get_node_info', { nodeType });
|
||||
|
||||
const essentialSize = JSON.stringify(essentials).length;
|
||||
const fullSize = JSON.stringify(fullInfo).length;
|
||||
const reduction = ((fullSize - essentialSize) / fullSize * 100).toFixed(1);
|
||||
|
||||
console.log(`✅ ${nodeType}:`);
|
||||
console.log(` Required: ${essentials.requiredProperties?.map(p => p.name).join(', ') || 'none'}`);
|
||||
console.log(` Common: ${essentials.commonProperties?.map(p => p.name).join(', ') || 'none'}`);
|
||||
console.log(` Size: ${(fullSize / 1024).toFixed(1)}KB → ${(essentialSize / 1024).toFixed(1)}KB (${reduction}% reduction)`);
|
||||
console.log(` Examples: ${Object.keys(essentials.examples || {}).length}`);
|
||||
|
||||
return { success: true, reduction: parseFloat(reduction) };
|
||||
} catch (error) {
|
||||
console.log(`❌ ${nodeType}: ${error.message}`);
|
||||
return { success: false };
|
||||
}
|
||||
}
|
||||
|
||||
async function main() {
|
||||
console.log(`${colors.bright}${colors.cyan}🎯 Final Validation Test${colors.reset}\n`);
|
||||
|
||||
try {
|
||||
const server = new N8NDocumentationMCPServer();
|
||||
await new Promise(resolve => setTimeout(resolve, 500));
|
||||
|
||||
const nodes = [
|
||||
'nodes-base.httpRequest',
|
||||
'nodes-base.webhook',
|
||||
'nodes-base.code',
|
||||
'nodes-base.set',
|
||||
'nodes-base.postgres',
|
||||
'nodes-base.slack',
|
||||
'nodes-base.openAi',
|
||||
'nodes-base.googleSheets'
|
||||
];
|
||||
|
||||
const results = [];
|
||||
|
||||
for (const node of nodes) {
|
||||
const result = await testNode(server, node);
|
||||
results.push(result);
|
||||
}
|
||||
|
||||
// Summary
|
||||
console.log(`\n${colors.bright}📊 Summary${colors.reset}`);
|
||||
const successful = results.filter(r => r.success);
|
||||
const avgReduction = successful.reduce((sum, r) => sum + r.reduction, 0) / successful.length;
|
||||
|
||||
console.log(`✅ Successful: ${successful.length}/${results.length}`);
|
||||
console.log(`📉 Average size reduction: ${avgReduction.toFixed(1)}%`);
|
||||
|
||||
// Test property search
|
||||
console.log(`\n${colors.bright}🔍 Testing Property Search${colors.reset}`);
|
||||
const searchResult = await server.executeTool('search_node_properties', {
|
||||
nodeType: 'nodes-base.httpRequest',
|
||||
query: 'auth'
|
||||
});
|
||||
console.log(`✅ Found ${searchResult.totalMatches} properties matching "auth"`);
|
||||
searchResult.matches.slice(0, 3).forEach(m => {
|
||||
console.log(` - ${m.name}: ${m.type}`);
|
||||
});
|
||||
|
||||
console.log(`\n${colors.bright}${colors.green}✨ Implementation validated successfully!${colors.reset}`);
|
||||
console.log('\nThe MCP essentials tools are working correctly with:');
|
||||
console.log(`- ${avgReduction.toFixed(1)}% average size reduction`);
|
||||
console.log('- Property filtering working');
|
||||
console.log('- Examples included');
|
||||
console.log('- Search functionality operational');
|
||||
|
||||
} catch (error) {
|
||||
console.error(`${colors.red}Fatal error: ${error.message}${colors.reset}`);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
main().catch(console.error);
|
||||
@@ -1,143 +0,0 @@
|
||||
-- Enhanced n8n Node Documentation Database Schema
|
||||
-- This schema stores comprehensive node information including source code,
|
||||
-- documentation, operations, API methods, examples, and metadata
|
||||
|
||||
-- Main nodes table with rich documentation
|
||||
CREATE TABLE IF NOT EXISTS nodes (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
node_type TEXT UNIQUE NOT NULL,
|
||||
name TEXT NOT NULL,
|
||||
display_name TEXT,
|
||||
description TEXT,
|
||||
category TEXT,
|
||||
subcategory TEXT,
|
||||
icon TEXT,
|
||||
|
||||
-- Source code
|
||||
source_code TEXT NOT NULL,
|
||||
credential_code TEXT,
|
||||
code_hash TEXT NOT NULL,
|
||||
code_length INTEGER NOT NULL,
|
||||
|
||||
-- Documentation
|
||||
documentation_markdown TEXT,
|
||||
documentation_url TEXT,
|
||||
documentation_title TEXT,
|
||||
|
||||
-- Enhanced documentation fields (stored as JSON)
|
||||
operations TEXT,
|
||||
api_methods TEXT,
|
||||
documentation_examples TEXT,
|
||||
templates TEXT,
|
||||
related_resources TEXT,
|
||||
required_scopes TEXT,
|
||||
|
||||
-- Example usage
|
||||
example_workflow TEXT,
|
||||
example_parameters TEXT,
|
||||
properties_schema TEXT,
|
||||
|
||||
-- Metadata
|
||||
package_name TEXT NOT NULL,
|
||||
version TEXT,
|
||||
codex_data TEXT,
|
||||
aliases TEXT,
|
||||
|
||||
-- Flags
|
||||
has_credentials INTEGER DEFAULT 0,
|
||||
is_trigger INTEGER DEFAULT 0,
|
||||
is_webhook INTEGER DEFAULT 0,
|
||||
|
||||
-- Timestamps
|
||||
extracted_at DATETIME DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Indexes for performance
|
||||
CREATE INDEX IF NOT EXISTS idx_nodes_package_name ON nodes(package_name);
|
||||
CREATE INDEX IF NOT EXISTS idx_nodes_category ON nodes(category);
|
||||
CREATE INDEX IF NOT EXISTS idx_nodes_code_hash ON nodes(code_hash);
|
||||
CREATE INDEX IF NOT EXISTS idx_nodes_name ON nodes(name);
|
||||
CREATE INDEX IF NOT EXISTS idx_nodes_is_trigger ON nodes(is_trigger);
|
||||
CREATE INDEX IF NOT EXISTS idx_nodes_has_credentials ON nodes(has_credentials);
|
||||
|
||||
-- Full Text Search table
|
||||
CREATE VIRTUAL TABLE IF NOT EXISTS nodes_fts USING fts5(
|
||||
node_type,
|
||||
name,
|
||||
display_name,
|
||||
description,
|
||||
category,
|
||||
documentation_markdown,
|
||||
aliases,
|
||||
content=nodes,
|
||||
content_rowid=id
|
||||
);
|
||||
|
||||
-- Triggers to keep FTS in sync
|
||||
CREATE TRIGGER IF NOT EXISTS nodes_ai AFTER INSERT ON nodes
|
||||
BEGIN
|
||||
INSERT INTO nodes_fts(rowid, node_type, name, display_name, description, category, documentation_markdown, aliases)
|
||||
VALUES (new.id, new.node_type, new.name, new.display_name, new.description, new.category, new.documentation_markdown, new.aliases);
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS nodes_ad AFTER DELETE ON nodes
|
||||
BEGIN
|
||||
DELETE FROM nodes_fts WHERE rowid = old.id;
|
||||
END;
|
||||
|
||||
CREATE TRIGGER IF NOT EXISTS nodes_au AFTER UPDATE ON nodes
|
||||
BEGIN
|
||||
DELETE FROM nodes_fts WHERE rowid = old.id;
|
||||
INSERT INTO nodes_fts(rowid, node_type, name, display_name, description, category, documentation_markdown, aliases)
|
||||
VALUES (new.id, new.node_type, new.name, new.display_name, new.description, new.category, new.documentation_markdown, new.aliases);
|
||||
END;
|
||||
|
||||
-- Documentation sources tracking
|
||||
CREATE TABLE IF NOT EXISTS documentation_sources (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
source TEXT NOT NULL,
|
||||
commit_hash TEXT,
|
||||
fetched_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Statistics tracking
|
||||
CREATE TABLE IF NOT EXISTS extraction_stats (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
total_nodes INTEGER NOT NULL,
|
||||
nodes_with_docs INTEGER NOT NULL,
|
||||
nodes_with_examples INTEGER NOT NULL,
|
||||
total_code_size INTEGER NOT NULL,
|
||||
total_docs_size INTEGER NOT NULL,
|
||||
extraction_date DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- Views for common queries
|
||||
CREATE VIEW IF NOT EXISTS nodes_summary AS
|
||||
SELECT
|
||||
node_type,
|
||||
name,
|
||||
display_name,
|
||||
description,
|
||||
category,
|
||||
package_name,
|
||||
CASE WHEN documentation_markdown IS NOT NULL THEN 1 ELSE 0 END as has_documentation,
|
||||
CASE WHEN documentation_examples IS NOT NULL THEN 1 ELSE 0 END as has_examples,
|
||||
CASE WHEN operations IS NOT NULL THEN 1 ELSE 0 END as has_operations,
|
||||
has_credentials,
|
||||
is_trigger,
|
||||
is_webhook
|
||||
FROM nodes;
|
||||
|
||||
CREATE VIEW IF NOT EXISTS package_summary AS
|
||||
SELECT
|
||||
package_name,
|
||||
COUNT(*) as node_count,
|
||||
SUM(CASE WHEN documentation_markdown IS NOT NULL THEN 1 ELSE 0 END) as nodes_with_docs,
|
||||
SUM(CASE WHEN documentation_examples IS NOT NULL THEN 1 ELSE 0 END) as nodes_with_examples,
|
||||
SUM(has_credentials) as nodes_with_credentials,
|
||||
SUM(is_trigger) as trigger_nodes,
|
||||
SUM(is_webhook) as webhook_nodes
|
||||
FROM nodes
|
||||
GROUP BY package_name
|
||||
ORDER BY node_count DESC;
|
||||
@@ -1,50 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
// Simple test to verify node extraction works
|
||||
const { NodeSourceExtractor } = require('./dist/utils/node-source-extractor');
|
||||
|
||||
async function testExtraction() {
|
||||
const extractor = new NodeSourceExtractor();
|
||||
|
||||
console.log('🧪 Testing n8n Node Extraction\n');
|
||||
|
||||
// Test cases
|
||||
const testNodes = [
|
||||
'n8n-nodes-base.Function',
|
||||
'n8n-nodes-base.Webhook',
|
||||
'@n8n/n8n-nodes-langchain.Agent'
|
||||
];
|
||||
|
||||
for (const nodeType of testNodes) {
|
||||
console.log(`\n📦 Testing: ${nodeType}`);
|
||||
console.log('-'.repeat(40));
|
||||
|
||||
try {
|
||||
const result = await extractor.extractNodeSource(nodeType);
|
||||
|
||||
console.log('✅ Success!');
|
||||
console.log(` Size: ${result.sourceCode.length} bytes`);
|
||||
console.log(` Location: ${result.location}`);
|
||||
console.log(` Has package info: ${result.packageInfo ? 'Yes' : 'No'}`);
|
||||
|
||||
if (result.packageInfo) {
|
||||
console.log(` Package: ${result.packageInfo.name} v${result.packageInfo.version}`);
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.log('❌ Failed:', error.message);
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n\n📋 Listing available nodes...');
|
||||
const allNodes = await extractor.listAvailableNodes();
|
||||
console.log(`Found ${allNodes.length} total nodes`);
|
||||
|
||||
// Show first 5
|
||||
console.log('\nFirst 5 nodes:');
|
||||
allNodes.slice(0, 5).forEach(node => {
|
||||
console.log(` - ${node.name} (${node.displayName || 'no display name'})`);
|
||||
});
|
||||
}
|
||||
|
||||
testExtraction().catch(console.error);
|
||||
@@ -1,81 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const { spawn } = require('child_process');
|
||||
|
||||
// Start the MCP server
|
||||
const server = spawn('node', ['dist/mcp/index.js'], {
|
||||
env: {
|
||||
...process.env,
|
||||
MCP_MODE: 'stdio',
|
||||
LOG_LEVEL: 'error',
|
||||
DISABLE_CONSOLE_OUTPUT: 'true'
|
||||
},
|
||||
stdio: ['pipe', 'pipe', 'inherit']
|
||||
});
|
||||
|
||||
// Send list_tools request
|
||||
const request = {
|
||||
jsonrpc: "2.0",
|
||||
method: "tools/list",
|
||||
id: 1
|
||||
};
|
||||
|
||||
server.stdin.write(JSON.stringify(request) + '\n');
|
||||
|
||||
// Collect response
|
||||
let responseData = '';
|
||||
server.stdout.on('data', (data) => {
|
||||
responseData += data.toString();
|
||||
|
||||
// Try to parse each line
|
||||
const lines = responseData.split('\n');
|
||||
for (const line of lines) {
|
||||
if (line.trim()) {
|
||||
try {
|
||||
const response = JSON.parse(line);
|
||||
if (response.result && response.result.tools) {
|
||||
console.log('MCP Server Tools Report');
|
||||
console.log('======================');
|
||||
console.log(`Total tools: ${response.result.tools.length}`);
|
||||
console.log('\nTools list:');
|
||||
response.result.tools.forEach(tool => {
|
||||
console.log(`- ${tool.name}`);
|
||||
});
|
||||
|
||||
// Check for specific tools
|
||||
const expectedTools = [
|
||||
'get_node_for_task',
|
||||
'validate_node_config',
|
||||
'get_property_dependencies',
|
||||
'list_tasks',
|
||||
'search_node_properties',
|
||||
'get_node_essentials'
|
||||
];
|
||||
|
||||
console.log('\nNew tools check:');
|
||||
expectedTools.forEach(toolName => {
|
||||
const found = response.result.tools.find(t => t.name === toolName);
|
||||
console.log(`- ${toolName}: ${found ? '✅ Found' : '❌ Missing'}`);
|
||||
});
|
||||
|
||||
server.kill();
|
||||
process.exit(0);
|
||||
}
|
||||
} catch (e) {
|
||||
// Not a complete JSON response yet
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Timeout after 5 seconds
|
||||
setTimeout(() => {
|
||||
console.error('Timeout: No response from MCP server');
|
||||
server.kill();
|
||||
process.exit(1);
|
||||
}, 5000);
|
||||
|
||||
server.on('error', (err) => {
|
||||
console.error('Failed to start MCP server:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -1,113 +0,0 @@
|
||||
{
|
||||
"timestamp": "2025-06-16T09:08:37.822Z",
|
||||
"summary": {
|
||||
"totalTests": 10,
|
||||
"successful": 0,
|
||||
"failed": 10,
|
||||
"averageReduction": null,
|
||||
"totalFullSize": 0,
|
||||
"totalEssentialSize": 0
|
||||
},
|
||||
"results": [
|
||||
{
|
||||
"nodeType": "nodes-base.httpRequest",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected end of JSON input"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.webhook",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected end of JSON input"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.code",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected end of JSON input"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.set",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected end of JSON input"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.if",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected end of JSON input"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.postgres",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected token 'o', \"[object Obj\"... is not valid JSON"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.openAi",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "\"[object Object]\" is not valid JSON"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.googleSheets",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected token 'o', \"[object Obj\"... is not valid JSON"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.slack",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected token 'o', \"[object Obj\"... is not valid JSON"
|
||||
},
|
||||
{
|
||||
"nodeType": "nodes-base.merge",
|
||||
"fullSize": 0,
|
||||
"essentialSize": 0,
|
||||
"sizeReduction": 0,
|
||||
"fullPropCount": 0,
|
||||
"essentialPropCount": 0,
|
||||
"success": false,
|
||||
"error": "Unexpected end of JSON input"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -1,94 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const { NodeDocumentationService } = require('../dist/services/node-documentation-service');
|
||||
|
||||
async function testCompleteFix() {
|
||||
console.log('=== Testing Complete Documentation Fix ===\n');
|
||||
|
||||
const service = new NodeDocumentationService('./data/test-nodes-v2.db');
|
||||
|
||||
try {
|
||||
// First check if we have any nodes
|
||||
const existingNodes = await service.listNodes();
|
||||
console.log(`📊 Current database has ${existingNodes.length} nodes`);
|
||||
|
||||
if (existingNodes.length === 0) {
|
||||
console.log('\n🔄 Rebuilding database with fixed documentation fetcher...');
|
||||
const stats = await service.rebuildDatabase();
|
||||
console.log(`\n✅ Rebuild complete:`);
|
||||
console.log(` - Total nodes found: ${stats.total}`);
|
||||
console.log(` - Successfully processed: ${stats.successful}`);
|
||||
console.log(` - Failed: ${stats.failed}`);
|
||||
|
||||
if (stats.errors.length > 0) {
|
||||
console.log('\n⚠️ Errors encountered:');
|
||||
stats.errors.slice(0, 5).forEach(err => console.log(` - ${err}`));
|
||||
}
|
||||
}
|
||||
|
||||
// Test specific nodes
|
||||
console.log('\n📋 Testing specific nodes:');
|
||||
|
||||
const testNodes = ['slack', 'if', 'httpRequest', 'webhook'];
|
||||
|
||||
for (const nodeName of testNodes) {
|
||||
const nodeInfo = await service.getNodeInfo(`n8n-nodes-base.${nodeName}`);
|
||||
|
||||
if (nodeInfo) {
|
||||
console.log(`\n✅ ${nodeInfo.displayName || nodeName}:`);
|
||||
console.log(` - Type: ${nodeInfo.nodeType}`);
|
||||
console.log(` - Description: ${nodeInfo.description?.substring(0, 80)}...`);
|
||||
console.log(` - Has source code: ${!!nodeInfo.sourceCode}`);
|
||||
console.log(` - Has documentation: ${!!nodeInfo.documentation}`);
|
||||
console.log(` - Documentation URL: ${nodeInfo.documentationUrl || 'N/A'}`);
|
||||
console.log(` - Has example: ${!!nodeInfo.exampleWorkflow}`);
|
||||
console.log(` - Category: ${nodeInfo.category || 'N/A'}`);
|
||||
|
||||
// Check if it's getting the right documentation
|
||||
if (nodeInfo.documentation) {
|
||||
const isCredentialDoc = nodeInfo.documentation.includes('credentials') &&
|
||||
!nodeInfo.documentation.includes('node documentation');
|
||||
console.log(` - Is credential doc: ${isCredentialDoc} ${isCredentialDoc ? '❌' : '✅'}`);
|
||||
}
|
||||
} else {
|
||||
console.log(`\n❌ ${nodeName}: Not found in database`);
|
||||
}
|
||||
}
|
||||
|
||||
// Test search functionality
|
||||
console.log('\n🔍 Testing search functionality:');
|
||||
|
||||
const searchTests = [
|
||||
{ query: 'webhook', label: 'Webhook nodes' },
|
||||
{ query: 'http', label: 'HTTP nodes' },
|
||||
{ query: 'slack', label: 'Slack nodes' }
|
||||
];
|
||||
|
||||
for (const test of searchTests) {
|
||||
const results = await service.searchNodes({ query: test.query });
|
||||
console.log(`\n ${test.label}: ${results.length} results`);
|
||||
results.slice(0, 3).forEach(node => {
|
||||
console.log(` - ${node.displayName} (${node.nodeType})`);
|
||||
});
|
||||
}
|
||||
|
||||
// Get final statistics
|
||||
console.log('\n📊 Final database statistics:');
|
||||
const stats = service.getStatistics();
|
||||
console.log(` - Total nodes: ${stats.totalNodes}`);
|
||||
console.log(` - Nodes with documentation: ${stats.nodesWithDocs}`);
|
||||
console.log(` - Nodes with examples: ${stats.nodesWithExamples}`);
|
||||
console.log(` - Trigger nodes: ${stats.triggerNodes}`);
|
||||
console.log(` - Webhook nodes: ${stats.webhookNodes}`);
|
||||
|
||||
console.log('\n✅ All tests completed!');
|
||||
|
||||
} catch (error) {
|
||||
console.error('\n❌ Test failed:', error);
|
||||
process.exit(1);
|
||||
} finally {
|
||||
service.close();
|
||||
}
|
||||
}
|
||||
|
||||
testCompleteFix().catch(console.error);
|
||||
@@ -1,38 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const { EnhancedDocumentationFetcher } = require('../dist/utils/enhanced-documentation-fetcher');
|
||||
|
||||
async function debugTest() {
|
||||
console.log('=== Debug Enhanced Documentation ===\n');
|
||||
|
||||
const fetcher = new EnhancedDocumentationFetcher();
|
||||
|
||||
try {
|
||||
await fetcher.ensureDocsRepository();
|
||||
|
||||
// Test Slack documentation parsing
|
||||
console.log('Testing Slack documentation...');
|
||||
const slackDoc = await fetcher.getEnhancedNodeDocumentation('n8n-nodes-base.slack');
|
||||
|
||||
if (slackDoc) {
|
||||
console.log('\nSlack Documentation:');
|
||||
console.log('- Operations found:', slackDoc.operations?.length || 0);
|
||||
|
||||
// Show raw markdown around operations section
|
||||
const operationsIndex = slackDoc.markdown.indexOf('## Operations');
|
||||
if (operationsIndex > -1) {
|
||||
console.log('\nRaw markdown around Operations section:');
|
||||
console.log('---');
|
||||
console.log(slackDoc.markdown.substring(operationsIndex, operationsIndex + 1000));
|
||||
console.log('---');
|
||||
}
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.error('Error:', error);
|
||||
} finally {
|
||||
await fetcher.cleanup();
|
||||
}
|
||||
}
|
||||
|
||||
debugTest().catch(console.error);
|
||||
@@ -1,57 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const { DocumentationFetcher } = require('../dist/utils/documentation-fetcher');
|
||||
const { NodeSourceExtractor } = require('../dist/utils/node-source-extractor');
|
||||
|
||||
async function testDocsFix() {
|
||||
console.log('=== Testing Documentation Fix ===\n');
|
||||
|
||||
const docsFetcher = new DocumentationFetcher();
|
||||
const extractor = new NodeSourceExtractor();
|
||||
|
||||
try {
|
||||
// Test nodes
|
||||
const testNodes = [
|
||||
'n8n-nodes-base.slack',
|
||||
'n8n-nodes-base.if',
|
||||
'n8n-nodes-base.httpRequest',
|
||||
'n8n-nodes-base.webhook'
|
||||
];
|
||||
|
||||
for (const nodeType of testNodes) {
|
||||
console.log(`\n📋 Testing ${nodeType}:`);
|
||||
|
||||
// Test documentation fetching
|
||||
const docs = await docsFetcher.getNodeDocumentation(nodeType);
|
||||
if (docs) {
|
||||
console.log(` ✅ Documentation found`);
|
||||
console.log(` 📄 URL: ${docs.url}`);
|
||||
const titleMatch = docs.markdown.match(/title:\s*(.+)/);
|
||||
if (titleMatch) {
|
||||
console.log(` 📝 Title: ${titleMatch[1]}`);
|
||||
}
|
||||
console.log(` 📏 Length: ${docs.markdown.length} characters`);
|
||||
console.log(` 🔧 Has examples: ${docs.examples && docs.examples.length > 0}`);
|
||||
} else {
|
||||
console.log(` ❌ No documentation found`);
|
||||
}
|
||||
|
||||
// Test source extraction
|
||||
try {
|
||||
const source = await extractor.extractNodeSource(nodeType);
|
||||
console.log(` ✅ Source code found at: ${source.location}`);
|
||||
} catch (error) {
|
||||
console.log(` ❌ Source extraction failed: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
console.log('\n✅ Test completed!');
|
||||
|
||||
} catch (error) {
|
||||
console.error('\n❌ Test failed:', error);
|
||||
} finally {
|
||||
await docsFetcher.cleanup();
|
||||
}
|
||||
}
|
||||
|
||||
testDocsFix().catch(console.error);
|
||||
@@ -1,156 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const { EnhancedDocumentationFetcher } = require('../dist/utils/enhanced-documentation-fetcher');
|
||||
const { EnhancedSQLiteStorageService } = require('../dist/services/enhanced-sqlite-storage-service');
|
||||
const { NodeSourceExtractor } = require('../dist/utils/node-source-extractor');
|
||||
|
||||
async function testEnhancedDocumentation() {
|
||||
console.log('=== Enhanced Documentation Parser Test ===\n');
|
||||
|
||||
const fetcher = new EnhancedDocumentationFetcher();
|
||||
const extractor = new NodeSourceExtractor();
|
||||
|
||||
try {
|
||||
// Test 1: Parse Slack documentation
|
||||
console.log('1. Parsing Slack node documentation...');
|
||||
const slackDoc = await fetcher.getEnhancedNodeDocumentation('n8n-nodes-base.slack');
|
||||
|
||||
if (slackDoc) {
|
||||
console.log('\n✓ Slack Documentation Parsed:');
|
||||
console.log(` Title: ${slackDoc.title}`);
|
||||
console.log(` Description: ${slackDoc.description?.substring(0, 100)}...`);
|
||||
console.log(` URL: ${slackDoc.url}`);
|
||||
console.log(` Operations: ${slackDoc.operations?.length || 0} found`);
|
||||
console.log(` API Methods: ${slackDoc.apiMethods?.length || 0} found`);
|
||||
console.log(` Related Resources: ${slackDoc.relatedResources?.length || 0} found`);
|
||||
|
||||
// Show sample operations
|
||||
if (slackDoc.operations && slackDoc.operations.length > 0) {
|
||||
console.log('\n Sample Operations (first 10):');
|
||||
slackDoc.operations.slice(0, 10).forEach((op, i) => {
|
||||
console.log(` ${i + 1}. ${op.resource}.${op.operation}: ${op.description}`);
|
||||
});
|
||||
}
|
||||
|
||||
// Show sample API mappings
|
||||
if (slackDoc.apiMethods && slackDoc.apiMethods.length > 0) {
|
||||
console.log('\n Sample API Method Mappings (first 5):');
|
||||
slackDoc.apiMethods.slice(0, 5).forEach((api, i) => {
|
||||
console.log(` ${i + 1}. ${api.resource}.${api.operation} → ${api.apiMethod} (${api.apiUrl})`);
|
||||
});
|
||||
}
|
||||
|
||||
// Show related resources
|
||||
if (slackDoc.relatedResources && slackDoc.relatedResources.length > 0) {
|
||||
console.log('\n Related Resources:');
|
||||
slackDoc.relatedResources.forEach((res, i) => {
|
||||
console.log(` ${i + 1}. ${res.title} (${res.type}): ${res.url}`);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Test 2: Parse HTTP Request documentation (if available)
|
||||
console.log('\n\n2. Parsing HTTP Request node documentation...');
|
||||
const httpDoc = await fetcher.getEnhancedNodeDocumentation('n8n-nodes-base.httpRequest');
|
||||
|
||||
if (httpDoc) {
|
||||
console.log('\n✓ HTTP Request Documentation Parsed:');
|
||||
console.log(` Title: ${httpDoc.title}`);
|
||||
console.log(` Examples: ${httpDoc.examples?.length || 0} found`);
|
||||
|
||||
if (httpDoc.examples && httpDoc.examples.length > 0) {
|
||||
console.log('\n Code Examples:');
|
||||
httpDoc.examples.forEach((ex, i) => {
|
||||
console.log(` ${i + 1}. ${ex.title || 'Example'} (${ex.type}): ${ex.code.length} characters`);
|
||||
});
|
||||
}
|
||||
} else {
|
||||
console.log(' HTTP Request documentation not found');
|
||||
}
|
||||
|
||||
// Test 3: Database storage test with smaller database
|
||||
console.log('\n\n3. Testing enhanced database storage...');
|
||||
const storage = new EnhancedSQLiteStorageService('./data/demo-enhanced.db');
|
||||
|
||||
try {
|
||||
// Store Slack node with documentation
|
||||
const slackNodeInfo = await extractor.extractNodeSource('n8n-nodes-base.slack');
|
||||
if (slackNodeInfo) {
|
||||
const storedNode = await storage.storeNodeWithDocumentation(slackNodeInfo);
|
||||
|
||||
console.log('\n✓ Slack node stored with documentation:');
|
||||
console.log(` Node Type: ${storedNode.nodeType}`);
|
||||
console.log(` Documentation: ${storedNode.documentationTitle || 'No title'}`);
|
||||
console.log(` Operations stored: ${storedNode.operationCount}`);
|
||||
console.log(` API methods stored: ${storedNode.apiMethodCount}`);
|
||||
console.log(` Examples stored: ${storedNode.exampleCount}`);
|
||||
console.log(` Resources stored: ${storedNode.resourceCount}`);
|
||||
}
|
||||
|
||||
// Store a few more nodes
|
||||
const nodeTypes = ['n8n-nodes-base.if', 'n8n-nodes-base.webhook'];
|
||||
for (const nodeType of nodeTypes) {
|
||||
try {
|
||||
const nodeInfo = await extractor.extractNodeSource(nodeType);
|
||||
if (nodeInfo) {
|
||||
await storage.storeNodeWithDocumentation(nodeInfo);
|
||||
console.log(` ✓ Stored ${nodeType}`);
|
||||
}
|
||||
} catch (e) {
|
||||
console.log(` ✗ Failed to store ${nodeType}: ${e.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Test search functionality
|
||||
console.log('\n\n4. Testing enhanced search...');
|
||||
|
||||
const searchTests = [
|
||||
{ query: 'slack', description: 'Search for "slack"' },
|
||||
{ query: 'message send', description: 'Search for "message send"' },
|
||||
{ query: 'webhook', description: 'Search for "webhook"' }
|
||||
];
|
||||
|
||||
for (const test of searchTests) {
|
||||
const results = await storage.searchNodes({ query: test.query });
|
||||
console.log(`\n ${test.description}: ${results.length} results`);
|
||||
if (results.length > 0) {
|
||||
const first = results[0];
|
||||
console.log(` Top result: ${first.displayName || first.name} (${first.nodeType})`);
|
||||
if (first.documentationTitle) {
|
||||
console.log(` Documentation: ${first.documentationTitle}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Get final statistics
|
||||
console.log('\n\n5. Database Statistics:');
|
||||
const stats = await storage.getEnhancedStatistics();
|
||||
|
||||
console.log(` Total Nodes: ${stats.totalNodes}`);
|
||||
console.log(` Nodes with Documentation: ${stats.nodesWithDocumentation} (${stats.documentationCoverage}% coverage)`);
|
||||
console.log(` Total Operations: ${stats.totalOperations}`);
|
||||
console.log(` Total API Methods: ${stats.totalApiMethods}`);
|
||||
console.log(` Total Examples: ${stats.totalExamples}`);
|
||||
console.log(` Total Resources: ${stats.totalResources}`);
|
||||
|
||||
if (stats.topDocumentedNodes && stats.topDocumentedNodes.length > 0) {
|
||||
console.log('\n Best Documented Nodes:');
|
||||
stats.topDocumentedNodes.forEach((node, i) => {
|
||||
console.log(` ${i + 1}. ${node.display_name || node.name}: ${node.operation_count} operations, ${node.example_count} examples`);
|
||||
});
|
||||
}
|
||||
|
||||
} finally {
|
||||
storage.close();
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.error('\nError:', error);
|
||||
} finally {
|
||||
await fetcher.cleanup();
|
||||
console.log('\n\n✓ Test completed and cleaned up');
|
||||
}
|
||||
}
|
||||
|
||||
// Run the test
|
||||
testEnhancedDocumentation().catch(console.error);
|
||||
@@ -1,133 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const { DocumentationFetcher } = require('../dist/utils/documentation-fetcher');
|
||||
const { NodeSourceExtractor } = require('../dist/utils/node-source-extractor');
|
||||
const { execSync } = require('child_process');
|
||||
const path = require('path');
|
||||
const fs = require('fs');
|
||||
|
||||
async function investigateSlackDocs() {
|
||||
console.log('=== Investigating Slack Node Documentation Issue ===\n');
|
||||
|
||||
const docsFetcher = new DocumentationFetcher();
|
||||
const extractor = new NodeSourceExtractor();
|
||||
|
||||
try {
|
||||
// 1. Ensure docs repo is available
|
||||
console.log('1️⃣ Ensuring documentation repository...');
|
||||
await docsFetcher.ensureDocsRepository();
|
||||
|
||||
// 2. Check what files exist for Slack
|
||||
console.log('\n2️⃣ Searching for Slack documentation files...');
|
||||
const docsPath = path.join(process.cwd(), 'temp', 'n8n-docs');
|
||||
|
||||
try {
|
||||
const slackFiles = execSync(
|
||||
`find ${docsPath} -name "*slack*" -type f | grep -v ".git"`,
|
||||
{ encoding: 'utf-8' }
|
||||
).trim().split('\n').filter(Boolean);
|
||||
|
||||
console.log(`Found ${slackFiles.length} files with "slack" in the name:`);
|
||||
slackFiles.forEach(file => {
|
||||
const relPath = path.relative(docsPath, file);
|
||||
console.log(` - ${relPath}`);
|
||||
});
|
||||
|
||||
// Check content of each file
|
||||
console.log('\n3️⃣ Checking content of Slack-related files...');
|
||||
for (const file of slackFiles.slice(0, 5)) { // Check first 5 files
|
||||
if (file.endsWith('.md')) {
|
||||
const content = fs.readFileSync(file, 'utf-8');
|
||||
const firstLine = content.split('\n')[0];
|
||||
const isCredential = content.includes('credential') || content.includes('authentication');
|
||||
console.log(`\n 📄 ${path.basename(file)}`);
|
||||
console.log(` First line: ${firstLine}`);
|
||||
console.log(` Is credential doc: ${isCredential}`);
|
||||
|
||||
// Check if it mentions being a node or credential
|
||||
if (content.includes('# Slack node')) {
|
||||
console.log(' ✅ This is the Slack NODE documentation!');
|
||||
console.log(` Path: ${file}`);
|
||||
} else if (content.includes('# Slack credentials')) {
|
||||
console.log(' ⚠️ This is the Slack CREDENTIALS documentation');
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.log('Error searching for Slack files:', error.message);
|
||||
}
|
||||
|
||||
// 4. Test the getNodeDocumentation method
|
||||
console.log('\n4️⃣ Testing getNodeDocumentation for Slack...');
|
||||
const slackDocs = await docsFetcher.getNodeDocumentation('n8n-nodes-base.slack');
|
||||
|
||||
if (slackDocs) {
|
||||
console.log(' ✅ Found documentation for Slack node');
|
||||
console.log(` URL: ${slackDocs.url}`);
|
||||
console.log(` Content preview: ${slackDocs.markdown.substring(0, 200)}...`);
|
||||
|
||||
// Check if it's credential or node docs
|
||||
const isCredentialDoc = slackDocs.markdown.includes('credential') ||
|
||||
slackDocs.markdown.includes('authentication') ||
|
||||
slackDocs.markdown.includes('# Slack credentials');
|
||||
const isNodeDoc = slackDocs.markdown.includes('# Slack node') ||
|
||||
slackDocs.markdown.includes('## Properties');
|
||||
|
||||
console.log(` Is credential doc: ${isCredentialDoc}`);
|
||||
console.log(` Is node doc: ${isNodeDoc}`);
|
||||
} else {
|
||||
console.log(' ❌ No documentation found for Slack node');
|
||||
}
|
||||
|
||||
// 5. Extract the Slack node source to understand its structure
|
||||
console.log('\n5️⃣ Extracting Slack node source code...');
|
||||
try {
|
||||
const slackNode = await extractor.extractNodeSource('n8n-nodes-base.slack');
|
||||
console.log(' ✅ Successfully extracted Slack node');
|
||||
console.log(` Location: ${slackNode.location}`);
|
||||
console.log(` Has credential code: ${!!slackNode.credentialCode}`);
|
||||
|
||||
// Parse the node definition
|
||||
const descMatch = slackNode.sourceCode.match(/description\s*[:=]\s*({[\s\S]*?})\s*[,;]/);
|
||||
if (descMatch) {
|
||||
console.log(' Found node description in source');
|
||||
}
|
||||
} catch (error) {
|
||||
console.log(' ❌ Failed to extract Slack node:', error.message);
|
||||
}
|
||||
|
||||
// 6. Check documentation structure
|
||||
console.log('\n6️⃣ Checking n8n-docs repository structure...');
|
||||
const docStructure = [
|
||||
'docs/integrations/builtin/app-nodes',
|
||||
'docs/integrations/builtin/core-nodes',
|
||||
'docs/integrations/builtin/trigger-nodes',
|
||||
'docs/integrations/builtin/credentials'
|
||||
];
|
||||
|
||||
for (const dir of docStructure) {
|
||||
const fullPath = path.join(docsPath, dir);
|
||||
try {
|
||||
const files = fs.readdirSync(fullPath);
|
||||
const slackFile = files.find(f => f.toLowerCase().includes('slack'));
|
||||
console.log(`\n 📁 ${dir}:`);
|
||||
if (slackFile) {
|
||||
console.log(` Found: ${slackFile}`);
|
||||
} else {
|
||||
console.log(` No Slack files found`);
|
||||
}
|
||||
} catch (error) {
|
||||
console.log(` Directory doesn't exist`);
|
||||
}
|
||||
}
|
||||
|
||||
} catch (error) {
|
||||
console.error('\n❌ Investigation failed:', error);
|
||||
} finally {
|
||||
// Cleanup
|
||||
await docsFetcher.cleanup();
|
||||
}
|
||||
}
|
||||
|
||||
// Run investigation
|
||||
investigateSlackDocs().catch(console.error);
|
||||
@@ -1,119 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
const { NodeDocumentationService } = require('../dist/services/node-documentation-service');
|
||||
const { NodeSourceExtractor } = require('../dist/utils/node-source-extractor');
|
||||
const { DocumentationFetcher } = require('../dist/utils/documentation-fetcher');
|
||||
|
||||
async function testSlackFix() {
|
||||
console.log('=== Testing Slack Node Fix ===\n');
|
||||
|
||||
const extractor = new NodeSourceExtractor();
|
||||
const docsFetcher = new DocumentationFetcher();
|
||||
|
||||
try {
|
||||
// Test 1: Node source extraction
|
||||
console.log('1️⃣ Testing Slack node source extraction...');
|
||||
const slackSource = await extractor.extractNodeSource('n8n-nodes-base.slack');
|
||||
console.log(` ✅ Source code found at: ${slackSource.location}`);
|
||||
console.log(` 📏 Source length: ${slackSource.sourceCode.length} bytes`);
|
||||
|
||||
// Extract display name from source
|
||||
const displayNameMatch = slackSource.sourceCode.match(/displayName\s*[:=]\s*['"`]([^'"`]+)['"`]/);
|
||||
console.log(` 📛 Display name: ${displayNameMatch ? displayNameMatch[1] : 'Not found'}`);
|
||||
|
||||
// Test 2: Documentation fetching
|
||||
console.log('\n2️⃣ Testing Slack documentation fetching...');
|
||||
const slackDocs = await docsFetcher.getNodeDocumentation('n8n-nodes-base.slack');
|
||||
|
||||
if (slackDocs) {
|
||||
console.log(` ✅ Documentation found`);
|
||||
console.log(` 📄 URL: ${slackDocs.url}`);
|
||||
|
||||
// Extract title from markdown
|
||||
const titleMatch = slackDocs.markdown.match(/title:\s*(.+)/);
|
||||
console.log(` 📝 Title: ${titleMatch ? titleMatch[1] : 'Not found'}`);
|
||||
|
||||
// Check if it's the correct documentation
|
||||
const isNodeDoc = slackDocs.markdown.includes('Slack node') ||
|
||||
slackDocs.markdown.includes('node documentation');
|
||||
const isCredentialDoc = slackDocs.markdown.includes('Slack credentials') &&
|
||||
!slackDocs.markdown.includes('node documentation');
|
||||
|
||||
console.log(` ✅ Is node documentation: ${isNodeDoc}`);
|
||||
console.log(` ❌ Is credential documentation: ${isCredentialDoc}`);
|
||||
|
||||
if (isNodeDoc && !isCredentialDoc) {
|
||||
console.log('\n🎉 SUCCESS: Slack node documentation is correctly fetched!');
|
||||
} else {
|
||||
console.log('\n⚠️ WARNING: Documentation may not be correct');
|
||||
}
|
||||
|
||||
// Show first few lines of content
|
||||
console.log('\n📋 Documentation preview:');
|
||||
const lines = slackDocs.markdown.split('\n').slice(0, 15);
|
||||
lines.forEach(line => console.log(` ${line}`));
|
||||
|
||||
} else {
|
||||
console.log(' ❌ No documentation found');
|
||||
}
|
||||
|
||||
// Test 3: Complete node info using NodeDocumentationService
|
||||
console.log('\n3️⃣ Testing complete node info storage...');
|
||||
const service = new NodeDocumentationService('./data/test-slack-fix.db');
|
||||
|
||||
try {
|
||||
// Parse node definition
|
||||
const nodeDefinition = {
|
||||
displayName: displayNameMatch ? displayNameMatch[1] : 'Slack',
|
||||
description: 'Send messages to Slack channels, users and conversations',
|
||||
category: 'Communication',
|
||||
icon: 'file:slack.svg',
|
||||
version: 2
|
||||
};
|
||||
|
||||
// Store node info
|
||||
await service.storeNode({
|
||||
nodeType: 'n8n-nodes-base.slack',
|
||||
name: 'slack',
|
||||
displayName: nodeDefinition.displayName,
|
||||
description: nodeDefinition.description,
|
||||
category: nodeDefinition.category,
|
||||
icon: nodeDefinition.icon,
|
||||
sourceCode: slackSource.sourceCode,
|
||||
credentialCode: slackSource.credentialCode,
|
||||
documentation: slackDocs?.markdown,
|
||||
documentationUrl: slackDocs?.url,
|
||||
packageName: 'n8n-nodes-base',
|
||||
version: nodeDefinition.version,
|
||||
hasCredentials: !!slackSource.credentialCode,
|
||||
isTrigger: false,
|
||||
isWebhook: false
|
||||
});
|
||||
|
||||
console.log(' ✅ Node info stored successfully');
|
||||
|
||||
// Retrieve and verify
|
||||
const retrievedNode = await service.getNodeInfo('n8n-nodes-base.slack');
|
||||
if (retrievedNode) {
|
||||
console.log(' ✅ Node retrieved successfully');
|
||||
console.log(` 📛 Display name: ${retrievedNode.displayName}`);
|
||||
console.log(` 📝 Has documentation: ${!!retrievedNode.documentation}`);
|
||||
console.log(` 📄 Documentation URL: ${retrievedNode.documentationUrl || 'N/A'}`);
|
||||
}
|
||||
|
||||
service.close();
|
||||
} catch (error) {
|
||||
console.error(' ❌ Error with node service:', error.message);
|
||||
service.close();
|
||||
}
|
||||
|
||||
console.log('\n✅ All tests completed!');
|
||||
|
||||
} catch (error) {
|
||||
console.error('\n❌ Test failed:', error);
|
||||
} finally {
|
||||
await docsFetcher.cleanup();
|
||||
}
|
||||
}
|
||||
|
||||
testSlackFix().catch(console.error);
|
||||
Reference in New Issue
Block a user