refactor: reorganize documentation and configuration files for improved clarity
This commit is contained in:
@@ -1,170 +0,0 @@
|
||||
# AI Agent Node Extraction Test Guide
|
||||
|
||||
This document describes how to test the n8n Documentation MCP Server's ability to extract and provide node source code, including the AI Agent node from n8n.
|
||||
|
||||
## Overview
|
||||
|
||||
The n8n Documentation MCP Server provides comprehensive node information including:
|
||||
- Source code extraction
|
||||
- Official documentation
|
||||
- Usage examples
|
||||
- Node metadata
|
||||
|
||||
## Test Scenario
|
||||
|
||||
An MCP client (like Claude) requests the source code for n8n's AI Agent node, and the documentation server successfully extracts and returns it.
|
||||
|
||||
## Implementation Overview
|
||||
|
||||
### 1. Available MCP Tools
|
||||
|
||||
- **`get_node_source_code`**: Extracts source code for any n8n node
|
||||
- **`get_node_info`**: Gets complete node information including docs and examples
|
||||
- **`list_nodes`**: Lists all available n8n nodes
|
||||
- **`search_nodes`**: Search nodes by name or content
|
||||
- **`get_node_documentation`**: Gets only the documentation for a node
|
||||
- **`get_node_example`**: Gets example workflow for a node
|
||||
|
||||
### 2. Key Components
|
||||
|
||||
- **`NodeSourceExtractor`** (`src/utils/node-source-extractor.ts`): Handles file system access to extract node source code
|
||||
- **`NodeDocumentationService`** (`src/services/node-documentation-service.ts`): Manages SQLite database with node information
|
||||
- **`DocumentationFetcher`** (`src/utils/documentation-fetcher.ts`): Fetches docs from n8n-docs repository
|
||||
|
||||
### 3. Test Infrastructure
|
||||
|
||||
- **Docker setup** (`docker-compose.test.yml`): Mounts n8n's node_modules for source access
|
||||
- **Test scripts**: Multiple test approaches for different scenarios
|
||||
|
||||
## Running the Tests
|
||||
|
||||
### Option 1: Docker-based Test
|
||||
|
||||
```bash
|
||||
# Build the project
|
||||
npm run build
|
||||
|
||||
# Run the comprehensive test
|
||||
./scripts/test-ai-agent-extraction.sh
|
||||
```
|
||||
|
||||
This script will:
|
||||
1. Build Docker containers
|
||||
2. Start n8n and MCP server
|
||||
3. Check for AI Agent node availability
|
||||
4. Test source code extraction
|
||||
|
||||
### Option 2: Standalone MCP Test
|
||||
|
||||
```bash
|
||||
# Build the project
|
||||
npm run build
|
||||
|
||||
# Ensure n8n is running (locally or in Docker)
|
||||
docker-compose -f docker-compose.test.yml up -d n8n
|
||||
|
||||
# Run the MCP client test
|
||||
node tests/test-mcp-extraction.js
|
||||
```
|
||||
|
||||
### Option 3: Manual Testing
|
||||
|
||||
1. Start the environment:
|
||||
```bash
|
||||
docker-compose -f docker-compose.test.yml up -d
|
||||
```
|
||||
|
||||
2. Use any MCP client to connect and request:
|
||||
```json
|
||||
{
|
||||
"method": "tools/call",
|
||||
"params": {
|
||||
"name": "get_node_source_code",
|
||||
"arguments": {
|
||||
"nodeType": "@n8n/n8n-nodes-langchain.Agent",
|
||||
"includeCredentials": true
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Expected Results
|
||||
|
||||
### Successful Extraction Response
|
||||
|
||||
```json
|
||||
{
|
||||
"nodeType": "@n8n/n8n-nodes-langchain.Agent",
|
||||
"sourceCode": "/* AI Agent node JavaScript code */",
|
||||
"location": "/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/agents/Agent/Agent.node.js",
|
||||
"credentialCode": "/* Optional credential code */",
|
||||
"packageInfo": {
|
||||
"name": "@n8n/n8n-nodes-langchain",
|
||||
"version": "1.x.x",
|
||||
"description": "LangChain nodes for n8n"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## How It Works
|
||||
|
||||
1. **MCP Client Request**: Client calls `get_node_source_code` tool with node type
|
||||
2. **Server Processing**: MCP server receives request and invokes `NodeSourceExtractor`
|
||||
3. **File System Search**: Extractor searches known n8n paths for the node file
|
||||
4. **Source Extraction**: Reads the JavaScript source code and optional credential files
|
||||
5. **Response Formation**: Returns structured data with source code and metadata
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Node Not Found
|
||||
|
||||
If the AI Agent node is not found:
|
||||
|
||||
1. Check if langchain nodes are installed:
|
||||
```bash
|
||||
docker exec n8n-test ls /usr/local/lib/node_modules/n8n/node_modules/@n8n/
|
||||
```
|
||||
|
||||
2. Install langchain nodes:
|
||||
```bash
|
||||
docker exec n8n-test npm install -g @n8n/n8n-nodes-langchain
|
||||
```
|
||||
|
||||
### Permission Issues
|
||||
|
||||
Ensure the MCP container has read access to n8n's node_modules:
|
||||
```yaml
|
||||
volumes:
|
||||
- n8n_modules:/usr/local/lib/node_modules/n8n/node_modules:ro
|
||||
```
|
||||
|
||||
### Alternative Node Types
|
||||
|
||||
You can test with other built-in nodes:
|
||||
- `n8n-nodes-base.HttpRequest`
|
||||
- `n8n-nodes-base.Code`
|
||||
- `n8n-nodes-base.If`
|
||||
|
||||
## Success Criteria
|
||||
|
||||
The test is successful when:
|
||||
1. ✅ MCP server starts and accepts connections
|
||||
2. ✅ Client can discover the `get_node_source_code` tool
|
||||
3. ✅ Server locates the AI Agent node in the file system
|
||||
4. ✅ Complete source code is extracted and returned
|
||||
5. ✅ Response includes metadata (location, package info)
|
||||
|
||||
## Security Considerations
|
||||
|
||||
- Source code extraction is read-only
|
||||
- Access is limited to n8n's node_modules directory
|
||||
- Authentication token required for MCP server access
|
||||
- No modification of files is possible
|
||||
|
||||
## Next Steps
|
||||
|
||||
After successful testing:
|
||||
1. Deploy to production environment
|
||||
2. Configure proper authentication
|
||||
3. Set up monitoring for extraction requests
|
||||
4. Document available node types for users
|
||||
@@ -1,175 +0,0 @@
|
||||
# Quick Deployment Guide for n8ndocumentation.aiservices.pl
|
||||
|
||||
This guide walks through deploying the n8n Documentation MCP Server to a VM.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Ubuntu 20.04+ VM with root access
|
||||
- Domain pointing to your VM (e.g., n8ndocumentation.aiservices.pl)
|
||||
- Node.js 18+ installed on your local machine
|
||||
- Git installed locally
|
||||
|
||||
## Step 1: Prepare Local Environment
|
||||
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone https://github.com/yourusername/n8n-mcp.git
|
||||
cd n8n-mcp
|
||||
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Copy and configure environment
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
## Step 2: Configure for Production
|
||||
|
||||
Edit `.env` file:
|
||||
|
||||
```bash
|
||||
# Change these values:
|
||||
NODE_ENV=production
|
||||
MCP_DOMAIN=n8ndocumentation.aiservices.pl
|
||||
MCP_CORS=true
|
||||
|
||||
# Generate and set auth token:
|
||||
openssl rand -hex 32
|
||||
# Copy the output and set:
|
||||
MCP_AUTH_TOKEN=your-generated-token-here
|
||||
```
|
||||
|
||||
## Step 3: Build and Deploy
|
||||
|
||||
```bash
|
||||
# Build the project
|
||||
npm run build
|
||||
|
||||
# Run the deployment script
|
||||
./scripts/deploy-to-vm.sh
|
||||
```
|
||||
|
||||
If you don't have SSH key authentication set up, you'll be prompted for the server password.
|
||||
|
||||
## Step 4: Server Setup (First Time Only)
|
||||
|
||||
SSH into your server and run:
|
||||
|
||||
```bash
|
||||
# Install Node.js 18+
|
||||
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
|
||||
sudo apt install -y nodejs
|
||||
|
||||
# Install PM2
|
||||
sudo npm install -g pm2
|
||||
|
||||
# Install Nginx
|
||||
sudo apt install -y nginx
|
||||
|
||||
# Copy Nginx configuration
|
||||
sudo cp /opt/n8n-mcp/scripts/nginx-n8n-mcp.conf /etc/nginx/sites-available/n8n-mcp
|
||||
sudo ln -s /etc/nginx/sites-available/n8n-mcp /etc/nginx/sites-enabled/
|
||||
sudo nginx -t
|
||||
sudo systemctl reload nginx
|
||||
|
||||
# Get SSL certificate
|
||||
sudo apt install -y certbot python3-certbot-nginx
|
||||
sudo certbot --nginx -d n8ndocumentation.aiservices.pl
|
||||
|
||||
# Setup PM2 startup
|
||||
pm2 startup
|
||||
# Follow the instructions it provides
|
||||
```
|
||||
|
||||
## Step 5: Configure Claude Desktop
|
||||
|
||||
Add to your Claude Desktop configuration:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"n8n-nodes-remote": {
|
||||
"command": "npx",
|
||||
"args": [
|
||||
"-y",
|
||||
"@modelcontextprotocol/client-http",
|
||||
"https://n8ndocumentation.aiservices.pl/mcp"
|
||||
],
|
||||
"env": {
|
||||
"MCP_AUTH_TOKEN": "your-auth-token-from-env-file"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Verify Deployment
|
||||
|
||||
Test the endpoints:
|
||||
|
||||
```bash
|
||||
# Health check
|
||||
curl https://n8ndocumentation.aiservices.pl/health
|
||||
|
||||
# Statistics (public)
|
||||
curl https://n8ndocumentation.aiservices.pl/stats
|
||||
|
||||
# MCP endpoint (requires auth)
|
||||
curl -X POST https://n8ndocumentation.aiservices.pl/mcp \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "Authorization: Bearer your-auth-token" \
|
||||
-d '{"jsonrpc": "2.0", "method": "tools/list", "id": 1}'
|
||||
```
|
||||
|
||||
## Management Commands
|
||||
|
||||
On the server:
|
||||
|
||||
```bash
|
||||
# View logs
|
||||
pm2 logs n8n-docs-mcp
|
||||
|
||||
# Restart service
|
||||
pm2 restart n8n-docs-mcp
|
||||
|
||||
# View status
|
||||
pm2 status
|
||||
|
||||
# Rebuild database
|
||||
cd /opt/n8n-mcp
|
||||
npm run db:rebuild:v2
|
||||
pm2 restart n8n-docs-mcp
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Service won't start
|
||||
```bash
|
||||
# Check logs
|
||||
pm2 logs n8n-docs-mcp --lines 50
|
||||
|
||||
# Check if port is in use
|
||||
sudo lsof -i :3000
|
||||
```
|
||||
|
||||
### SSL issues
|
||||
```bash
|
||||
# Renew certificate
|
||||
sudo certbot renew
|
||||
```
|
||||
|
||||
### Database issues
|
||||
```bash
|
||||
# Rebuild database
|
||||
cd /opt/n8n-mcp
|
||||
rm data/nodes-v2.db
|
||||
npm run db:rebuild:v2
|
||||
```
|
||||
|
||||
## Security Notes
|
||||
|
||||
1. Keep your `MCP_AUTH_TOKEN` secret
|
||||
2. Regularly update dependencies: `npm update`
|
||||
3. Monitor logs for suspicious activity
|
||||
4. Use fail2ban to prevent brute force attacks
|
||||
5. Keep your VM updated: `sudo apt update && sudo apt upgrade`
|
||||
@@ -1,133 +0,0 @@
|
||||
# Enhanced Documentation Parser for n8n-MCP
|
||||
|
||||
## Overview
|
||||
|
||||
We have successfully enhanced the markdown parser in DocumentationFetcher to extract rich, structured content from n8n documentation. This enhancement enables AI agents to have deeper understanding of n8n nodes, their operations, API mappings, and usage patterns.
|
||||
|
||||
## Key Features Implemented
|
||||
|
||||
### 1. Enhanced Documentation Structure
|
||||
|
||||
The `EnhancedDocumentationFetcher` class extracts and structures documentation into:
|
||||
|
||||
```typescript
|
||||
interface EnhancedNodeDocumentation {
|
||||
markdown: string; // Raw markdown content
|
||||
url: string; // Documentation URL
|
||||
title?: string; // Node title
|
||||
description?: string; // Node description
|
||||
operations?: OperationInfo[]; // Structured operations
|
||||
apiMethods?: ApiMethodMapping[]; // API endpoint mappings
|
||||
examples?: CodeExample[]; // Code examples
|
||||
templates?: TemplateInfo[]; // Template references
|
||||
relatedResources?: RelatedResource[]; // Related docs
|
||||
requiredScopes?: string[]; // OAuth scopes
|
||||
metadata?: DocumentationMetadata; // Frontmatter data
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Operations Extraction
|
||||
|
||||
The parser correctly identifies and extracts hierarchical operations:
|
||||
|
||||
- **Resource Level**: e.g., "Channel", "Message", "User"
|
||||
- **Operation Level**: e.g., "Archive", "Send", "Get"
|
||||
- **Descriptions**: Detailed operation descriptions
|
||||
|
||||
Example from Slack node:
|
||||
- Channel.Archive: "a channel"
|
||||
- Message.Send: "a message"
|
||||
- User.Get: "information about a user"
|
||||
|
||||
### 3. API Method Mapping
|
||||
|
||||
Extracts mappings between n8n operations and actual API endpoints from markdown tables:
|
||||
|
||||
```
|
||||
Channel.Archive → conversations.archive (https://api.slack.com/methods/conversations.archive)
|
||||
Message.Send → chat.postMessage (https://api.slack.com/methods/chat.postMessage)
|
||||
```
|
||||
|
||||
### 4. Enhanced Database Schema
|
||||
|
||||
Created a new schema to store the rich documentation:
|
||||
|
||||
- `nodes` table: Extended with documentation fields
|
||||
- `node_operations`: Stores all operations for each node
|
||||
- `node_api_methods`: Maps operations to API endpoints
|
||||
- `node_examples`: Stores code examples
|
||||
- `node_resources`: Related documentation links
|
||||
- `node_scopes`: Required OAuth scopes
|
||||
|
||||
### 5. Full-Text Search Enhancement
|
||||
|
||||
The FTS index now includes:
|
||||
- Documentation title and description
|
||||
- Operations and their descriptions
|
||||
- API method names
|
||||
- Full markdown content
|
||||
|
||||
## Usage Examples
|
||||
|
||||
### Basic Usage
|
||||
|
||||
```javascript
|
||||
const fetcher = new EnhancedDocumentationFetcher();
|
||||
const doc = await fetcher.getEnhancedNodeDocumentation('n8n-nodes-base.slack');
|
||||
|
||||
// Access structured data
|
||||
console.log(`Operations: ${doc.operations.length}`);
|
||||
console.log(`API Methods: ${doc.apiMethods.length}`);
|
||||
```
|
||||
|
||||
### With Database Storage
|
||||
|
||||
```javascript
|
||||
const storage = new EnhancedSQLiteStorageService();
|
||||
const nodeInfo = await extractor.extractNodeSource('n8n-nodes-base.slack');
|
||||
const storedNode = await storage.storeNodeWithDocumentation(nodeInfo);
|
||||
|
||||
// Access counts
|
||||
console.log(`Stored ${storedNode.operationCount} operations`);
|
||||
console.log(`Stored ${storedNode.apiMethodCount} API methods`);
|
||||
```
|
||||
|
||||
## Benefits for AI Agents
|
||||
|
||||
1. **Comprehensive Understanding**: AI agents can now understand not just what a node does, but exactly which operations are available and how they map to API endpoints.
|
||||
|
||||
2. **Better Search**: Enhanced FTS allows searching across operations, descriptions, and documentation content.
|
||||
|
||||
3. **Structured Data**: Operations and API methods are stored as structured data, making it easier for AI to reason about node capabilities.
|
||||
|
||||
4. **Rich Context**: Related resources, examples, and metadata provide additional context for better AI responses.
|
||||
|
||||
## Implementation Files
|
||||
|
||||
- `/src/utils/enhanced-documentation-fetcher.ts`: Main parser implementation
|
||||
- `/src/services/enhanced-sqlite-storage-service.ts`: Database storage with rich schema
|
||||
- `/src/db/enhanced-schema.sql`: Enhanced database schema
|
||||
- `/tests/demo-enhanced-documentation.js`: Working demonstration
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
1. **Example Extraction**: Improve code example extraction from documentation
|
||||
2. **Parameter Parsing**: Extract operation parameters and their types
|
||||
3. **Credential Requirements**: Parse specific credential field requirements
|
||||
4. **Version Tracking**: Track documentation versions and changes
|
||||
5. **Caching**: Implement smart caching for documentation fetches
|
||||
|
||||
## Testing
|
||||
|
||||
Run the demo to see the enhanced parser in action:
|
||||
|
||||
```bash
|
||||
npm run build
|
||||
node tests/demo-enhanced-documentation.js
|
||||
```
|
||||
|
||||
This will show:
|
||||
- Extraction of 40+ operations from Slack node
|
||||
- API method mappings with URLs
|
||||
- Resource grouping and organization
|
||||
- Related documentation links
|
||||
@@ -1,232 +0,0 @@
|
||||
# Production Deployment Guide
|
||||
|
||||
This guide covers deploying the n8n Documentation MCP Server in production environments.
|
||||
|
||||
## Overview
|
||||
|
||||
The n8n Documentation MCP Server provides node documentation and source code to AI assistants. It can be deployed:
|
||||
- **Locally** - Using stdio transport for Claude Desktop on the same machine
|
||||
- **Remotely** - Using HTTP transport for access over the internet
|
||||
|
||||
For remote deployment with full VM setup instructions, see [REMOTE_DEPLOYMENT.md](./REMOTE_DEPLOYMENT.md).
|
||||
|
||||
## Local Production Deployment
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js 18+ installed
|
||||
- Git installed
|
||||
- 500MB available disk space
|
||||
|
||||
### Quick Start
|
||||
|
||||
1. **Clone and setup**
|
||||
```bash
|
||||
git clone https://github.com/yourusername/n8n-mcp.git
|
||||
cd n8n-mcp
|
||||
npm install
|
||||
npm run build
|
||||
```
|
||||
|
||||
2. **Initialize database**
|
||||
```bash
|
||||
npm run db:rebuild:v2
|
||||
```
|
||||
|
||||
3. **Configure Claude Desktop**
|
||||
Edit Claude Desktop config (see README.md for paths):
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"n8n-nodes": {
|
||||
"command": "node",
|
||||
"args": ["/absolute/path/to/n8n-mcp/dist/index-v2.js"],
|
||||
"env": {
|
||||
"NODE_DB_PATH": "/absolute/path/to/n8n-mcp/data/nodes-v2.db"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Docker Deployment
|
||||
|
||||
### Using Docker Compose
|
||||
|
||||
1. **Create docker-compose.yml**
|
||||
```yaml
|
||||
version: '3.8'
|
||||
services:
|
||||
n8n-docs-mcp:
|
||||
build: .
|
||||
volumes:
|
||||
- ./data:/app/data
|
||||
environment:
|
||||
- NODE_ENV=production
|
||||
- NODE_DB_PATH=/app/data/nodes-v2.db
|
||||
command: node dist/index-v2.js
|
||||
```
|
||||
|
||||
2. **Build and run**
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
### Using Dockerfile
|
||||
|
||||
```dockerfile
|
||||
FROM node:18-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Copy package files
|
||||
COPY package*.json ./
|
||||
RUN npm ci --only=production
|
||||
|
||||
# Copy built files
|
||||
COPY dist/ ./dist/
|
||||
COPY data/ ./data/
|
||||
|
||||
# Set environment
|
||||
ENV NODE_ENV=production
|
||||
ENV NODE_DB_PATH=/app/data/nodes-v2.db
|
||||
|
||||
# Run the server
|
||||
CMD ["node", "dist/index-v2.js"]
|
||||
```
|
||||
|
||||
## Database Management
|
||||
|
||||
### Automatic Rebuilds
|
||||
|
||||
Schedule regular database updates to get latest node documentation:
|
||||
|
||||
```bash
|
||||
# Add to crontab
|
||||
0 2 * * * cd /path/to/n8n-mcp && npm run db:rebuild:v2
|
||||
```
|
||||
|
||||
### Manual Rebuild
|
||||
|
||||
```bash
|
||||
npm run db:rebuild:v2
|
||||
```
|
||||
|
||||
### Database Location
|
||||
|
||||
The SQLite database is stored at: `data/nodes-v2.db`
|
||||
|
||||
### Backup
|
||||
|
||||
```bash
|
||||
# Simple backup
|
||||
cp data/nodes-v2.db data/nodes-v2.db.backup
|
||||
|
||||
# Timestamped backup
|
||||
cp data/nodes-v2.db "data/nodes-v2-$(date +%Y%m%d-%H%M%S).db"
|
||||
```
|
||||
|
||||
## Monitoring
|
||||
|
||||
### Database Statistics
|
||||
|
||||
Check the database status:
|
||||
|
||||
```bash
|
||||
# Using SQLite directly
|
||||
sqlite3 data/nodes-v2.db "SELECT COUNT(*) as total_nodes FROM nodes;"
|
||||
|
||||
# Using the MCP tool (in Claude)
|
||||
# "Get database statistics for n8n nodes"
|
||||
```
|
||||
|
||||
### Logs
|
||||
|
||||
For local deployment:
|
||||
```bash
|
||||
# Run with logging
|
||||
NODE_ENV=production node dist/index-v2.js 2>&1 | tee app.log
|
||||
```
|
||||
|
||||
## Performance Optimization
|
||||
|
||||
### SQLite Optimization
|
||||
|
||||
The database uses these optimizations by default:
|
||||
- WAL mode for better concurrency
|
||||
- Memory-mapped I/O
|
||||
- Full-text search indexes
|
||||
|
||||
### System Requirements
|
||||
|
||||
- **Minimum**: 256MB RAM, 500MB disk
|
||||
- **Recommended**: 512MB RAM, 1GB disk
|
||||
- **CPU**: Minimal requirements (mostly I/O bound)
|
||||
|
||||
## Security
|
||||
|
||||
### Local Deployment
|
||||
|
||||
- No network exposure (stdio only)
|
||||
- File system permissions control access
|
||||
- No authentication needed
|
||||
|
||||
### Remote Deployment
|
||||
|
||||
See [REMOTE_DEPLOYMENT.md](./REMOTE_DEPLOYMENT.md) for:
|
||||
- HTTPS configuration
|
||||
- Authentication setup
|
||||
- Firewall rules
|
||||
- Security best practices
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Database Issues
|
||||
|
||||
If the database is missing or corrupted:
|
||||
```bash
|
||||
# Rebuild from scratch
|
||||
rm data/nodes-v2.db
|
||||
npm run db:rebuild:v2
|
||||
```
|
||||
|
||||
### Memory Issues
|
||||
|
||||
If running on limited memory:
|
||||
```bash
|
||||
# Limit Node.js memory usage
|
||||
NODE_OPTIONS="--max-old-space-size=256" node dist/index-v2.js
|
||||
```
|
||||
|
||||
### Permission Issues
|
||||
|
||||
Ensure proper file permissions:
|
||||
```bash
|
||||
chmod 644 data/nodes-v2.db
|
||||
chmod 755 data/
|
||||
```
|
||||
|
||||
## Updates
|
||||
|
||||
To update to the latest version:
|
||||
|
||||
```bash
|
||||
# Pull latest code
|
||||
git pull
|
||||
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Rebuild
|
||||
npm run build
|
||||
|
||||
# Rebuild database
|
||||
npm run db:rebuild:v2
|
||||
```
|
||||
|
||||
## Support
|
||||
|
||||
For issues and questions:
|
||||
- GitHub Issues: https://github.com/yourusername/n8n-mcp/issues
|
||||
- Check logs for error messages
|
||||
- Verify database integrity
|
||||
@@ -1,434 +0,0 @@
|
||||
# Remote Deployment Guide
|
||||
|
||||
This guide explains how to deploy the n8n Documentation MCP Server to a remote VM (such as Hetzner) and connect to it from Claude Desktop.
|
||||
|
||||
**Quick Start**: For a streamlined deployment to n8ndocumentation.aiservices.pl, see [DEPLOYMENT_QUICKSTART.md](./DEPLOYMENT_QUICKSTART.md).
|
||||
|
||||
## Overview
|
||||
|
||||
The n8n Documentation MCP Server can be deployed as a remote HTTP service, allowing Claude Desktop to access n8n node documentation over the internet. This is useful for:
|
||||
|
||||
- Centralized documentation serving for teams
|
||||
- Accessing documentation without local n8n installation
|
||||
- Cloud-based AI development workflows
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
Claude Desktop → Internet → MCP Server (HTTPS) → SQLite Database
|
||||
↓
|
||||
n8n Documentation
|
||||
```
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- A VM with Ubuntu 20.04+ or similar Linux distribution
|
||||
- Node.js 18+ installed
|
||||
- A domain name (e.g., `n8ndocumentation.aiservices.pl`)
|
||||
- SSL certificate (Let's Encrypt recommended)
|
||||
- Basic knowledge of Linux server administration
|
||||
|
||||
## Deployment Steps
|
||||
|
||||
### 1. Server Setup
|
||||
|
||||
SSH into your VM and prepare the environment:
|
||||
|
||||
```bash
|
||||
# Update system packages
|
||||
sudo apt update && sudo apt upgrade -y
|
||||
|
||||
# Install Node.js 18+ (if not already installed)
|
||||
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
|
||||
sudo apt install -y nodejs
|
||||
|
||||
# Install git and build essentials
|
||||
sudo apt install -y git build-essential
|
||||
|
||||
# Install PM2 for process management
|
||||
sudo npm install -g pm2
|
||||
|
||||
# Create application directory
|
||||
sudo mkdir -p /opt/n8n-mcp
|
||||
sudo chown $USER:$USER /opt/n8n-mcp
|
||||
```
|
||||
|
||||
### 2. Clone and Build
|
||||
|
||||
```bash
|
||||
cd /opt
|
||||
git clone https://github.com/yourusername/n8n-mcp.git
|
||||
cd n8n-mcp
|
||||
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Build the project
|
||||
npm run build
|
||||
|
||||
# Initialize database
|
||||
npm run db:rebuild:v2
|
||||
```
|
||||
|
||||
### 3. Configure Environment
|
||||
|
||||
Create the production environment file:
|
||||
|
||||
```bash
|
||||
cp .env.example .env
|
||||
nano .env
|
||||
```
|
||||
|
||||
Configure with your domain and security settings:
|
||||
|
||||
```env
|
||||
# Remote Server Configuration
|
||||
MCP_PORT=3000
|
||||
MCP_HOST=0.0.0.0
|
||||
MCP_DOMAIN=n8ndocumentation.aiservices.pl
|
||||
|
||||
# Authentication - REQUIRED for production
|
||||
# Generate secure token: openssl rand -hex 32
|
||||
MCP_AUTH_TOKEN=your-generated-secure-token-here
|
||||
|
||||
# Enable CORS for browser access
|
||||
MCP_CORS=true
|
||||
|
||||
# Database path
|
||||
NODE_DB_PATH=/opt/n8n-mcp/data/nodes-v2.db
|
||||
|
||||
# Production environment
|
||||
NODE_ENV=production
|
||||
MCP_LOG_LEVEL=info
|
||||
```
|
||||
|
||||
### 4. Setup SSL with Nginx
|
||||
|
||||
Install and configure Nginx as a reverse proxy with SSL:
|
||||
|
||||
```bash
|
||||
# Install Nginx and Certbot
|
||||
sudo apt install -y nginx certbot python3-certbot-nginx
|
||||
|
||||
# Create Nginx configuration
|
||||
sudo nano /etc/nginx/sites-available/n8n-mcp
|
||||
```
|
||||
|
||||
Add the following configuration:
|
||||
|
||||
```nginx
|
||||
server {
|
||||
listen 80;
|
||||
server_name n8ndocumentation.aiservices.pl;
|
||||
|
||||
location / {
|
||||
return 301 https://$server_name$request_uri;
|
||||
}
|
||||
}
|
||||
|
||||
server {
|
||||
listen 443 ssl;
|
||||
server_name n8ndocumentation.aiservices.pl;
|
||||
|
||||
# SSL will be configured by Certbot
|
||||
|
||||
# Security headers
|
||||
add_header X-Content-Type-Options nosniff;
|
||||
add_header X-Frame-Options DENY;
|
||||
add_header X-XSS-Protection "1; mode=block";
|
||||
|
||||
# Proxy settings
|
||||
location / {
|
||||
proxy_pass http://localhost:3000;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection 'upgrade';
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
proxy_cache_bypass $http_upgrade;
|
||||
|
||||
# Increase timeouts for MCP operations
|
||||
proxy_connect_timeout 60s;
|
||||
proxy_send_timeout 60s;
|
||||
proxy_read_timeout 60s;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Enable the site and obtain SSL certificate:
|
||||
|
||||
```bash
|
||||
# Enable the site
|
||||
sudo ln -s /etc/nginx/sites-available/n8n-mcp /etc/nginx/sites-enabled/
|
||||
sudo nginx -t
|
||||
sudo systemctl reload nginx
|
||||
|
||||
# Obtain SSL certificate
|
||||
sudo certbot --nginx -d n8ndocumentation.aiservices.pl
|
||||
```
|
||||
|
||||
### 5. Start with PM2
|
||||
|
||||
Create PM2 ecosystem file:
|
||||
|
||||
```bash
|
||||
nano /opt/n8n-mcp/ecosystem.config.js
|
||||
```
|
||||
|
||||
```javascript
|
||||
module.exports = {
|
||||
apps: [{
|
||||
name: 'n8n-mcp',
|
||||
script: './dist/index-http.js',
|
||||
cwd: '/opt/n8n-mcp',
|
||||
instances: 1,
|
||||
autorestart: true,
|
||||
watch: false,
|
||||
max_memory_restart: '1G',
|
||||
env: {
|
||||
NODE_ENV: 'production'
|
||||
},
|
||||
error_file: '/opt/n8n-mcp/logs/error.log',
|
||||
out_file: '/opt/n8n-mcp/logs/out.log',
|
||||
log_file: '/opt/n8n-mcp/logs/combined.log',
|
||||
time: true
|
||||
}]
|
||||
};
|
||||
```
|
||||
|
||||
Start the application:
|
||||
|
||||
```bash
|
||||
# Create logs directory
|
||||
mkdir -p /opt/n8n-mcp/logs
|
||||
|
||||
# Start with PM2
|
||||
pm2 start ecosystem.config.js
|
||||
|
||||
# Save PM2 configuration
|
||||
pm2 save
|
||||
|
||||
# Setup PM2 to start on boot
|
||||
pm2 startup
|
||||
```
|
||||
|
||||
### 6. Configure Firewall
|
||||
|
||||
```bash
|
||||
# Allow SSH, HTTP, and HTTPS
|
||||
sudo ufw allow 22/tcp
|
||||
sudo ufw allow 80/tcp
|
||||
sudo ufw allow 443/tcp
|
||||
sudo ufw enable
|
||||
```
|
||||
|
||||
## Claude Desktop Configuration
|
||||
|
||||
### 1. Get your auth token
|
||||
|
||||
From your server, get the configured auth token:
|
||||
|
||||
```bash
|
||||
grep MCP_AUTH_TOKEN /opt/n8n-mcp/.env
|
||||
```
|
||||
|
||||
### 2. Configure Claude Desktop
|
||||
|
||||
Edit your Claude Desktop configuration:
|
||||
|
||||
**macOS**: `~/Library/Application Support/Claude/claude_desktop_config.json`
|
||||
**Windows**: `%APPDATA%\Claude\claude_desktop_config.json`
|
||||
**Linux**: `~/.config/Claude/claude_desktop_config.json`
|
||||
|
||||
Add the remote MCP server:
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"n8n-nodes-remote": {
|
||||
"command": "npx",
|
||||
"args": [
|
||||
"-y",
|
||||
"@modelcontextprotocol/client-http",
|
||||
"https://n8ndocumentation.aiservices.pl/mcp"
|
||||
],
|
||||
"env": {
|
||||
"MCP_AUTH_TOKEN": "your-auth-token-here"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Restart Claude Desktop
|
||||
|
||||
Quit and restart Claude Desktop to load the new configuration.
|
||||
|
||||
## Server Management
|
||||
|
||||
### Viewing Logs
|
||||
|
||||
```bash
|
||||
# View real-time logs
|
||||
pm2 logs n8n-mcp
|
||||
|
||||
# View error logs
|
||||
tail -f /opt/n8n-mcp/logs/error.log
|
||||
|
||||
# View Nginx logs
|
||||
sudo tail -f /var/log/nginx/access.log
|
||||
sudo tail -f /var/log/nginx/error.log
|
||||
```
|
||||
|
||||
### Rebuilding Database
|
||||
|
||||
To update the node documentation database:
|
||||
|
||||
```bash
|
||||
cd /opt/n8n-mcp
|
||||
|
||||
# Stop the server
|
||||
pm2 stop n8n-mcp
|
||||
|
||||
# Rebuild database
|
||||
npm run db:rebuild:v2
|
||||
|
||||
# Restart server
|
||||
pm2 restart n8n-mcp
|
||||
```
|
||||
|
||||
### Updating the Server
|
||||
|
||||
```bash
|
||||
cd /opt/n8n-mcp
|
||||
|
||||
# Pull latest changes
|
||||
git pull
|
||||
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Build
|
||||
npm run build
|
||||
|
||||
# Restart
|
||||
pm2 restart n8n-mcp
|
||||
```
|
||||
|
||||
## Security Considerations
|
||||
|
||||
1. **Authentication Token**: Always use a strong, randomly generated token
|
||||
```bash
|
||||
openssl rand -hex 32
|
||||
```
|
||||
|
||||
2. **HTTPS**: Always use HTTPS in production. The setup above includes automatic SSL with Let's Encrypt.
|
||||
|
||||
3. **Firewall**: Only open necessary ports (22, 80, 443)
|
||||
|
||||
4. **Updates**: Keep the system and Node.js updated regularly
|
||||
|
||||
5. **Monitoring**: Set up monitoring for the service:
|
||||
```bash
|
||||
# PM2 monitoring
|
||||
pm2 install pm2-logrotate
|
||||
pm2 set pm2-logrotate:max_size 10M
|
||||
pm2 set pm2-logrotate:retain 7
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
Once deployed, your server provides:
|
||||
|
||||
- `GET https://n8ndocumentation.aiservices.pl/` - Server information
|
||||
- `GET https://n8ndocumentation.aiservices.pl/health` - Health check
|
||||
- `GET https://n8ndocumentation.aiservices.pl/stats` - Database statistics
|
||||
- `POST https://n8ndocumentation.aiservices.pl/mcp` - MCP protocol endpoint
|
||||
- `POST https://n8ndocumentation.aiservices.pl/rebuild` - Rebuild database (requires auth)
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Connection Issues
|
||||
|
||||
1. Check if the server is running:
|
||||
```bash
|
||||
pm2 status
|
||||
curl https://n8ndocumentation.aiservices.pl/health
|
||||
```
|
||||
|
||||
2. Verify Nginx is working:
|
||||
```bash
|
||||
sudo nginx -t
|
||||
sudo systemctl status nginx
|
||||
```
|
||||
|
||||
3. Check firewall:
|
||||
```bash
|
||||
sudo ufw status
|
||||
```
|
||||
|
||||
### Authentication Failures
|
||||
|
||||
1. Verify the token matches in both `.env` and Claude config
|
||||
2. Check server logs for auth errors:
|
||||
```bash
|
||||
pm2 logs n8n-mcp --lines 100
|
||||
```
|
||||
|
||||
### Database Issues
|
||||
|
||||
1. Check database exists:
|
||||
```bash
|
||||
ls -la /opt/n8n-mcp/data/nodes-v2.db
|
||||
```
|
||||
|
||||
2. Rebuild if necessary:
|
||||
```bash
|
||||
cd /opt/n8n-mcp
|
||||
npm run db:rebuild:v2
|
||||
```
|
||||
|
||||
## Monitoring and Maintenance
|
||||
|
||||
### Health Monitoring
|
||||
|
||||
Set up external monitoring (e.g., UptimeRobot) to check:
|
||||
- `https://n8ndocumentation.aiservices.pl/health`
|
||||
|
||||
### Backup
|
||||
|
||||
Regular backups of the database:
|
||||
|
||||
```bash
|
||||
# Create backup script
|
||||
cat > /opt/n8n-mcp/backup.sh << 'EOF'
|
||||
#!/bin/bash
|
||||
BACKUP_DIR="/opt/n8n-mcp/backups"
|
||||
mkdir -p $BACKUP_DIR
|
||||
cp /opt/n8n-mcp/data/nodes-v2.db "$BACKUP_DIR/nodes-v2-$(date +%Y%m%d-%H%M%S).db"
|
||||
# Keep only last 7 backups
|
||||
find $BACKUP_DIR -name "nodes-v2-*.db" -mtime +7 -delete
|
||||
EOF
|
||||
|
||||
chmod +x /opt/n8n-mcp/backup.sh
|
||||
|
||||
# Add to crontab (daily at 2 AM)
|
||||
(crontab -l 2>/dev/null; echo "0 2 * * * /opt/n8n-mcp/backup.sh") | crontab -
|
||||
```
|
||||
|
||||
## Cost Optimization
|
||||
|
||||
For a small Hetzner VM (CX11 - 1 vCPU, 2GB RAM):
|
||||
- Monthly cost: ~€4-5
|
||||
- Sufficient for serving documentation to multiple Claude instances
|
||||
- Can handle hundreds of concurrent connections
|
||||
|
||||
## Support
|
||||
|
||||
For issues specific to remote deployment:
|
||||
1. Check server logs first
|
||||
2. Verify network connectivity
|
||||
3. Ensure all dependencies are installed
|
||||
4. Check GitHub issues for similar problems
|
||||
@@ -1,81 +0,0 @@
|
||||
# Slack Documentation Fix Summary
|
||||
|
||||
## Issues Fixed
|
||||
|
||||
### 1. Documentation Fetcher Was Getting Wrong Files
|
||||
**Problem**: When searching for Slack node documentation, the fetcher was finding credential documentation instead of node documentation.
|
||||
|
||||
**Root Cause**:
|
||||
- Documentation files in n8n-docs repository are named with full node type (e.g., `n8n-nodes-base.slack.md`)
|
||||
- The fetcher was searching for just the node name (e.g., `slack.md`)
|
||||
- This caused it to find `slack.md` in the credentials folder first
|
||||
|
||||
**Fix Applied**:
|
||||
- Updated `getNodeDocumentation()` to search for full node type first
|
||||
- Added logic to skip credential documentation files by checking:
|
||||
- If file path includes `/credentials/`
|
||||
- If content has "credentials" in title without "node documentation"
|
||||
- Fixed search order to prioritize correct documentation
|
||||
|
||||
### 2. Node Source Extractor Case Sensitivity
|
||||
**Problem**: Slack node source code wasn't found because the directory is capitalized (`Slack/`) but search was case-sensitive.
|
||||
|
||||
**Root Cause**:
|
||||
- n8n node directories use capitalized names (e.g., `Slack/`, `If/`)
|
||||
- Extractor was searching with lowercase names from node type
|
||||
|
||||
**Fix Applied**:
|
||||
- Added case variants to try when searching:
|
||||
- Original case
|
||||
- Capitalized first letter
|
||||
- All lowercase
|
||||
- All uppercase
|
||||
- Now properly finds nodes regardless of directory naming convention
|
||||
|
||||
### 3. Missing Information in Database
|
||||
**Problem**: Node definitions weren't being properly parsed from compiled JavaScript.
|
||||
|
||||
**Fix Applied**:
|
||||
- Improved `parseNodeDefinition()` to extract individual fields using regex
|
||||
- Added extraction for:
|
||||
- displayName
|
||||
- description
|
||||
- icon
|
||||
- category/group
|
||||
- version
|
||||
- trigger/webhook detection
|
||||
|
||||
## Test Results
|
||||
|
||||
After applying fixes:
|
||||
- ✅ Slack node source code is correctly extracted
|
||||
- ✅ Slack node documentation (not credentials) is fetched
|
||||
- ✅ Documentation URL points to correct page
|
||||
- ✅ All information is properly stored in database
|
||||
|
||||
## Files Modified
|
||||
|
||||
1. `/src/utils/documentation-fetcher.ts`
|
||||
- Fixed path searching logic
|
||||
- Added credential documentation filtering
|
||||
- Improved search order
|
||||
|
||||
2. `/src/utils/node-source-extractor.ts`
|
||||
- Added case-insensitive directory searching
|
||||
- Improved path detection for different node structures
|
||||
|
||||
3. `/src/services/node-documentation-service.ts`
|
||||
- Enhanced node definition parsing
|
||||
- Better extraction of metadata from source code
|
||||
|
||||
## Verification
|
||||
|
||||
Run the test to verify the fix:
|
||||
```bash
|
||||
node tests/test-slack-fix.js
|
||||
```
|
||||
|
||||
This should show:
|
||||
- Source code found at correct location
|
||||
- Documentation is node documentation (not credentials)
|
||||
- All fields properly extracted and stored
|
||||
@@ -1,18 +0,0 @@
|
||||
{
|
||||
"mcpServers": {
|
||||
"puppeteer": {
|
||||
"command": "npx",
|
||||
"args": [
|
||||
"-y",
|
||||
"@modelcontextprotocol/server-puppeteer"
|
||||
]
|
||||
},
|
||||
"n8n-nodes": {
|
||||
"command": "node",
|
||||
"args": ["/Users/romualdczlonkowski/Pliki/n8n-mcp/n8n-mcp/dist/index-v2.js"],
|
||||
"env": {
|
||||
"NODE_DB_PATH": "/Users/romualdczlonkowski/Pliki/n8n-mcp/n8n-mcp/data/nodes-v2.db"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
768
docs/implementation_plan2.md
Normal file
768
docs/implementation_plan2.md
Normal file
@@ -0,0 +1,768 @@
|
||||
# n8n-MCP Enhancement Implementation Plan v2.2
|
||||
|
||||
## Executive Summary
|
||||
|
||||
This revised plan addresses the core issues discovered during testing: empty properties/operations arrays and missing AI tool detection. We focus on fixing the data extraction and storage pipeline while maintaining the simplicity of v2.1.
|
||||
|
||||
## Key Issues Found & Solutions
|
||||
|
||||
### 1. Empty Properties/Operations Arrays
|
||||
**Problem**: The MCP service returns empty arrays for properties, operations, and credentials despite nodes having this data.
|
||||
|
||||
**Root Cause**: The parser is correctly extracting data, but either:
|
||||
- The data isn't being properly serialized to the database
|
||||
- The MCP server isn't deserializing it correctly
|
||||
- The property structure is more complex than expected
|
||||
|
||||
**Solution**: Enhanced property extraction and proper JSON handling
|
||||
|
||||
### 2. AI Tools Not Detected
|
||||
**Problem**: No nodes are flagged as AI tools despite having `usableAsTool` property.
|
||||
|
||||
**Root Cause**: The property might be nested or named differently in the actual node classes.
|
||||
|
||||
**Solution**: Deep property search and multiple detection strategies
|
||||
|
||||
### 3. Missing Versioned Node Support
|
||||
**Problem**: Versioned nodes aren't properly handled, leading to incomplete data.
|
||||
|
||||
**Solution**: Explicit version handling for nodes like HTTPRequest and Code
|
||||
|
||||
## Updated Architecture
|
||||
|
||||
```
|
||||
n8n-mcp/
|
||||
├── src/
|
||||
│ ├── loaders/
|
||||
│ │ └── node-loader.ts # Enhanced with better error handling
|
||||
│ ├── parsers/
|
||||
│ │ ├── property-extractor.ts # NEW: Dedicated property extraction
|
||||
│ │ └── node-parser.ts # Updated parser with deep inspection
|
||||
│ ├── mappers/
|
||||
│ │ └── docs-mapper.ts # Existing (working fine)
|
||||
│ ├── database/
|
||||
│ │ └── node-repository.ts # NEW: Proper data serialization
|
||||
│ ├── scripts/
|
||||
│ │ └── rebuild.ts # Enhanced with validation
|
||||
│ └── mcp/
|
||||
│ └── server.ts # Fixed data retrieval
|
||||
└── data/
|
||||
└── nodes.db # Same schema
|
||||
```
|
||||
|
||||
## Week 1: Core Fixes
|
||||
|
||||
### Day 1-2: Property Extractor
|
||||
|
||||
**NEW File**: `src/parsers/property-extractor.ts`
|
||||
|
||||
```typescript
|
||||
export class PropertyExtractor {
|
||||
/**
|
||||
* Extract properties with proper handling of n8n's complex structures
|
||||
*/
|
||||
extractProperties(nodeClass: any): any[] {
|
||||
const properties = [];
|
||||
|
||||
// Handle versioned nodes
|
||||
if (nodeClass.nodeVersions) {
|
||||
const versions = Object.keys(nodeClass.nodeVersions);
|
||||
const latestVersion = Math.max(...versions.map(Number));
|
||||
const versionedNode = nodeClass.nodeVersions[latestVersion];
|
||||
|
||||
if (versionedNode.description?.properties) {
|
||||
return this.normalizeProperties(versionedNode.description.properties);
|
||||
}
|
||||
}
|
||||
|
||||
// Handle regular nodes
|
||||
if (nodeClass.description?.properties) {
|
||||
return this.normalizeProperties(nodeClass.description.properties);
|
||||
}
|
||||
|
||||
return properties;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract operations from both declarative and programmatic nodes
|
||||
*/
|
||||
extractOperations(nodeClass: any): any[] {
|
||||
const operations = [];
|
||||
|
||||
// Declarative nodes (with routing)
|
||||
if (nodeClass.description?.routing) {
|
||||
const routing = nodeClass.description.routing;
|
||||
|
||||
// Extract from request.resource and request.operation
|
||||
if (routing.request?.resource) {
|
||||
const resources = routing.request.resource.options || [];
|
||||
const operationOptions = routing.request.operation?.options || {};
|
||||
|
||||
resources.forEach(resource => {
|
||||
const resourceOps = operationOptions[resource.value] || [];
|
||||
resourceOps.forEach(op => {
|
||||
operations.push({
|
||||
resource: resource.value,
|
||||
operation: op.value,
|
||||
name: `${resource.name} - ${op.name}`,
|
||||
action: op.action
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Programmatic nodes - look for operation property
|
||||
const props = this.extractProperties(nodeClass);
|
||||
const operationProp = props.find(p => p.name === 'operation' || p.name === 'action');
|
||||
|
||||
if (operationProp?.options) {
|
||||
operationProp.options.forEach(op => {
|
||||
operations.push({
|
||||
operation: op.value,
|
||||
name: op.name,
|
||||
description: op.description
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
return operations;
|
||||
}
|
||||
|
||||
/**
|
||||
* Deep search for AI tool capability
|
||||
*/
|
||||
detectAIToolCapability(nodeClass: any): boolean {
|
||||
// Direct property check
|
||||
if (nodeClass.description?.usableAsTool === true) return true;
|
||||
|
||||
// Check in actions for declarative nodes
|
||||
if (nodeClass.description?.actions?.some(a => a.usableAsTool === true)) return true;
|
||||
|
||||
// Check versioned nodes
|
||||
if (nodeClass.nodeVersions) {
|
||||
for (const version of Object.values(nodeClass.nodeVersions)) {
|
||||
if ((version as any).description?.usableAsTool === true) return true;
|
||||
}
|
||||
}
|
||||
|
||||
// Check for specific AI-related properties
|
||||
const aiIndicators = ['openai', 'anthropic', 'huggingface', 'cohere', 'ai'];
|
||||
const nodeName = nodeClass.description?.name?.toLowerCase() || '';
|
||||
|
||||
return aiIndicators.some(indicator => nodeName.includes(indicator));
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract credential requirements with proper structure
|
||||
*/
|
||||
extractCredentials(nodeClass: any): any[] {
|
||||
const credentials = [];
|
||||
|
||||
// Handle versioned nodes
|
||||
if (nodeClass.nodeVersions) {
|
||||
const versions = Object.keys(nodeClass.nodeVersions);
|
||||
const latestVersion = Math.max(...versions.map(Number));
|
||||
const versionedNode = nodeClass.nodeVersions[latestVersion];
|
||||
|
||||
if (versionedNode.description?.credentials) {
|
||||
return versionedNode.description.credentials;
|
||||
}
|
||||
}
|
||||
|
||||
// Regular nodes
|
||||
if (nodeClass.description?.credentials) {
|
||||
return nodeClass.description.credentials;
|
||||
}
|
||||
|
||||
return credentials;
|
||||
}
|
||||
|
||||
private normalizeProperties(properties: any[]): any[] {
|
||||
// Ensure all properties have consistent structure
|
||||
return properties.map(prop => ({
|
||||
displayName: prop.displayName,
|
||||
name: prop.name,
|
||||
type: prop.type,
|
||||
default: prop.default,
|
||||
description: prop.description,
|
||||
options: prop.options,
|
||||
required: prop.required,
|
||||
displayOptions: prop.displayOptions,
|
||||
typeOptions: prop.typeOptions,
|
||||
noDataExpression: prop.noDataExpression
|
||||
}));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Day 3: Updated Parser
|
||||
|
||||
**Updated File**: `src/parsers/node-parser.ts`
|
||||
|
||||
```typescript
|
||||
import { PropertyExtractor } from './property-extractor';
|
||||
|
||||
export class NodeParser {
|
||||
private propertyExtractor = new PropertyExtractor();
|
||||
|
||||
parse(nodeClass: any, packageName: string): ParsedNode {
|
||||
// Get base description (handles versioned nodes)
|
||||
const description = this.getNodeDescription(nodeClass);
|
||||
|
||||
return {
|
||||
style: this.detectStyle(nodeClass),
|
||||
nodeType: this.extractNodeType(description, packageName),
|
||||
displayName: description.displayName || description.name,
|
||||
description: description.description,
|
||||
category: this.extractCategory(description),
|
||||
properties: this.propertyExtractor.extractProperties(nodeClass),
|
||||
credentials: this.propertyExtractor.extractCredentials(nodeClass),
|
||||
isAITool: this.propertyExtractor.detectAIToolCapability(nodeClass),
|
||||
isTrigger: this.detectTrigger(description),
|
||||
isWebhook: this.detectWebhook(description),
|
||||
operations: this.propertyExtractor.extractOperations(nodeClass),
|
||||
version: this.extractVersion(nodeClass),
|
||||
isVersioned: !!nodeClass.nodeVersions
|
||||
};
|
||||
}
|
||||
|
||||
private getNodeDescription(nodeClass: any): any {
|
||||
// For versioned nodes, get the latest version's description
|
||||
if (nodeClass.baseDescription) {
|
||||
return nodeClass.baseDescription;
|
||||
}
|
||||
|
||||
if (nodeClass.nodeVersions) {
|
||||
const versions = Object.keys(nodeClass.nodeVersions);
|
||||
const latestVersion = Math.max(...versions.map(Number));
|
||||
return nodeClass.nodeVersions[latestVersion].description || {};
|
||||
}
|
||||
|
||||
return nodeClass.description || {};
|
||||
}
|
||||
|
||||
private detectStyle(nodeClass: any): 'declarative' | 'programmatic' {
|
||||
const desc = this.getNodeDescription(nodeClass);
|
||||
return desc.routing ? 'declarative' : 'programmatic';
|
||||
}
|
||||
|
||||
private extractNodeType(description: any, packageName: string): string {
|
||||
// Ensure we have the full node type including package prefix
|
||||
const name = description.name;
|
||||
|
||||
if (name.includes('.')) {
|
||||
return name;
|
||||
}
|
||||
|
||||
// Add package prefix if missing
|
||||
const packagePrefix = packageName.replace('@n8n/', '').replace('n8n-', '');
|
||||
return `${packagePrefix}.${name}`;
|
||||
}
|
||||
|
||||
private extractCategory(description: any): string {
|
||||
return description.group?.[0] ||
|
||||
description.categories?.[0] ||
|
||||
description.category ||
|
||||
'misc';
|
||||
}
|
||||
|
||||
private detectTrigger(description: any): boolean {
|
||||
return description.polling === true ||
|
||||
description.trigger === true ||
|
||||
description.eventTrigger === true ||
|
||||
description.name?.toLowerCase().includes('trigger');
|
||||
}
|
||||
|
||||
private detectWebhook(description: any): boolean {
|
||||
return (description.webhooks?.length > 0) ||
|
||||
description.webhook === true ||
|
||||
description.name?.toLowerCase().includes('webhook');
|
||||
}
|
||||
|
||||
private extractVersion(nodeClass: any): string {
|
||||
if (nodeClass.baseDescription?.defaultVersion) {
|
||||
return nodeClass.baseDescription.defaultVersion.toString();
|
||||
}
|
||||
|
||||
if (nodeClass.nodeVersions) {
|
||||
const versions = Object.keys(nodeClass.nodeVersions);
|
||||
return Math.max(...versions.map(Number)).toString();
|
||||
}
|
||||
|
||||
return nodeClass.description?.version || '1';
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Day 4: Node Repository
|
||||
|
||||
**NEW File**: `src/database/node-repository.ts`
|
||||
|
||||
```typescript
|
||||
import Database from 'better-sqlite3';
|
||||
|
||||
export class NodeRepository {
|
||||
constructor(private db: Database.Database) {}
|
||||
|
||||
/**
|
||||
* Save node with proper JSON serialization
|
||||
*/
|
||||
saveNode(node: ParsedNode): void {
|
||||
const stmt = this.db.prepare(`
|
||||
INSERT OR REPLACE INTO nodes (
|
||||
node_type, package_name, display_name, description,
|
||||
category, development_style, is_ai_tool, is_trigger,
|
||||
is_webhook, is_versioned, version, documentation,
|
||||
properties_schema, operations, credentials_required
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
`);
|
||||
|
||||
stmt.run(
|
||||
node.nodeType,
|
||||
node.packageName,
|
||||
node.displayName,
|
||||
node.description,
|
||||
node.category,
|
||||
node.style,
|
||||
node.isAITool ? 1 : 0,
|
||||
node.isTrigger ? 1 : 0,
|
||||
node.isWebhook ? 1 : 0,
|
||||
node.isVersioned ? 1 : 0,
|
||||
node.version,
|
||||
node.documentation || null,
|
||||
JSON.stringify(node.properties, null, 2),
|
||||
JSON.stringify(node.operations, null, 2),
|
||||
JSON.stringify(node.credentials, null, 2)
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get node with proper JSON deserialization
|
||||
*/
|
||||
getNode(nodeType: string): any {
|
||||
const row = this.db.prepare(`
|
||||
SELECT * FROM nodes WHERE node_type = ?
|
||||
`).get(nodeType);
|
||||
|
||||
if (!row) return null;
|
||||
|
||||
return {
|
||||
nodeType: row.node_type,
|
||||
displayName: row.display_name,
|
||||
description: row.description,
|
||||
category: row.category,
|
||||
developmentStyle: row.development_style,
|
||||
package: row.package_name,
|
||||
isAITool: !!row.is_ai_tool,
|
||||
isTrigger: !!row.is_trigger,
|
||||
isWebhook: !!row.is_webhook,
|
||||
isVersioned: !!row.is_versioned,
|
||||
version: row.version,
|
||||
properties: this.safeJsonParse(row.properties_schema, []),
|
||||
operations: this.safeJsonParse(row.operations, []),
|
||||
credentials: this.safeJsonParse(row.credentials_required, []),
|
||||
hasDocumentation: !!row.documentation
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get AI tools with proper filtering
|
||||
*/
|
||||
getAITools(): any[] {
|
||||
const rows = this.db.prepare(`
|
||||
SELECT node_type, display_name, description, package_name
|
||||
FROM nodes
|
||||
WHERE is_ai_tool = 1
|
||||
ORDER BY display_name
|
||||
`).all();
|
||||
|
||||
return rows.map(row => ({
|
||||
nodeType: row.node_type,
|
||||
displayName: row.display_name,
|
||||
description: row.description,
|
||||
package: row.package_name
|
||||
}));
|
||||
}
|
||||
|
||||
private safeJsonParse(json: string, defaultValue: any): any {
|
||||
try {
|
||||
return JSON.parse(json);
|
||||
} catch {
|
||||
return defaultValue;
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Day 5: Enhanced Rebuild Script
|
||||
|
||||
**Updated File**: `src/scripts/rebuild.ts`
|
||||
|
||||
```typescript
|
||||
#!/usr/bin/env node
|
||||
import Database from 'better-sqlite3';
|
||||
import { N8nNodeLoader } from '../loaders/node-loader';
|
||||
import { NodeParser } from '../parsers/node-parser';
|
||||
import { DocsMapper } from '../mappers/docs-mapper';
|
||||
import { NodeRepository } from '../database/node-repository';
|
||||
import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
|
||||
async function rebuild() {
|
||||
console.log('🔄 Rebuilding n8n node database...\n');
|
||||
|
||||
const db = new Database('./data/nodes.db');
|
||||
const loader = new N8nNodeLoader();
|
||||
const parser = new NodeParser();
|
||||
const mapper = new DocsMapper();
|
||||
const repository = new NodeRepository(db);
|
||||
|
||||
// Initialize database
|
||||
const schema = fs.readFileSync(path.join(__dirname, '../database/schema.sql'), 'utf8');
|
||||
db.exec(schema);
|
||||
|
||||
// Clear existing data
|
||||
db.exec('DELETE FROM nodes');
|
||||
console.log('🗑️ Cleared existing data\n');
|
||||
|
||||
// Load all nodes
|
||||
const nodes = await loader.loadAllNodes();
|
||||
console.log(`📦 Loaded ${nodes.length} nodes from packages\n`);
|
||||
|
||||
// Statistics
|
||||
const stats = {
|
||||
successful: 0,
|
||||
failed: 0,
|
||||
aiTools: 0,
|
||||
triggers: 0,
|
||||
webhooks: 0,
|
||||
withProperties: 0,
|
||||
withOperations: 0,
|
||||
withDocs: 0
|
||||
};
|
||||
|
||||
// Process each node
|
||||
for (const { packageName, nodeName, NodeClass } of nodes) {
|
||||
try {
|
||||
// Parse node
|
||||
const parsed = parser.parse(NodeClass, packageName);
|
||||
|
||||
// Validate parsed data
|
||||
if (!parsed.nodeType || !parsed.displayName) {
|
||||
throw new Error('Missing required fields');
|
||||
}
|
||||
|
||||
// Get documentation
|
||||
const docs = await mapper.fetchDocumentation(parsed.nodeType);
|
||||
parsed.documentation = docs;
|
||||
|
||||
// Save to database
|
||||
repository.saveNode(parsed);
|
||||
|
||||
// Update statistics
|
||||
stats.successful++;
|
||||
if (parsed.isAITool) stats.aiTools++;
|
||||
if (parsed.isTrigger) stats.triggers++;
|
||||
if (parsed.isWebhook) stats.webhooks++;
|
||||
if (parsed.properties.length > 0) stats.withProperties++;
|
||||
if (parsed.operations.length > 0) stats.withOperations++;
|
||||
if (docs) stats.withDocs++;
|
||||
|
||||
console.log(`✅ ${parsed.nodeType} [Props: ${parsed.properties.length}, Ops: ${parsed.operations.length}]`);
|
||||
} catch (error) {
|
||||
stats.failed++;
|
||||
console.error(`❌ Failed to process ${nodeName}: ${error.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// Validation check
|
||||
console.log('\n🔍 Running validation checks...');
|
||||
const validationResults = validateDatabase(repository);
|
||||
|
||||
// Summary
|
||||
console.log('\n📊 Summary:');
|
||||
console.log(` Total nodes: ${nodes.length}`);
|
||||
console.log(` Successful: ${stats.successful}`);
|
||||
console.log(` Failed: ${stats.failed}`);
|
||||
console.log(` AI Tools: ${stats.aiTools}`);
|
||||
console.log(` Triggers: ${stats.triggers}`);
|
||||
console.log(` Webhooks: ${stats.webhooks}`);
|
||||
console.log(` With Properties: ${stats.withProperties}`);
|
||||
console.log(` With Operations: ${stats.withOperations}`);
|
||||
console.log(` With Documentation: ${stats.withDocs}`);
|
||||
|
||||
if (!validationResults.passed) {
|
||||
console.log('\n⚠️ Validation Issues:');
|
||||
validationResults.issues.forEach(issue => console.log(` - ${issue}`));
|
||||
}
|
||||
|
||||
console.log('\n✨ Rebuild complete!');
|
||||
|
||||
db.close();
|
||||
}
|
||||
|
||||
function validateDatabase(repository: NodeRepository): { passed: boolean; issues: string[] } {
|
||||
const issues = [];
|
||||
|
||||
// Check critical nodes
|
||||
const criticalNodes = ['httpRequest', 'code', 'webhook', 'slack'];
|
||||
|
||||
for (const nodeType of criticalNodes) {
|
||||
const node = repository.getNode(nodeType);
|
||||
|
||||
if (!node) {
|
||||
issues.push(`Critical node ${nodeType} not found`);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (node.properties.length === 0) {
|
||||
issues.push(`Node ${nodeType} has no properties`);
|
||||
}
|
||||
}
|
||||
|
||||
// Check AI tools
|
||||
const aiTools = repository.getAITools();
|
||||
if (aiTools.length === 0) {
|
||||
issues.push('No AI tools found - check detection logic');
|
||||
}
|
||||
|
||||
return {
|
||||
passed: issues.length === 0,
|
||||
issues
|
||||
};
|
||||
}
|
||||
|
||||
// Run if called directly
|
||||
if (require.main === module) {
|
||||
rebuild().catch(console.error);
|
||||
}
|
||||
```
|
||||
|
||||
## Week 2: Testing and MCP Updates
|
||||
|
||||
### Day 6-7: Enhanced MCP Server
|
||||
|
||||
**Updated File**: `src/mcp/server.ts`
|
||||
|
||||
```typescript
|
||||
import { NodeRepository } from '../database/node-repository';
|
||||
|
||||
// In the get_node_info handler
|
||||
async function getNodeInfo(nodeType: string) {
|
||||
const repository = new NodeRepository(db);
|
||||
const node = repository.getNode(nodeType);
|
||||
|
||||
if (!node) {
|
||||
// Try alternative formats
|
||||
const alternatives = [
|
||||
nodeType,
|
||||
nodeType.replace('n8n-nodes-base.', ''),
|
||||
`n8n-nodes-base.${nodeType}`,
|
||||
nodeType.toLowerCase()
|
||||
];
|
||||
|
||||
for (const alt of alternatives) {
|
||||
const found = repository.getNode(alt);
|
||||
if (found) {
|
||||
node = found;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (!node) {
|
||||
throw new Error(`Node ${nodeType} not found`);
|
||||
}
|
||||
}
|
||||
|
||||
return node;
|
||||
}
|
||||
|
||||
// In the list_ai_tools handler
|
||||
async function listAITools() {
|
||||
const repository = new NodeRepository(db);
|
||||
const tools = repository.getAITools();
|
||||
|
||||
return {
|
||||
tools,
|
||||
totalCount: tools.length,
|
||||
requirements: {
|
||||
environmentVariable: 'N8N_COMMUNITY_PACKAGES_ALLOW_TOOL_USAGE=true',
|
||||
nodeProperty: 'usableAsTool: true'
|
||||
}
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
### Day 8-9: Test Suite
|
||||
|
||||
**NEW File**: `src/scripts/test-nodes.ts`
|
||||
|
||||
```typescript
|
||||
#!/usr/bin/env node
|
||||
import Database from 'better-sqlite3';
|
||||
import { NodeRepository } from '../database/node-repository';
|
||||
|
||||
const TEST_CASES = [
|
||||
{
|
||||
nodeType: 'httpRequest',
|
||||
checks: {
|
||||
hasProperties: true,
|
||||
minProperties: 5,
|
||||
hasDocumentation: true,
|
||||
isVersioned: true
|
||||
}
|
||||
},
|
||||
{
|
||||
nodeType: 'slack',
|
||||
checks: {
|
||||
hasOperations: true,
|
||||
minOperations: 10,
|
||||
style: 'declarative'
|
||||
}
|
||||
},
|
||||
{
|
||||
nodeType: 'code',
|
||||
checks: {
|
||||
hasProperties: true,
|
||||
properties: ['mode', 'language', 'jsCode']
|
||||
}
|
||||
}
|
||||
];
|
||||
|
||||
async function runTests() {
|
||||
const db = new Database('./data/nodes.db');
|
||||
const repository = new NodeRepository(db);
|
||||
|
||||
console.log('🧪 Running node tests...\n');
|
||||
|
||||
let passed = 0;
|
||||
let failed = 0;
|
||||
|
||||
for (const testCase of TEST_CASES) {
|
||||
console.log(`Testing ${testCase.nodeType}...`);
|
||||
|
||||
try {
|
||||
const node = repository.getNode(testCase.nodeType);
|
||||
|
||||
if (!node) {
|
||||
throw new Error('Node not found');
|
||||
}
|
||||
|
||||
// Run checks
|
||||
for (const [check, expected] of Object.entries(testCase.checks)) {
|
||||
switch (check) {
|
||||
case 'hasProperties':
|
||||
if (expected && node.properties.length === 0) {
|
||||
throw new Error('No properties found');
|
||||
}
|
||||
break;
|
||||
|
||||
case 'minProperties':
|
||||
if (node.properties.length < expected) {
|
||||
throw new Error(`Expected at least ${expected} properties, got ${node.properties.length}`);
|
||||
}
|
||||
break;
|
||||
|
||||
case 'hasOperations':
|
||||
if (expected && node.operations.length === 0) {
|
||||
throw new Error('No operations found');
|
||||
}
|
||||
break;
|
||||
|
||||
case 'minOperations':
|
||||
if (node.operations.length < expected) {
|
||||
throw new Error(`Expected at least ${expected} operations, got ${node.operations.length}`);
|
||||
}
|
||||
break;
|
||||
|
||||
case 'properties':
|
||||
const propNames = node.properties.map(p => p.name);
|
||||
for (const prop of expected as string[]) {
|
||||
if (!propNames.includes(prop)) {
|
||||
throw new Error(`Missing property: ${prop}`);
|
||||
}
|
||||
}
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`✅ ${testCase.nodeType} passed all checks\n`);
|
||||
passed++;
|
||||
} catch (error) {
|
||||
console.error(`❌ ${testCase.nodeType} failed: ${error.message}\n`);
|
||||
failed++;
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`\n📊 Test Results: ${passed} passed, ${failed} failed`);
|
||||
|
||||
db.close();
|
||||
}
|
||||
|
||||
if (require.main === module) {
|
||||
runTests().catch(console.error);
|
||||
}
|
||||
```
|
||||
|
||||
## Key Improvements in v2.2
|
||||
|
||||
1. **Dedicated Property Extraction**
|
||||
- Handles versioned nodes properly
|
||||
- Extracts operations from both declarative and programmatic nodes
|
||||
- Deep search for AI tool capabilities
|
||||
|
||||
2. **Proper Data Serialization**
|
||||
- NodeRepository ensures JSON is properly stored and retrieved
|
||||
- Safe JSON parsing with defaults
|
||||
- Consistent data structure
|
||||
|
||||
3. **Enhanced Validation**
|
||||
- Validation checks in rebuild script
|
||||
- Test suite for critical nodes
|
||||
- Statistics tracking for better visibility
|
||||
|
||||
4. **Better Error Handling**
|
||||
- Alternative node type lookups
|
||||
- Graceful fallbacks
|
||||
- Detailed error messages
|
||||
|
||||
5. **AI Tool Detection**
|
||||
- Multiple detection strategies
|
||||
- Check in versioned nodes
|
||||
- Name-based heuristics as fallback
|
||||
|
||||
## Success Metrics Update
|
||||
|
||||
1. **Properties/Operations**: >90% of nodes should have non-empty arrays
|
||||
2. **AI Tools**: Should detect at least 10-20 AI-capable nodes
|
||||
3. **Critical Nodes**: 100% pass rate on test suite
|
||||
4. **Documentation**: Maintain existing 89% coverage
|
||||
5. **Performance**: Rebuild in <60 seconds (allowing for validation)
|
||||
|
||||
## Deployment Steps
|
||||
|
||||
```bash
|
||||
# 1. Update code with v2.2 changes
|
||||
npm install
|
||||
|
||||
# 2. Build TypeScript
|
||||
npm run build
|
||||
|
||||
# 3. Run rebuild with validation
|
||||
npm run rebuild
|
||||
|
||||
# 4. Run test suite
|
||||
npm run test-nodes
|
||||
|
||||
# 5. Verify AI tools
|
||||
npm run list-ai-tools
|
||||
|
||||
# 6. Start MCP server
|
||||
npm start
|
||||
```
|
||||
|
||||
## Summary
|
||||
|
||||
Version 2.2 focuses on fixing the core data extraction issues while maintaining the simplicity of the MVP approach. The key insight is that n8n's node structure is more complex than initially assumed, especially for versioned nodes and AI tool detection. By adding dedicated extraction logic and proper data handling, we can deliver accurate node information while keeping the implementation straightforward.</document_content>
|
||||
</invoke>
|
||||
108
docs/report.md
108
docs/report.md
@@ -1,108 +0,0 @@
|
||||
# n8n-MCP Implementation Report
|
||||
|
||||
## Summary
|
||||
|
||||
Successfully implemented the n8n-MCP Enhancement Plan v2.1 Final, delivering a functional MVP that provides accurate n8n node documentation through the Model Context Protocol (MCP).
|
||||
|
||||
## Achievements
|
||||
|
||||
### Week 1: Core Implementation ✅
|
||||
|
||||
1. **Node Loader** (`src/loaders/node-loader.ts`)
|
||||
- Loads nodes from both `n8n-nodes-base` and `@n8n/n8n-nodes-langchain`
|
||||
- Handles both array and object formats for node configurations
|
||||
- Successfully loads 457 out of 458 nodes
|
||||
|
||||
2. **Simple Parser** (`src/parsers/simple-parser.ts`)
|
||||
- Parses both declarative and programmatic nodes
|
||||
- Detects versioned nodes (both VersionedNodeType and inline versioning)
|
||||
- Extracts node metadata, properties, and operations
|
||||
- Handles instantiation of nodes to access instance properties
|
||||
|
||||
3. **Documentation Mapper** (`src/mappers/docs-mapper.ts`)
|
||||
- Maps nodes to their documentation files
|
||||
- Handles both file and directory documentation structures
|
||||
- Includes known fixes for problematic node names
|
||||
- Achieves 89% documentation coverage (405/457 nodes)
|
||||
|
||||
4. **Database Schema** (`src/database/schema.sql`)
|
||||
- Simple SQLite schema optimized for the MVP
|
||||
- Stores all essential node information
|
||||
- Includes indexes for performance
|
||||
|
||||
5. **Rebuild Script** (`src/scripts/rebuild.ts`)
|
||||
- One-command database rebuild (`npm run rebuild`)
|
||||
- Provides clear progress and error reporting
|
||||
- Completes in under 30 seconds
|
||||
|
||||
### Week 2: Testing and Integration ✅
|
||||
|
||||
1. **Validation Script** (`src/scripts/validate.ts`)
|
||||
- Tests critical nodes (HTTP Request, Code, Slack, Agent)
|
||||
- Validates documentation coverage
|
||||
- Provides database statistics
|
||||
- 3 out of 4 critical nodes pass all tests
|
||||
|
||||
2. **MCP Server Updates** (`src/mcp/server-update.ts`)
|
||||
- Implements all planned MCP tools:
|
||||
- `list_nodes` - Filter and list nodes
|
||||
- `get_node_info` - Detailed node information
|
||||
- `search_nodes` - Full-text search
|
||||
- `list_ai_tools` - List AI-capable nodes
|
||||
- `get_node_documentation` - Fetch node docs
|
||||
- `get_database_statistics` - Database stats
|
||||
|
||||
## Key Metrics
|
||||
|
||||
- **Nodes Loaded**: 457/458 (99.8%)
|
||||
- **Documentation Coverage**: 405/457 (88.6%)
|
||||
- **Versioned Nodes Detected**: 46
|
||||
- **AI Tools**: 0 (none marked with usableAsTool flag)
|
||||
- **Triggers**: 10
|
||||
- **Packages Supported**: 2
|
||||
|
||||
## Known Limitations
|
||||
|
||||
1. **Slack Operations**: Unable to extract operations from some versioned nodes due to complex structure
|
||||
2. **AI Tools Detection**: No nodes currently have the `usableAsTool` flag set
|
||||
3. **One Failed Node**: One node from langchain package fails to load due to missing dependency
|
||||
|
||||
## Usage
|
||||
|
||||
```bash
|
||||
# Setup
|
||||
git clone https://github.com/n8n-io/n8n-docs.git n8n-docs
|
||||
npm install
|
||||
|
||||
# Build
|
||||
npm run build
|
||||
|
||||
# Rebuild database
|
||||
npm run rebuild
|
||||
|
||||
# Validate
|
||||
npm run validate
|
||||
|
||||
# Start MCP server
|
||||
npm start
|
||||
```
|
||||
|
||||
## Next Steps (Post-MVP)
|
||||
|
||||
1. Improve operations extraction for complex versioned nodes
|
||||
2. Add real-time monitoring capabilities
|
||||
3. Implement version history tracking
|
||||
4. Add support for community nodes
|
||||
5. Create web UI for browsing documentation
|
||||
|
||||
## Conclusion
|
||||
|
||||
The implementation successfully achieves the MVP goals:
|
||||
- ✅ Accurate node-to-documentation mapping
|
||||
- ✅ Coverage of official n8n packages
|
||||
- ✅ Fast rebuild process (<30 seconds)
|
||||
- ✅ Simple one-command operations
|
||||
- ✅ Reliable processing of standard nodes
|
||||
- ✅ Working MCP server with documentation tools
|
||||
|
||||
The system is ready for use and provides a solid foundation for future enhancements.
|
||||
Reference in New Issue
Block a user