mirror of
https://github.com/czlonkowski/n8n-mcp.git
synced 2026-01-30 22:42:04 +00:00
Compare commits
11 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0e0f0998af | ||
|
|
08a4be8370 | ||
|
|
3578f2cc31 | ||
|
|
4d3b8fbc91 | ||
|
|
5688384113 | ||
|
|
346fa3c8d2 | ||
|
|
3d5ceae43f | ||
|
|
1834d474a5 | ||
|
|
a4ef1efaf8 | ||
|
|
65f51ad8b5 | ||
|
|
af6efe9e88 |
244
CHANGELOG.md
244
CHANGELOG.md
@@ -7,6 +7,250 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
|
||||
## [Unreleased]
|
||||
|
||||
## [2.22.11] - 2025-01-06
|
||||
|
||||
### ✨ New Features
|
||||
|
||||
**Issue #399: Workflow Activation via Diff Operations**
|
||||
|
||||
Added workflow activation and deactivation as diff operations in `n8n_update_partial_workflow`, using n8n's dedicated API endpoints.
|
||||
|
||||
#### Problem
|
||||
|
||||
The n8n API provides dedicated `POST /workflows/{id}/activate` and `POST /workflows/{id}/deactivate` endpoints, but these were not accessible through n8n-mcp. Users could not programmatically control workflow activation status, forcing manual activation through the n8n UI.
|
||||
|
||||
#### Solution
|
||||
|
||||
Implemented activation/deactivation as diff operations, following the established pattern of metadata operations like `updateSettings` and `updateName`. This keeps the tool count manageable (40 tools, not 42) and provides a consistent interface.
|
||||
|
||||
#### Changes
|
||||
|
||||
**API Client** (`src/services/n8n-api-client.ts`):
|
||||
- Added `activateWorkflow(id: string): Promise<Workflow>` method
|
||||
- Added `deactivateWorkflow(id: string): Promise<Workflow>` method
|
||||
- Both use POST requests to dedicated n8n API endpoints
|
||||
|
||||
**Diff Engine Types** (`src/types/workflow-diff.ts`):
|
||||
- Added `ActivateWorkflowOperation` interface
|
||||
- Added `DeactivateWorkflowOperation` interface
|
||||
- Added `shouldActivate` and `shouldDeactivate` flags to `WorkflowDiffResult`
|
||||
- Increased supported operations from 15 to 17
|
||||
|
||||
**Diff Engine** (`src/services/workflow-diff-engine.ts`):
|
||||
- Added validation for activation (requires activatable triggers)
|
||||
- Added operation application logic
|
||||
- Transfers activation intent from workflow object to result
|
||||
- Validates workflow has activatable triggers (webhook, schedule, etc.)
|
||||
- Rejects workflows with only `executeWorkflowTrigger` (cannot activate)
|
||||
|
||||
**Handler** (`src/mcp/handlers-workflow-diff.ts`):
|
||||
- Checks `shouldActivate` and `shouldDeactivate` flags after workflow update
|
||||
- Calls appropriate API methods
|
||||
- Includes activation status in response message and details
|
||||
- Handles activation/deactivation errors gracefully
|
||||
|
||||
**Documentation** (`src/mcp/tool-docs/workflow_management/n8n-update-partial-workflow.ts`):
|
||||
- Updated operation count from 15 to 17
|
||||
- Added "Workflow Activation Operations" section
|
||||
- Added activation tip to essentials
|
||||
|
||||
**Tool Registration** (`src/mcp/handlers-n8n-manager.ts`):
|
||||
- Removed "Cannot activate/deactivate workflows via API" from limitations
|
||||
|
||||
#### Usage
|
||||
|
||||
```javascript
|
||||
// Activate workflow
|
||||
n8n_update_partial_workflow({
|
||||
id: "workflow_id",
|
||||
operations: [{
|
||||
type: "activateWorkflow"
|
||||
}]
|
||||
})
|
||||
|
||||
// Deactivate workflow
|
||||
n8n_update_partial_workflow({
|
||||
id: "workflow_id",
|
||||
operations: [{
|
||||
type: "deactivateWorkflow"
|
||||
}]
|
||||
})
|
||||
|
||||
// Combine with other operations
|
||||
n8n_update_partial_workflow({
|
||||
id: "workflow_id",
|
||||
operations: [
|
||||
{type: "updateNode", nodeId: "abc", updates: {name: "Updated"}},
|
||||
{type: "activateWorkflow"}
|
||||
]
|
||||
})
|
||||
```
|
||||
|
||||
#### Validation
|
||||
|
||||
- **Activation**: Requires at least one enabled activatable trigger node
|
||||
- **Deactivation**: Always valid
|
||||
- **Error Handling**: Clear messages when activation fails due to missing triggers
|
||||
- **Trigger Detection**: Uses `isActivatableTrigger()` utility (Issue #351 compliance)
|
||||
|
||||
#### Benefits
|
||||
|
||||
- ✅ Consistent with existing architecture (metadata operations pattern)
|
||||
- ✅ Keeps tool count at 40 (not 42)
|
||||
- ✅ Atomic operations - activation happens after workflow update
|
||||
- ✅ Proper validation - prevents activation without triggers
|
||||
- ✅ Clear error messages - guides users on trigger requirements
|
||||
- ✅ Works with other operations - can update and activate in one call
|
||||
|
||||
#### Credits
|
||||
|
||||
- **@ArtemisAI** - Original investigation and API endpoint discovery
|
||||
- **@cmj-hub** - Implementation attempt and PR contribution
|
||||
- Architectural guidance from project maintainer
|
||||
|
||||
Resolves #399
|
||||
|
||||
Conceived by Romuald Członkowski - [www.aiadvisors.pl/en](https://www.aiadvisors.pl/en)
|
||||
|
||||
## [2.22.10] - 2025-11-04
|
||||
|
||||
### 🐛 Bug Fixes
|
||||
|
||||
**sql.js Fallback: Fixed Database Health Check Crash**
|
||||
|
||||
Fixed critical startup crash when the server falls back to sql.js adapter (used when better-sqlite3 fails to load, such as Node.js version mismatches between build and runtime).
|
||||
|
||||
#### Problem
|
||||
|
||||
When Claude Desktop was configured to use a different Node.js version than the one used to build the project:
|
||||
- better-sqlite3 fails to load due to NODE_MODULE_VERSION mismatch (e.g., built with Node v22, running with Node v20)
|
||||
- System gracefully falls back to sql.js adapter (pure JavaScript, no native dependencies)
|
||||
- **BUT** the database health check crashed with "no such module: fts5" error
|
||||
- Server exits immediately after startup, preventing connection
|
||||
|
||||
**Error Details:**
|
||||
```
|
||||
[ERROR] Database health check failed: Error: no such module: fts5
|
||||
at e.handleError (sql-wasm.js:90:371)
|
||||
at e.prepare (sql-wasm.js:89:104)
|
||||
at SQLJSAdapter.prepare (database-adapter.js:202:30)
|
||||
at N8NDocumentationMCPServer.validateDatabaseHealth (server.js:251:42)
|
||||
```
|
||||
|
||||
**Root Cause:** The health check attempted to query the FTS5 (Full-Text Search) table, which is not available in sql.js. The error was not caught, causing the server to exit.
|
||||
|
||||
#### Solution
|
||||
|
||||
Wrapped the FTS5 health check in a try-catch block to handle sql.js gracefully:
|
||||
|
||||
```typescript
|
||||
// Check if FTS5 table exists (wrap in try-catch for sql.js compatibility)
|
||||
try {
|
||||
const ftsExists = this.db.prepare(`
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='nodes_fts'
|
||||
`).get();
|
||||
|
||||
if (!ftsExists) {
|
||||
logger.warn('FTS5 table missing - search performance will be degraded...');
|
||||
} else {
|
||||
const ftsCount = this.db.prepare('SELECT COUNT(*) as count FROM nodes_fts').get();
|
||||
if (ftsCount.count === 0) {
|
||||
logger.warn('FTS5 index is empty - search will not work properly...');
|
||||
}
|
||||
}
|
||||
} catch (ftsError) {
|
||||
// FTS5 not supported (e.g., sql.js fallback) - this is OK, just warn
|
||||
logger.warn('FTS5 not available - using fallback search. For better performance, ensure better-sqlite3 is properly installed.');
|
||||
}
|
||||
```
|
||||
|
||||
#### Impact
|
||||
|
||||
**Before Fix:**
|
||||
- ❌ Server crashed immediately when using sql.js fallback
|
||||
- ❌ Claude Desktop connection failed with Node.js version mismatches
|
||||
- ❌ No way to use the MCP server without matching Node.js versions exactly
|
||||
|
||||
**After Fix:**
|
||||
- ✅ Server starts successfully with sql.js fallback
|
||||
- ✅ Works with any Node.js version (graceful degradation)
|
||||
- ✅ Clear warning about FTS5 unavailability in logs
|
||||
- ✅ Users can choose between sql.js (slower, works everywhere) or rebuilding better-sqlite3 (faster, requires matching Node version)
|
||||
|
||||
#### Performance Notes
|
||||
|
||||
When using sql.js fallback:
|
||||
- Full-text search (FTS5) is not available, falls back to LIKE queries
|
||||
- Slightly slower search performance (~10-30ms vs ~5ms with FTS5)
|
||||
- All other functionality works identically
|
||||
- Database operations work correctly
|
||||
|
||||
**Recommendation:** For best performance, ensure better-sqlite3 loads successfully by matching Node.js versions or rebuilding:
|
||||
```bash
|
||||
# If Node version mismatch, rebuild better-sqlite3
|
||||
npm rebuild better-sqlite3
|
||||
```
|
||||
|
||||
#### Files Changed
|
||||
|
||||
**Modified (1 file):**
|
||||
- `src/mcp/server.ts` (lines 299-317) - Added try-catch around FTS5 health check
|
||||
|
||||
#### Testing
|
||||
|
||||
- ✅ Tested with Node v20.17.0 (Claude Desktop version)
|
||||
- ✅ Tested with Node v22.17.0 (build version)
|
||||
- ✅ Server starts successfully in both cases
|
||||
- ✅ sql.js fallback works correctly with graceful FTS5 degradation
|
||||
- ✅ All 6 startup checkpoints pass
|
||||
- ✅ Database health check passes with warning
|
||||
|
||||
Conceived by Romuald Członkowski - [www.aiadvisors.pl/en](https://www.aiadvisors.pl/en)
|
||||
|
||||
## [2.22.9] - 2025-11-04
|
||||
|
||||
### 🔄 Dependencies Update
|
||||
|
||||
**n8n Platform Update to 1.118.1**
|
||||
|
||||
Updated n8n and all related dependencies to the latest versions:
|
||||
|
||||
- **n8n**: 1.117.2 → 1.118.1
|
||||
- **n8n-core**: 1.116.0 → 1.117.0
|
||||
- **n8n-workflow**: 1.114.0 → 1.115.0
|
||||
- **@n8n/n8n-nodes-langchain**: 1.116.2 → 1.117.0
|
||||
|
||||
### 📊 Database Changes
|
||||
|
||||
- Rebuilt node database with **542 nodes**
|
||||
- 439 nodes from n8n-nodes-base
|
||||
- 103 nodes from @n8n/n8n-nodes-langchain
|
||||
- All node metadata synchronized with latest n8n release
|
||||
|
||||
### 🐛 Bug Fixes
|
||||
|
||||
**n8n 1.118.1+ Compatibility: Fixed versionCounter API Rejection**
|
||||
|
||||
Fixed integration test failures caused by n8n 1.118.1 API change where `versionCounter` property is returned in GET responses but rejected in PUT requests.
|
||||
|
||||
**Impact**:
|
||||
- Integration tests were failing with "request/body must NOT have additional properties" error
|
||||
- Workflow update operations via n8n API were failing
|
||||
|
||||
**Solution**:
|
||||
- Added `versionCounter` to property exclusion list in `cleanWorkflowForUpdate()` (src/services/n8n-validation.ts:136)
|
||||
- Added `versionCounter?: number` type definition to Workflow and WorkflowExport interfaces
|
||||
- Added test coverage to prevent regression
|
||||
|
||||
### ✅ Verification
|
||||
|
||||
- Database rebuild completed successfully
|
||||
- All node types validated
|
||||
- Documentation mappings updated
|
||||
|
||||
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
|
||||
|
||||
## [2.22.7] - 2025-10-26
|
||||
|
||||
### 📝 Documentation Fixes
|
||||
|
||||
@@ -1,5 +1,87 @@
|
||||
# n8n Update Process - Quick Reference
|
||||
|
||||
## ⚡ Recommended Fast Workflow (2025-11-04)
|
||||
|
||||
**CRITICAL FIRST STEP**: Check existing releases to avoid version conflicts!
|
||||
|
||||
```bash
|
||||
# 1. CHECK EXISTING RELEASES FIRST (prevents version conflicts!)
|
||||
gh release list | head -5
|
||||
# Look at the latest version - your new version must be higher!
|
||||
|
||||
# 2. Switch to main and pull
|
||||
git checkout main && git pull
|
||||
|
||||
# 3. Check for updates (dry run)
|
||||
npm run update:n8n:check
|
||||
|
||||
# 4. Run update and skip tests (we'll test in CI)
|
||||
yes y | npm run update:n8n
|
||||
|
||||
# 5. Create feature branch
|
||||
git checkout -b update/n8n-X.X.X
|
||||
|
||||
# 6. Update version in package.json (must be HIGHER than latest release!)
|
||||
# Edit: "version": "2.XX.X" (not the version from the release list!)
|
||||
|
||||
# 7. Update CHANGELOG.md
|
||||
# - Change version number to match package.json
|
||||
# - Update date to today
|
||||
# - Update dependency versions
|
||||
|
||||
# 8. Update README badge
|
||||
# Edit line 8: Change n8n version badge to new n8n version
|
||||
|
||||
# 9. Commit and push
|
||||
git add -A
|
||||
git commit -m "chore: update n8n to X.X.X and bump version to 2.XX.X
|
||||
|
||||
- Updated n8n from X.X.X to X.X.X
|
||||
- Updated n8n-core from X.X.X to X.X.X
|
||||
- Updated n8n-workflow from X.X.X to X.X.X
|
||||
- Updated @n8n/n8n-nodes-langchain from X.X.X to X.X.X
|
||||
- Rebuilt node database with XXX nodes (XXX from n8n-nodes-base, XXX from @n8n/n8n-nodes-langchain)
|
||||
- Updated README badge with new n8n version
|
||||
- Updated CHANGELOG with dependency changes
|
||||
|
||||
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
|
||||
|
||||
🤖 Generated with [Claude Code](https://claude.com/claude-code)
|
||||
|
||||
Co-Authored-By: Claude <noreply@anthropic.com>"
|
||||
|
||||
git push -u origin update/n8n-X.X.X
|
||||
|
||||
# 10. Create PR
|
||||
gh pr create --title "chore: update n8n to X.X.X" --body "Updates n8n and all related dependencies to the latest versions..."
|
||||
|
||||
# 11. After PR is merged, verify release triggered
|
||||
gh release list | head -1
|
||||
# If the new version appears, you're done!
|
||||
# If not, the version might have already been released - bump version again and create new PR
|
||||
```
|
||||
|
||||
### Why This Workflow?
|
||||
|
||||
✅ **Fast**: Skip local tests (2-3 min saved) - CI runs them anyway
|
||||
✅ **Safe**: Unit tests in CI verify compatibility
|
||||
✅ **Clean**: All changes in one PR with proper tracking
|
||||
✅ **Automatic**: Release workflow triggers on merge if version is new
|
||||
|
||||
### Common Issues
|
||||
|
||||
**Problem**: Release workflow doesn't trigger after merge
|
||||
**Cause**: Version number was already released (check `gh release list`)
|
||||
**Solution**: Create new PR bumping version by one patch number
|
||||
|
||||
**Problem**: Integration tests fail in CI with "unauthorized"
|
||||
**Cause**: n8n test instance credentials expired (infrastructure issue)
|
||||
**Solution**: Ignore if unit tests pass - this is not a code problem
|
||||
|
||||
**Problem**: CI takes 8+ minutes
|
||||
**Reason**: Integration tests need live n8n instance (slow)
|
||||
**Normal**: Unit tests (~2 min) + integration tests (~6 min) = ~8 min total
|
||||
|
||||
## Quick One-Command Update
|
||||
|
||||
For a complete update with tests and publish preparation:
|
||||
@@ -99,12 +181,14 @@ This command:
|
||||
|
||||
## Important Notes
|
||||
|
||||
1. **Always run on main branch** - Make sure you're on main and it's clean
|
||||
2. **The update script is smart** - It automatically syncs all n8n dependencies to compatible versions
|
||||
3. **Tests are required** - The publish script now runs tests automatically
|
||||
4. **Database rebuild is automatic** - The update script handles this for you
|
||||
5. **Template sanitization is automatic** - Any API tokens in workflow templates are replaced with placeholders
|
||||
6. **Docker image builds automatically** - Pushing to GitHub triggers the workflow
|
||||
1. **ALWAYS check existing releases first** - Use `gh release list` to see what versions are already released. Your new version must be higher!
|
||||
2. **Release workflow only triggers on version CHANGE** - If you merge a PR with an already-released version (e.g., 2.22.8), the workflow won't run. You'll need to bump to a new version (e.g., 2.22.9) and create another PR.
|
||||
3. **Integration test failures in CI are usually infrastructure issues** - If unit tests pass but integration tests fail with "unauthorized", this is typically because the test n8n instance credentials need updating. The code itself is fine.
|
||||
4. **Skip local tests - let CI handle them** - Running tests locally adds 2-3 minutes with no benefit since CI runs them anyway. The fast workflow skips local tests.
|
||||
5. **The update script is smart** - It automatically syncs all n8n dependencies to compatible versions
|
||||
6. **Database rebuild is automatic** - The update script handles this for you
|
||||
7. **Template sanitization is automatic** - Any API tokens in workflow templates are replaced with placeholders
|
||||
8. **Docker image builds automatically** - Pushing to GitHub triggers the workflow
|
||||
|
||||
## GitHub Push Protection
|
||||
|
||||
@@ -115,11 +199,27 @@ As of July 2025, GitHub's push protection may block database pushes if they cont
|
||||
3. If push is still blocked, use the GitHub web interface to review and allow the push
|
||||
|
||||
## Time Estimate
|
||||
|
||||
### Fast Workflow (Recommended)
|
||||
- Local work: ~2-3 minutes
|
||||
- npm install and database rebuild: ~2-3 minutes
|
||||
- File edits (CHANGELOG, README, package.json): ~30 seconds
|
||||
- Git operations (commit, push, create PR): ~30 seconds
|
||||
- CI testing after PR creation: ~8-10 minutes (runs automatically)
|
||||
- Unit tests: ~2 minutes
|
||||
- Integration tests: ~6 minutes (may fail with infrastructure issues - ignore if unit tests pass)
|
||||
- Other checks: ~1 minute
|
||||
|
||||
**Total hands-on time: ~3 minutes** (then wait for CI)
|
||||
|
||||
### Full Workflow with Local Tests
|
||||
- Total time: ~5-7 minutes
|
||||
- Test suite: ~2.5 minutes
|
||||
- npm install and database rebuild: ~2-3 minutes
|
||||
- The rest: seconds
|
||||
|
||||
**Note**: The fast workflow is recommended since CI runs the same tests anyway.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
If tests fail:
|
||||
|
||||
@@ -54,6 +54,10 @@ Collected data is used solely to:
|
||||
- Identify common error patterns
|
||||
- Improve tool performance and reliability
|
||||
- Guide development priorities
|
||||
- Train machine learning models for workflow generation
|
||||
|
||||
All ML training uses sanitized, anonymized data only.
|
||||
Users can opt-out at any time with `npx n8n-mcp telemetry disable`
|
||||
|
||||
## Data Retention
|
||||
- Data is retained for analysis purposes
|
||||
@@ -66,4 +70,4 @@ We may update this privacy policy from time to time. Updates will be reflected i
|
||||
For questions about telemetry or privacy, please open an issue on GitHub:
|
||||
https://github.com/czlonkowski/n8n-mcp/issues
|
||||
|
||||
Last updated: 2025-09-25
|
||||
Last updated: 2025-11-06
|
||||
@@ -5,7 +5,7 @@
|
||||
[](https://www.npmjs.com/package/n8n-mcp)
|
||||
[](https://codecov.io/gh/czlonkowski/n8n-mcp)
|
||||
[](https://github.com/czlonkowski/n8n-mcp/actions)
|
||||
[](https://github.com/n8n-io/n8n)
|
||||
[](https://github.com/n8n-io/n8n)
|
||||
[](https://github.com/czlonkowski/n8n-mcp/pkgs/container/n8n-mcp)
|
||||
[](https://railway.com/deploy/n8n-mcp?referralCode=n8n-mcp)
|
||||
|
||||
|
||||
BIN
data/nodes.db
BIN
data/nodes.db
Binary file not shown.
1147
package-lock.json
generated
1147
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
10
package.json
10
package.json
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "n8n-mcp",
|
||||
"version": "2.22.8",
|
||||
"version": "2.22.11",
|
||||
"description": "Integration between n8n workflow automation and Model Context Protocol (MCP)",
|
||||
"main": "dist/index.js",
|
||||
"types": "dist/index.d.ts",
|
||||
@@ -140,15 +140,15 @@
|
||||
},
|
||||
"dependencies": {
|
||||
"@modelcontextprotocol/sdk": "^1.20.1",
|
||||
"@n8n/n8n-nodes-langchain": "^1.116.2",
|
||||
"@n8n/n8n-nodes-langchain": "^1.117.0",
|
||||
"@supabase/supabase-js": "^2.57.4",
|
||||
"dotenv": "^16.5.0",
|
||||
"express": "^5.1.0",
|
||||
"express-rate-limit": "^7.1.5",
|
||||
"lru-cache": "^11.2.1",
|
||||
"n8n": "^1.117.2",
|
||||
"n8n-core": "^1.116.0",
|
||||
"n8n-workflow": "^1.114.0",
|
||||
"n8n": "^1.118.1",
|
||||
"n8n-core": "^1.117.0",
|
||||
"n8n-workflow": "^1.115.0",
|
||||
"openai": "^4.77.0",
|
||||
"sql.js": "^1.13.0",
|
||||
"tslib": "^2.6.2",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "n8n-mcp-runtime",
|
||||
"version": "2.22.8",
|
||||
"version": "2.22.11",
|
||||
"description": "n8n MCP Server Runtime Dependencies Only",
|
||||
"private": true,
|
||||
"dependencies": {
|
||||
|
||||
@@ -1561,7 +1561,6 @@ export async function handleListAvailableTools(context?: InstanceContext): Promi
|
||||
maxRetries: config.maxRetries
|
||||
} : null,
|
||||
limitations: [
|
||||
'Cannot activate/deactivate workflows via API',
|
||||
'Cannot execute workflows directly (must use webhooks)',
|
||||
'Cannot stop running executions',
|
||||
'Tags and credentials have limited API support'
|
||||
|
||||
@@ -245,15 +245,52 @@ export async function handleUpdatePartialWorkflow(
|
||||
// Update workflow via API
|
||||
try {
|
||||
const updatedWorkflow = await client.updateWorkflow(input.id, diffResult.workflow!);
|
||||
|
||||
|
||||
// Handle activation/deactivation if requested
|
||||
let finalWorkflow = updatedWorkflow;
|
||||
let activationMessage = '';
|
||||
|
||||
if (diffResult.shouldActivate) {
|
||||
try {
|
||||
finalWorkflow = await client.activateWorkflow(input.id);
|
||||
activationMessage = ' Workflow activated.';
|
||||
} catch (activationError) {
|
||||
logger.error('Failed to activate workflow after update', activationError);
|
||||
return {
|
||||
success: false,
|
||||
error: 'Workflow updated successfully but activation failed',
|
||||
details: {
|
||||
workflowUpdated: true,
|
||||
activationError: activationError instanceof Error ? activationError.message : 'Unknown error'
|
||||
}
|
||||
};
|
||||
}
|
||||
} else if (diffResult.shouldDeactivate) {
|
||||
try {
|
||||
finalWorkflow = await client.deactivateWorkflow(input.id);
|
||||
activationMessage = ' Workflow deactivated.';
|
||||
} catch (deactivationError) {
|
||||
logger.error('Failed to deactivate workflow after update', deactivationError);
|
||||
return {
|
||||
success: false,
|
||||
error: 'Workflow updated successfully but deactivation failed',
|
||||
details: {
|
||||
workflowUpdated: true,
|
||||
deactivationError: deactivationError instanceof Error ? deactivationError.message : 'Unknown error'
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
data: updatedWorkflow,
|
||||
message: `Workflow "${updatedWorkflow.name}" updated successfully. Applied ${diffResult.operationsApplied} operations.`,
|
||||
data: finalWorkflow,
|
||||
message: `Workflow "${finalWorkflow.name}" updated successfully. Applied ${diffResult.operationsApplied} operations.${activationMessage}`,
|
||||
details: {
|
||||
operationsApplied: diffResult.operationsApplied,
|
||||
workflowId: updatedWorkflow.id,
|
||||
workflowName: updatedWorkflow.name,
|
||||
workflowId: finalWorkflow.id,
|
||||
workflowName: finalWorkflow.name,
|
||||
active: finalWorkflow.active,
|
||||
applied: diffResult.applied,
|
||||
failed: diffResult.failed,
|
||||
errors: diffResult.errors,
|
||||
|
||||
@@ -296,19 +296,24 @@ export class N8NDocumentationMCPServer {
|
||||
throw new Error('Database is empty. Run "npm run rebuild" to populate node data.');
|
||||
}
|
||||
|
||||
// Check if FTS5 table exists
|
||||
const ftsExists = this.db.prepare(`
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='nodes_fts'
|
||||
`).get();
|
||||
// Check if FTS5 table exists (wrap in try-catch for sql.js compatibility)
|
||||
try {
|
||||
const ftsExists = this.db.prepare(`
|
||||
SELECT name FROM sqlite_master
|
||||
WHERE type='table' AND name='nodes_fts'
|
||||
`).get();
|
||||
|
||||
if (!ftsExists) {
|
||||
logger.warn('FTS5 table missing - search performance will be degraded. Please run: npm run rebuild');
|
||||
} else {
|
||||
const ftsCount = this.db.prepare('SELECT COUNT(*) as count FROM nodes_fts').get() as { count: number };
|
||||
if (ftsCount.count === 0) {
|
||||
logger.warn('FTS5 index is empty - search will not work properly. Please run: npm run rebuild');
|
||||
if (!ftsExists) {
|
||||
logger.warn('FTS5 table missing - search performance will be degraded. Please run: npm run rebuild');
|
||||
} else {
|
||||
const ftsCount = this.db.prepare('SELECT COUNT(*) as count FROM nodes_fts').get() as { count: number };
|
||||
if (ftsCount.count === 0) {
|
||||
logger.warn('FTS5 index is empty - search will not work properly. Please run: npm run rebuild');
|
||||
}
|
||||
}
|
||||
} catch (ftsError) {
|
||||
// FTS5 not supported (e.g., sql.js fallback) - this is OK, just warn
|
||||
logger.warn('FTS5 not available - using fallback search. For better performance, ensure better-sqlite3 is properly installed.');
|
||||
}
|
||||
|
||||
logger.info(`Database health check passed: ${nodeCount.count} nodes loaded`);
|
||||
|
||||
@@ -4,7 +4,7 @@ export const n8nUpdatePartialWorkflowDoc: ToolDocumentation = {
|
||||
name: 'n8n_update_partial_workflow',
|
||||
category: 'workflow_management',
|
||||
essentials: {
|
||||
description: 'Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, rewireConnection, cleanStaleConnections, replaceConnections, updateSettings, updateName, add/removeTag. Supports smart parameters (branch, case) for multi-output nodes. Full support for AI connections (ai_languageModel, ai_tool, ai_memory, ai_embedding, ai_vectorStore, ai_document, ai_textSplitter, ai_outputParser).',
|
||||
description: 'Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, rewireConnection, cleanStaleConnections, replaceConnections, updateSettings, updateName, add/removeTag, activateWorkflow, deactivateWorkflow. Supports smart parameters (branch, case) for multi-output nodes. Full support for AI connections (ai_languageModel, ai_tool, ai_memory, ai_embedding, ai_vectorStore, ai_document, ai_textSplitter, ai_outputParser).',
|
||||
keyParameters: ['id', 'operations', 'continueOnError'],
|
||||
example: 'n8n_update_partial_workflow({id: "wf_123", operations: [{type: "rewireConnection", source: "IF", from: "Old", to: "New", branch: "true"}]})',
|
||||
performance: 'Fast (50-200ms)',
|
||||
@@ -19,11 +19,12 @@ export const n8nUpdatePartialWorkflowDoc: ToolDocumentation = {
|
||||
'For AI connections, specify sourceOutput type (ai_languageModel, ai_tool, etc.)',
|
||||
'Batch AI component connections for atomic updates',
|
||||
'Auto-sanitization: ALL nodes auto-fixed during updates (operator structures, missing metadata)',
|
||||
'Node renames automatically update all connection references - no manual connection operations needed'
|
||||
'Node renames automatically update all connection references - no manual connection operations needed',
|
||||
'Activate/deactivate workflows: Use activateWorkflow/deactivateWorkflow operations (requires activatable triggers like webhook/schedule)'
|
||||
]
|
||||
},
|
||||
full: {
|
||||
description: `Updates workflows using surgical diff operations instead of full replacement. Supports 15 operation types for precise modifications. Operations are validated and applied atomically by default - all succeed or none are applied.
|
||||
description: `Updates workflows using surgical diff operations instead of full replacement. Supports 17 operation types for precise modifications. Operations are validated and applied atomically by default - all succeed or none are applied.
|
||||
|
||||
## Available Operations:
|
||||
|
||||
@@ -48,6 +49,10 @@ export const n8nUpdatePartialWorkflowDoc: ToolDocumentation = {
|
||||
- **addTag**: Add a workflow tag
|
||||
- **removeTag**: Remove a workflow tag
|
||||
|
||||
### Workflow Activation Operations (2 types):
|
||||
- **activateWorkflow**: Activate the workflow to enable automatic execution via triggers
|
||||
- **deactivateWorkflow**: Deactivate the workflow to prevent automatic execution
|
||||
|
||||
## Smart Parameters for Multi-Output Nodes
|
||||
|
||||
For **IF nodes**, use semantic 'branch' parameter instead of technical sourceIndex:
|
||||
|
||||
@@ -170,6 +170,24 @@ export class N8nApiClient {
|
||||
}
|
||||
}
|
||||
|
||||
async activateWorkflow(id: string): Promise<Workflow> {
|
||||
try {
|
||||
const response = await this.client.post(`/workflows/${id}/activate`);
|
||||
return response.data;
|
||||
} catch (error) {
|
||||
throw handleN8nApiError(error);
|
||||
}
|
||||
}
|
||||
|
||||
async deactivateWorkflow(id: string): Promise<Workflow> {
|
||||
try {
|
||||
const response = await this.client.post(`/workflows/${id}/deactivate`);
|
||||
return response.data;
|
||||
} catch (error) {
|
||||
throw handleN8nApiError(error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Lists workflows from n8n instance.
|
||||
*
|
||||
|
||||
@@ -133,6 +133,7 @@ export function cleanWorkflowForUpdate(workflow: Workflow): Partial<Workflow> {
|
||||
createdAt,
|
||||
updatedAt,
|
||||
versionId,
|
||||
versionCounter, // Added: n8n 1.118.1+ returns this but rejects it in updates
|
||||
meta,
|
||||
staticData,
|
||||
// Remove fields that cause API errors
|
||||
|
||||
@@ -25,6 +25,8 @@ import {
|
||||
UpdateNameOperation,
|
||||
AddTagOperation,
|
||||
RemoveTagOperation,
|
||||
ActivateWorkflowOperation,
|
||||
DeactivateWorkflowOperation,
|
||||
CleanStaleConnectionsOperation,
|
||||
ReplaceConnectionsOperation
|
||||
} from '../types/workflow-diff';
|
||||
@@ -32,6 +34,7 @@ import { Workflow, WorkflowNode, WorkflowConnection } from '../types/n8n-api';
|
||||
import { Logger } from '../utils/logger';
|
||||
import { validateWorkflowNode, validateWorkflowConnections } from './n8n-validation';
|
||||
import { sanitizeNode, sanitizeWorkflowNodes } from './node-sanitizer';
|
||||
import { isActivatableTrigger } from '../utils/node-type-utils';
|
||||
|
||||
const logger = new Logger({ prefix: '[WorkflowDiffEngine]' });
|
||||
|
||||
@@ -214,12 +217,23 @@ export class WorkflowDiffEngine {
|
||||
}
|
||||
|
||||
const operationsApplied = request.operations.length;
|
||||
|
||||
// Extract activation flags from workflow object
|
||||
const shouldActivate = (workflowCopy as any)._shouldActivate === true;
|
||||
const shouldDeactivate = (workflowCopy as any)._shouldDeactivate === true;
|
||||
|
||||
// Clean up temporary flags
|
||||
delete (workflowCopy as any)._shouldActivate;
|
||||
delete (workflowCopy as any)._shouldDeactivate;
|
||||
|
||||
return {
|
||||
success: true,
|
||||
workflow: workflowCopy,
|
||||
operationsApplied,
|
||||
message: `Successfully applied ${operationsApplied} operations (${nodeOperations.length} node ops, ${otherOperations.length} other ops)`,
|
||||
warnings: this.warnings.length > 0 ? this.warnings : undefined
|
||||
warnings: this.warnings.length > 0 ? this.warnings : undefined,
|
||||
shouldActivate: shouldActivate || undefined,
|
||||
shouldDeactivate: shouldDeactivate || undefined
|
||||
};
|
||||
}
|
||||
} catch (error) {
|
||||
@@ -262,6 +276,10 @@ export class WorkflowDiffEngine {
|
||||
case 'addTag':
|
||||
case 'removeTag':
|
||||
return null; // These are always valid
|
||||
case 'activateWorkflow':
|
||||
return this.validateActivateWorkflow(workflow, operation);
|
||||
case 'deactivateWorkflow':
|
||||
return this.validateDeactivateWorkflow(workflow, operation);
|
||||
case 'cleanStaleConnections':
|
||||
return this.validateCleanStaleConnections(workflow, operation);
|
||||
case 'replaceConnections':
|
||||
@@ -315,6 +333,12 @@ export class WorkflowDiffEngine {
|
||||
case 'removeTag':
|
||||
this.applyRemoveTag(workflow, operation);
|
||||
break;
|
||||
case 'activateWorkflow':
|
||||
this.applyActivateWorkflow(workflow, operation);
|
||||
break;
|
||||
case 'deactivateWorkflow':
|
||||
this.applyDeactivateWorkflow(workflow, operation);
|
||||
break;
|
||||
case 'cleanStaleConnections':
|
||||
this.applyCleanStaleConnections(workflow, operation);
|
||||
break;
|
||||
@@ -847,13 +871,46 @@ export class WorkflowDiffEngine {
|
||||
|
||||
private applyRemoveTag(workflow: Workflow, operation: RemoveTagOperation): void {
|
||||
if (!workflow.tags) return;
|
||||
|
||||
|
||||
const index = workflow.tags.indexOf(operation.tag);
|
||||
if (index !== -1) {
|
||||
workflow.tags.splice(index, 1);
|
||||
}
|
||||
}
|
||||
|
||||
// Workflow activation operation validators
|
||||
private validateActivateWorkflow(workflow: Workflow, operation: ActivateWorkflowOperation): string | null {
|
||||
// Check if workflow has at least one activatable trigger
|
||||
// Issue #351: executeWorkflowTrigger cannot activate workflows
|
||||
const activatableTriggers = workflow.nodes.filter(
|
||||
node => !node.disabled && isActivatableTrigger(node.type)
|
||||
);
|
||||
|
||||
if (activatableTriggers.length === 0) {
|
||||
return 'Cannot activate workflow: No activatable trigger nodes found. Workflows must have at least one enabled trigger node (webhook, schedule, email, etc.). Note: executeWorkflowTrigger cannot activate workflows as they can only be invoked by other workflows.';
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
private validateDeactivateWorkflow(workflow: Workflow, operation: DeactivateWorkflowOperation): string | null {
|
||||
// Deactivation is always valid - any workflow can be deactivated
|
||||
return null;
|
||||
}
|
||||
|
||||
// Workflow activation operation appliers
|
||||
private applyActivateWorkflow(workflow: Workflow, operation: ActivateWorkflowOperation): void {
|
||||
// Set flag in workflow object to indicate activation intent
|
||||
// The handler will call the API method after workflow update
|
||||
(workflow as any)._shouldActivate = true;
|
||||
}
|
||||
|
||||
private applyDeactivateWorkflow(workflow: Workflow, operation: DeactivateWorkflowOperation): void {
|
||||
// Set flag in workflow object to indicate deactivation intent
|
||||
// The handler will call the API method after workflow update
|
||||
(workflow as any)._shouldDeactivate = true;
|
||||
}
|
||||
|
||||
// Connection cleanup operation validators
|
||||
private validateCleanStaleConnections(workflow: Workflow, operation: CleanStaleConnectionsOperation): string | null {
|
||||
// This operation is always valid - it just cleans up what it finds
|
||||
|
||||
@@ -66,6 +66,7 @@ export interface Workflow {
|
||||
updatedAt?: string;
|
||||
createdAt?: string;
|
||||
versionId?: string;
|
||||
versionCounter?: number; // Added: n8n 1.118.1+ returns this in GET responses
|
||||
meta?: {
|
||||
instanceId?: string;
|
||||
};
|
||||
@@ -152,6 +153,7 @@ export interface WorkflowExport {
|
||||
tags?: string[];
|
||||
pinData?: Record<string, unknown>;
|
||||
versionId?: string;
|
||||
versionCounter?: number; // Added: n8n 1.118.1+
|
||||
meta?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
|
||||
@@ -114,6 +114,16 @@ export interface RemoveTagOperation extends DiffOperation {
|
||||
tag: string;
|
||||
}
|
||||
|
||||
export interface ActivateWorkflowOperation extends DiffOperation {
|
||||
type: 'activateWorkflow';
|
||||
// No additional properties needed - just activates the workflow
|
||||
}
|
||||
|
||||
export interface DeactivateWorkflowOperation extends DiffOperation {
|
||||
type: 'deactivateWorkflow';
|
||||
// No additional properties needed - just deactivates the workflow
|
||||
}
|
||||
|
||||
// Connection Cleanup Operations
|
||||
export interface CleanStaleConnectionsOperation extends DiffOperation {
|
||||
type: 'cleanStaleConnections';
|
||||
@@ -148,6 +158,8 @@ export type WorkflowDiffOperation =
|
||||
| UpdateNameOperation
|
||||
| AddTagOperation
|
||||
| RemoveTagOperation
|
||||
| ActivateWorkflowOperation
|
||||
| DeactivateWorkflowOperation
|
||||
| CleanStaleConnectionsOperation
|
||||
| ReplaceConnectionsOperation;
|
||||
|
||||
@@ -176,6 +188,8 @@ export interface WorkflowDiffResult {
|
||||
applied?: number[]; // Indices of successfully applied operations (when continueOnError is true)
|
||||
failed?: number[]; // Indices of failed operations (when continueOnError is true)
|
||||
staleConnectionsRemoved?: Array<{ from: string; to: string }>; // For cleanStaleConnections operation
|
||||
shouldActivate?: boolean; // Flag to activate workflow after update (for activateWorkflow operation)
|
||||
shouldDeactivate?: boolean; // Flag to deactivate workflow after update (for deactivateWorkflow operation)
|
||||
}
|
||||
|
||||
// Helper type for node reference (supports both ID and name)
|
||||
|
||||
@@ -101,7 +101,6 @@ describe('Integration: handleListAvailableTools', () => {
|
||||
|
||||
// Common known limitations
|
||||
const limitationsText = data.limitations.join(' ');
|
||||
expect(limitationsText).toContain('Cannot activate');
|
||||
expect(limitationsText).toContain('Cannot execute workflows directly');
|
||||
});
|
||||
});
|
||||
|
||||
@@ -156,9 +156,11 @@ describe('handlers-workflow-diff', () => {
|
||||
operationsApplied: 1,
|
||||
workflowId: 'test-workflow-id',
|
||||
workflowName: 'Test Workflow',
|
||||
active: true,
|
||||
applied: [0],
|
||||
failed: [],
|
||||
errors: [],
|
||||
warnings: undefined,
|
||||
},
|
||||
});
|
||||
|
||||
@@ -633,5 +635,211 @@ describe('handlers-workflow-diff', () => {
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
describe('Workflow Activation/Deactivation', () => {
|
||||
it('should activate workflow after successful update', async () => {
|
||||
const testWorkflow = createTestWorkflow({ active: false });
|
||||
const updatedWorkflow = { ...testWorkflow, active: false };
|
||||
const activatedWorkflow = { ...testWorkflow, active: true };
|
||||
|
||||
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
|
||||
mockDiffEngine.applyDiff.mockResolvedValue({
|
||||
success: true,
|
||||
workflow: updatedWorkflow,
|
||||
operationsApplied: 1,
|
||||
message: 'Success',
|
||||
errors: [],
|
||||
shouldActivate: true,
|
||||
});
|
||||
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
|
||||
mockApiClient.activateWorkflow = vi.fn().mockResolvedValue(activatedWorkflow);
|
||||
|
||||
const result = await handleUpdatePartialWorkflow({
|
||||
id: 'test-workflow-id',
|
||||
operations: [{ type: 'activateWorkflow' }],
|
||||
}, mockRepository);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.data).toEqual(activatedWorkflow);
|
||||
expect(result.message).toContain('Workflow activated');
|
||||
expect(result.details?.active).toBe(true);
|
||||
expect(mockApiClient.activateWorkflow).toHaveBeenCalledWith('test-workflow-id');
|
||||
});
|
||||
|
||||
it('should deactivate workflow after successful update', async () => {
|
||||
const testWorkflow = createTestWorkflow({ active: true });
|
||||
const updatedWorkflow = { ...testWorkflow, active: true };
|
||||
const deactivatedWorkflow = { ...testWorkflow, active: false };
|
||||
|
||||
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
|
||||
mockDiffEngine.applyDiff.mockResolvedValue({
|
||||
success: true,
|
||||
workflow: updatedWorkflow,
|
||||
operationsApplied: 1,
|
||||
message: 'Success',
|
||||
errors: [],
|
||||
shouldDeactivate: true,
|
||||
});
|
||||
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
|
||||
mockApiClient.deactivateWorkflow = vi.fn().mockResolvedValue(deactivatedWorkflow);
|
||||
|
||||
const result = await handleUpdatePartialWorkflow({
|
||||
id: 'test-workflow-id',
|
||||
operations: [{ type: 'deactivateWorkflow' }],
|
||||
}, mockRepository);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.data).toEqual(deactivatedWorkflow);
|
||||
expect(result.message).toContain('Workflow deactivated');
|
||||
expect(result.details?.active).toBe(false);
|
||||
expect(mockApiClient.deactivateWorkflow).toHaveBeenCalledWith('test-workflow-id');
|
||||
});
|
||||
|
||||
it('should handle activation failure after successful update', async () => {
|
||||
const testWorkflow = createTestWorkflow({ active: false });
|
||||
const updatedWorkflow = { ...testWorkflow, active: false };
|
||||
|
||||
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
|
||||
mockDiffEngine.applyDiff.mockResolvedValue({
|
||||
success: true,
|
||||
workflow: updatedWorkflow,
|
||||
operationsApplied: 1,
|
||||
message: 'Success',
|
||||
errors: [],
|
||||
shouldActivate: true,
|
||||
});
|
||||
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
|
||||
mockApiClient.activateWorkflow = vi.fn().mockRejectedValue(new Error('Activation failed: No trigger nodes'));
|
||||
|
||||
const result = await handleUpdatePartialWorkflow({
|
||||
id: 'test-workflow-id',
|
||||
operations: [{ type: 'activateWorkflow' }],
|
||||
}, mockRepository);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toBe('Workflow updated successfully but activation failed');
|
||||
expect(result.details).toEqual({
|
||||
workflowUpdated: true,
|
||||
activationError: 'Activation failed: No trigger nodes',
|
||||
});
|
||||
});
|
||||
|
||||
it('should handle deactivation failure after successful update', async () => {
|
||||
const testWorkflow = createTestWorkflow({ active: true });
|
||||
const updatedWorkflow = { ...testWorkflow, active: true };
|
||||
|
||||
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
|
||||
mockDiffEngine.applyDiff.mockResolvedValue({
|
||||
success: true,
|
||||
workflow: updatedWorkflow,
|
||||
operationsApplied: 1,
|
||||
message: 'Success',
|
||||
errors: [],
|
||||
shouldDeactivate: true,
|
||||
});
|
||||
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
|
||||
mockApiClient.deactivateWorkflow = vi.fn().mockRejectedValue(new Error('Deactivation failed'));
|
||||
|
||||
const result = await handleUpdatePartialWorkflow({
|
||||
id: 'test-workflow-id',
|
||||
operations: [{ type: 'deactivateWorkflow' }],
|
||||
}, mockRepository);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toBe('Workflow updated successfully but deactivation failed');
|
||||
expect(result.details).toEqual({
|
||||
workflowUpdated: true,
|
||||
deactivationError: 'Deactivation failed',
|
||||
});
|
||||
});
|
||||
|
||||
it('should update workflow without activation when shouldActivate is false', async () => {
|
||||
const testWorkflow = createTestWorkflow({ active: false });
|
||||
const updatedWorkflow = { ...testWorkflow, active: false };
|
||||
|
||||
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
|
||||
mockDiffEngine.applyDiff.mockResolvedValue({
|
||||
success: true,
|
||||
workflow: updatedWorkflow,
|
||||
operationsApplied: 1,
|
||||
message: 'Success',
|
||||
errors: [],
|
||||
shouldActivate: false,
|
||||
shouldDeactivate: false,
|
||||
});
|
||||
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
|
||||
mockApiClient.activateWorkflow = vi.fn();
|
||||
mockApiClient.deactivateWorkflow = vi.fn();
|
||||
|
||||
const result = await handleUpdatePartialWorkflow({
|
||||
id: 'test-workflow-id',
|
||||
operations: [{ type: 'updateName', name: 'Updated' }],
|
||||
}, mockRepository);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.message).not.toContain('activated');
|
||||
expect(result.message).not.toContain('deactivated');
|
||||
expect(mockApiClient.activateWorkflow).not.toHaveBeenCalled();
|
||||
expect(mockApiClient.deactivateWorkflow).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('should handle non-Error activation failures', async () => {
|
||||
const testWorkflow = createTestWorkflow({ active: false });
|
||||
const updatedWorkflow = { ...testWorkflow, active: false };
|
||||
|
||||
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
|
||||
mockDiffEngine.applyDiff.mockResolvedValue({
|
||||
success: true,
|
||||
workflow: updatedWorkflow,
|
||||
operationsApplied: 1,
|
||||
message: 'Success',
|
||||
errors: [],
|
||||
shouldActivate: true,
|
||||
});
|
||||
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
|
||||
mockApiClient.activateWorkflow = vi.fn().mockRejectedValue('String error');
|
||||
|
||||
const result = await handleUpdatePartialWorkflow({
|
||||
id: 'test-workflow-id',
|
||||
operations: [{ type: 'activateWorkflow' }],
|
||||
}, mockRepository);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toBe('Workflow updated successfully but activation failed');
|
||||
expect(result.details).toEqual({
|
||||
workflowUpdated: true,
|
||||
activationError: 'Unknown error',
|
||||
});
|
||||
});
|
||||
|
||||
it('should handle non-Error deactivation failures', async () => {
|
||||
const testWorkflow = createTestWorkflow({ active: true });
|
||||
const updatedWorkflow = { ...testWorkflow, active: true };
|
||||
|
||||
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
|
||||
mockDiffEngine.applyDiff.mockResolvedValue({
|
||||
success: true,
|
||||
workflow: updatedWorkflow,
|
||||
operationsApplied: 1,
|
||||
message: 'Success',
|
||||
errors: [],
|
||||
shouldDeactivate: true,
|
||||
});
|
||||
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
|
||||
mockApiClient.deactivateWorkflow = vi.fn().mockRejectedValue({ code: 'UNKNOWN' });
|
||||
|
||||
const result = await handleUpdatePartialWorkflow({
|
||||
id: 'test-workflow-id',
|
||||
operations: [{ type: 'deactivateWorkflow' }],
|
||||
}, mockRepository);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.error).toBe('Workflow updated successfully but deactivation failed');
|
||||
expect(result.details).toEqual({
|
||||
workflowUpdated: true,
|
||||
deactivationError: 'Unknown error',
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
@@ -362,19 +362,19 @@ describe('N8nApiClient', () => {
|
||||
|
||||
it('should delete workflow successfully', async () => {
|
||||
mockAxiosInstance.delete.mockResolvedValue({ data: {} });
|
||||
|
||||
|
||||
await client.deleteWorkflow('123');
|
||||
|
||||
|
||||
expect(mockAxiosInstance.delete).toHaveBeenCalledWith('/workflows/123');
|
||||
});
|
||||
|
||||
it('should handle deletion error', async () => {
|
||||
const error = {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 404, data: { message: 'Not found' } }
|
||||
response: { status: 404, data: { message: 'Not found' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('delete', error);
|
||||
|
||||
|
||||
try {
|
||||
await client.deleteWorkflow('123');
|
||||
expect.fail('Should have thrown an error');
|
||||
@@ -386,6 +386,178 @@ describe('N8nApiClient', () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe('activateWorkflow', () => {
|
||||
beforeEach(() => {
|
||||
client = new N8nApiClient(defaultConfig);
|
||||
});
|
||||
|
||||
it('should activate workflow successfully', async () => {
|
||||
const workflow = { id: '123', name: 'Test', active: false, nodes: [], connections: {} };
|
||||
const activatedWorkflow = { ...workflow, active: true };
|
||||
mockAxiosInstance.post.mockResolvedValue({ data: activatedWorkflow });
|
||||
|
||||
const result = await client.activateWorkflow('123');
|
||||
|
||||
expect(mockAxiosInstance.post).toHaveBeenCalledWith('/workflows/123/activate');
|
||||
expect(result).toEqual(activatedWorkflow);
|
||||
expect(result.active).toBe(true);
|
||||
});
|
||||
|
||||
it('should handle activation error - no trigger nodes', async () => {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 400, data: { message: 'Workflow must have at least one trigger node' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('post', error);
|
||||
|
||||
try {
|
||||
await client.activateWorkflow('123');
|
||||
expect.fail('Should have thrown an error');
|
||||
} catch (err) {
|
||||
expect(err).toBeInstanceOf(N8nValidationError);
|
||||
expect((err as N8nValidationError).message).toContain('trigger node');
|
||||
expect((err as N8nValidationError).statusCode).toBe(400);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle activation error - workflow not found', async () => {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 404, data: { message: 'Workflow not found' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('post', error);
|
||||
|
||||
try {
|
||||
await client.activateWorkflow('non-existent');
|
||||
expect.fail('Should have thrown an error');
|
||||
} catch (err) {
|
||||
expect(err).toBeInstanceOf(N8nNotFoundError);
|
||||
expect((err as N8nNotFoundError).message).toContain('not found');
|
||||
expect((err as N8nNotFoundError).statusCode).toBe(404);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle activation error - workflow already active', async () => {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 400, data: { message: 'Workflow is already active' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('post', error);
|
||||
|
||||
try {
|
||||
await client.activateWorkflow('123');
|
||||
expect.fail('Should have thrown an error');
|
||||
} catch (err) {
|
||||
expect(err).toBeInstanceOf(N8nValidationError);
|
||||
expect((err as N8nValidationError).message).toContain('already active');
|
||||
expect((err as N8nValidationError).statusCode).toBe(400);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle server error during activation', async () => {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 500, data: { message: 'Internal server error' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('post', error);
|
||||
|
||||
try {
|
||||
await client.activateWorkflow('123');
|
||||
expect.fail('Should have thrown an error');
|
||||
} catch (err) {
|
||||
expect(err).toBeInstanceOf(N8nServerError);
|
||||
expect((err as N8nServerError).message).toBe('Internal server error');
|
||||
expect((err as N8nServerError).statusCode).toBe(500);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('deactivateWorkflow', () => {
|
||||
beforeEach(() => {
|
||||
client = new N8nApiClient(defaultConfig);
|
||||
});
|
||||
|
||||
it('should deactivate workflow successfully', async () => {
|
||||
const workflow = { id: '123', name: 'Test', active: true, nodes: [], connections: {} };
|
||||
const deactivatedWorkflow = { ...workflow, active: false };
|
||||
mockAxiosInstance.post.mockResolvedValue({ data: deactivatedWorkflow });
|
||||
|
||||
const result = await client.deactivateWorkflow('123');
|
||||
|
||||
expect(mockAxiosInstance.post).toHaveBeenCalledWith('/workflows/123/deactivate');
|
||||
expect(result).toEqual(deactivatedWorkflow);
|
||||
expect(result.active).toBe(false);
|
||||
});
|
||||
|
||||
it('should handle deactivation error - workflow not found', async () => {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 404, data: { message: 'Workflow not found' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('post', error);
|
||||
|
||||
try {
|
||||
await client.deactivateWorkflow('non-existent');
|
||||
expect.fail('Should have thrown an error');
|
||||
} catch (err) {
|
||||
expect(err).toBeInstanceOf(N8nNotFoundError);
|
||||
expect((err as N8nNotFoundError).message).toContain('not found');
|
||||
expect((err as N8nNotFoundError).statusCode).toBe(404);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle deactivation error - workflow already inactive', async () => {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 400, data: { message: 'Workflow is already inactive' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('post', error);
|
||||
|
||||
try {
|
||||
await client.deactivateWorkflow('123');
|
||||
expect.fail('Should have thrown an error');
|
||||
} catch (err) {
|
||||
expect(err).toBeInstanceOf(N8nValidationError);
|
||||
expect((err as N8nValidationError).message).toContain('already inactive');
|
||||
expect((err as N8nValidationError).statusCode).toBe(400);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle server error during deactivation', async () => {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 500, data: { message: 'Internal server error' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('post', error);
|
||||
|
||||
try {
|
||||
await client.deactivateWorkflow('123');
|
||||
expect.fail('Should have thrown an error');
|
||||
} catch (err) {
|
||||
expect(err).toBeInstanceOf(N8nServerError);
|
||||
expect((err as N8nServerError).message).toBe('Internal server error');
|
||||
expect((err as N8nServerError).statusCode).toBe(500);
|
||||
}
|
||||
});
|
||||
|
||||
it('should handle authentication error during deactivation', async () => {
|
||||
const error = {
|
||||
message: 'Request failed',
|
||||
response: { status: 401, data: { message: 'Invalid API key' } }
|
||||
};
|
||||
await mockAxiosInstance.simulateError('post', error);
|
||||
|
||||
try {
|
||||
await client.deactivateWorkflow('123');
|
||||
expect.fail('Should have thrown an error');
|
||||
} catch (err) {
|
||||
expect(err).toBeInstanceOf(N8nAuthenticationError);
|
||||
expect((err as N8nAuthenticationError).message).toBe('Invalid API key');
|
||||
expect((err as N8nAuthenticationError).statusCode).toBe(401);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('listWorkflows', () => {
|
||||
beforeEach(() => {
|
||||
client = new N8nApiClient(defaultConfig);
|
||||
|
||||
@@ -313,6 +313,7 @@ describe('n8n-validation', () => {
|
||||
createdAt: '2023-01-01',
|
||||
updatedAt: '2023-01-01',
|
||||
versionId: 'v123',
|
||||
versionCounter: 5, // n8n 1.118.1+ field
|
||||
meta: { test: 'data' },
|
||||
staticData: { some: 'data' },
|
||||
pinData: { pin: 'data' },
|
||||
@@ -333,6 +334,7 @@ describe('n8n-validation', () => {
|
||||
expect(cleaned).not.toHaveProperty('createdAt');
|
||||
expect(cleaned).not.toHaveProperty('updatedAt');
|
||||
expect(cleaned).not.toHaveProperty('versionId');
|
||||
expect(cleaned).not.toHaveProperty('versionCounter'); // n8n 1.118.1+ compatibility
|
||||
expect(cleaned).not.toHaveProperty('meta');
|
||||
expect(cleaned).not.toHaveProperty('staticData');
|
||||
expect(cleaned).not.toHaveProperty('pinData');
|
||||
@@ -349,6 +351,22 @@ describe('n8n-validation', () => {
|
||||
expect(cleaned.settings).toEqual({ executionOrder: 'v1' });
|
||||
});
|
||||
|
||||
it('should exclude versionCounter for n8n 1.118.1+ compatibility', () => {
|
||||
const workflow = {
|
||||
name: 'Test Workflow',
|
||||
nodes: [],
|
||||
connections: {},
|
||||
versionId: 'v123',
|
||||
versionCounter: 5, // n8n 1.118.1 returns this but rejects it in PUT
|
||||
} as any;
|
||||
|
||||
const cleaned = cleanWorkflowForUpdate(workflow);
|
||||
|
||||
expect(cleaned).not.toHaveProperty('versionCounter');
|
||||
expect(cleaned).not.toHaveProperty('versionId');
|
||||
expect(cleaned.name).toBe('Test Workflow');
|
||||
});
|
||||
|
||||
it('should add empty settings object for cloud API compatibility', () => {
|
||||
const workflow = {
|
||||
name: 'Test Workflow',
|
||||
|
||||
@@ -4269,4 +4269,358 @@ describe('WorkflowDiffEngine', () => {
|
||||
expect(result.workflow.connections["When clicking 'Execute workflow'"]).toBeDefined();
|
||||
});
|
||||
});
|
||||
|
||||
describe('Workflow Activation/Deactivation Operations', () => {
|
||||
it('should activate workflow with activatable trigger nodes', async () => {
|
||||
// Create workflow with webhook trigger (activatable)
|
||||
const workflowWithTrigger = createWorkflow('Test Workflow')
|
||||
.addWebhookNode({ id: 'webhook-1', name: 'Webhook Trigger' })
|
||||
.addHttpRequestNode({ id: 'http-1', name: 'HTTP Request' })
|
||||
.connect('webhook-1', 'http-1')
|
||||
.build() as Workflow;
|
||||
|
||||
// Fix connections to use node names
|
||||
const newConnections: any = {};
|
||||
for (const [nodeId, outputs] of Object.entries(workflowWithTrigger.connections)) {
|
||||
const node = workflowWithTrigger.nodes.find((n: any) => n.id === nodeId);
|
||||
if (node) {
|
||||
newConnections[node.name] = {};
|
||||
for (const [outputName, connections] of Object.entries(outputs)) {
|
||||
newConnections[node.name][outputName] = (connections as any[]).map((conns: any) =>
|
||||
conns.map((conn: any) => {
|
||||
const targetNode = workflowWithTrigger.nodes.find((n: any) => n.id === conn.node);
|
||||
return { ...conn, node: targetNode ? targetNode.name : conn.node };
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
workflowWithTrigger.connections = newConnections;
|
||||
|
||||
const operation: any = {
|
||||
type: 'activateWorkflow'
|
||||
};
|
||||
|
||||
const request: WorkflowDiffRequest = {
|
||||
id: 'test-workflow',
|
||||
operations: [operation]
|
||||
};
|
||||
|
||||
const result = await diffEngine.applyDiff(workflowWithTrigger, request);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.shouldActivate).toBe(true);
|
||||
expect((result.workflow as any)._shouldActivate).toBeUndefined(); // Flag should be cleaned up
|
||||
});
|
||||
|
||||
it('should reject activation if no activatable trigger nodes', async () => {
|
||||
// Create workflow with no trigger nodes at all
|
||||
const workflowWithoutActivatableTrigger = createWorkflow('Test Workflow')
|
||||
.addNode({
|
||||
id: 'set-1',
|
||||
name: 'Set Node',
|
||||
type: 'n8n-nodes-base.set',
|
||||
typeVersion: 1,
|
||||
position: [100, 100],
|
||||
parameters: {}
|
||||
})
|
||||
.addHttpRequestNode({ id: 'http-1', name: 'HTTP Request' })
|
||||
.connect('set-1', 'http-1')
|
||||
.build() as Workflow;
|
||||
|
||||
// Fix connections to use node names
|
||||
const newConnections: any = {};
|
||||
for (const [nodeId, outputs] of Object.entries(workflowWithoutActivatableTrigger.connections)) {
|
||||
const node = workflowWithoutActivatableTrigger.nodes.find((n: any) => n.id === nodeId);
|
||||
if (node) {
|
||||
newConnections[node.name] = {};
|
||||
for (const [outputName, connections] of Object.entries(outputs)) {
|
||||
newConnections[node.name][outputName] = (connections as any[]).map((conns: any) =>
|
||||
conns.map((conn: any) => {
|
||||
const targetNode = workflowWithoutActivatableTrigger.nodes.find((n: any) => n.id === conn.node);
|
||||
return { ...conn, node: targetNode ? targetNode.name : conn.node };
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
workflowWithoutActivatableTrigger.connections = newConnections;
|
||||
|
||||
const operation: any = {
|
||||
type: 'activateWorkflow'
|
||||
};
|
||||
|
||||
const request: WorkflowDiffRequest = {
|
||||
id: 'test-workflow',
|
||||
operations: [operation]
|
||||
};
|
||||
|
||||
const result = await diffEngine.applyDiff(workflowWithoutActivatableTrigger, request);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.errors).toBeDefined();
|
||||
expect(result.errors![0].message).toContain('No activatable trigger nodes found');
|
||||
expect(result.errors![0].message).toContain('executeWorkflowTrigger cannot activate workflows');
|
||||
});
|
||||
|
||||
it('should reject activation if all trigger nodes are disabled', async () => {
|
||||
// Create workflow with disabled webhook trigger
|
||||
const workflowWithDisabledTrigger = createWorkflow('Test Workflow')
|
||||
.addWebhookNode({ id: 'webhook-1', name: 'Webhook Trigger', disabled: true })
|
||||
.addHttpRequestNode({ id: 'http-1', name: 'HTTP Request' })
|
||||
.connect('webhook-1', 'http-1')
|
||||
.build() as Workflow;
|
||||
|
||||
// Fix connections to use node names
|
||||
const newConnections: any = {};
|
||||
for (const [nodeId, outputs] of Object.entries(workflowWithDisabledTrigger.connections)) {
|
||||
const node = workflowWithDisabledTrigger.nodes.find((n: any) => n.id === nodeId);
|
||||
if (node) {
|
||||
newConnections[node.name] = {};
|
||||
for (const [outputName, connections] of Object.entries(outputs)) {
|
||||
newConnections[node.name][outputName] = (connections as any[]).map((conns: any) =>
|
||||
conns.map((conn: any) => {
|
||||
const targetNode = workflowWithDisabledTrigger.nodes.find((n: any) => n.id === conn.node);
|
||||
return { ...conn, node: targetNode ? targetNode.name : conn.node };
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
workflowWithDisabledTrigger.connections = newConnections;
|
||||
|
||||
const operation: any = {
|
||||
type: 'activateWorkflow'
|
||||
};
|
||||
|
||||
const request: WorkflowDiffRequest = {
|
||||
id: 'test-workflow',
|
||||
operations: [operation]
|
||||
};
|
||||
|
||||
const result = await diffEngine.applyDiff(workflowWithDisabledTrigger, request);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.errors).toBeDefined();
|
||||
expect(result.errors![0].message).toContain('No activatable trigger nodes found');
|
||||
});
|
||||
|
||||
it('should activate workflow with schedule trigger', async () => {
|
||||
// Create workflow with schedule trigger (activatable)
|
||||
const workflowWithSchedule = createWorkflow('Test Workflow')
|
||||
.addNode({
|
||||
id: 'schedule-1',
|
||||
name: 'Schedule',
|
||||
type: 'n8n-nodes-base.scheduleTrigger',
|
||||
typeVersion: 1,
|
||||
position: [100, 100],
|
||||
parameters: { rule: { interval: [{ field: 'hours', hoursInterval: 1 }] } }
|
||||
})
|
||||
.addHttpRequestNode({ id: 'http-1', name: 'HTTP Request' })
|
||||
.connect('schedule-1', 'http-1')
|
||||
.build() as Workflow;
|
||||
|
||||
// Fix connections
|
||||
const newConnections: any = {};
|
||||
for (const [nodeId, outputs] of Object.entries(workflowWithSchedule.connections)) {
|
||||
const node = workflowWithSchedule.nodes.find((n: any) => n.id === nodeId);
|
||||
if (node) {
|
||||
newConnections[node.name] = {};
|
||||
for (const [outputName, connections] of Object.entries(outputs)) {
|
||||
newConnections[node.name][outputName] = (connections as any[]).map((conns: any) =>
|
||||
conns.map((conn: any) => {
|
||||
const targetNode = workflowWithSchedule.nodes.find((n: any) => n.id === conn.node);
|
||||
return { ...conn, node: targetNode ? targetNode.name : conn.node };
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
workflowWithSchedule.connections = newConnections;
|
||||
|
||||
const operation: any = {
|
||||
type: 'activateWorkflow'
|
||||
};
|
||||
|
||||
const request: WorkflowDiffRequest = {
|
||||
id: 'test-workflow',
|
||||
operations: [operation]
|
||||
};
|
||||
|
||||
const result = await diffEngine.applyDiff(workflowWithSchedule, request);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.shouldActivate).toBe(true);
|
||||
});
|
||||
|
||||
it('should deactivate workflow successfully', async () => {
|
||||
// Any workflow can be deactivated
|
||||
const operation: any = {
|
||||
type: 'deactivateWorkflow'
|
||||
};
|
||||
|
||||
const request: WorkflowDiffRequest = {
|
||||
id: 'test-workflow',
|
||||
operations: [operation]
|
||||
};
|
||||
|
||||
const result = await diffEngine.applyDiff(baseWorkflow, request);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.shouldDeactivate).toBe(true);
|
||||
expect((result.workflow as any)._shouldDeactivate).toBeUndefined(); // Flag should be cleaned up
|
||||
});
|
||||
|
||||
it('should deactivate workflow without trigger nodes', async () => {
|
||||
// Create workflow without any trigger nodes
|
||||
const workflowWithoutTrigger = createWorkflow('Test Workflow')
|
||||
.addHttpRequestNode({ id: 'http-1', name: 'HTTP Request' })
|
||||
.addNode({
|
||||
id: 'set-1',
|
||||
name: 'Set',
|
||||
type: 'n8n-nodes-base.set',
|
||||
typeVersion: 1,
|
||||
position: [300, 100],
|
||||
parameters: {}
|
||||
})
|
||||
.connect('http-1', 'set-1')
|
||||
.build() as Workflow;
|
||||
|
||||
// Fix connections
|
||||
const newConnections: any = {};
|
||||
for (const [nodeId, outputs] of Object.entries(workflowWithoutTrigger.connections)) {
|
||||
const node = workflowWithoutTrigger.nodes.find((n: any) => n.id === nodeId);
|
||||
if (node) {
|
||||
newConnections[node.name] = {};
|
||||
for (const [outputName, connections] of Object.entries(outputs)) {
|
||||
newConnections[node.name][outputName] = (connections as any[]).map((conns: any) =>
|
||||
conns.map((conn: any) => {
|
||||
const targetNode = workflowWithoutTrigger.nodes.find((n: any) => n.id === conn.node);
|
||||
return { ...conn, node: targetNode ? targetNode.name : conn.node };
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
workflowWithoutTrigger.connections = newConnections;
|
||||
|
||||
const operation: any = {
|
||||
type: 'deactivateWorkflow'
|
||||
};
|
||||
|
||||
const request: WorkflowDiffRequest = {
|
||||
id: 'test-workflow',
|
||||
operations: [operation]
|
||||
};
|
||||
|
||||
const result = await diffEngine.applyDiff(workflowWithoutTrigger, request);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.shouldDeactivate).toBe(true);
|
||||
});
|
||||
|
||||
it('should combine activation with other operations', async () => {
|
||||
// Create workflow with webhook trigger
|
||||
const workflowWithTrigger = createWorkflow('Test Workflow')
|
||||
.addWebhookNode({ id: 'webhook-1', name: 'Webhook Trigger' })
|
||||
.addHttpRequestNode({ id: 'http-1', name: 'HTTP Request' })
|
||||
.connect('webhook-1', 'http-1')
|
||||
.build() as Workflow;
|
||||
|
||||
// Fix connections
|
||||
const newConnections: any = {};
|
||||
for (const [nodeId, outputs] of Object.entries(workflowWithTrigger.connections)) {
|
||||
const node = workflowWithTrigger.nodes.find((n: any) => n.id === nodeId);
|
||||
if (node) {
|
||||
newConnections[node.name] = {};
|
||||
for (const [outputName, connections] of Object.entries(outputs)) {
|
||||
newConnections[node.name][outputName] = (connections as any[]).map((conns: any) =>
|
||||
conns.map((conn: any) => {
|
||||
const targetNode = workflowWithTrigger.nodes.find((n: any) => n.id === conn.node);
|
||||
return { ...conn, node: targetNode ? targetNode.name : conn.node };
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
workflowWithTrigger.connections = newConnections;
|
||||
|
||||
const operations: any[] = [
|
||||
{
|
||||
type: 'updateName',
|
||||
name: 'Updated Workflow Name'
|
||||
},
|
||||
{
|
||||
type: 'addTag',
|
||||
tag: 'production'
|
||||
},
|
||||
{
|
||||
type: 'activateWorkflow'
|
||||
}
|
||||
];
|
||||
|
||||
const request: WorkflowDiffRequest = {
|
||||
id: 'test-workflow',
|
||||
operations
|
||||
};
|
||||
|
||||
const result = await diffEngine.applyDiff(workflowWithTrigger, request);
|
||||
|
||||
expect(result.success).toBe(true);
|
||||
expect(result.operationsApplied).toBe(3);
|
||||
expect(result.workflow!.name).toBe('Updated Workflow Name');
|
||||
expect(result.workflow!.tags).toContain('production');
|
||||
expect(result.shouldActivate).toBe(true);
|
||||
});
|
||||
|
||||
it('should reject activation if workflow has executeWorkflowTrigger only', async () => {
|
||||
// Create workflow with executeWorkflowTrigger (not activatable - Issue #351)
|
||||
const workflowWithExecuteTrigger = createWorkflow('Test Workflow')
|
||||
.addNode({
|
||||
id: 'execute-1',
|
||||
name: 'Execute Workflow Trigger',
|
||||
type: 'n8n-nodes-base.executeWorkflowTrigger',
|
||||
typeVersion: 1,
|
||||
position: [100, 100],
|
||||
parameters: {}
|
||||
})
|
||||
.addHttpRequestNode({ id: 'http-1', name: 'HTTP Request' })
|
||||
.connect('execute-1', 'http-1')
|
||||
.build() as Workflow;
|
||||
|
||||
// Fix connections
|
||||
const newConnections: any = {};
|
||||
for (const [nodeId, outputs] of Object.entries(workflowWithExecuteTrigger.connections)) {
|
||||
const node = workflowWithExecuteTrigger.nodes.find((n: any) => n.id === nodeId);
|
||||
if (node) {
|
||||
newConnections[node.name] = {};
|
||||
for (const [outputName, connections] of Object.entries(outputs)) {
|
||||
newConnections[node.name][outputName] = (connections as any[]).map((conns: any) =>
|
||||
conns.map((conn: any) => {
|
||||
const targetNode = workflowWithExecuteTrigger.nodes.find((n: any) => n.id === conn.node);
|
||||
return { ...conn, node: targetNode ? targetNode.name : conn.node };
|
||||
})
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
workflowWithExecuteTrigger.connections = newConnections;
|
||||
|
||||
const operation: any = {
|
||||
type: 'activateWorkflow'
|
||||
};
|
||||
|
||||
const request: WorkflowDiffRequest = {
|
||||
id: 'test-workflow',
|
||||
operations: [operation]
|
||||
};
|
||||
|
||||
const result = await diffEngine.applyDiff(workflowWithExecuteTrigger, request);
|
||||
|
||||
expect(result.success).toBe(false);
|
||||
expect(result.errors).toBeDefined();
|
||||
expect(result.errors![0].message).toContain('No activatable trigger nodes found');
|
||||
expect(result.errors![0].message).toContain('executeWorkflowTrigger cannot activate workflows');
|
||||
});
|
||||
});
|
||||
});
|
||||
Reference in New Issue
Block a user