Compare commits

..

3 Commits

Author SHA1 Message Date
czlonkowski
71170d3f76 fix(tests): close existing connection before reconnecting in MCP protocol tests
MCP SDK 1.27+ enforces single-connection per Server instance, throwing
"Already connected to a transport" when connect() is called twice.
Updated test helper to close existing connections before reconnecting.

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-14 13:49:23 +01:00
czlonkowski
fb0a501f1c fix(ci): update MCP SDK version check from 1.20.1 to 1.27.1
The dependency-check workflow was hardcoded to expect MCP SDK 1.20.1,
causing CI failure after the intentional upgrade to 1.27.1.

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-14 13:33:42 +01:00
czlonkowski
6fc956f77e chore: update n8n to 2.11.4 and bump version to 2.36.2
- Updated n8n from 2.10.3 to 2.11.4
- Updated n8n-core from 2.10.1 to 2.11.1
- Updated n8n-workflow from 2.10.1 to 2.11.1
- Updated @n8n/n8n-nodes-langchain from 2.10.1 to 2.11.2
- Updated @modelcontextprotocol/sdk from 1.20.1 to 1.27.1 (critical security fix)
- Rebuilt node database with 1,239 nodes (809 core + 430 community preserved)
- Updated README badge with new n8n version and node counts
- Updated CHANGELOG with dependency changes

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-14 13:17:15 +01:00
84 changed files with 5749 additions and 4063 deletions

View File

@@ -7,117 +7,6 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
## [Unreleased] ## [Unreleased]
## [2.40.1] - 2026-03-21
### Fixed
- **`n8n_manage_datatable` row operations broken by MCP transport serialization**: `data` parameter received as string instead of JSON — added `z.preprocess` coercers for array/object/filter params
- **`n8n_manage_datatable` filter/sortBy URL encoding**: n8n API requires URL-encoded query params — added `encodeURIComponent()` for filter and sortBy in getRows and deleteRows
- **`json` column type rejected by n8n API**: Removed `json` from column type enum (n8n only accepts string/number/boolean/date)
- **Garbled 404 error messages**: Fixed `N8nNotFoundError` constructor — API error messages are now passed through cleanly instead of being wrapped in "Resource with ID ... not found"
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
## [2.40.0] - 2026-03-21
### Changed
- **`n8n_manage_datatable` MCP tool** (replaces `n8n_create_data_table`): Full data table management covering all 10 n8n data table API endpoints
- **Table operations**: createTable, listTables, getTable, updateTable, deleteTable
- **Row operations**: getRows, insertRows, updateRows, upsertRows, deleteRows
- Filter system with and/or logic and 8 condition operators (eq, neq, like, ilike, gt, gte, lt, lte)
- Dry-run support for updateRows, upsertRows, deleteRows
- Pagination, sorting, and full-text search for row listing
- Shared error handler and consolidated Zod schemas for consistency
- 9 new `N8nApiClient` methods for all data table endpoints
- **`projectId` parameter for `n8n_create_workflow`**: Create workflows directly in a specific team project (enterprise feature)
### Breaking
- `n8n_create_data_table` tool replaced by `n8n_manage_datatable` with `action: "createTable"`
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
## [2.38.0] - 2026-03-20
### Added
- **`transferWorkflow` diff operation** (Issue #644): Move workflows between projects via `n8n_update_partial_workflow`
- New `transferWorkflow` operation type with `destinationProjectId` parameter
- Calls `PUT /workflows/{id}/transfer` via dedicated API after workflow update
- Proper error handling: returns `{ success: false, saved: true }` when transfer fails after update
- Transfer executes before activation so workflow is in target project first
- Zod schema validates `destinationProjectId` is non-empty
- Updated tool description and documentation to list the new operation
- `inferIntentFromOperations` returns descriptive intent for transfer operations
- `N8nApiClient.transferWorkflow()` method added
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
## [2.37.4] - 2026-03-18
### Changed
- **Updated n8n dependencies**: n8n 2.11.4 → 2.12.3, n8n-core 2.11.1 → 2.12.0, n8n-workflow 2.11.1 → 2.12.0, @n8n/n8n-nodes-langchain 2.11.2 → 2.12.0
- **Rebuilt node database**: 1,239 nodes (809 from n8n-nodes-base and @n8n/n8n-nodes-langchain, 430 community)
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
## [2.37.3] - 2026-03-15
### Fixed
- **updateNode `name`/`id` field normalization**: LLMs sending `{type: "updateNode", name: "Code", ...}` instead of `nodeName` no longer get "Node not found" errors. The Zod schema now normalizes `name``nodeName` and `id``nodeId` for node-targeting operations (updateNode, removeNode, moveNode, enableNode, disableNode)
- **AI connection types in disconnected-node detection** (Issue #581): Replaced hardcoded 7-type list with dynamic iteration over all connection types present in workflow data. Nodes connected via `ai_outputParser`, `ai_document`, `ai_textSplitter`, `ai_agent`, `ai_chain`, `ai_retriever` are no longer falsely flagged as disconnected during save
- **Connection schema and reference validation** (Issue #581): Added `.catchall()` to `workflowConnectionSchema` for unknown AI connection types, and extended connection reference validation to check all connection types (not just `main`)
- **autofix `filterOperationsByFixes` ID-vs-name mismatch**: Typeversion-upgrade operations now include `nodeName` alongside `nodeId`, and the filter checks both fields. Previously, `applyFixes=true` silently dropped all typeversion fixes because `fixedNodes` contained names but the filter only checked `nodeId` (UUID)
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
## [2.37.2] - 2026-03-15
### Fixed
- **Code validator `$()` false positive** (Issue #294): `$('Previous Node').first().json` no longer triggers "Invalid $ usage detected" warning. Added `(` and `_` to the regex negative lookahead to support standard n8n cross-node references and valid JS identifiers like `$_var`
- **Code validator helper function return false positive** (Issue #293): `function isValid(item) { return false; }` no longer triggers "Cannot return primitive values directly" error. Added helper function detection to skip primitive return checks when named functions or arrow function assignments are present
- **Null property removal in diff engine** (Issue #611): `{continueOnFail: null}` no longer causes Zod validation error "Expected boolean, received null". The diff engine now treats `null` values as property deletion (`delete` operator), and documentation updated from `undefined` to `null` for property removal
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
## [2.37.1] - 2026-03-14
### Fixed
- **Numeric sourceOutput remapping** (Issue #537): `addConnection` with numeric `sourceOutput` values like `"0"` or `"1"` now correctly maps to `"main"` with the corresponding `sourceIndex`, preventing malformed connection keys
- **IMAP Email Trigger activation** (Issue #538): `n8n-nodes-base.emailReadImap` and other IMAP-based polling triggers are now recognized as activatable triggers, allowing workflow activation
- **AI tool description false positives** (Issue #477): Validators now check `description` and `options.description` in addition to `toolDescription`, fixing false `MISSING_TOOL_DESCRIPTION` errors for toolWorkflow, toolCode, and toolSerpApi nodes
- **n8n_create_workflow undefined ID** (Issue #602): Added defensive check for missing workflow ID in API response with actionable error message
- **Flaky CI performance test**: Relaxed bulk insert ratio threshold from 15 to 20 to accommodate CI runner variability
Conceived by Romuald Czlonkowski - https://www.aiadvisors.pl/en
## [2.37.0] - 2026-03-14
### Fixed
- **Unary operator sanitization** (Issue #592): Added missing `empty`, `notEmpty`, `exists`, `notExists` operators to the sanitizer's unary operator list, preventing IF/Switch node corruption during partial updates
- **Positional connection array preservation** (Issue #610): `removeNode` and `cleanStaleConnections` now trim only trailing empty arrays, preserving intermediate positional indices for IF/Switch multi-output nodes
- **Scoped sanitization**: Auto-sanitization now only runs on nodes that were actually added or updated, preventing unrelated nodes (e.g., HTTP Request parameters) from being silently modified
- **Activate/deactivate 415 errors** (Issue #633): Added empty body `{}` to POST calls for workflow activation/deactivation endpoints
- **Zod error readability** (Issue #630): Validation errors now return human-readable `"path: message"` strings instead of raw Zod error objects
- **updateNode error hints** (Issue #623): Improved error message when `updates` parameter is missing, showing correct structure with `nodeId`/`nodeName` and `updates` fields
- **removeConnection after removeNode** (Issue #624): When a node was already removed by a prior `removeNode` operation, the error message now explains that connections were automatically cleaned up
- **Connection type coercion** (Issue #629): `sourceOutput` and `targetInput` are now coerced to strings, handling numeric values (0, 1) passed by MCP clients
### Added
- **`saved` field in responses** (Issue #625): All `n8n_update_partial_workflow` responses now include `saved: true/false` to distinguish whether the workflow was persisted to n8n
- **Tag operations via dedicated API** (Issue #599): `addTag`/`removeTag` now use the n8n tag API (`PUT /workflows/{id}/tags`) instead of embedding tags in the workflow body, fixing silent tag failures. Includes automatic tag creation, case-insensitive name resolution, and last-operation-wins reconciliation for conflicting add/remove
- **`updateWorkflowTags` API client method**: New method on `N8nApiClient` for managing workflow tag associations via the dedicated endpoint
- **`operationsApplied` in top-level response**: Promoted from nested `details` to top-level for easier consumption by MCP clients
Conceived by Romuald Czlonkowski - https://www.aiadvisors.pl/en
## [2.36.2] - 2026-03-14 ## [2.36.2] - 2026-03-14
### Changed ### Changed

View File

@@ -5,7 +5,7 @@
[![npm version](https://img.shields.io/npm/v/n8n-mcp.svg)](https://www.npmjs.com/package/n8n-mcp) [![npm version](https://img.shields.io/npm/v/n8n-mcp.svg)](https://www.npmjs.com/package/n8n-mcp)
[![codecov](https://codecov.io/gh/czlonkowski/n8n-mcp/graph/badge.svg?token=YOUR_TOKEN)](https://codecov.io/gh/czlonkowski/n8n-mcp) [![codecov](https://codecov.io/gh/czlonkowski/n8n-mcp/graph/badge.svg?token=YOUR_TOKEN)](https://codecov.io/gh/czlonkowski/n8n-mcp)
[![Tests](https://img.shields.io/badge/tests-3336%20passing-brightgreen.svg)](https://github.com/czlonkowski/n8n-mcp/actions) [![Tests](https://img.shields.io/badge/tests-3336%20passing-brightgreen.svg)](https://github.com/czlonkowski/n8n-mcp/actions)
[![n8n version](https://img.shields.io/badge/n8n-2.12.3-orange.svg)](https://github.com/n8n-io/n8n) [![n8n version](https://img.shields.io/badge/n8n-2.11.4-orange.svg)](https://github.com/n8n-io/n8n)
[![Docker](https://img.shields.io/badge/docker-ghcr.io%2Fczlonkowski%2Fn8n--mcp-green.svg)](https://github.com/czlonkowski/n8n-mcp/pkgs/container/n8n-mcp) [![Docker](https://img.shields.io/badge/docker-ghcr.io%2Fczlonkowski%2Fn8n--mcp-green.svg)](https://github.com/czlonkowski/n8n-mcp/pkgs/container/n8n-mcp)
[![Deploy on Railway](https://railway.com/button.svg)](https://railway.com/deploy/n8n-mcp?referralCode=n8n-mcp) [![Deploy on Railway](https://railway.com/button.svg)](https://railway.com/deploy/n8n-mcp?referralCode=n8n-mcp)

Binary file not shown.

View File

@@ -1 +1 @@
{"version":3,"file":"handlers-n8n-manager.d.ts","sourceRoot":"","sources":["../../src/mcp/handlers-n8n-manager.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,YAAY,EAAE,MAAM,4BAA4B,CAAC;AAE1D,OAAO,EAML,eAAe,EAGhB,MAAM,kBAAkB,CAAC;AAkB1B,OAAO,EAAE,cAAc,EAAE,MAAM,6BAA6B,CAAC;AAC7D,OAAO,EAAE,eAAe,EAA2B,MAAM,2BAA2B,CAAC;AAOrF,OAAO,EAAE,eAAe,EAAE,MAAM,+BAA+B,CAAC;AAqNhE,wBAAgB,0BAA0B,IAAI,MAAM,CAEnD;AAMD,wBAAgB,uBAAuB,gDAEtC;AAKD,wBAAgB,kBAAkB,IAAI,IAAI,CAIzC;AAED,wBAAgB,eAAe,CAAC,OAAO,CAAC,EAAE,eAAe,GAAG,YAAY,GAAG,IAAI,CAgF9E;AA2HD,wBAAsB,oBAAoB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CA8F7G;AAED,wBAAsB,iBAAiB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAiC1G;AAED,wBAAsB,wBAAwB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAoDjH;AAED,wBAAsB,0BAA0B,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAmDnH;AAED,wBAAsB,wBAAwB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAyCjH;AAED,wBAAsB,oBAAoB,CACxC,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CA8H1B;AAeD,wBAAsB,oBAAoB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAsC7G;AAED,wBAAsB,mBAAmB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAiE5G;AAED,wBAAsB,sBAAsB,CAC1C,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CA0F1B;AAED,wBAAsB,qBAAqB,CACzC,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CAoK1B;AAQD,wBAAsB,kBAAkB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAwJ3G;AAED,wBAAsB,kBAAkB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CA8H3G;AAED,wBAAsB,oBAAoB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAgD7G;AAED,wBAAsB,qBAAqB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAiC9G;AAID,wBAAsB,iBAAiB,CAAC,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAwG3F;AAkLD,wBAAsB,gBAAgB,CAAC,OAAO,EAAE,GAAG,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAkQxG;AAED,wBAAsB,sBAAsB,CAC1C,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CAsL1B;AA+BD,wBAAsB,oBAAoB,CACxC,IAAI,EAAE,OAAO,EACb,eAAe,EAAE,eAAe,EAChC,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CAoM1B;AAQD,wBAAsB,4BAA4B,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAyErH"} {"version":3,"file":"handlers-n8n-manager.d.ts","sourceRoot":"","sources":["../../src/mcp/handlers-n8n-manager.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,YAAY,EAAE,MAAM,4BAA4B,CAAC;AAE1D,OAAO,EAML,eAAe,EAGhB,MAAM,kBAAkB,CAAC;AAkB1B,OAAO,EAAE,cAAc,EAAE,MAAM,6BAA6B,CAAC;AAC7D,OAAO,EAAE,eAAe,EAA2B,MAAM,2BAA2B,CAAC;AAOrF,OAAO,EAAE,eAAe,EAAE,MAAM,+BAA+B,CAAC;AAqNhE,wBAAgB,0BAA0B,IAAI,MAAM,CAEnD;AAMD,wBAAgB,uBAAuB,gDAEtC;AAKD,wBAAgB,kBAAkB,IAAI,IAAI,CAIzC;AAED,wBAAgB,eAAe,CAAC,OAAO,CAAC,EAAE,eAAe,GAAG,YAAY,GAAG,IAAI,CAgF9E;AA2HD,wBAAsB,oBAAoB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAmF7G;AAED,wBAAsB,iBAAiB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAiC1G;AAED,wBAAsB,wBAAwB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAoDjH;AAED,wBAAsB,0BAA0B,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAmDnH;AAED,wBAAsB,wBAAwB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAyCjH;AAED,wBAAsB,oBAAoB,CACxC,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CA8H1B;AAeD,wBAAsB,oBAAoB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAsC7G;AAED,wBAAsB,mBAAmB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAiE5G;AAED,wBAAsB,sBAAsB,CAC1C,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CA0F1B;AAED,wBAAsB,qBAAqB,CACzC,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CAoK1B;AAQD,wBAAsB,kBAAkB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAwJ3G;AAED,wBAAsB,kBAAkB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CA8H3G;AAED,wBAAsB,oBAAoB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAgD7G;AAED,wBAAsB,qBAAqB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAiC9G;AAID,wBAAsB,iBAAiB,CAAC,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAwG3F;AAkLD,wBAAsB,gBAAgB,CAAC,OAAO,EAAE,GAAG,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAkQxG;AAED,wBAAsB,sBAAsB,CAC1C,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CAsL1B;AA+BD,wBAAsB,oBAAoB,CACxC,IAAI,EAAE,OAAO,EACb,eAAe,EAAE,eAAe,EAChC,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CAoM1B;AAQD,wBAAsB,4BAA4B,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC,CAyErH"}

View File

@@ -288,15 +288,6 @@ async function handleCreateWorkflow(args, context) {
}; };
} }
const workflow = await client.createWorkflow(input); const workflow = await client.createWorkflow(input);
if (!workflow || !workflow.id) {
return {
success: false,
error: 'Workflow creation failed: n8n API returned an empty or invalid response. Verify your N8N_API_URL points to the correct /api/v1 endpoint and that the n8n instance supports workflow creation.',
details: {
response: workflow ? { keys: Object.keys(workflow) } : null
}
};
}
telemetry_1.telemetry.trackWorkflowCreation(workflow, true); telemetry_1.telemetry.trackWorkflowCreation(workflow, true);
return { return {
success: true, success: true,

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{"version":3,"file":"handlers-workflow-diff.d.ts","sourceRoot":"","sources":["../../src/mcp/handlers-workflow-diff.ts"],"names":[],"mappings":"AAMA,OAAO,EAAE,eAAe,EAAE,MAAM,kBAAkB,CAAC;AAMnD,OAAO,EAAE,eAAe,EAAE,MAAM,2BAA2B,CAAC;AAE5D,OAAO,EAAE,cAAc,EAAE,MAAM,6BAA6B,CAAC;AAkF7D,wBAAsB,2BAA2B,CAC/C,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CAib1B"} {"version":3,"file":"handlers-workflow-diff.d.ts","sourceRoot":"","sources":["../../src/mcp/handlers-workflow-diff.ts"],"names":[],"mappings":"AAMA,OAAO,EAAE,eAAe,EAAE,MAAM,kBAAkB,CAAC;AAMnD,OAAO,EAAE,eAAe,EAAE,MAAM,2BAA2B,CAAC;AAE5D,OAAO,EAAE,cAAc,EAAE,MAAM,6BAA6B,CAAC;AA0D7D,wBAAsB,2BAA2B,CAC/C,IAAI,EAAE,OAAO,EACb,UAAU,EAAE,cAAc,EAC1B,OAAO,CAAC,EAAE,eAAe,GACxB,OAAO,CAAC,eAAe,CAAC,CA6V1B"}

View File

@@ -50,9 +50,6 @@ function getValidator(repository) {
} }
return cachedValidator; return cachedValidator;
} }
const NODE_TARGETING_OPERATIONS = new Set([
'updateNode', 'removeNode', 'moveNode', 'enableNode', 'disableNode'
]);
const workflowDiffSchema = zod_1.z.object({ const workflowDiffSchema = zod_1.z.object({
id: zod_1.z.string(), id: zod_1.z.string(),
operations: zod_1.z.array(zod_1.z.object({ operations: zod_1.z.array(zod_1.z.object({
@@ -67,8 +64,8 @@ const workflowDiffSchema = zod_1.z.object({
target: zod_1.z.string().optional(), target: zod_1.z.string().optional(),
from: zod_1.z.string().optional(), from: zod_1.z.string().optional(),
to: zod_1.z.string().optional(), to: zod_1.z.string().optional(),
sourceOutput: zod_1.z.union([zod_1.z.string(), zod_1.z.number()]).transform(String).optional(), sourceOutput: zod_1.z.string().optional(),
targetInput: zod_1.z.union([zod_1.z.string(), zod_1.z.number()]).transform(String).optional(), targetInput: zod_1.z.string().optional(),
sourceIndex: zod_1.z.number().optional(), sourceIndex: zod_1.z.number().optional(),
targetIndex: zod_1.z.number().optional(), targetIndex: zod_1.z.number().optional(),
branch: zod_1.z.enum(['true', 'false']).optional(), branch: zod_1.z.enum(['true', 'false']).optional(),
@@ -79,20 +76,6 @@ const workflowDiffSchema = zod_1.z.object({
settings: zod_1.z.any().optional(), settings: zod_1.z.any().optional(),
name: zod_1.z.string().optional(), name: zod_1.z.string().optional(),
tag: zod_1.z.string().optional(), tag: zod_1.z.string().optional(),
destinationProjectId: zod_1.z.string().min(1).optional(),
id: zod_1.z.string().optional(),
}).transform((op) => {
if (NODE_TARGETING_OPERATIONS.has(op.type)) {
if (!op.nodeName && !op.nodeId && op.name) {
op.nodeName = op.name;
op.name = undefined;
}
if (!op.nodeId && op.id) {
op.nodeId = op.id;
op.id = undefined;
}
}
return op;
})), })),
validateOnly: zod_1.z.boolean().optional(), validateOnly: zod_1.z.boolean().optional(),
continueOnError: zod_1.z.boolean().optional(), continueOnError: zod_1.z.boolean().optional(),
@@ -184,12 +167,11 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
else { else {
return { return {
success: false, success: false,
saved: false,
error: 'Failed to apply diff operations', error: 'Failed to apply diff operations',
operationsApplied: diffResult.operationsApplied,
details: { details: {
errors: diffResult.errors, errors: diffResult.errors,
warnings: diffResult.warnings, warnings: diffResult.warnings,
operationsApplied: diffResult.operationsApplied,
applied: diffResult.applied, applied: diffResult.applied,
failed: diffResult.failed failed: diffResult.failed
} }
@@ -257,7 +239,6 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
if (!skipValidation) { if (!skipValidation) {
return { return {
success: false, success: false,
saved: false,
error: errorMessage, error: errorMessage,
details: { details: {
errors: structureErrors, errors: structureErrors,
@@ -266,7 +247,7 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
applied: diffResult.applied, applied: diffResult.applied,
recoveryGuidance: recoverySteps, recoveryGuidance: recoverySteps,
note: 'Operations were applied but created an invalid workflow structure. The workflow was NOT saved to n8n to prevent UI rendering errors.', note: 'Operations were applied but created an invalid workflow structure. The workflow was NOT saved to n8n to prevent UI rendering errors.',
autoSanitizationNote: 'Auto-sanitization runs on modified nodes during updates to fix operator structures and add missing metadata. However, it cannot fix all issues (e.g., broken connections, branch mismatches). Use the recovery guidance above to resolve remaining issues.' autoSanitizationNote: 'Auto-sanitization runs on all nodes during updates to fix operator structures and add missing metadata. However, it cannot fix all issues (e.g., broken connections, branch mismatches). Use the recovery guidance above to resolve remaining issues.'
} }
}; };
} }
@@ -278,77 +259,6 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
} }
try { try {
const updatedWorkflow = await client.updateWorkflow(input.id, diffResult.workflow); const updatedWorkflow = await client.updateWorkflow(input.id, diffResult.workflow);
let tagWarnings = [];
if (diffResult.tagsToAdd?.length || diffResult.tagsToRemove?.length) {
try {
const existingTags = Array.isArray(updatedWorkflow.tags)
? updatedWorkflow.tags.map((t) => typeof t === 'object' ? { id: t.id, name: t.name } : { id: '', name: t })
: [];
const allTags = await client.listTags();
const tagMap = new Map();
for (const t of allTags.data) {
if (t.id)
tagMap.set(t.name.toLowerCase(), t.id);
}
for (const tagName of (diffResult.tagsToAdd || [])) {
if (!tagMap.has(tagName.toLowerCase())) {
try {
const newTag = await client.createTag({ name: tagName });
if (newTag.id)
tagMap.set(tagName.toLowerCase(), newTag.id);
}
catch (createErr) {
tagWarnings.push(`Failed to create tag "${tagName}": ${createErr instanceof Error ? createErr.message : 'Unknown error'}`);
}
}
}
const currentTagIds = new Set();
for (const et of existingTags) {
if (et.id) {
currentTagIds.add(et.id);
}
else {
const resolved = tagMap.get(et.name.toLowerCase());
if (resolved)
currentTagIds.add(resolved);
}
}
for (const tagName of (diffResult.tagsToAdd || [])) {
const tagId = tagMap.get(tagName.toLowerCase());
if (tagId)
currentTagIds.add(tagId);
}
for (const tagName of (diffResult.tagsToRemove || [])) {
const tagId = tagMap.get(tagName.toLowerCase());
if (tagId)
currentTagIds.delete(tagId);
}
await client.updateWorkflowTags(input.id, Array.from(currentTagIds));
}
catch (tagError) {
tagWarnings.push(`Tag update failed: ${tagError instanceof Error ? tagError.message : 'Unknown error'}`);
logger_1.logger.warn('Tag operations failed (non-blocking)', tagError);
}
}
let transferMessage = '';
if (diffResult.transferToProjectId) {
try {
await client.transferWorkflow(input.id, diffResult.transferToProjectId);
transferMessage = ` Workflow transferred to project ${diffResult.transferToProjectId}.`;
}
catch (transferError) {
logger_1.logger.error('Failed to transfer workflow to project', transferError);
return {
success: false,
saved: true,
error: 'Workflow updated successfully but project transfer failed',
details: {
workflowUpdated: true,
transferError: transferError instanceof Error ? transferError.message : 'Unknown error'
}
};
}
}
let finalWorkflow = updatedWorkflow; let finalWorkflow = updatedWorkflow;
let activationMessage = ''; let activationMessage = '';
try { try {
@@ -376,7 +286,6 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
logger_1.logger.error('Failed to activate workflow after update', activationError); logger_1.logger.error('Failed to activate workflow after update', activationError);
return { return {
success: false, success: false,
saved: true,
error: 'Workflow updated successfully but activation failed', error: 'Workflow updated successfully but activation failed',
details: { details: {
workflowUpdated: true, workflowUpdated: true,
@@ -394,7 +303,6 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
logger_1.logger.error('Failed to deactivate workflow after update', deactivationError); logger_1.logger.error('Failed to deactivate workflow after update', deactivationError);
return { return {
success: false, success: false,
saved: true,
error: 'Workflow updated successfully but deactivation failed', error: 'Workflow updated successfully but deactivation failed',
details: { details: {
workflowUpdated: true, workflowUpdated: true,
@@ -421,7 +329,6 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
} }
return { return {
success: true, success: true,
saved: true,
data: { data: {
id: finalWorkflow.id, id: finalWorkflow.id,
name: finalWorkflow.name, name: finalWorkflow.name,
@@ -429,12 +336,12 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
nodeCount: finalWorkflow.nodes?.length || 0, nodeCount: finalWorkflow.nodes?.length || 0,
operationsApplied: diffResult.operationsApplied operationsApplied: diffResult.operationsApplied
}, },
message: `Workflow "${finalWorkflow.name}" updated successfully. Applied ${diffResult.operationsApplied} operations.${transferMessage}${activationMessage} Use n8n_get_workflow with mode 'structure' to verify current state.`, message: `Workflow "${finalWorkflow.name}" updated successfully. Applied ${diffResult.operationsApplied} operations.${activationMessage} Use n8n_get_workflow with mode 'structure' to verify current state.`,
details: { details: {
applied: diffResult.applied, applied: diffResult.applied,
failed: diffResult.failed, failed: diffResult.failed,
errors: diffResult.errors, errors: diffResult.errors,
warnings: mergeWarnings(diffResult.warnings, tagWarnings) warnings: diffResult.warnings
} }
}; };
} }
@@ -472,9 +379,7 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
return { return {
success: false, success: false,
error: 'Invalid input', error: 'Invalid input',
details: { details: { errors: error.errors }
errors: error.errors.map(e => `${e.path.join('.')}: ${e.message}`)
}
}; };
} }
logger_1.logger.error('Failed to update partial workflow', error); logger_1.logger.error('Failed to update partial workflow', error);
@@ -484,13 +389,6 @@ async function handleUpdatePartialWorkflow(args, repository, context) {
}; };
} }
} }
function mergeWarnings(diffWarnings, tagWarnings) {
const merged = [
...(diffWarnings || []),
...tagWarnings.map(w => ({ operation: -1, message: w }))
];
return merged.length > 0 ? merged : undefined;
}
function inferIntentFromOperations(operations) { function inferIntentFromOperations(operations) {
if (!operations || operations.length === 0) { if (!operations || operations.length === 0) {
return 'Partial workflow update'; return 'Partial workflow update';
@@ -518,8 +416,6 @@ function inferIntentFromOperations(operations) {
return 'Activate workflow'; return 'Activate workflow';
case 'deactivateWorkflow': case 'deactivateWorkflow':
return 'Deactivate workflow'; return 'Deactivate workflow';
case 'transferWorkflow':
return `Transfer workflow to project ${op.destinationProjectId || ''}`.trim();
default: default:
return `Workflow ${op.type}`; return `Workflow ${op.type}`;
} }

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{"version":3,"file":"server.d.ts","sourceRoot":"","sources":["../../src/mcp/server.ts"],"names":[],"mappings":"AA0CA,OAAO,EAAE,eAAe,EAAE,MAAM,2BAA2B,CAAC;AAE5D,OAAO,EAAE,gBAAgB,EAAE,MAAM,iCAAiC,CAAC;AAmGnE,qBAAa,yBAAyB;IACpC,OAAO,CAAC,MAAM,CAAS;IACvB,OAAO,CAAC,EAAE,CAAgC;IAC1C,OAAO,CAAC,UAAU,CAA+B;IACjD,OAAO,CAAC,eAAe,CAAgC;IACvD,OAAO,CAAC,WAAW,CAAgB;IACnC,OAAO,CAAC,KAAK,CAAqB;IAClC,OAAO,CAAC,UAAU,CAAa;IAC/B,OAAO,CAAC,eAAe,CAAC,CAAkB;IAC1C,OAAO,CAAC,YAAY,CAAuB;IAC3C,OAAO,CAAC,qBAAqB,CAAsB;IACnD,OAAO,CAAC,WAAW,CAAiC;IACpD,OAAO,CAAC,kBAAkB,CAA4B;IACtD,OAAO,CAAC,iBAAiB,CAAkB;IAC3C,OAAO,CAAC,aAAa,CAAoC;IACzD,OAAO,CAAC,UAAU,CAAkB;gBAExB,eAAe,CAAC,EAAE,eAAe,EAAE,WAAW,CAAC,EAAE,gBAAgB;IA8GvE,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;YA+Cd,kBAAkB;YAiDlB,wBAAwB;IA0BtC,OAAO,CAAC,kBAAkB;YA6CZ,iBAAiB;IAa/B,OAAO,CAAC,eAAe,CAAkB;YAE3B,sBAAsB;IAgDpC,OAAO,CAAC,gBAAgB;IAqCxB,OAAO,CAAC,aAAa;IA0XrB,OAAO,CAAC,wBAAwB;IAoFhC,OAAO,CAAC,kBAAkB;IAqE1B,OAAO,CAAC,uBAAuB;IAwB/B,OAAO,CAAC,qBAAqB;IAiF7B,OAAO,CAAC,2BAA2B;YA0UrB,SAAS;YA2DT,WAAW;YAkFX,WAAW;YA0CX,cAAc;YA8Md,gBAAgB;IAqD9B,OAAO,CAAC,mBAAmB;IAwE3B,OAAO,CAAC,eAAe;YAsBT,eAAe;IA2L7B,OAAO,CAAC,kBAAkB;IAQ1B,OAAO,CAAC,uBAAuB;IA0D/B,OAAO,CAAC,iBAAiB;YAqFX,WAAW;YAgCX,oBAAoB;IAuFlC,OAAO,CAAC,aAAa;YAQP,qBAAqB;YAwDrB,iBAAiB;YAiKjB,OAAO;YAgDP,cAAc;YAwFd,iBAAiB;IAqC/B,OAAO,CAAC,iBAAiB;IA0BzB,OAAO,CAAC,iBAAiB;IA0BzB,OAAO,CAAC,eAAe;IAwCvB,OAAO,CAAC,kBAAkB;IAiC1B,OAAO,CAAC,aAAa;IAoCrB,OAAO,CAAC,0BAA0B;IAgClC,OAAO,CAAC,4BAA4B;YAKtB,oBAAoB;IAsDlC,OAAO,CAAC,gBAAgB;YAiBV,SAAS;YA6CT,kBAAkB;YAqElB,uBAAuB;YAsDvB,iBAAiB;IAqE/B,OAAO,CAAC,qBAAqB;IA8C7B,OAAO,CAAC,uBAAuB;IA4D/B,OAAO,CAAC,wBAAwB;IAkChC,OAAO,CAAC,iBAAiB;YAoDX,mBAAmB;YAoEnB,qBAAqB;IAS7B,OAAO,CAAC,SAAS,EAAE,GAAG,GAAG,OAAO,CAAC,IAAI,CAAC;YAS9B,aAAa;YAcb,iBAAiB;YAoBjB,WAAW;YAwBX,eAAe;YAqBf,mBAAmB;YAwBnB,yBAAyB;IA4CvC,OAAO,CAAC,kBAAkB;YAiBZ,gBAAgB;YA6HhB,2BAA2B;YAiE3B,2BAA2B;IAyEnC,GAAG,IAAI,OAAO,CAAC,IAAI,CAAC;IA0BpB,QAAQ,IAAI,OAAO,CAAC,IAAI,CAAC;CAgEhC"} {"version":3,"file":"server.d.ts","sourceRoot":"","sources":["../../src/mcp/server.ts"],"names":[],"mappings":"AA0CA,OAAO,EAAE,eAAe,EAAE,MAAM,2BAA2B,CAAC;AAE5D,OAAO,EAAE,gBAAgB,EAAE,MAAM,iCAAiC,CAAC;AAmGnE,qBAAa,yBAAyB;IACpC,OAAO,CAAC,MAAM,CAAS;IACvB,OAAO,CAAC,EAAE,CAAgC;IAC1C,OAAO,CAAC,UAAU,CAA+B;IACjD,OAAO,CAAC,eAAe,CAAgC;IACvD,OAAO,CAAC,WAAW,CAAgB;IACnC,OAAO,CAAC,KAAK,CAAqB;IAClC,OAAO,CAAC,UAAU,CAAa;IAC/B,OAAO,CAAC,eAAe,CAAC,CAAkB;IAC1C,OAAO,CAAC,YAAY,CAAuB;IAC3C,OAAO,CAAC,qBAAqB,CAAsB;IACnD,OAAO,CAAC,WAAW,CAAiC;IACpD,OAAO,CAAC,kBAAkB,CAA4B;IACtD,OAAO,CAAC,iBAAiB,CAAkB;IAC3C,OAAO,CAAC,aAAa,CAAoC;IACzD,OAAO,CAAC,UAAU,CAAkB;gBAExB,eAAe,CAAC,EAAE,eAAe,EAAE,WAAW,CAAC,EAAE,gBAAgB;IAuGvE,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;YA+Cd,kBAAkB;YAiDlB,wBAAwB;IA0BtC,OAAO,CAAC,kBAAkB;YA6CZ,iBAAiB;IAa/B,OAAO,CAAC,eAAe,CAAkB;YAE3B,sBAAsB;IAgDpC,OAAO,CAAC,gBAAgB;IAqCxB,OAAO,CAAC,aAAa;IA0XrB,OAAO,CAAC,wBAAwB;IAoFhC,OAAO,CAAC,kBAAkB;IAqE1B,OAAO,CAAC,uBAAuB;IAwB/B,OAAO,CAAC,qBAAqB;IAiF7B,OAAO,CAAC,2BAA2B;YA0UrB,SAAS;YA2DT,WAAW;YAkFX,WAAW;YA0CX,cAAc;YA8Md,gBAAgB;IAqD9B,OAAO,CAAC,mBAAmB;IAwE3B,OAAO,CAAC,eAAe;YAsBT,eAAe;IA2L7B,OAAO,CAAC,kBAAkB;IAQ1B,OAAO,CAAC,uBAAuB;IA0D/B,OAAO,CAAC,iBAAiB;YAqFX,WAAW;YAgCX,oBAAoB;IAuFlC,OAAO,CAAC,aAAa;YAQP,qBAAqB;YAwDrB,iBAAiB;YAiKjB,OAAO;YAgDP,cAAc;YAwFd,iBAAiB;IAqC/B,OAAO,CAAC,iBAAiB;IA0BzB,OAAO,CAAC,iBAAiB;IA0BzB,OAAO,CAAC,eAAe;IAwCvB,OAAO,CAAC,kBAAkB;IAiC1B,OAAO,CAAC,aAAa;IAoCrB,OAAO,CAAC,0BAA0B;IAgClC,OAAO,CAAC,4BAA4B;YAKtB,oBAAoB;IAsDlC,OAAO,CAAC,gBAAgB;YAiBV,SAAS;YA6CT,kBAAkB;YAqElB,uBAAuB;YAsDvB,iBAAiB;IAqE/B,OAAO,CAAC,qBAAqB;IA8C7B,OAAO,CAAC,uBAAuB;IA4D/B,OAAO,CAAC,wBAAwB;IAkChC,OAAO,CAAC,iBAAiB;YAoDX,mBAAmB;YAoEnB,qBAAqB;IAS7B,OAAO,CAAC,SAAS,EAAE,GAAG,GAAG,OAAO,CAAC,IAAI,CAAC;YAS9B,aAAa;YAcb,iBAAiB;YAoBjB,WAAW;YAwBX,eAAe;YAqBf,mBAAmB;YAwBnB,yBAAyB;IA4CvC,OAAO,CAAC,kBAAkB;YAiBZ,gBAAgB;YA6HhB,2BAA2B;YAiE3B,2BAA2B;IAyEnC,GAAG,IAAI,OAAO,CAAC,IAAI,CAAC;IA0BpB,QAAQ,IAAI,OAAO,CAAC,IAAI,CAAC;CAgEhC"}

1
dist/mcp/server.js vendored
View File

@@ -124,7 +124,6 @@ class N8NDocumentationMCPServer {
this.earlyLogger.logCheckpoint(startup_checkpoints_1.STARTUP_CHECKPOINTS.N8N_API_READY); this.earlyLogger.logCheckpoint(startup_checkpoints_1.STARTUP_CHECKPOINTS.N8N_API_READY);
} }
}); });
this.initialized.catch(() => { });
logger_1.logger.info('Initializing n8n Documentation MCP server'); logger_1.logger.info('Initializing n8n Documentation MCP server');
this.server = new index_js_1.Server({ this.server = new index_js_1.Server({
name: 'n8n-documentation-mcp', name: 'n8n-documentation-mcp',

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{"version":3,"file":"n8n-update-partial-workflow.d.ts","sourceRoot":"","sources":["../../../../src/mcp/tool-docs/workflow_management/n8n-update-partial-workflow.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,iBAAiB,EAAE,MAAM,UAAU,CAAC;AAE7C,eAAO,MAAM,2BAA2B,EAAE,iBAuazC,CAAC"} {"version":3,"file":"n8n-update-partial-workflow.d.ts","sourceRoot":"","sources":["../../../../src/mcp/tool-docs/workflow_management/n8n-update-partial-workflow.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,iBAAiB,EAAE,MAAM,UAAU,CAAC;AAE7C,eAAO,MAAM,2BAA2B,EAAE,iBA+ZzC,CAAC"}

View File

@@ -5,7 +5,7 @@ exports.n8nUpdatePartialWorkflowDoc = {
name: 'n8n_update_partial_workflow', name: 'n8n_update_partial_workflow',
category: 'workflow_management', category: 'workflow_management',
essentials: { essentials: {
description: 'Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, rewireConnection, cleanStaleConnections, replaceConnections, updateSettings, updateName, add/removeTag, activateWorkflow, deactivateWorkflow, transferWorkflow. Supports smart parameters (branch, case) for multi-output nodes. Full support for AI connections (ai_languageModel, ai_tool, ai_memory, ai_embedding, ai_vectorStore, ai_document, ai_textSplitter, ai_outputParser).', description: 'Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, rewireConnection, cleanStaleConnections, replaceConnections, updateSettings, updateName, add/removeTag, activateWorkflow, deactivateWorkflow. Supports smart parameters (branch, case) for multi-output nodes. Full support for AI connections (ai_languageModel, ai_tool, ai_memory, ai_embedding, ai_vectorStore, ai_document, ai_textSplitter, ai_outputParser).',
keyParameters: ['id', 'operations', 'continueOnError'], keyParameters: ['id', 'operations', 'continueOnError'],
example: 'n8n_update_partial_workflow({id: "wf_123", operations: [{type: "rewireConnection", source: "IF", from: "Old", to: "New", branch: "true"}]})', example: 'n8n_update_partial_workflow({id: "wf_123", operations: [{type: "rewireConnection", source: "IF", from: "Old", to: "New", branch: "true"}]})',
performance: 'Fast (50-200ms)', performance: 'Fast (50-200ms)',
@@ -23,8 +23,7 @@ exports.n8nUpdatePartialWorkflowDoc = {
'Batch AI component connections for atomic updates', 'Batch AI component connections for atomic updates',
'Auto-sanitization: ALL nodes auto-fixed during updates (operator structures, missing metadata)', 'Auto-sanitization: ALL nodes auto-fixed during updates (operator structures, missing metadata)',
'Node renames automatically update all connection references - no manual connection operations needed', 'Node renames automatically update all connection references - no manual connection operations needed',
'Activate/deactivate workflows: Use activateWorkflow/deactivateWorkflow operations (requires activatable triggers like webhook/schedule)', 'Activate/deactivate workflows: Use activateWorkflow/deactivateWorkflow operations (requires activatable triggers like webhook/schedule)'
'Transfer workflows between projects: Use transferWorkflow with destinationProjectId (enterprise feature)'
] ]
}, },
full: { full: {
@@ -57,9 +56,6 @@ exports.n8nUpdatePartialWorkflowDoc = {
- **activateWorkflow**: Activate the workflow to enable automatic execution via triggers - **activateWorkflow**: Activate the workflow to enable automatic execution via triggers
- **deactivateWorkflow**: Deactivate the workflow to prevent automatic execution - **deactivateWorkflow**: Deactivate the workflow to prevent automatic execution
### Project Management Operations (1 type):
- **transferWorkflow**: Transfer the workflow to a different project. Requires \`destinationProjectId\`. Enterprise/cloud feature.
## Smart Parameters for Multi-Output Nodes ## Smart Parameters for Multi-Output Nodes
For **IF nodes**, use semantic 'branch' parameter instead of technical sourceIndex: For **IF nodes**, use semantic 'branch' parameter instead of technical sourceIndex:
@@ -199,12 +195,12 @@ Please choose a different name.
- Can rename a node and add/remove connections using the new name in the same batch - Can rename a node and add/remove connections using the new name in the same batch
- Use \`validateOnly: true\` to preview effects before applying - Use \`validateOnly: true\` to preview effects before applying
## Removing Properties with null ## Removing Properties with undefined
To remove a property from a node, set its value to \`null\` in the updates object. This is essential when migrating from deprecated properties or cleaning up optional configuration fields. To remove a property from a node, set its value to \`undefined\` in the updates object. This is essential when migrating from deprecated properties or cleaning up optional configuration fields.
### Why Use null? ### Why Use undefined?
- **Property removal**: Setting a property to \`null\` removes it completely from the node object - **Property removal vs. null**: Setting a property to \`undefined\` removes it completely from the node object, while \`null\` sets the property to a null value
- **Validation constraints**: Some properties are mutually exclusive (e.g., \`continueOnFail\` and \`onError\`). Simply setting one without removing the other will fail validation - **Validation constraints**: Some properties are mutually exclusive (e.g., \`continueOnFail\` and \`onError\`). Simply setting one without removing the other will fail validation
- **Deprecated property migration**: When n8n deprecates properties, you must remove the old property before the new one will work - **Deprecated property migration**: When n8n deprecates properties, you must remove the old property before the new one will work
@@ -216,7 +212,7 @@ n8n_update_partial_workflow({
operations: [{ operations: [{
type: "updateNode", type: "updateNode",
nodeName: "HTTP Request", nodeName: "HTTP Request",
updates: { onError: null } updates: { onError: undefined }
}] }]
}); });
@@ -226,7 +222,7 @@ n8n_update_partial_workflow({
operations: [{ operations: [{
type: "updateNode", type: "updateNode",
nodeId: "node_abc", nodeId: "node_abc",
updates: { disabled: null } updates: { disabled: undefined }
}] }]
}); });
\`\`\` \`\`\`
@@ -240,7 +236,7 @@ n8n_update_partial_workflow({
operations: [{ operations: [{
type: "updateNode", type: "updateNode",
nodeName: "API Request", nodeName: "API Request",
updates: { "parameters.authentication": null } updates: { "parameters.authentication": undefined }
}] }]
}); });
@@ -250,7 +246,7 @@ n8n_update_partial_workflow({
operations: [{ operations: [{
type: "updateNode", type: "updateNode",
nodeName: "HTTP Request", nodeName: "HTTP Request",
updates: { "parameters.headers": null } updates: { "parameters.headers": undefined }
}] }]
}); });
\`\`\` \`\`\`
@@ -276,7 +272,7 @@ n8n_update_partial_workflow({
type: "updateNode", type: "updateNode",
nodeName: "HTTP Request", nodeName: "HTTP Request",
updates: { updates: {
continueOnFail: null, continueOnFail: undefined,
onError: "continueErrorOutput" onError: "continueErrorOutput"
} }
}] }]
@@ -292,15 +288,15 @@ n8n_update_partial_workflow({
type: "updateNode", type: "updateNode",
nodeName: "Data Processor", nodeName: "Data Processor",
updates: { updates: {
continueOnFail: null, continueOnFail: undefined,
alwaysOutputData: null, alwaysOutputData: undefined,
"parameters.legacy_option": null "parameters.legacy_option": undefined
} }
}] }]
}); });
\`\`\` \`\`\`
### When to Use null ### When to Use undefined
- Removing deprecated properties during migration - Removing deprecated properties during migration
- Cleaning up optional configuration flags - Cleaning up optional configuration flags
- Resolving mutual exclusivity validation errors - Resolving mutual exclusivity validation errors
@@ -346,14 +342,11 @@ n8n_update_partial_workflow({
'// Rewire AI Agent to use different language model\nn8n_update_partial_workflow({id: "ai9", operations: [{type: "rewireConnection", source: "AI Agent", from: "OpenAI Chat Model", to: "Anthropic Chat Model", sourceOutput: "ai_languageModel"}]})', '// Rewire AI Agent to use different language model\nn8n_update_partial_workflow({id: "ai9", operations: [{type: "rewireConnection", source: "AI Agent", from: "OpenAI Chat Model", to: "Anthropic Chat Model", sourceOutput: "ai_languageModel"}]})',
'// Replace all AI tools for an agent\nn8n_update_partial_workflow({id: "ai10", operations: [\n {type: "removeConnection", source: "Old Tool 1", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "removeConnection", source: "Old Tool 2", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "addConnection", source: "New HTTP Tool", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "addConnection", source: "New Code Tool", target: "AI Agent", sourceOutput: "ai_tool"}\n]})', '// Replace all AI tools for an agent\nn8n_update_partial_workflow({id: "ai10", operations: [\n {type: "removeConnection", source: "Old Tool 1", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "removeConnection", source: "Old Tool 2", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "addConnection", source: "New HTTP Tool", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "addConnection", source: "New Code Tool", target: "AI Agent", sourceOutput: "ai_tool"}\n]})',
'\n// ============ REMOVING PROPERTIES EXAMPLES ============', '\n// ============ REMOVING PROPERTIES EXAMPLES ============',
'// Remove a simple property\nn8n_update_partial_workflow({id: "rm1", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {onError: null}}]})', '// Remove a simple property\nn8n_update_partial_workflow({id: "rm1", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {onError: undefined}}]})',
'// Migrate from deprecated continueOnFail to onError\nn8n_update_partial_workflow({id: "rm2", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {continueOnFail: null, onError: "continueErrorOutput"}}]})', '// Migrate from deprecated continueOnFail to onError\nn8n_update_partial_workflow({id: "rm2", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {continueOnFail: undefined, onError: "continueErrorOutput"}}]})',
'// Remove nested property\nn8n_update_partial_workflow({id: "rm3", operations: [{type: "updateNode", nodeName: "API Request", updates: {"parameters.authentication": null}}]})', '// Remove nested property\nn8n_update_partial_workflow({id: "rm3", operations: [{type: "updateNode", nodeName: "API Request", updates: {"parameters.authentication": undefined}}]})',
'// Remove multiple properties\nn8n_update_partial_workflow({id: "rm4", operations: [{type: "updateNode", nodeName: "Data Processor", updates: {continueOnFail: null, alwaysOutputData: null, "parameters.legacy_option": null}}]})', '// Remove multiple properties\nn8n_update_partial_workflow({id: "rm4", operations: [{type: "updateNode", nodeName: "Data Processor", updates: {continueOnFail: undefined, alwaysOutputData: undefined, "parameters.legacy_option": undefined}}]})',
'// Remove entire array property\nn8n_update_partial_workflow({id: "rm5", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {"parameters.headers": null}}]})', '// Remove entire array property\nn8n_update_partial_workflow({id: "rm5", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {"parameters.headers": undefined}}]})'
'\n// ============ PROJECT TRANSFER EXAMPLES ============',
'// Transfer workflow to a different project\nn8n_update_partial_workflow({id: "tf1", operations: [{type: "transferWorkflow", destinationProjectId: "project-abc-123"}]})',
'// Transfer and activate in one call\nn8n_update_partial_workflow({id: "tf2", operations: [{type: "transferWorkflow", destinationProjectId: "project-abc-123"}, {type: "activateWorkflow"}]})'
], ],
useCases: [ useCases: [
'Rewire connections when replacing nodes', 'Rewire connections when replacing nodes',
@@ -371,8 +364,7 @@ n8n_update_partial_workflow({
'Add fallback language models to AI Agents', 'Add fallback language models to AI Agents',
'Configure Vector Store retrieval systems', 'Configure Vector Store retrieval systems',
'Swap language models in existing AI workflows', 'Swap language models in existing AI workflows',
'Batch-update AI tool connections', 'Batch-update AI tool connections'
'Transfer workflows between team projects (enterprise)'
], ],
performance: 'Very fast - typically 50-200ms. Much faster than full updates as only changes are processed.', performance: 'Very fast - typically 50-200ms. Much faster than full updates as only changes are processed.',
bestPractices: [ bestPractices: [
@@ -392,9 +384,9 @@ n8n_update_partial_workflow({
'Use targetIndex for fallback models (primary=0, fallback=1)', 'Use targetIndex for fallback models (primary=0, fallback=1)',
'Batch AI component connections in a single operation for atomicity', 'Batch AI component connections in a single operation for atomicity',
'Validate AI workflows after connection changes to catch configuration errors', 'Validate AI workflows after connection changes to catch configuration errors',
'To remove properties, set them to null in the updates object', 'To remove properties, set them to undefined (not null) in the updates object',
'When migrating from deprecated properties, remove the old property and add the new one in the same operation', 'When migrating from deprecated properties, remove the old property and add the new one in the same operation',
'Use null to resolve mutual exclusivity validation errors between properties', 'Use undefined to resolve mutual exclusivity validation errors between properties',
'Batch multiple property removals in a single updateNode operation for efficiency' 'Batch multiple property removals in a single updateNode operation for efficiency'
], ],
pitfalls: [ pitfalls: [
@@ -416,8 +408,8 @@ n8n_update_partial_workflow({
'**Auto-sanitization runs on ALL nodes**: When ANY update is made, ALL nodes in the workflow are sanitized (not just modified ones)', '**Auto-sanitization runs on ALL nodes**: When ANY update is made, ALL nodes in the workflow are sanitized (not just modified ones)',
'**Auto-sanitization cannot fix everything**: It fixes operator structures and missing metadata, but cannot fix broken connections or branch mismatches', '**Auto-sanitization cannot fix everything**: It fixes operator structures and missing metadata, but cannot fix broken connections or branch mismatches',
'**Corrupted workflows beyond repair**: Workflows in paradoxical states (API returns corrupt, API rejects updates) cannot be fixed via API - must be recreated', '**Corrupted workflows beyond repair**: Workflows in paradoxical states (API returns corrupt, API rejects updates) cannot be fixed via API - must be recreated',
'To remove a property, set it to null in the updates object', 'Setting a property to null does NOT remove it - use undefined instead',
'When properties are mutually exclusive (e.g., continueOnFail and onError), setting only the new property will fail - you must remove the old one with null', 'When properties are mutually exclusive (e.g., continueOnFail and onError), setting only the new property will fail - you must remove the old one with undefined',
'Removing a required property may cause validation errors - check node documentation first', 'Removing a required property may cause validation errors - check node documentation first',
'Nested property removal with dot notation only removes the specific nested field, not the entire parent object', 'Nested property removal with dot notation only removes the specific nested field, not the entire parent object',
'Array index notation (e.g., "parameters.headers[0]") is not supported - remove the entire array property instead' 'Array index notation (e.g., "parameters.headers[0]") is not supported - remove the entire array property instead'

View File

@@ -1 +1 @@
{"version":3,"file":"n8n-update-partial-workflow.js","sourceRoot":"","sources":["../../../../src/mcp/tool-docs/workflow_management/n8n-update-partial-workflow.ts"],"names":[],"mappings":";;;AAEa,QAAA,2BAA2B,GAAsB;IAC5D,IAAI,EAAE,6BAA6B;IACnC,QAAQ,EAAE,qBAAqB;IAC/B,UAAU,EAAE;QACV,WAAW,EAAE,khBAAkhB;QAC/hB,aAAa,EAAE,CAAC,IAAI,EAAE,YAAY,EAAE,iBAAiB,CAAC;QACtD,OAAO,EAAE,6IAA6I;QACtJ,WAAW,EAAE,iBAAiB;QAC9B,IAAI,EAAE;YACJ,gJAAgJ;YAChJ,oGAAoG;YACpG,mDAAmD;YACnD,wCAAwC;YACxC,6BAA6B;YAC7B,6DAA6D;YAC7D,uDAAuD;YACvD,0DAA0D;YAC1D,kCAAkC;YAClC,iFAAiF;YACjF,mDAAmD;YACnD,gGAAgG;YAChG,sGAAsG;YACtG,yIAAyI;YACzI,0GAA0G;SAC3G;KACF;IACD,IAAI,EAAE;QACJ,WAAW,EAAE;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;iCAqRgB;QAC7B,UAAU,EAAE;YACV,EAAE,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,QAAQ,EAAE,IAAI,EAAE,WAAW,EAAE,uBAAuB,EAAE;YAC5E,UAAU,EAAE;gBACV,IAAI,EAAE,OAAO;gBACb,QAAQ,EAAE,IAAI;gBACd,WAAW,EAAE,iIAAiI;aAC/I;YACD,YAAY,EAAE,EAAE,IAAI,EAAE,SAAS,EAAE,WAAW,EAAE,yDAAyD,EAAE;YACzG,eAAe,EAAE,EAAE,IAAI,EAAE,SAAS,EAAE,WAAW,EAAE,6IAA6I,EAAE;YAChM,MAAM,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,qIAAqI,EAAE;SAC/K;QACD,OAAO,EAAE,uNAAuN;QAChO,QAAQ,EAAE;YACR,mOAAmO;YACnO,wNAAwN;YACxN,kTAAkT;YAClT,0VAA0V;YAC1V,gMAAgM;YAChM,mLAAmL;YACnL,mLAAmL;YACnL,6UAA6U;YAC7U,oMAAoM;YACpM,oYAAoY;YACpY,qJAAqJ;YACrJ,+MAA+M;YAC/M,kSAAkS;YAClS,0LAA0L;YAC1L,wJAAwJ;YACxJ,uDAAuD;YACvD,2MAA2M;YAC3M,wLAAwL;YACxL,+LAA+L;YAC/L,gNAAgN;YAChN,4hBAA4hB;YAC5hB,+WAA+W;YAC/W,qWAAqW;YACrW,uVAAuV;YACvV,qPAAqP;YACrP,0eAA0e;YAC1e,6DAA6D;YAC7D,+JAA+J;YAC/J,+NAA+N;YAC/N,gLAAgL;YAChL,oOAAoO;YACpO,gLAAgL;YAChL,0DAA0D;YAC1D,0KAA0K;YAC1K,+LAA+L;SAChM;QACD,QAAQ,EAAE;YACR,yCAAyC;YACzC,uDAAuD;YACvD,wDAAwD;YACxD,+CAA+C;YAC/C,+BAA+B;YAC/B,iCAAiC;YACjC,8CAA8C;YAC9C,sBAAsB;YACtB,2BAA2B;YAC3B,yBAAyB;YACzB,iEAAiE;YACjE,+CAA+C;YAC/C,2CAA2C;YAC3C,0CAA0C;YAC1C,+CAA+C;YAC/C,kCAAkC;YAClC,uDAAuD;SACxD;QACD,WAAW,EAAE,8FAA8F;QAC3G,aAAa,EAAE;YACb,kPAAkP;YAClP,iEAAiE;YACjE,+DAA+D;YAC/D,oDAAoD;YACpD,yDAAyD;YACzD,iDAAiD;YACjD,gEAAgE;YAChE,qDAAqD;YACrD,mCAAmC;YACnC,wCAAwC;YACxC,gDAAgD;YAChD,8FAA8F;YAC9F,2EAA2E;YAC3E,6DAA6D;YAC7D,oEAAoE;YACpE,8EAA8E;YAC9E,8DAA8D;YAC9D,8GAA8G;YAC9G,6EAA6E;YAC7E,kFAAkF;SACnF;QACD,QAAQ,EAAE;YACR,uGAAuG;YACvG,wEAAwE;YACxE,6DAA6D;YAC7D,sFAAsF;YACtF,4DAA4D;YAC5D,yEAAyE;YACzE,yFAAyF;YACzF,wFAAwF;YACxF,mGAAmG;YACnG,iFAAiF;YACjF,iNAAiN;YACjN,kKAAkK;YAClK,4EAA4E;YAC5E,yFAAyF;YACzF,4LAA4L;YAC5L,oIAAoI;YACpI,wJAAwJ;YACxJ,+JAA+J;YAC/J,4DAA4D;YAC5D,4JAA4J;YAC5J,2FAA2F;YAC3F,gHAAgH;YAChH,kHAAkH;SACnH;QACD,YAAY,EAAE,CAAC,0BAA0B,EAAE,kBAAkB,EAAE,mBAAmB,EAAE,qBAAqB,CAAC;KAC3G;CACF,CAAC"} {"version":3,"file":"n8n-update-partial-workflow.js","sourceRoot":"","sources":["../../../../src/mcp/tool-docs/workflow_management/n8n-update-partial-workflow.ts"],"names":[],"mappings":";;;AAEa,QAAA,2BAA2B,GAAsB;IAC5D,IAAI,EAAE,6BAA6B;IACnC,QAAQ,EAAE,qBAAqB;IAC/B,UAAU,EAAE;QACV,WAAW,EAAE,ggBAAggB;QAC7gB,aAAa,EAAE,CAAC,IAAI,EAAE,YAAY,EAAE,iBAAiB,CAAC;QACtD,OAAO,EAAE,6IAA6I;QACtJ,WAAW,EAAE,iBAAiB;QAC9B,IAAI,EAAE;YACJ,gJAAgJ;YAChJ,oGAAoG;YACpG,mDAAmD;YACnD,wCAAwC;YACxC,6BAA6B;YAC7B,6DAA6D;YAC7D,uDAAuD;YACvD,0DAA0D;YAC1D,kCAAkC;YAClC,iFAAiF;YACjF,mDAAmD;YACnD,gGAAgG;YAChG,sGAAsG;YACtG,yIAAyI;SAC1I;KACF;IACD,IAAI,EAAE;QACJ,WAAW,EAAE;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;iCAkRgB;QAC7B,UAAU,EAAE;YACV,EAAE,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,QAAQ,EAAE,IAAI,EAAE,WAAW,EAAE,uBAAuB,EAAE;YAC5E,UAAU,EAAE;gBACV,IAAI,EAAE,OAAO;gBACb,QAAQ,EAAE,IAAI;gBACd,WAAW,EAAE,iIAAiI;aAC/I;YACD,YAAY,EAAE,EAAE,IAAI,EAAE,SAAS,EAAE,WAAW,EAAE,yDAAyD,EAAE;YACzG,eAAe,EAAE,EAAE,IAAI,EAAE,SAAS,EAAE,WAAW,EAAE,6IAA6I,EAAE;YAChM,MAAM,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,qIAAqI,EAAE;SAC/K;QACD,OAAO,EAAE,uNAAuN;QAChO,QAAQ,EAAE;YACR,mOAAmO;YACnO,wNAAwN;YACxN,kTAAkT;YAClT,0VAA0V;YAC1V,gMAAgM;YAChM,mLAAmL;YACnL,mLAAmL;YACnL,6UAA6U;YAC7U,oMAAoM;YACpM,oYAAoY;YACpY,qJAAqJ;YACrJ,+MAA+M;YAC/M,kSAAkS;YAClS,0LAA0L;YAC1L,wJAAwJ;YACxJ,uDAAuD;YACvD,2MAA2M;YAC3M,wLAAwL;YACxL,+LAA+L;YAC/L,gNAAgN;YAChN,4hBAA4hB;YAC5hB,+WAA+W;YAC/W,qWAAqW;YACrW,uVAAuV;YACvV,qPAAqP;YACrP,0eAA0e;YAC1e,6DAA6D;YAC7D,oKAAoK;YACpK,oOAAoO;YACpO,qLAAqL;YACrL,mPAAmP;YACnP,qLAAqL;SACtL;QACD,QAAQ,EAAE;YACR,yCAAyC;YACzC,uDAAuD;YACvD,wDAAwD;YACxD,+CAA+C;YAC/C,+BAA+B;YAC/B,iCAAiC;YACjC,8CAA8C;YAC9C,sBAAsB;YACtB,2BAA2B;YAC3B,yBAAyB;YACzB,iEAAiE;YACjE,+CAA+C;YAC/C,2CAA2C;YAC3C,0CAA0C;YAC1C,+CAA+C;YAC/C,kCAAkC;SACnC;QACD,WAAW,EAAE,8FAA8F;QAC3G,aAAa,EAAE;YACb,kPAAkP;YAClP,iEAAiE;YACjE,+DAA+D;YAC/D,oDAAoD;YACpD,yDAAyD;YACzD,iDAAiD;YACjD,gEAAgE;YAChE,qDAAqD;YACrD,mCAAmC;YACnC,wCAAwC;YACxC,gDAAgD;YAChD,8FAA8F;YAC9F,2EAA2E;YAC3E,6DAA6D;YAC7D,oEAAoE;YACpE,8EAA8E;YAC9E,8EAA8E;YAC9E,8GAA8G;YAC9G,kFAAkF;YAClF,kFAAkF;SACnF;QACD,QAAQ,EAAE;YACR,uGAAuG;YACvG,wEAAwE;YACxE,6DAA6D;YAC7D,sFAAsF;YACtF,4DAA4D;YAC5D,yEAAyE;YACzE,yFAAyF;YACzF,wFAAwF;YACxF,mGAAmG;YACnG,iFAAiF;YACjF,iNAAiN;YACjN,kKAAkK;YAClK,4EAA4E;YAC5E,yFAAyF;YACzF,4LAA4L;YAC5L,oIAAoI;YACpI,wJAAwJ;YACxJ,+JAA+J;YAC/J,uEAAuE;YACvE,iKAAiK;YACjK,2FAA2F;YAC3F,gHAAgH;YAChH,kHAAkH;SACnH;QACD,YAAY,EAAE,CAAC,0BAA0B,EAAE,kBAAkB,EAAE,mBAAmB,EAAE,qBAAqB,CAAC;KAC3G;CACF,CAAC"}

View File

@@ -137,7 +137,7 @@ exports.n8nManagementTools = [
}, },
{ {
name: 'n8n_update_partial_workflow', name: 'n8n_update_partial_workflow',
description: `Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, updateSettings, updateName, add/removeTag, activate/deactivateWorkflow, transferWorkflow. See tools_documentation("n8n_update_partial_workflow", "full") for details.`, description: `Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, updateSettings, updateName, add/removeTag. See tools_documentation("n8n_update_partial_workflow", "full") for details.`,
inputSchema: { inputSchema: {
type: 'object', type: 'object',
additionalProperties: true, additionalProperties: true,

File diff suppressed because one or more lines are too long

View File

@@ -33,8 +33,8 @@ export declare function validateVectorStoreTool(node: WorkflowNode, reverseConne
export declare function validateWorkflowTool(node: WorkflowNode, reverseConnections?: Map<string, ReverseConnection[]>): ValidationIssue[]; export declare function validateWorkflowTool(node: WorkflowNode, reverseConnections?: Map<string, ReverseConnection[]>): ValidationIssue[];
export declare function validateAIAgentTool(node: WorkflowNode, reverseConnections: Map<string, ReverseConnection[]>): ValidationIssue[]; export declare function validateAIAgentTool(node: WorkflowNode, reverseConnections: Map<string, ReverseConnection[]>): ValidationIssue[];
export declare function validateMCPClientTool(node: WorkflowNode): ValidationIssue[]; export declare function validateMCPClientTool(node: WorkflowNode): ValidationIssue[];
export declare function validateCalculatorTool(_node: WorkflowNode): ValidationIssue[]; export declare function validateCalculatorTool(node: WorkflowNode): ValidationIssue[];
export declare function validateThinkTool(_node: WorkflowNode): ValidationIssue[]; export declare function validateThinkTool(node: WorkflowNode): ValidationIssue[];
export declare function validateSerpApiTool(node: WorkflowNode): ValidationIssue[]; export declare function validateSerpApiTool(node: WorkflowNode): ValidationIssue[];
export declare function validateWikipediaTool(node: WorkflowNode): ValidationIssue[]; export declare function validateWikipediaTool(node: WorkflowNode): ValidationIssue[];
export declare function validateSearXngTool(node: WorkflowNode): ValidationIssue[]; export declare function validateSearXngTool(node: WorkflowNode): ValidationIssue[];

View File

@@ -1 +1 @@
{"version":3,"file":"ai-tool-validators.d.ts","sourceRoot":"","sources":["../../src/services/ai-tool-validators.ts"],"names":[],"mappings":"AAmBA,MAAM,WAAW,YAAY;IAC3B,EAAE,EAAE,MAAM,CAAC;IACX,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC3B,UAAU,EAAE,GAAG,CAAC;IAChB,WAAW,CAAC,EAAE,GAAG,CAAC;IAClB,QAAQ,CAAC,EAAE,OAAO,CAAC;IACnB,WAAW,CAAC,EAAE,MAAM,CAAC;CACtB;AAiBD,MAAM,WAAW,YAAY;IAC3B,IAAI,CAAC,EAAE,MAAM,CAAC;IACd,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,WAAW,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;IACjC,QAAQ,CAAC,EAAE,GAAG,CAAC;CAChB;AAED,MAAM,WAAW,iBAAiB;IAChC,UAAU,EAAE,MAAM,CAAC;IACnB,UAAU,EAAE,MAAM,CAAC;IACnB,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,EAAE,MAAM,CAAC;CACf;AAED,MAAM,WAAW,eAAe;IAC9B,QAAQ,EAAE,OAAO,GAAG,SAAS,GAAG,MAAM,CAAC;IACvC,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAMD,wBAAgB,uBAAuB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAuJ7E;AAMD,wBAAgB,gBAAgB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAoCtE;AAMD,wBAAgB,uBAAuB,CACrC,IAAI,EAAE,YAAY,EAClB,kBAAkB,EAAE,GAAG,CAAC,MAAM,EAAE,iBAAiB,EAAE,CAAC,EACpD,QAAQ,EAAE,YAAY,GACrB,eAAe,EAAE,CAmCnB;AAMD,wBAAgB,oBAAoB,CAAC,IAAI,EAAE,YAAY,EAAE,kBAAkB,CAAC,EAAE,GAAG,CAAC,MAAM,EAAE,iBAAiB,EAAE,CAAC,GAAG,eAAe,EAAE,CA0BjI;AAMD,wBAAgB,mBAAmB,CACjC,IAAI,EAAE,YAAY,EAClB,kBAAkB,EAAE,GAAG,CAAC,MAAM,EAAE,iBAAiB,EAAE,CAAC,GACnD,eAAe,EAAE,CAmCnB;AAMD,wBAAgB,qBAAqB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CA0B3E;AAMD,wBAAgB,sBAAsB,CAAC,KAAK,EAAE,YAAY,GAAG,eAAe,EAAE,CAG7E;AAED,wBAAgB,iBAAiB,CAAC,KAAK,EAAE,YAAY,GAAG,eAAe,EAAE,CAGxE;AAMD,wBAAgB,mBAAmB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAyBzE;AAED,wBAAgB,qBAAqB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CA4B3E;AAED,wBAAgB,mBAAmB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CA0BzE;AAED,wBAAgB,wBAAwB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAyB9E;AAKD,eAAO,MAAM,kBAAkB;;;;;;;;;;;;;CAarB,CAAC;AAKX,wBAAgB,eAAe,CAAC,QAAQ,EAAE,MAAM,GAAG,OAAO,CAGzD;AAKD,wBAAgB,qBAAqB,CACnC,IAAI,EAAE,YAAY,EAClB,QAAQ,EAAE,MAAM,EAChB,kBAAkB,EAAE,GAAG,CAAC,MAAM,EAAE,iBAAiB,EAAE,CAAC,EACpD,QAAQ,EAAE,YAAY,GACrB,eAAe,EAAE,CAgCnB"} {"version":3,"file":"ai-tool-validators.d.ts","sourceRoot":"","sources":["../../src/services/ai-tool-validators.ts"],"names":[],"mappings":"AAmBA,MAAM,WAAW,YAAY;IAC3B,EAAE,EAAE,MAAM,CAAC;IACX,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC3B,UAAU,EAAE,GAAG,CAAC;IAChB,WAAW,CAAC,EAAE,GAAG,CAAC;IAClB,QAAQ,CAAC,EAAE,OAAO,CAAC;IACnB,WAAW,CAAC,EAAE,MAAM,CAAC;CACtB;AAED,MAAM,WAAW,YAAY;IAC3B,IAAI,CAAC,EAAE,MAAM,CAAC;IACd,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,WAAW,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;IACjC,QAAQ,CAAC,EAAE,GAAG,CAAC;CAChB;AAED,MAAM,WAAW,iBAAiB;IAChC,UAAU,EAAE,MAAM,CAAC;IACnB,UAAU,EAAE,MAAM,CAAC;IACnB,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,EAAE,MAAM,CAAC;CACf;AAED,MAAM,WAAW,eAAe;IAC9B,QAAQ,EAAE,OAAO,GAAG,SAAS,GAAG,MAAM,CAAC;IACvC,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAMD,wBAAgB,uBAAuB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAuJ7E;AAMD,wBAAgB,gBAAgB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAoCtE;AAMD,wBAAgB,uBAAuB,CACrC,IAAI,EAAE,YAAY,EAClB,kBAAkB,EAAE,GAAG,CAAC,MAAM,EAAE,iBAAiB,EAAE,CAAC,EACpD,QAAQ,EAAE,YAAY,GACrB,eAAe,EAAE,CAmCnB;AAMD,wBAAgB,oBAAoB,CAAC,IAAI,EAAE,YAAY,EAAE,kBAAkB,CAAC,EAAE,GAAG,CAAC,MAAM,EAAE,iBAAiB,EAAE,CAAC,GAAG,eAAe,EAAE,CA0BjI;AAMD,wBAAgB,mBAAmB,CACjC,IAAI,EAAE,YAAY,EAClB,kBAAkB,EAAE,GAAG,CAAC,MAAM,EAAE,iBAAiB,EAAE,CAAC,GACnD,eAAe,EAAE,CAmCnB;AAMD,wBAAgB,qBAAqB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CA0B3E;AAMD,wBAAgB,sBAAsB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAM5E;AAED,wBAAgB,iBAAiB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAMvE;AAMD,wBAAgB,mBAAmB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAyBzE;AAED,wBAAgB,qBAAqB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CA4B3E;AAED,wBAAgB,mBAAmB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CA0BzE;AAED,wBAAgB,wBAAwB,CAAC,IAAI,EAAE,YAAY,GAAG,eAAe,EAAE,CAyB9E;AAKD,eAAO,MAAM,kBAAkB;;;;;;;;;;;;;CAarB,CAAC;AAKX,wBAAgB,eAAe,CAAC,QAAQ,EAAE,MAAM,GAAG,OAAO,CAGzD;AAKD,wBAAgB,qBAAqB,CACnC,IAAI,EAAE,YAAY,EAClB,QAAQ,EAAE,MAAM,EAChB,kBAAkB,EAAE,GAAG,CAAC,MAAM,EAAE,iBAAiB,EAAE,CAAC,EACpD,QAAQ,EAAE,YAAY,GACrB,eAAe,EAAE,CAgCnB"}

View File

@@ -21,14 +21,9 @@ const MIN_DESCRIPTION_LENGTH_MEDIUM = 15;
const MIN_DESCRIPTION_LENGTH_LONG = 20; const MIN_DESCRIPTION_LENGTH_LONG = 20;
const MAX_ITERATIONS_WARNING_THRESHOLD = 50; const MAX_ITERATIONS_WARNING_THRESHOLD = 50;
const MAX_TOPK_WARNING_THRESHOLD = 20; const MAX_TOPK_WARNING_THRESHOLD = 20;
function getToolDescription(node) {
return (node.parameters.toolDescription ||
node.parameters.description ||
node.parameters.options?.description);
}
function validateHTTPRequestTool(node) { function validateHTTPRequestTool(node) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -37,7 +32,7 @@ function validateHTTPRequestTool(node) {
code: 'MISSING_TOOL_DESCRIPTION' code: 'MISSING_TOOL_DESCRIPTION'
}); });
} }
else if (getToolDescription(node).trim().length < MIN_DESCRIPTION_LENGTH_MEDIUM) { else if (node.parameters.toolDescription.trim().length < MIN_DESCRIPTION_LENGTH_MEDIUM) {
issues.push({ issues.push({
severity: 'warning', severity: 'warning',
nodeId: node.id, nodeId: node.id,
@@ -159,7 +154,7 @@ function validateHTTPRequestTool(node) {
} }
function validateCodeTool(node) { function validateCodeTool(node) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -189,7 +184,7 @@ function validateCodeTool(node) {
} }
function validateVectorStoreTool(node, reverseConnections, workflow) { function validateVectorStoreTool(node, reverseConnections, workflow) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -221,7 +216,7 @@ function validateVectorStoreTool(node, reverseConnections, workflow) {
} }
function validateWorkflowTool(node, reverseConnections) { function validateWorkflowTool(node, reverseConnections) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -243,7 +238,7 @@ function validateWorkflowTool(node, reverseConnections) {
} }
function validateAIAgentTool(node, reverseConnections) { function validateAIAgentTool(node, reverseConnections) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -275,7 +270,7 @@ function validateAIAgentTool(node, reverseConnections) {
} }
function validateMCPClientTool(node) { function validateMCPClientTool(node) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -295,15 +290,17 @@ function validateMCPClientTool(node) {
} }
return issues; return issues;
} }
function validateCalculatorTool(_node) { function validateCalculatorTool(node) {
return []; const issues = [];
return issues;
} }
function validateThinkTool(_node) { function validateThinkTool(node) {
return []; const issues = [];
return issues;
} }
function validateSerpApiTool(node) { function validateSerpApiTool(node) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -324,7 +321,7 @@ function validateSerpApiTool(node) {
} }
function validateWikipediaTool(node) { function validateWikipediaTool(node) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -348,7 +345,7 @@ function validateWikipediaTool(node) {
} }
function validateSearXngTool(node) { function validateSearXngTool(node) {
const issues = []; const issues = [];
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -379,7 +376,7 @@ function validateWolframAlphaTool(node) {
code: 'MISSING_CREDENTIALS' code: 'MISSING_CREDENTIALS'
}); });
} }
if (!getToolDescription(node)) { if (!node.parameters.description && !node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'info', severity: 'info',
nodeId: node.id, nodeId: node.id,

File diff suppressed because one or more lines are too long

View File

@@ -20,7 +20,6 @@ export declare class N8nApiClient {
getWorkflow(id: string): Promise<Workflow>; getWorkflow(id: string): Promise<Workflow>;
updateWorkflow(id: string, workflow: Partial<Workflow>): Promise<Workflow>; updateWorkflow(id: string, workflow: Partial<Workflow>): Promise<Workflow>;
deleteWorkflow(id: string): Promise<Workflow>; deleteWorkflow(id: string): Promise<Workflow>;
transferWorkflow(id: string, destinationProjectId: string): Promise<void>;
activateWorkflow(id: string): Promise<Workflow>; activateWorkflow(id: string): Promise<Workflow>;
deactivateWorkflow(id: string): Promise<Workflow>; deactivateWorkflow(id: string): Promise<Workflow>;
listWorkflows(params?: WorkflowListParams): Promise<WorkflowListResponse>; listWorkflows(params?: WorkflowListParams): Promise<WorkflowListResponse>;
@@ -37,7 +36,6 @@ export declare class N8nApiClient {
createTag(tag: Partial<Tag>): Promise<Tag>; createTag(tag: Partial<Tag>): Promise<Tag>;
updateTag(id: string, tag: Partial<Tag>): Promise<Tag>; updateTag(id: string, tag: Partial<Tag>): Promise<Tag>;
deleteTag(id: string): Promise<void>; deleteTag(id: string): Promise<void>;
updateWorkflowTags(workflowId: string, tagIds: string[]): Promise<Tag[]>;
getSourceControlStatus(): Promise<SourceControlStatus>; getSourceControlStatus(): Promise<SourceControlStatus>;
pullSourceControl(force?: boolean): Promise<SourceControlPullResult>; pullSourceControl(force?: boolean): Promise<SourceControlPullResult>;
pushSourceControl(message: string, fileNames?: string[]): Promise<SourceControlPushResult>; pushSourceControl(message: string, fileNames?: string[]): Promise<SourceControlPushResult>;

View File

@@ -1 +1 @@
{"version":3,"file":"n8n-api-client.d.ts","sourceRoot":"","sources":["../../src/services/n8n-api-client.ts"],"names":[],"mappings":"AAEA,OAAO,EACL,QAAQ,EACR,kBAAkB,EAClB,oBAAoB,EACpB,SAAS,EACT,mBAAmB,EACnB,qBAAqB,EACrB,UAAU,EACV,oBAAoB,EACpB,sBAAsB,EACtB,GAAG,EACH,aAAa,EACb,eAAe,EACf,mBAAmB,EACnB,cAAc,EACd,QAAQ,EACR,cAAc,EAGd,mBAAmB,EACnB,uBAAuB,EACvB,uBAAuB,EACxB,MAAM,kBAAkB,CAAC;AAS1B,MAAM,WAAW,kBAAkB;IACjC,OAAO,EAAE,MAAM,CAAC;IAChB,MAAM,EAAE,MAAM,CAAC;IACf,OAAO,CAAC,EAAE,MAAM,CAAC;IACjB,UAAU,CAAC,EAAE,MAAM,CAAC;CACrB;AAED,qBAAa,YAAY;IACvB,OAAO,CAAC,MAAM,CAAgB;IAC9B,OAAO,CAAC,UAAU,CAAS;IAC3B,OAAO,CAAC,OAAO,CAAS;IACxB,OAAO,CAAC,WAAW,CAA+B;IAClD,OAAO,CAAC,cAAc,CAA+C;gBAEzD,MAAM,EAAE,kBAAkB;IAqDhC,UAAU,IAAI,OAAO,CAAC,cAAc,GAAG,IAAI,CAAC;YAyBpC,gBAAgB;IAa9B,oBAAoB,IAAI,cAAc,GAAG,IAAI;IAKvC,WAAW,IAAI,OAAO,CAAC,mBAAmB,CAAC;IA6C3C,cAAc,CAAC,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC;IAU9D,WAAW,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS1C,cAAc,CAAC,EAAE,EAAE,MAAM,EAAE,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC;IAsC1E,cAAc,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS7C,gBAAgB,CAAC,EAAE,EAAE,MAAM,EAAE,oBAAoB,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAQzE,gBAAgB,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS/C,kBAAkB,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC;IAsBjD,aAAa,CAAC,MAAM,GAAE,kBAAuB,GAAG,OAAO,CAAC,oBAAoB,CAAC;IAU7E,YAAY,CAAC,EAAE,EAAE,MAAM,EAAE,WAAW,UAAQ,GAAG,OAAO,CAAC,SAAS,CAAC;IAwBjE,cAAc,CAAC,MAAM,GAAE,mBAAwB,GAAG,OAAO,CAAC,qBAAqB,CAAC;IAShF,eAAe,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAS1C,cAAc,CAAC,OAAO,EAAE,cAAc,GAAG,OAAO,CAAC,GAAG,CAAC;IAiErD,eAAe,CAAC,MAAM,GAAE,oBAAyB,GAAG,OAAO,CAAC,sBAAsB,CAAC;IASnF,aAAa,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,UAAU,CAAC;IAS9C,gBAAgB,CAAC,UAAU,EAAE,OAAO,CAAC,UAAU,CAAC,GAAG,OAAO,CAAC,UAAU,CAAC;IAStE,gBAAgB,CAAC,EAAE,EAAE,MAAM,EAAE,UAAU,EAAE,OAAO,CAAC,UAAU,CAAC,GAAG,OAAO,CAAC,UAAU,CAAC;IASlF,gBAAgB,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAsB3C,QAAQ,CAAC,MAAM,GAAE,aAAkB,GAAG,OAAO,CAAC,eAAe,CAAC;IAS9D,SAAS,CAAC,GAAG,EAAE,OAAO,CAAC,GAAG,CAAC,GAAG,OAAO,CAAC,GAAG,CAAC;IAS1C,SAAS,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,EAAE,OAAO,CAAC,GAAG,CAAC,GAAG,OAAO,CAAC,GAAG,CAAC;IAStD,SAAS,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAQpC,kBAAkB,CAAC,UAAU,EAAE,MAAM,EAAE,MAAM,EAAE,MAAM,EAAE,GAAG,OAAO,CAAC,GAAG,EAAE,CAAC;IAUxE,sBAAsB,IAAI,OAAO,CAAC,mBAAmB,CAAC;IAStD,iBAAiB,CAAC,KAAK,UAAQ,GAAG,OAAO,CAAC,uBAAuB,CAAC;IASlE,iBAAiB,CACrB,OAAO,EAAE,MAAM,EACf,SAAS,CAAC,EAAE,MAAM,EAAE,GACnB,OAAO,CAAC,uBAAuB,CAAC;IAa7B,YAAY,IAAI,OAAO,CAAC,QAAQ,EAAE,CAAC;IAWnC,cAAc,CAAC,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS9D,cAAc,CAAC,EAAE,EAAE,MAAM,EAAE,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS1E,cAAc,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAiB/C,OAAO,CAAC,oBAAoB;CAmC7B"} {"version":3,"file":"n8n-api-client.d.ts","sourceRoot":"","sources":["../../src/services/n8n-api-client.ts"],"names":[],"mappings":"AAEA,OAAO,EACL,QAAQ,EACR,kBAAkB,EAClB,oBAAoB,EACpB,SAAS,EACT,mBAAmB,EACnB,qBAAqB,EACrB,UAAU,EACV,oBAAoB,EACpB,sBAAsB,EACtB,GAAG,EACH,aAAa,EACb,eAAe,EACf,mBAAmB,EACnB,cAAc,EACd,QAAQ,EACR,cAAc,EAGd,mBAAmB,EACnB,uBAAuB,EACvB,uBAAuB,EACxB,MAAM,kBAAkB,CAAC;AAS1B,MAAM,WAAW,kBAAkB;IACjC,OAAO,EAAE,MAAM,CAAC;IAChB,MAAM,EAAE,MAAM,CAAC;IACf,OAAO,CAAC,EAAE,MAAM,CAAC;IACjB,UAAU,CAAC,EAAE,MAAM,CAAC;CACrB;AAED,qBAAa,YAAY;IACvB,OAAO,CAAC,MAAM,CAAgB;IAC9B,OAAO,CAAC,UAAU,CAAS;IAC3B,OAAO,CAAC,OAAO,CAAS;IACxB,OAAO,CAAC,WAAW,CAA+B;IAClD,OAAO,CAAC,cAAc,CAA+C;gBAEzD,MAAM,EAAE,kBAAkB;IAqDhC,UAAU,IAAI,OAAO,CAAC,cAAc,GAAG,IAAI,CAAC;YAyBpC,gBAAgB;IAa9B,oBAAoB,IAAI,cAAc,GAAG,IAAI;IAKvC,WAAW,IAAI,OAAO,CAAC,mBAAmB,CAAC;IA6C3C,cAAc,CAAC,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC;IAU9D,WAAW,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS1C,cAAc,CAAC,EAAE,EAAE,MAAM,EAAE,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC;IAsC1E,cAAc,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS7C,gBAAgB,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS/C,kBAAkB,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,CAAC;IAsBjD,aAAa,CAAC,MAAM,GAAE,kBAAuB,GAAG,OAAO,CAAC,oBAAoB,CAAC;IAU7E,YAAY,CAAC,EAAE,EAAE,MAAM,EAAE,WAAW,UAAQ,GAAG,OAAO,CAAC,SAAS,CAAC;IAwBjE,cAAc,CAAC,MAAM,GAAE,mBAAwB,GAAG,OAAO,CAAC,qBAAqB,CAAC;IAShF,eAAe,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAS1C,cAAc,CAAC,OAAO,EAAE,cAAc,GAAG,OAAO,CAAC,GAAG,CAAC;IAiErD,eAAe,CAAC,MAAM,GAAE,oBAAyB,GAAG,OAAO,CAAC,sBAAsB,CAAC;IASnF,aAAa,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,UAAU,CAAC;IAS9C,gBAAgB,CAAC,UAAU,EAAE,OAAO,CAAC,UAAU,CAAC,GAAG,OAAO,CAAC,UAAU,CAAC;IAStE,gBAAgB,CAAC,EAAE,EAAE,MAAM,EAAE,UAAU,EAAE,OAAO,CAAC,UAAU,CAAC,GAAG,OAAO,CAAC,UAAU,CAAC;IASlF,gBAAgB,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAsB3C,QAAQ,CAAC,MAAM,GAAE,aAAkB,GAAG,OAAO,CAAC,eAAe,CAAC;IAS9D,SAAS,CAAC,GAAG,EAAE,OAAO,CAAC,GAAG,CAAC,GAAG,OAAO,CAAC,GAAG,CAAC;IAS1C,SAAS,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,EAAE,OAAO,CAAC,GAAG,CAAC,GAAG,OAAO,CAAC,GAAG,CAAC;IAStD,SAAS,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IASpC,sBAAsB,IAAI,OAAO,CAAC,mBAAmB,CAAC;IAStD,iBAAiB,CAAC,KAAK,UAAQ,GAAG,OAAO,CAAC,uBAAuB,CAAC;IASlE,iBAAiB,CACrB,OAAO,EAAE,MAAM,EACf,SAAS,CAAC,EAAE,MAAM,EAAE,GACnB,OAAO,CAAC,uBAAuB,CAAC;IAa7B,YAAY,IAAI,OAAO,CAAC,QAAQ,EAAE,CAAC;IAWnC,cAAc,CAAC,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS9D,cAAc,CAAC,EAAE,EAAE,MAAM,EAAE,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC;IAS1E,cAAc,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAiB/C,OAAO,CAAC,oBAAoB;CAmC7B"}

View File

@@ -194,17 +194,9 @@ class N8nApiClient {
throw (0, n8n_errors_1.handleN8nApiError)(error); throw (0, n8n_errors_1.handleN8nApiError)(error);
} }
} }
async transferWorkflow(id, destinationProjectId) {
try {
await this.client.put(`/workflows/${id}/transfer`, { destinationProjectId });
}
catch (error) {
throw (0, n8n_errors_1.handleN8nApiError)(error);
}
}
async activateWorkflow(id) { async activateWorkflow(id) {
try { try {
const response = await this.client.post(`/workflows/${id}/activate`, {}); const response = await this.client.post(`/workflows/${id}/activate`);
return response.data; return response.data;
} }
catch (error) { catch (error) {
@@ -213,7 +205,7 @@ class N8nApiClient {
} }
async deactivateWorkflow(id) { async deactivateWorkflow(id) {
try { try {
const response = await this.client.post(`/workflows/${id}/deactivate`, {}); const response = await this.client.post(`/workflows/${id}/deactivate`);
return response.data; return response.data;
} }
catch (error) { catch (error) {
@@ -373,15 +365,6 @@ class N8nApiClient {
throw (0, n8n_errors_1.handleN8nApiError)(error); throw (0, n8n_errors_1.handleN8nApiError)(error);
} }
} }
async updateWorkflowTags(workflowId, tagIds) {
try {
const response = await this.client.put(`/workflows/${workflowId}/tags`, tagIds.filter(id => id).map(id => ({ id })));
return response.data;
}
catch (error) {
throw (0, n8n_errors_1.handleN8nApiError)(error);
}
}
async getSourceControlStatus() { async getSourceControlStatus() {
try { try {
const response = await this.client.get('/source-control/status'); const response = await this.client.get('/source-control/status');

File diff suppressed because one or more lines are too long

View File

@@ -144,227 +144,79 @@ export declare const workflowConnectionSchema: z.ZodRecord<z.ZodString, z.ZodObj
node: string; node: string;
index: number; index: number;
}>, "many">, "many">>; }>, "many">, "many">>;
}, "strip", z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, { }, "strip", z.ZodTypeAny, {
type: string; error?: {
node: string; type: string;
index: number; node: string;
index: number;
}[][] | undefined;
main?: {
type: string;
node: string;
index: number;
}[][] | undefined;
ai_tool?: {
type: string;
node: string;
index: number;
}[][] | undefined;
ai_languageModel?: {
type: string;
node: string;
index: number;
}[][] | undefined;
ai_memory?: {
type: string;
node: string;
index: number;
}[][] | undefined;
ai_embedding?: {
type: string;
node: string;
index: number;
}[][] | undefined;
ai_vectorStore?: {
type: string;
node: string;
index: number;
}[][] | undefined;
}, { }, {
type: string; error?: {
node: string;
index: number;
}>, "many">, "many">, z.objectOutputType<{
main: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string; type: string;
node: string; node: string;
index: number; index: number;
}, { }[][] | undefined;
main?: {
type: string; type: string;
node: string; node: string;
index: number; index: number;
}>, "many">, "many">>; }[][] | undefined;
error: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{ ai_tool?: {
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string; type: string;
node: string; node: string;
index: number; index: number;
}, { }[][] | undefined;
ai_languageModel?: {
type: string; type: string;
node: string; node: string;
index: number; index: number;
}>, "many">, "many">>; }[][] | undefined;
ai_tool: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{ ai_memory?: {
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string; type: string;
node: string; node: string;
index: number; index: number;
}, { }[][] | undefined;
ai_embedding?: {
type: string; type: string;
node: string; node: string;
index: number; index: number;
}>, "many">, "many">>; }[][] | undefined;
ai_languageModel: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{ ai_vectorStore?: {
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string; type: string;
node: string; node: string;
index: number; index: number;
}, { }[][] | undefined;
type: string; }>>;
node: string;
index: number;
}>, "many">, "many">>;
ai_memory: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
ai_embedding: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
ai_vectorStore: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
}, z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">, "strip">, z.objectInputType<{
main: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
error: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
ai_tool: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
ai_languageModel: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
ai_memory: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
ai_embedding: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
ai_vectorStore: z.ZodOptional<z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">>;
}, z.ZodArray<z.ZodArray<z.ZodObject<{
node: z.ZodString;
type: z.ZodString;
index: z.ZodNumber;
}, "strip", z.ZodTypeAny, {
type: string;
node: string;
index: number;
}, {
type: string;
node: string;
index: number;
}>, "many">, "many">, "strip">>>;
export declare const workflowSettingsSchema: z.ZodObject<{ export declare const workflowSettingsSchema: z.ZodObject<{
executionOrder: z.ZodDefault<z.ZodEnum<["v0", "v1"]>>; executionOrder: z.ZodDefault<z.ZodEnum<["v0", "v1"]>>;
timezone: z.ZodOptional<z.ZodString>; timezone: z.ZodOptional<z.ZodString>;

View File

@@ -1 +1 @@
{"version":3,"file":"n8n-validation.d.ts","sourceRoot":"","sources":["../../src/services/n8n-validation.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,CAAC,EAAE,MAAM,KAAK,CAAC;AACxB,OAAO,EAAE,YAAY,EAAE,kBAAkB,EAAE,QAAQ,EAAE,MAAM,kBAAkB,CAAC;AAM9E,eAAO,MAAM,kBAAkB;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;EAiB7B,CAAC;AAkBH,eAAO,MAAM,wBAAwB;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;gCAUpC,CAAC;AAEF,eAAO,MAAM,sBAAsB;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;EAWjC,CAAC;AAGH,eAAO,MAAM,uBAAuB;;;;;;CAMnC,CAAC;AAGF,wBAAgB,oBAAoB,CAAC,IAAI,EAAE,OAAO,GAAG,YAAY,CAEhE;AAED,wBAAgB,2BAA2B,CAAC,WAAW,EAAE,OAAO,GAAG,kBAAkB,CAEpF;AAED,wBAAgB,wBAAwB,CAAC,QAAQ,EAAE,OAAO,GAAG,CAAC,CAAC,KAAK,CAAC,OAAO,sBAAsB,CAAC,CAElG;AAGD,wBAAgB,sBAAsB,CAAC,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC,CAsBrF;AAiBD,wBAAgB,sBAAsB,CAAC,QAAQ,EAAE,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC,CAoE5E;AAGD,wBAAgB,yBAAyB,CAAC,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,MAAM,EAAE,CAkQ/E;AAGD,wBAAgB,iBAAiB,CAAC,QAAQ,EAAE,QAAQ,GAAG,OAAO,CAK7D;AAMD,wBAAgB,+BAA+B,CAAC,IAAI,EAAE,YAAY,GAAG,MAAM,EAAE,CA+F5E;AAMD,wBAAgB,yBAAyB,CAAC,QAAQ,EAAE,GAAG,EAAE,IAAI,EAAE,MAAM,GAAG,MAAM,EAAE,CA0D/E;AAGD,wBAAgB,aAAa,CAAC,QAAQ,EAAE,QAAQ,GAAG,MAAM,GAAG,IAAI,CAmB/D;AAGD,wBAAgB,2BAA2B,IAAI,MAAM,CA6CpD;AAGD,wBAAgB,yBAAyB,CAAC,MAAM,EAAE,MAAM,EAAE,GAAG,MAAM,EAAE,CAmBpE"} {"version":3,"file":"n8n-validation.d.ts","sourceRoot":"","sources":["../../src/services/n8n-validation.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,CAAC,EAAE,MAAM,KAAK,CAAC;AACxB,OAAO,EAAE,YAAY,EAAE,kBAAkB,EAAE,QAAQ,EAAE,MAAM,kBAAkB,CAAC;AAM9E,eAAO,MAAM,kBAAkB;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;EAiB7B,CAAC;AAkBH,eAAO,MAAM,wBAAwB;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;GAUpC,CAAC;AAEF,eAAO,MAAM,sBAAsB;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;EAWjC,CAAC;AAGH,eAAO,MAAM,uBAAuB;;;;;;CAMnC,CAAC;AAGF,wBAAgB,oBAAoB,CAAC,IAAI,EAAE,OAAO,GAAG,YAAY,CAEhE;AAED,wBAAgB,2BAA2B,CAAC,WAAW,EAAE,OAAO,GAAG,kBAAkB,CAEpF;AAED,wBAAgB,wBAAwB,CAAC,QAAQ,EAAE,OAAO,GAAG,CAAC,CAAC,KAAK,CAAC,OAAO,sBAAsB,CAAC,CAElG;AAGD,wBAAgB,sBAAsB,CAAC,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,QAAQ,CAAC,CAsBrF;AAiBD,wBAAgB,sBAAsB,CAAC,QAAQ,EAAE,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC,CAoE5E;AAGD,wBAAgB,yBAAyB,CAAC,QAAQ,EAAE,OAAO,CAAC,QAAQ,CAAC,GAAG,MAAM,EAAE,CA6P/E;AAGD,wBAAgB,iBAAiB,CAAC,QAAQ,EAAE,QAAQ,GAAG,OAAO,CAK7D;AAMD,wBAAgB,+BAA+B,CAAC,IAAI,EAAE,YAAY,GAAG,MAAM,EAAE,CA+F5E;AAMD,wBAAgB,yBAAyB,CAAC,QAAQ,EAAE,GAAG,EAAE,IAAI,EAAE,MAAM,GAAG,MAAM,EAAE,CA0D/E;AAGD,wBAAgB,aAAa,CAAC,QAAQ,EAAE,QAAQ,GAAG,MAAM,GAAG,IAAI,CAmB/D;AAGD,wBAAgB,2BAA2B,IAAI,MAAM,CA6CpD;AAGD,wBAAgB,yBAAyB,CAAC,MAAM,EAAE,MAAM,EAAE,GAAG,MAAM,EAAE,CAmBpE"}

View File

@@ -47,7 +47,7 @@ exports.workflowConnectionSchema = zod_1.z.record(zod_1.z.object({
ai_memory: connectionArraySchema.optional(), ai_memory: connectionArraySchema.optional(),
ai_embedding: connectionArraySchema.optional(), ai_embedding: connectionArraySchema.optional(),
ai_vectorStore: connectionArraySchema.optional(), ai_vectorStore: connectionArraySchema.optional(),
}).catchall(connectionArraySchema)); }));
exports.workflowSettingsSchema = zod_1.z.object({ exports.workflowSettingsSchema = zod_1.z.object({
executionOrder: zod_1.z.enum(['v0', 'v1']).default('v1'), executionOrder: zod_1.z.enum(['v0', 'v1']).default('v1'),
timezone: zod_1.z.string().optional(), timezone: zod_1.z.string().optional(),
@@ -152,10 +152,11 @@ function validateWorkflowStructure(workflow) {
} }
else if (connectionCount > 0 || executableNodes.length > 1) { else if (connectionCount > 0 || executableNodes.length > 1) {
const connectedNodes = new Set(); const connectedNodes = new Set();
const ALL_CONNECTION_TYPES = ['main', 'error', 'ai_tool', 'ai_languageModel', 'ai_memory', 'ai_embedding', 'ai_vectorStore'];
Object.entries(workflow.connections).forEach(([sourceName, connection]) => { Object.entries(workflow.connections).forEach(([sourceName, connection]) => {
connectedNodes.add(sourceName); connectedNodes.add(sourceName);
const connectionRecord = connection; ALL_CONNECTION_TYPES.forEach(connType => {
Object.values(connectionRecord).forEach((connData) => { const connData = connection[connType];
if (connData && Array.isArray(connData)) { if (connData && Array.isArray(connData)) {
connData.forEach((outputs) => { connData.forEach((outputs) => {
if (Array.isArray(outputs)) { if (Array.isArray(outputs)) {
@@ -281,28 +282,23 @@ function validateWorkflowStructure(workflow) {
errors.push(`Connection references non-existent node: ${sourceName}`); errors.push(`Connection references non-existent node: ${sourceName}`);
} }
} }
const connectionRecord = connection; if (connection.main && Array.isArray(connection.main)) {
Object.values(connectionRecord).forEach((connData) => { connection.main.forEach((outputs, outputIndex) => {
if (connData && Array.isArray(connData)) { if (Array.isArray(outputs)) {
connData.forEach((outputs, outputIndex) => { outputs.forEach((target, targetIndex) => {
if (Array.isArray(outputs)) { if (!nodeNames.has(target.node)) {
outputs.forEach((target, targetIndex) => { if (nodeIds.has(target.node)) {
if (!target?.node) const correctName = nodeIdToName.get(target.node);
return; errors.push(`Connection target uses node ID '${target.node}' but must use node name '${correctName}' (from ${sourceName}[${outputIndex}][${targetIndex}])`);
if (!nodeNames.has(target.node)) {
if (nodeIds.has(target.node)) {
const correctName = nodeIdToName.get(target.node);
errors.push(`Connection target uses node ID '${target.node}' but must use node name '${correctName}' (from ${sourceName}[${outputIndex}][${targetIndex}])`);
}
else {
errors.push(`Connection references non-existent target node: ${target.node} (from ${sourceName}[${outputIndex}][${targetIndex}])`);
}
} }
}); else {
} errors.push(`Connection references non-existent target node: ${target.node} (from ${sourceName}[${outputIndex}][${targetIndex}])`);
}); }
} }
}); });
}
});
}
}); });
} }
return errors; return errors;

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{"version":3,"file":"node-sanitizer.d.ts","sourceRoot":"","sources":["../../src/services/node-sanitizer.ts"],"names":[],"mappings":"AAaA,OAAO,EAAE,YAAY,EAAE,MAAM,kBAAkB,CAAC;AAKhD,wBAAgB,YAAY,CAAC,IAAI,EAAE,YAAY,GAAG,YAAY,CAa7D;AAKD,wBAAgB,qBAAqB,CAAC,QAAQ,EAAE,GAAG,GAAG,GAAG,CASxD;AA6ND,wBAAgB,oBAAoB,CAAC,IAAI,EAAE,YAAY,GAAG,MAAM,EAAE,CAgEjE"} {"version":3,"file":"node-sanitizer.d.ts","sourceRoot":"","sources":["../../src/services/node-sanitizer.ts"],"names":[],"mappings":"AAaA,OAAO,EAAE,YAAY,EAAE,MAAM,kBAAkB,CAAC;AAKhD,wBAAgB,YAAY,CAAC,IAAI,EAAE,YAAY,GAAG,YAAY,CAa7D;AAKD,wBAAgB,qBAAqB,CAAC,QAAQ,EAAE,GAAG,GAAG,GAAG,CASxD;AAoND,wBAAgB,oBAAoB,CAAC,IAAI,EAAE,YAAY,GAAG,MAAM,EAAE,CAgEjE"}

View File

@@ -17,7 +17,7 @@ function sanitizeWorkflowNodes(workflow) {
} }
return { return {
...workflow, ...workflow,
nodes: workflow.nodes.map(sanitizeNode) nodes: workflow.nodes.map((node) => sanitizeNode(node))
}; };
} }
function isFilterBasedNode(nodeType, typeVersion) { function isFilterBasedNode(nodeType, typeVersion) {
@@ -66,7 +66,7 @@ function sanitizeFilterConditions(conditions) {
...sanitized.options ...sanitized.options
}; };
if (sanitized.conditions && Array.isArray(sanitized.conditions)) { if (sanitized.conditions && Array.isArray(sanitized.conditions)) {
sanitized.conditions = sanitized.conditions.map(sanitizeCondition); sanitized.conditions = sanitized.conditions.map((condition) => sanitizeCondition(condition));
} }
return sanitized; return sanitized;
} }
@@ -124,10 +124,6 @@ function inferDataType(operation) {
if (dateOps.some(op => operation.includes(op))) { if (dateOps.some(op => operation.includes(op))) {
return 'dateTime'; return 'dateTime';
} }
const objectOps = ['empty', 'notEmpty', 'exists', 'notExists'];
if (objectOps.includes(operation)) {
return 'object';
}
return 'string'; return 'string';
} }
function isUnaryOperator(operation) { function isUnaryOperator(operation) {
@@ -136,11 +132,7 @@ function isUnaryOperator(operation) {
'isNotEmpty', 'isNotEmpty',
'true', 'true',
'false', 'false',
'isNumeric', 'isNumeric'
'empty',
'notEmpty',
'exists',
'notExists'
]; ];
return unaryOps.includes(operation); return unaryOps.includes(operation);
} }

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{"version":3,"file":"node-specific-validators.d.ts","sourceRoot":"","sources":["../../src/services/node-specific-validators.ts"],"names":[],"mappings":"AAOA,OAAO,EAAE,eAAe,EAAE,iBAAiB,EAAE,MAAM,oBAAoB,CAAC;AAExE,MAAM,WAAW,qBAAqB;IACpC,MAAM,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;IAC5B,MAAM,EAAE,eAAe,EAAE,CAAC;IAC1B,QAAQ,EAAE,iBAAiB,EAAE,CAAC;IAC9B,WAAW,EAAE,MAAM,EAAE,CAAC;IACtB,OAAO,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;CAC9B;AAED,qBAAa,sBAAsB;IAIjC,MAAM,CAAC,aAAa,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAqE1D,OAAO,CAAC,MAAM,CAAC,wBAAwB;IAkDvC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IAsBzC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IA4BzC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IA2CzC,MAAM,CAAC,oBAAoB,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAiDjE,OAAO,CAAC,MAAM,CAAC,0BAA0B;IA0BzC,OAAO,CAAC,MAAM,CAAC,wBAAwB;IAkBvC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IAsBzC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IA4BzC,OAAO,CAAC,MAAM,CAAC,yBAAyB;IAwCxC,MAAM,CAAC,cAAc,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAwF3D,MAAM,CAAC,eAAe,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAyG5D,MAAM,CAAC,gBAAgB,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAkI7D,MAAM,CAAC,eAAe,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAmG5D,MAAM,CAAC,aAAa,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IA6F1D,OAAO,CAAC,MAAM,CAAC,gBAAgB;IAkF/B,MAAM,CAAC,mBAAmB,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAiGhE,MAAM,CAAC,eAAe,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IA6D5D,MAAM,CAAC,YAAY,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IA8DzD,OAAO,CAAC,MAAM,CAAC,sBAAsB;IAuDrC,OAAO,CAAC,MAAM,CAAC,kBAAkB;IAuDjC,OAAO,CAAC,MAAM,CAAC,uBAAuB;IAyFtC,OAAO,CAAC,MAAM,CAAC,oBAAoB;IAuKnC,OAAO,CAAC,MAAM,CAAC,oBAAoB;IA2EnC,MAAM,CAAC,WAAW,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;CAoDzD"} {"version":3,"file":"node-specific-validators.d.ts","sourceRoot":"","sources":["../../src/services/node-specific-validators.ts"],"names":[],"mappings":"AAOA,OAAO,EAAE,eAAe,EAAE,iBAAiB,EAAE,MAAM,oBAAoB,CAAC;AAExE,MAAM,WAAW,qBAAqB;IACpC,MAAM,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;IAC5B,MAAM,EAAE,eAAe,EAAE,CAAC;IAC1B,QAAQ,EAAE,iBAAiB,EAAE,CAAC;IAC9B,WAAW,EAAE,MAAM,EAAE,CAAC;IACtB,OAAO,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;CAC9B;AAED,qBAAa,sBAAsB;IAIjC,MAAM,CAAC,aAAa,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAqE1D,OAAO,CAAC,MAAM,CAAC,wBAAwB;IAkDvC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IAsBzC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IA4BzC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IA2CzC,MAAM,CAAC,oBAAoB,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAiDjE,OAAO,CAAC,MAAM,CAAC,0BAA0B;IA0BzC,OAAO,CAAC,MAAM,CAAC,wBAAwB;IAkBvC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IAsBzC,OAAO,CAAC,MAAM,CAAC,0BAA0B;IA4BzC,OAAO,CAAC,MAAM,CAAC,yBAAyB;IAwCxC,MAAM,CAAC,cAAc,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAwF3D,MAAM,CAAC,eAAe,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAyG5D,MAAM,CAAC,gBAAgB,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAkI7D,MAAM,CAAC,eAAe,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAmG5D,MAAM,CAAC,aAAa,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IA6F1D,OAAO,CAAC,MAAM,CAAC,gBAAgB;IAkF/B,MAAM,CAAC,mBAAmB,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IAiGhE,MAAM,CAAC,eAAe,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IA6D5D,MAAM,CAAC,YAAY,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;IA8DzD,OAAO,CAAC,MAAM,CAAC,sBAAsB;IAuDrC,OAAO,CAAC,MAAM,CAAC,kBAAkB;IAuDjC,OAAO,CAAC,MAAM,CAAC,uBAAuB;IAsFtC,OAAO,CAAC,MAAM,CAAC,oBAAoB;IAuKnC,OAAO,CAAC,MAAM,CAAC,oBAAoB;IA2EnC,MAAM,CAAC,WAAW,CAAC,OAAO,EAAE,qBAAqB,GAAG,IAAI;CAoDzD"}

View File

@@ -1067,8 +1067,7 @@ class NodeSpecificValidators {
fix: 'Wrap in array: return [{json: yourObject}]' fix: 'Wrap in array: return [{json: yourObject}]'
}); });
} }
const hasHelperFunctions = /(?:function\s+\w+\s*\(|(?:const|let|var)\s+\w+\s*=\s*(?:async\s+)?(?:function|\([^)]*\)\s*=>|\w+\s*=>))/.test(code); if (/return\s+(true|false|null|undefined|\d+|['"`])/m.test(code)) {
if (!hasHelperFunctions && /return\s+(true|false|null|undefined|\d+|['"`])/m.test(code)) {
errors.push({ errors.push({
type: 'invalid_value', type: 'invalid_value',
property: 'jsCode', property: 'jsCode',
@@ -1150,7 +1149,7 @@ class NodeSpecificValidators {
} }
}); });
if (language === 'javaScript') { if (language === 'javaScript') {
if (/\$(?![a-zA-Z_(])/.test(code) && !code.includes('${')) { if (/\$(?![a-zA-Z])/.test(code) && !code.includes('${')) {
warnings.push({ warnings.push({
type: 'best_practice', type: 'best_practice',
message: 'Invalid $ usage detected', message: 'Invalid $ usage detected',

File diff suppressed because one or more lines are too long

View File

@@ -1 +1 @@
{"version":3,"file":"workflow-auto-fixer.d.ts","sourceRoot":"","sources":["../../src/services/workflow-auto-fixer.ts"],"names":[],"mappings":"AAQA,OAAO,EAAE,wBAAwB,EAA0B,MAAM,sBAAsB,CAAC;AACxF,OAAO,EAAE,qBAAqB,EAAE,MAAM,+BAA+B,CAAC;AAEtE,OAAO,EAAE,cAAc,EAAE,MAAM,6BAA6B,CAAC;AAC7D,OAAO,EACL,qBAAqB,EAGtB,MAAM,wBAAwB,CAAC;AAChC,OAAO,EAAgB,QAAQ,EAAE,MAAM,kBAAkB,CAAC;AAK1D,OAAO,EAAuB,kBAAkB,EAAE,MAAM,yBAAyB,CAAC;AAIlF,MAAM,MAAM,kBAAkB,GAAG,MAAM,GAAG,QAAQ,GAAG,KAAK,CAAC;AAC3D,MAAM,MAAM,OAAO,GACf,mBAAmB,GACnB,wBAAwB,GACxB,qBAAqB,GACrB,sBAAsB,GACtB,sBAAsB,GACtB,qBAAqB,GACrB,mBAAmB,GACnB,yBAAyB,GACzB,yBAAyB,GACzB,yBAAyB,GACzB,uBAAuB,GACvB,8BAA8B,GAC9B,wBAAwB,CAAC;AAE7B,eAAO,MAAM,oBAAoB,EAAE,OAAO,EAMzC,CAAC;AAEF,MAAM,WAAW,aAAa;IAC5B,UAAU,EAAE,OAAO,CAAC;IACpB,QAAQ,CAAC,EAAE,OAAO,EAAE,CAAC;IACrB,mBAAmB,CAAC,EAAE,kBAAkB,CAAC;IACzC,QAAQ,CAAC,EAAE,MAAM,CAAC;CACnB;AAED,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,EAAE,MAAM,CAAC;IACd,IAAI,EAAE,OAAO,CAAC;IACd,MAAM,EAAE,GAAG,CAAC;IACZ,KAAK,EAAE,GAAG,CAAC;IACX,UAAU,EAAE,kBAAkB,CAAC;IAC/B,WAAW,EAAE,MAAM,CAAC;CACrB;AAED,MAAM,WAAW,aAAa;IAC5B,UAAU,EAAE,qBAAqB,EAAE,CAAC;IACpC,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,OAAO,EAAE,MAAM,CAAC;IAChB,KAAK,EAAE;QACL,KAAK,EAAE,MAAM,CAAC;QACd,MAAM,EAAE,MAAM,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;QAChC,YAAY,EAAE,MAAM,CAAC,kBAAkB,EAAE,MAAM,CAAC,CAAC;KAClD,CAAC;IACF,kBAAkB,CAAC,EAAE,kBAAkB,EAAE,CAAC;CAC3C;AAED,MAAM,WAAW,eAAgB,SAAQ,qBAAqB;IAC5D,QAAQ,EAAE,MAAM,CAAC;IACjB,MAAM,EAAE,MAAM,CAAC;CAChB;AAKD,wBAAgB,iBAAiB,CAAC,KAAK,EAAE,qBAAqB,GAAG,KAAK,IAAI,eAAe,CAIxF;AAKD,MAAM,WAAW,aAAa;IAC5B,IAAI,EAAE,OAAO,CAAC;IACd,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,WAAW,CAAC,EAAE,KAAK,CAAC;QAClB,QAAQ,EAAE,MAAM,CAAC;QACjB,UAAU,EAAE,MAAM,CAAC;QACnB,MAAM,EAAE,MAAM,CAAC;KAChB,CAAC,CAAC;CACJ;AAED,qBAAa,iBAAiB;IAC5B,OAAO,CAAC,QAAQ,CAAC,aAAa,CAI5B;IACF,OAAO,CAAC,iBAAiB,CAAsC;IAC/D,OAAO,CAAC,cAAc,CAAmC;IACzD,OAAO,CAAC,sBAAsB,CAAuC;IACrE,OAAO,CAAC,gBAAgB,CAAqC;IAC7D,OAAO,CAAC,mBAAmB,CAAoC;gBAEnD,UAAU,CAAC,EAAE,cAAc;IAajC,aAAa,CACjB,QAAQ,EAAE,QAAQ,EAClB,gBAAgB,EAAE,wBAAwB,EAC1C,YAAY,GAAE,qBAAqB,EAAO,EAC1C,MAAM,GAAE,OAAO,CAAC,aAAa,CAAM,GAClC,OAAO,CAAC,aAAa,CAAC;IAgFzB,OAAO,CAAC,4BAA4B;IAqEpC,OAAO,CAAC,uBAAuB;IA8C/B,OAAO,CAAC,uBAAuB;IA0C/B,OAAO,CAAC,oBAAoB;IAkD5B,OAAO,CAAC,uBAAuB;IAwE/B,OAAO,CAAC,uBAAuB;IAsD/B,OAAO,CAAC,cAAc;IAmGtB,OAAO,CAAC,kBAAkB;IAkB1B,OAAO,CAAC,uBAAuB;IAuB/B,OAAO,CAAC,cAAc;IAoCtB,OAAO,CAAC,eAAe;IAwDvB,OAAO,CAAC,sBAAsB;IAgF9B,OAAO,CAAC,cAAc;IA+DtB,OAAO,CAAC,WAAW;IA6EnB,OAAO,CAAC,eAAe;IAqCvB,OAAO,CAAC,eAAe;IA4DvB,OAAO,CAAC,uBAAuB;YA6CjB,0BAA0B;YAoF1B,4BAA4B;CAiF3C"} {"version":3,"file":"workflow-auto-fixer.d.ts","sourceRoot":"","sources":["../../src/services/workflow-auto-fixer.ts"],"names":[],"mappings":"AAQA,OAAO,EAAE,wBAAwB,EAA0B,MAAM,sBAAsB,CAAC;AACxF,OAAO,EAAE,qBAAqB,EAAE,MAAM,+BAA+B,CAAC;AAEtE,OAAO,EAAE,cAAc,EAAE,MAAM,6BAA6B,CAAC;AAC7D,OAAO,EACL,qBAAqB,EAGtB,MAAM,wBAAwB,CAAC;AAChC,OAAO,EAAgB,QAAQ,EAAE,MAAM,kBAAkB,CAAC;AAK1D,OAAO,EAAuB,kBAAkB,EAAE,MAAM,yBAAyB,CAAC;AAIlF,MAAM,MAAM,kBAAkB,GAAG,MAAM,GAAG,QAAQ,GAAG,KAAK,CAAC;AAC3D,MAAM,MAAM,OAAO,GACf,mBAAmB,GACnB,wBAAwB,GACxB,qBAAqB,GACrB,sBAAsB,GACtB,sBAAsB,GACtB,qBAAqB,GACrB,mBAAmB,GACnB,yBAAyB,GACzB,yBAAyB,GACzB,yBAAyB,GACzB,uBAAuB,GACvB,8BAA8B,GAC9B,wBAAwB,CAAC;AAE7B,eAAO,MAAM,oBAAoB,EAAE,OAAO,EAMzC,CAAC;AAEF,MAAM,WAAW,aAAa;IAC5B,UAAU,EAAE,OAAO,CAAC;IACpB,QAAQ,CAAC,EAAE,OAAO,EAAE,CAAC;IACrB,mBAAmB,CAAC,EAAE,kBAAkB,CAAC;IACzC,QAAQ,CAAC,EAAE,MAAM,CAAC;CACnB;AAED,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,EAAE,MAAM,CAAC;IACd,IAAI,EAAE,OAAO,CAAC;IACd,MAAM,EAAE,GAAG,CAAC;IACZ,KAAK,EAAE,GAAG,CAAC;IACX,UAAU,EAAE,kBAAkB,CAAC;IAC/B,WAAW,EAAE,MAAM,CAAC;CACrB;AAED,MAAM,WAAW,aAAa;IAC5B,UAAU,EAAE,qBAAqB,EAAE,CAAC;IACpC,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,OAAO,EAAE,MAAM,CAAC;IAChB,KAAK,EAAE;QACL,KAAK,EAAE,MAAM,CAAC;QACd,MAAM,EAAE,MAAM,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;QAChC,YAAY,EAAE,MAAM,CAAC,kBAAkB,EAAE,MAAM,CAAC,CAAC;KAClD,CAAC;IACF,kBAAkB,CAAC,EAAE,kBAAkB,EAAE,CAAC;CAC3C;AAED,MAAM,WAAW,eAAgB,SAAQ,qBAAqB;IAC5D,QAAQ,EAAE,MAAM,CAAC;IACjB,MAAM,EAAE,MAAM,CAAC;CAChB;AAKD,wBAAgB,iBAAiB,CAAC,KAAK,EAAE,qBAAqB,GAAG,KAAK,IAAI,eAAe,CAIxF;AAKD,MAAM,WAAW,aAAa;IAC5B,IAAI,EAAE,OAAO,CAAC;IACd,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,WAAW,CAAC,EAAE,KAAK,CAAC;QAClB,QAAQ,EAAE,MAAM,CAAC;QACjB,UAAU,EAAE,MAAM,CAAC;QACnB,MAAM,EAAE,MAAM,CAAC;KAChB,CAAC,CAAC;CACJ;AAED,qBAAa,iBAAiB;IAC5B,OAAO,CAAC,QAAQ,CAAC,aAAa,CAI5B;IACF,OAAO,CAAC,iBAAiB,CAAsC;IAC/D,OAAO,CAAC,cAAc,CAAmC;IACzD,OAAO,CAAC,sBAAsB,CAAuC;IACrE,OAAO,CAAC,gBAAgB,CAAqC;IAC7D,OAAO,CAAC,mBAAmB,CAAoC;gBAEnD,UAAU,CAAC,EAAE,cAAc;IAajC,aAAa,CACjB,QAAQ,EAAE,QAAQ,EAClB,gBAAgB,EAAE,wBAAwB,EAC1C,YAAY,GAAE,qBAAqB,EAAO,EAC1C,MAAM,GAAE,OAAO,CAAC,aAAa,CAAM,GAClC,OAAO,CAAC,aAAa,CAAC;IAgFzB,OAAO,CAAC,4BAA4B;IAqEpC,OAAO,CAAC,uBAAuB;IA8C/B,OAAO,CAAC,uBAAuB;IA0C/B,OAAO,CAAC,oBAAoB;IAkD5B,OAAO,CAAC,uBAAuB;IAwE/B,OAAO,CAAC,uBAAuB;IAsD/B,OAAO,CAAC,cAAc;IAmGtB,OAAO,CAAC,kBAAkB;IAkB1B,OAAO,CAAC,uBAAuB;IAqB/B,OAAO,CAAC,cAAc;IAoCtB,OAAO,CAAC,eAAe;IAwDvB,OAAO,CAAC,sBAAsB;IAgF9B,OAAO,CAAC,cAAc;IA+DtB,OAAO,CAAC,WAAW;IA6EnB,OAAO,CAAC,eAAe;IAqCvB,OAAO,CAAC,eAAe;IA4DvB,OAAO,CAAC,uBAAuB;YA6CjB,0BAA0B;YAmF1B,4BAA4B;CAiF3C"}

View File

@@ -405,7 +405,7 @@ class WorkflowAutoFixer {
const hasConnectionFixes = filteredFixes.some(f => exports.CONNECTION_FIX_TYPES.includes(f.type)); const hasConnectionFixes = filteredFixes.some(f => exports.CONNECTION_FIX_TYPES.includes(f.type));
return operations.filter(op => { return operations.filter(op => {
if (op.type === 'updateNode') { if (op.type === 'updateNode') {
return fixedNodes.has(op.nodeName || '') || fixedNodes.has(op.nodeId || ''); return fixedNodes.has(op.nodeId || '');
} }
if (op.type === 'replaceConnections') { if (op.type === 'replaceConnections') {
return hasConnectionFixes; return hasConnectionFixes;
@@ -794,7 +794,6 @@ class WorkflowAutoFixer {
const operation = { const operation = {
type: 'updateNode', type: 'updateNode',
nodeId: node.id, nodeId: node.id,
nodeName: node.name,
updates: { updates: {
typeVersion: parseFloat(latestVersion), typeVersion: parseFloat(latestVersion),
parameters: migrationResult.updatedNode.parameters, parameters: migrationResult.updatedNode.parameters,

File diff suppressed because one or more lines are too long

View File

@@ -3,11 +3,6 @@ import { Workflow } from '../types/n8n-api';
export declare class WorkflowDiffEngine { export declare class WorkflowDiffEngine {
private renameMap; private renameMap;
private warnings; private warnings;
private modifiedNodeIds;
private removedNodeNames;
private tagsToAdd;
private tagsToRemove;
private transferToProjectId;
applyDiff(workflow: Workflow, request: WorkflowDiffRequest): Promise<WorkflowDiffResult>; applyDiff(workflow: Workflow, request: WorkflowDiffRequest): Promise<WorkflowDiffResult>;
private validateOperation; private validateOperation;
private applyOperation; private applyOperation;
@@ -37,8 +32,6 @@ export declare class WorkflowDiffEngine {
private validateDeactivateWorkflow; private validateDeactivateWorkflow;
private applyActivateWorkflow; private applyActivateWorkflow;
private applyDeactivateWorkflow; private applyDeactivateWorkflow;
private validateTransferWorkflow;
private applyTransferWorkflow;
private validateCleanStaleConnections; private validateCleanStaleConnections;
private validateReplaceConnections; private validateReplaceConnections;
private applyCleanStaleConnections; private applyCleanStaleConnections;

View File

@@ -1 +1 @@
{"version":3,"file":"workflow-diff-engine.d.ts","sourceRoot":"","sources":["../../src/services/workflow-diff-engine.ts"],"names":[],"mappings":"AAMA,OAAO,EAEL,mBAAmB,EACnB,kBAAkB,EAuBnB,MAAM,wBAAwB,CAAC;AAChC,OAAO,EAAE,QAAQ,EAAoC,MAAM,kBAAkB,CAAC;AAY9E,qBAAa,kBAAkB;IAE7B,OAAO,CAAC,SAAS,CAAkC;IAEnD,OAAO,CAAC,QAAQ,CAAqC;IAErD,OAAO,CAAC,eAAe,CAAqB;IAE5C,OAAO,CAAC,gBAAgB,CAAqB;IAE7C,OAAO,CAAC,SAAS,CAAgB;IACjC,OAAO,CAAC,YAAY,CAAgB;IAEpC,OAAO,CAAC,mBAAmB,CAAqB;IAK1C,SAAS,CACb,QAAQ,EAAE,QAAQ,EAClB,OAAO,EAAE,mBAAmB,GAC3B,OAAO,CAAC,kBAAkB,CAAC;IAgO9B,OAAO,CAAC,iBAAiB;IA0CzB,OAAO,CAAC,cAAc;IA4DtB,OAAO,CAAC,eAAe;IAwBvB,OAAO,CAAC,kBAAkB;IAuB1B,OAAO,CAAC,kBAAkB;IAoC1B,OAAO,CAAC,gBAAgB;IAQxB,OAAO,CAAC,kBAAkB;IAU1B,OAAO,CAAC,qBAAqB;IAkD7B,OAAO,CAAC,wBAAwB;IA6ChC,OAAO,CAAC,wBAAwB;IAmDhC,OAAO,CAAC,YAAY;IA4BpB,OAAO,CAAC,eAAe;IAwCvB,OAAO,CAAC,eAAe;IA0BvB,OAAO,CAAC,aAAa;IAOrB,OAAO,CAAC,eAAe;IAOvB,OAAO,CAAC,gBAAgB;IAWxB,OAAO,CAAC,sBAAsB;IAwD9B,OAAO,CAAC,kBAAkB;IA6C1B,OAAO,CAAC,qBAAqB;IAuC7B,OAAO,CAAC,qBAAqB;IA0B7B,OAAO,CAAC,mBAAmB;IAW3B,OAAO,CAAC,eAAe;IAIvB,OAAO,CAAC,WAAW;IAYnB,OAAO,CAAC,cAAc;IAatB,OAAO,CAAC,wBAAwB;IAchC,OAAO,CAAC,0BAA0B;IAMlC,OAAO,CAAC,qBAAqB;IAM7B,OAAO,CAAC,uBAAuB;IAO/B,OAAO,CAAC,wBAAwB;IAOhC,OAAO,CAAC,qBAAqB;IAK7B,OAAO,CAAC,6BAA6B;IAKrC,OAAO,CAAC,0BAA0B;IA0BlC,OAAO,CAAC,0BAA0B;IA+ElC,OAAO,CAAC,uBAAuB;IAe/B,OAAO,CAAC,0BAA0B;IAmElC,OAAO,CAAC,iBAAiB;IAkBzB,OAAO,CAAC,QAAQ;IAsChB,OAAO,CAAC,uBAAuB;IAW/B,OAAO,CAAC,iBAAiB;CAoB1B"} {"version":3,"file":"workflow-diff-engine.d.ts","sourceRoot":"","sources":["../../src/services/workflow-diff-engine.ts"],"names":[],"mappings":"AAMA,OAAO,EAEL,mBAAmB,EACnB,kBAAkB,EAsBnB,MAAM,wBAAwB,CAAC;AAChC,OAAO,EAAE,QAAQ,EAAoC,MAAM,kBAAkB,CAAC;AAQ9E,qBAAa,kBAAkB;IAE7B,OAAO,CAAC,SAAS,CAAkC;IAEnD,OAAO,CAAC,QAAQ,CAAqC;IAK/C,SAAS,CACb,QAAQ,EAAE,QAAQ,EAClB,OAAO,EAAE,mBAAmB,GAC3B,OAAO,CAAC,kBAAkB,CAAC;IA0M9B,OAAO,CAAC,iBAAiB;IAwCzB,OAAO,CAAC,cAAc;IAyDtB,OAAO,CAAC,eAAe;IAwBvB,OAAO,CAAC,kBAAkB;IAuB1B,OAAO,CAAC,kBAAkB;IAoC1B,OAAO,CAAC,gBAAgB;IAQxB,OAAO,CAAC,kBAAkB;IAU1B,OAAO,CAAC,qBAAqB;IAkD7B,OAAO,CAAC,wBAAwB;IAuChC,OAAO,CAAC,wBAAwB;IAmDhC,OAAO,CAAC,YAAY;IA2BpB,OAAO,CAAC,eAAe;IAkCvB,OAAO,CAAC,eAAe;IAwBvB,OAAO,CAAC,aAAa;IAOrB,OAAO,CAAC,eAAe;IAOvB,OAAO,CAAC,gBAAgB;IAWxB,OAAO,CAAC,sBAAsB;IAgD9B,OAAO,CAAC,kBAAkB;IA4C1B,OAAO,CAAC,qBAAqB;IA2C7B,OAAO,CAAC,qBAAqB;IA0B7B,OAAO,CAAC,mBAAmB;IAW3B,OAAO,CAAC,eAAe;IAIvB,OAAO,CAAC,WAAW;IASnB,OAAO,CAAC,cAAc;IAUtB,OAAO,CAAC,wBAAwB;IAchC,OAAO,CAAC,0BAA0B;IAMlC,OAAO,CAAC,qBAAqB;IAM7B,OAAO,CAAC,uBAAuB;IAO/B,OAAO,CAAC,6BAA6B;IAKrC,OAAO,CAAC,0BAA0B;IA0BlC,OAAO,CAAC,0BAA0B;IA0ElC,OAAO,CAAC,uBAAuB;IAe/B,OAAO,CAAC,0BAA0B;IAkElC,OAAO,CAAC,iBAAiB;IAkBzB,OAAO,CAAC,QAAQ;IAsChB,OAAO,CAAC,uBAAuB;IAW/B,OAAO,CAAC,iBAAiB;CAc1B"}

View File

@@ -10,20 +10,11 @@ class WorkflowDiffEngine {
constructor() { constructor() {
this.renameMap = new Map(); this.renameMap = new Map();
this.warnings = []; this.warnings = [];
this.modifiedNodeIds = new Set();
this.removedNodeNames = new Set();
this.tagsToAdd = [];
this.tagsToRemove = [];
} }
async applyDiff(workflow, request) { async applyDiff(workflow, request) {
try { try {
this.renameMap.clear(); this.renameMap.clear();
this.warnings = []; this.warnings = [];
this.modifiedNodeIds.clear();
this.removedNodeNames.clear();
this.tagsToAdd = [];
this.tagsToRemove = [];
this.transferToProjectId = undefined;
const workflowCopy = JSON.parse(JSON.stringify(workflow)); const workflowCopy = JSON.parse(JSON.stringify(workflow));
const nodeOperationTypes = ['addNode', 'removeNode', 'updateNode', 'moveNode', 'enableNode', 'disableNode']; const nodeOperationTypes = ['addNode', 'removeNode', 'updateNode', 'moveNode', 'enableNode', 'disableNode'];
const nodeOperations = []; const nodeOperations = [];
@@ -82,10 +73,6 @@ class WorkflowDiffEngine {
failed: failedIndices failed: failedIndices
}; };
} }
const shouldActivate = workflowCopy._shouldActivate === true;
const shouldDeactivate = workflowCopy._shouldDeactivate === true;
delete workflowCopy._shouldActivate;
delete workflowCopy._shouldDeactivate;
const success = appliedIndices.length > 0; const success = appliedIndices.length > 0;
return { return {
success, success,
@@ -95,12 +82,7 @@ class WorkflowDiffEngine {
errors: errors.length > 0 ? errors : undefined, errors: errors.length > 0 ? errors : undefined,
warnings: this.warnings.length > 0 ? this.warnings : undefined, warnings: this.warnings.length > 0 ? this.warnings : undefined,
applied: appliedIndices, applied: appliedIndices,
failed: failedIndices, failed: failedIndices
shouldActivate: shouldActivate || undefined,
shouldDeactivate: shouldDeactivate || undefined,
tagsToAdd: this.tagsToAdd.length > 0 ? this.tagsToAdd : undefined,
tagsToRemove: this.tagsToRemove.length > 0 ? this.tagsToRemove : undefined,
transferToProjectId: this.transferToProjectId || undefined
}; };
} }
else { else {
@@ -160,15 +142,8 @@ class WorkflowDiffEngine {
}; };
} }
} }
if (this.modifiedNodeIds.size > 0) { workflowCopy.nodes = workflowCopy.nodes.map((node) => (0, node_sanitizer_1.sanitizeNode)(node));
workflowCopy.nodes = workflowCopy.nodes.map((node) => { logger.debug('Applied full-workflow sanitization to all nodes');
if (this.modifiedNodeIds.has(node.id)) {
return (0, node_sanitizer_1.sanitizeNode)(node);
}
return node;
});
logger.debug(`Sanitized ${this.modifiedNodeIds.size} modified nodes`);
}
if (request.validateOnly) { if (request.validateOnly) {
return { return {
success: true, success: true,
@@ -187,10 +162,7 @@ class WorkflowDiffEngine {
message: `Successfully applied ${operationsApplied} operations (${nodeOperations.length} node ops, ${otherOperations.length} other ops)`, message: `Successfully applied ${operationsApplied} operations (${nodeOperations.length} node ops, ${otherOperations.length} other ops)`,
warnings: this.warnings.length > 0 ? this.warnings : undefined, warnings: this.warnings.length > 0 ? this.warnings : undefined,
shouldActivate: shouldActivate || undefined, shouldActivate: shouldActivate || undefined,
shouldDeactivate: shouldDeactivate || undefined, shouldDeactivate: shouldDeactivate || undefined
tagsToAdd: this.tagsToAdd.length > 0 ? this.tagsToAdd : undefined,
tagsToRemove: this.tagsToRemove.length > 0 ? this.tagsToRemove : undefined,
transferToProjectId: this.transferToProjectId || undefined
}; };
} }
} }
@@ -229,8 +201,6 @@ class WorkflowDiffEngine {
case 'addTag': case 'addTag':
case 'removeTag': case 'removeTag':
return null; return null;
case 'transferWorkflow':
return this.validateTransferWorkflow(workflow, operation);
case 'activateWorkflow': case 'activateWorkflow':
return this.validateActivateWorkflow(workflow, operation); return this.validateActivateWorkflow(workflow, operation);
case 'deactivateWorkflow': case 'deactivateWorkflow':
@@ -296,9 +266,6 @@ class WorkflowDiffEngine {
case 'replaceConnections': case 'replaceConnections':
this.applyReplaceConnections(workflow, operation); this.applyReplaceConnections(workflow, operation);
break; break;
case 'transferWorkflow':
this.applyTransferWorkflow(workflow, operation);
break;
} }
} }
validateAddNode(workflow, operation) { validateAddNode(workflow, operation) {
@@ -335,7 +302,7 @@ class WorkflowDiffEngine {
return `Invalid parameter 'changes'. The updateNode operation requires 'updates' (not 'changes'). Example: {type: "updateNode", nodeId: "abc", updates: {name: "New Name", "parameters.url": "https://example.com"}}`; return `Invalid parameter 'changes'. The updateNode operation requires 'updates' (not 'changes'). Example: {type: "updateNode", nodeId: "abc", updates: {name: "New Name", "parameters.url": "https://example.com"}}`;
} }
if (!operation.updates) { if (!operation.updates) {
return `Missing required parameter 'updates'. The updateNode operation requires an 'updates' object. Correct structure: {type: "updateNode", nodeId: "abc-123" OR nodeName: "My Node", updates: {name: "New Name", "parameters.url": "https://example.com"}}`; return `Missing required parameter 'updates'. The updateNode operation requires an 'updates' object containing properties to modify. Example: {type: "updateNode", nodeId: "abc", updates: {name: "New Name"}}`;
} }
const node = this.findNode(workflow, operation.nodeId, operation.nodeName); const node = this.findNode(workflow, operation.nodeId, operation.nodeName);
if (!node) { if (!node) {
@@ -415,18 +382,12 @@ class WorkflowDiffEngine {
const sourceNode = this.findNode(workflow, operation.source, operation.source); const sourceNode = this.findNode(workflow, operation.source, operation.source);
const targetNode = this.findNode(workflow, operation.target, operation.target); const targetNode = this.findNode(workflow, operation.target, operation.target);
if (!sourceNode) { if (!sourceNode) {
if (this.removedNodeNames.has(operation.source)) {
return `Source node "${operation.source}" was already removed by a prior removeNode operation. Its connections were automatically cleaned up — no separate removeConnection needed.`;
}
const availableNodes = workflow.nodes const availableNodes = workflow.nodes
.map(n => `"${n.name}" (id: ${n.id.substring(0, 8)}...)`) .map(n => `"${n.name}" (id: ${n.id.substring(0, 8)}...)`)
.join(', '); .join(', ');
return `Source node not found: "${operation.source}". Available nodes: ${availableNodes}. Tip: Use node ID for names with special characters.`; return `Source node not found: "${operation.source}". Available nodes: ${availableNodes}. Tip: Use node ID for names with special characters.`;
} }
if (!targetNode) { if (!targetNode) {
if (this.removedNodeNames.has(operation.target)) {
return `Target node "${operation.target}" was already removed by a prior removeNode operation. Its connections were automatically cleaned up — no separate removeConnection needed.`;
}
const availableNodes = workflow.nodes const availableNodes = workflow.nodes
.map(n => `"${n.name}" (id: ${n.id.substring(0, 8)}...)`) .map(n => `"${n.name}" (id: ${n.id.substring(0, 8)}...)`)
.join(', '); .join(', ');
@@ -500,40 +461,34 @@ class WorkflowDiffEngine {
executeOnce: operation.node.executeOnce executeOnce: operation.node.executeOnce
}; };
const sanitizedNode = (0, node_sanitizer_1.sanitizeNode)(newNode); const sanitizedNode = (0, node_sanitizer_1.sanitizeNode)(newNode);
this.modifiedNodeIds.add(sanitizedNode.id);
workflow.nodes.push(sanitizedNode); workflow.nodes.push(sanitizedNode);
} }
applyRemoveNode(workflow, operation) { applyRemoveNode(workflow, operation) {
const node = this.findNode(workflow, operation.nodeId, operation.nodeName); const node = this.findNode(workflow, operation.nodeId, operation.nodeName);
if (!node) if (!node)
return; return;
this.removedNodeNames.add(node.name);
const index = workflow.nodes.findIndex(n => n.id === node.id); const index = workflow.nodes.findIndex(n => n.id === node.id);
if (index !== -1) { if (index !== -1) {
workflow.nodes.splice(index, 1); workflow.nodes.splice(index, 1);
} }
delete workflow.connections[node.name]; delete workflow.connections[node.name];
for (const [sourceName, sourceConnections] of Object.entries(workflow.connections)) { Object.keys(workflow.connections).forEach(sourceName => {
for (const [outputName, outputConns] of Object.entries(sourceConnections)) { const sourceConnections = workflow.connections[sourceName];
sourceConnections[outputName] = outputConns.map(connections => connections.filter(conn => conn.node !== node.name)); Object.keys(sourceConnections).forEach(outputName => {
const trimmed = sourceConnections[outputName]; sourceConnections[outputName] = sourceConnections[outputName].map(connections => connections.filter(conn => conn.node !== node.name)).filter(connections => connections.length > 0);
while (trimmed.length > 0 && trimmed[trimmed.length - 1].length === 0) { if (sourceConnections[outputName].length === 0) {
trimmed.pop();
}
if (trimmed.length === 0) {
delete sourceConnections[outputName]; delete sourceConnections[outputName];
} }
} });
if (Object.keys(sourceConnections).length === 0) { if (Object.keys(sourceConnections).length === 0) {
delete workflow.connections[sourceName]; delete workflow.connections[sourceName];
} }
} });
} }
applyUpdateNode(workflow, operation) { applyUpdateNode(workflow, operation) {
const node = this.findNode(workflow, operation.nodeId, operation.nodeName); const node = this.findNode(workflow, operation.nodeId, operation.nodeName);
if (!node) if (!node)
return; return;
this.modifiedNodeIds.add(node.id);
if (operation.updates.name && operation.updates.name !== node.name) { if (operation.updates.name && operation.updates.name !== node.name) {
const oldName = node.name; const oldName = node.name;
const newName = operation.updates.name; const newName = operation.updates.name;
@@ -566,13 +521,8 @@ class WorkflowDiffEngine {
} }
resolveSmartParameters(workflow, operation) { resolveSmartParameters(workflow, operation) {
const sourceNode = this.findNode(workflow, operation.source, operation.source); const sourceNode = this.findNode(workflow, operation.source, operation.source);
let sourceOutput = String(operation.sourceOutput ?? 'main'); let sourceOutput = operation.sourceOutput ?? 'main';
let sourceIndex = operation.sourceIndex ?? 0; let sourceIndex = operation.sourceIndex ?? 0;
if (/^\d+$/.test(sourceOutput) && operation.sourceIndex === undefined
&& operation.branch === undefined && operation.case === undefined) {
sourceIndex = parseInt(sourceOutput, 10);
sourceOutput = 'main';
}
if (operation.branch !== undefined && operation.sourceIndex === undefined) { if (operation.branch !== undefined && operation.sourceIndex === undefined) {
if (sourceNode?.type === 'n8n-nodes-base.if') { if (sourceNode?.type === 'n8n-nodes-base.if') {
sourceIndex = operation.branch === 'true' ? 0 : 1; sourceIndex = operation.branch === 'true' ? 0 : 1;
@@ -606,7 +556,7 @@ class WorkflowDiffEngine {
if (!sourceNode || !targetNode) if (!sourceNode || !targetNode)
return; return;
const { sourceOutput, sourceIndex } = this.resolveSmartParameters(workflow, operation); const { sourceOutput, sourceIndex } = this.resolveSmartParameters(workflow, operation);
const targetInput = String(operation.targetInput ?? sourceOutput); const targetInput = operation.targetInput ?? sourceOutput;
const targetIndex = operation.targetIndex ?? 0; const targetIndex = operation.targetIndex ?? 0;
if (!workflow.connections[sourceNode.name]) { if (!workflow.connections[sourceNode.name]) {
workflow.connections[sourceNode.name] = {}; workflow.connections[sourceNode.name] = {};
@@ -631,9 +581,12 @@ class WorkflowDiffEngine {
const sourceNode = this.findNode(workflow, operation.source, operation.source); const sourceNode = this.findNode(workflow, operation.source, operation.source);
const targetNode = this.findNode(workflow, operation.target, operation.target); const targetNode = this.findNode(workflow, operation.target, operation.target);
if (!sourceNode || !targetNode) { if (!sourceNode || !targetNode) {
if (operation.ignoreErrors) {
return;
}
return; return;
} }
const sourceOutput = String(operation.sourceOutput ?? 'main'); const sourceOutput = operation.sourceOutput || 'main';
const connections = workflow.connections[sourceNode.name]?.[sourceOutput]; const connections = workflow.connections[sourceNode.name]?.[sourceOutput];
if (!connections) if (!connections)
return; return;
@@ -680,21 +633,19 @@ class WorkflowDiffEngine {
workflow.name = operation.name; workflow.name = operation.name;
} }
applyAddTag(workflow, operation) { applyAddTag(workflow, operation) {
const removeIdx = this.tagsToRemove.indexOf(operation.tag); if (!workflow.tags) {
if (removeIdx !== -1) { workflow.tags = [];
this.tagsToRemove.splice(removeIdx, 1);
} }
if (!this.tagsToAdd.includes(operation.tag)) { if (!workflow.tags.includes(operation.tag)) {
this.tagsToAdd.push(operation.tag); workflow.tags.push(operation.tag);
} }
} }
applyRemoveTag(workflow, operation) { applyRemoveTag(workflow, operation) {
const addIdx = this.tagsToAdd.indexOf(operation.tag); if (!workflow.tags)
if (addIdx !== -1) { return;
this.tagsToAdd.splice(addIdx, 1); const index = workflow.tags.indexOf(operation.tag);
} if (index !== -1) {
if (!this.tagsToRemove.includes(operation.tag)) { workflow.tags.splice(index, 1);
this.tagsToRemove.push(operation.tag);
} }
} }
validateActivateWorkflow(workflow, operation) { validateActivateWorkflow(workflow, operation) {
@@ -713,15 +664,6 @@ class WorkflowDiffEngine {
applyDeactivateWorkflow(workflow, operation) { applyDeactivateWorkflow(workflow, operation) {
workflow._shouldDeactivate = true; workflow._shouldDeactivate = true;
} }
validateTransferWorkflow(_workflow, operation) {
if (!operation.destinationProjectId) {
return 'transferWorkflow requires a non-empty destinationProjectId string';
}
return null;
}
applyTransferWorkflow(_workflow, operation) {
this.transferToProjectId = operation.destinationProjectId;
}
validateCleanStaleConnections(workflow, operation) { validateCleanStaleConnections(workflow, operation) {
return null; return null;
} }
@@ -792,10 +734,7 @@ class WorkflowDiffEngine {
return false; return false;
} }
return true; return true;
})); })).filter(conns => conns.length > 0);
while (filteredConnections.length > 0 && filteredConnections[filteredConnections.length - 1].length === 0) {
filteredConnections.pop();
}
if (filteredConnections.length === 0) { if (filteredConnections.length === 0) {
delete outputs[outputName]; delete outputs[outputName];
} }
@@ -829,10 +768,9 @@ class WorkflowDiffEngine {
for (let connIndex = 0; connIndex < connectionsAtIndex.length; connIndex++) { for (let connIndex = 0; connIndex < connectionsAtIndex.length; connIndex++) {
const connection = connectionsAtIndex[connIndex]; const connection = connectionsAtIndex[connIndex];
if (renames.has(connection.node)) { if (renames.has(connection.node)) {
const oldTargetName = connection.node;
const newTargetName = renames.get(connection.node); const newTargetName = renames.get(connection.node);
connection.node = newTargetName; connection.node = newTargetName;
logger.debug(`Updated connection: ${sourceName}[${outputType}][${outputIndex}][${connIndex}].node: "${oldTargetName}" → "${newTargetName}"`); logger.debug(`Updated connection: ${sourceName}[${outputType}][${outputIndex}][${connIndex}].node: "${connection.node}" → "${newTargetName}"`);
} }
} }
} }
@@ -880,20 +818,12 @@ class WorkflowDiffEngine {
let current = obj; let current = obj;
for (let i = 0; i < keys.length - 1; i++) { for (let i = 0; i < keys.length - 1; i++) {
const key = keys[i]; const key = keys[i];
if (!(key in current) || typeof current[key] !== 'object' || current[key] === null) { if (!(key in current) || typeof current[key] !== 'object') {
if (value === null)
return;
current[key] = {}; current[key] = {};
} }
current = current[key]; current = current[key];
} }
const finalKey = keys[keys.length - 1]; current[keys[keys.length - 1]] = value;
if (value === null) {
delete current[finalKey];
}
else {
current[finalKey] = value;
}
} }
} }
exports.WorkflowDiffEngine = WorkflowDiffEngine; exports.WorkflowDiffEngine = WorkflowDiffEngine;

File diff suppressed because one or more lines are too long

View File

@@ -259,7 +259,6 @@ export interface WebhookRequest {
} }
export interface McpToolResponse { export interface McpToolResponse {
success: boolean; success: boolean;
saved?: boolean;
data?: unknown; data?: unknown;
error?: string; error?: string;
message?: string; message?: string;
@@ -267,7 +266,6 @@ export interface McpToolResponse {
details?: Record<string, unknown>; details?: Record<string, unknown>;
executionId?: string; executionId?: string;
workflowId?: string; workflowId?: string;
operationsApplied?: number;
} }
export type ExecutionMode = 'preview' | 'summary' | 'filtered' | 'full' | 'error'; export type ExecutionMode = 'preview' | 'summary' | 'filtered' | 'full' | 'error';
export interface ExecutionPreview { export interface ExecutionPreview {

File diff suppressed because one or more lines are too long

View File

@@ -94,10 +94,6 @@ export interface ActivateWorkflowOperation extends DiffOperation {
export interface DeactivateWorkflowOperation extends DiffOperation { export interface DeactivateWorkflowOperation extends DiffOperation {
type: 'deactivateWorkflow'; type: 'deactivateWorkflow';
} }
export interface TransferWorkflowOperation extends DiffOperation {
type: 'transferWorkflow';
destinationProjectId: string;
}
export interface CleanStaleConnectionsOperation extends DiffOperation { export interface CleanStaleConnectionsOperation extends DiffOperation {
type: 'cleanStaleConnections'; type: 'cleanStaleConnections';
dryRun?: boolean; dryRun?: boolean;
@@ -114,7 +110,7 @@ export interface ReplaceConnectionsOperation extends DiffOperation {
}; };
}; };
} }
export type WorkflowDiffOperation = AddNodeOperation | RemoveNodeOperation | UpdateNodeOperation | MoveNodeOperation | EnableNodeOperation | DisableNodeOperation | AddConnectionOperation | RemoveConnectionOperation | RewireConnectionOperation | UpdateSettingsOperation | UpdateNameOperation | AddTagOperation | RemoveTagOperation | ActivateWorkflowOperation | DeactivateWorkflowOperation | CleanStaleConnectionsOperation | ReplaceConnectionsOperation | TransferWorkflowOperation; export type WorkflowDiffOperation = AddNodeOperation | RemoveNodeOperation | UpdateNodeOperation | MoveNodeOperation | EnableNodeOperation | DisableNodeOperation | AddConnectionOperation | RemoveConnectionOperation | RewireConnectionOperation | UpdateSettingsOperation | UpdateNameOperation | AddTagOperation | RemoveTagOperation | ActivateWorkflowOperation | DeactivateWorkflowOperation | CleanStaleConnectionsOperation | ReplaceConnectionsOperation;
export interface WorkflowDiffRequest { export interface WorkflowDiffRequest {
id: string; id: string;
operations: WorkflowDiffOperation[]; operations: WorkflowDiffOperation[];
@@ -141,9 +137,6 @@ export interface WorkflowDiffResult {
}>; }>;
shouldActivate?: boolean; shouldActivate?: boolean;
shouldDeactivate?: boolean; shouldDeactivate?: boolean;
tagsToAdd?: string[];
tagsToRemove?: string[];
transferToProjectId?: string;
} }
export interface NodeReference { export interface NodeReference {
id?: string; id?: string;

View File

@@ -1 +1 @@
{"version":3,"file":"workflow-diff.d.ts","sourceRoot":"","sources":["../../src/types/workflow-diff.ts"],"names":[],"mappings":"AAKA,OAAO,EAAE,YAAY,EAAsB,MAAM,WAAW,CAAC;AAG7D,MAAM,WAAW,aAAa;IAC5B,IAAI,EAAE,MAAM,CAAC;IACb,WAAW,CAAC,EAAE,MAAM,CAAC;CACtB;AAGD,MAAM,WAAW,gBAAiB,SAAQ,aAAa;IACrD,IAAI,EAAE,SAAS,CAAC;IAChB,IAAI,EAAE,OAAO,CAAC,YAAY,CAAC,GAAG;QAC5B,IAAI,EAAE,MAAM,CAAC;QACb,IAAI,EAAE,MAAM,CAAC;QACb,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;KAC5B,CAAC;CACH;AAED,MAAM,WAAW,mBAAoB,SAAQ,aAAa;IACxD,IAAI,EAAE,YAAY,CAAC;IACnB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;CACnB;AAED,MAAM,WAAW,mBAAoB,SAAQ,aAAa;IACxD,IAAI,EAAE,YAAY,CAAC;IACnB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE;QACP,CAAC,IAAI,EAAE,MAAM,GAAG,GAAG,CAAC;KACrB,CAAC;CACH;AAED,MAAM,WAAW,iBAAkB,SAAQ,aAAa;IACtD,IAAI,EAAE,UAAU,CAAC;IACjB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;CAC5B;AAED,MAAM,WAAW,mBAAoB,SAAQ,aAAa;IACxD,IAAI,EAAE,YAAY,CAAC;IACnB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;CACnB;AAED,MAAM,WAAW,oBAAqB,SAAQ,aAAa;IACzD,IAAI,EAAE,aAAa,CAAC;IACpB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;CACnB;AAGD,MAAM,WAAW,sBAAuB,SAAQ,aAAa;IAC3D,IAAI,EAAE,eAAe,CAAC;IACtB,MAAM,EAAE,MAAM,CAAC;IACf,MAAM,EAAE,MAAM,CAAC;IACf,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,WAAW,CAAC,EAAE,MAAM,CAAC;IAErB,MAAM,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC;IAC1B,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAED,MAAM,WAAW,yBAA0B,SAAQ,aAAa;IAC9D,IAAI,EAAE,kBAAkB,CAAC;IACzB,MAAM,EAAE,MAAM,CAAC;IACf,MAAM,EAAE,MAAM,CAAC;IACf,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,YAAY,CAAC,EAAE,OAAO,CAAC;CACxB;AAED,MAAM,WAAW,yBAA0B,SAAQ,aAAa;IAC9D,IAAI,EAAE,kBAAkB,CAAC;IACzB,MAAM,EAAE,MAAM,CAAC;IACf,IAAI,EAAE,MAAM,CAAC;IACb,EAAE,EAAE,MAAM,CAAC;IACX,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,WAAW,CAAC,EAAE,MAAM,CAAC;IAErB,MAAM,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC;IAC1B,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAGD,MAAM,WAAW,uBAAwB,SAAQ,aAAa;IAC5D,IAAI,EAAE,gBAAgB,CAAC;IACvB,QAAQ,EAAE;QACR,CAAC,GAAG,EAAE,MAAM,GAAG,GAAG,CAAC;KACpB,CAAC;CACH;AAED,MAAM,WAAW,mBAAoB,SAAQ,aAAa;IACxD,IAAI,EAAE,YAAY,CAAC;IACnB,IAAI,EAAE,MAAM,CAAC;CACd;AAED,MAAM,WAAW,eAAgB,SAAQ,aAAa;IACpD,IAAI,EAAE,QAAQ,CAAC;IACf,GAAG,EAAE,MAAM,CAAC;CACb;AAED,MAAM,WAAW,kBAAmB,SAAQ,aAAa;IACvD,IAAI,EAAE,WAAW,CAAC;IAClB,GAAG,EAAE,MAAM,CAAC;CACb;AAED,MAAM,WAAW,yBAA0B,SAAQ,aAAa;IAC9D,IAAI,EAAE,kBAAkB,CAAC;CAE1B;AAED,MAAM,WAAW,2BAA4B,SAAQ,aAAa;IAChE,IAAI,EAAE,oBAAoB,CAAC;CAE5B;AAED,MAAM,WAAW,yBAA0B,SAAQ,aAAa;IAC9D,IAAI,EAAE,kBAAkB,CAAC;IACzB,oBAAoB,EAAE,MAAM,CAAC;CAC9B;AAGD,MAAM,WAAW,8BAA+B,SAAQ,aAAa;IACnE,IAAI,EAAE,uBAAuB,CAAC;IAC9B,MAAM,CAAC,EAAE,OAAO,CAAC;CAClB;AAED,MAAM,WAAW,2BAA4B,SAAQ,aAAa;IAChE,IAAI,EAAE,oBAAoB,CAAC;IAC3B,WAAW,EAAE;QACX,CAAC,QAAQ,EAAE,MAAM,GAAG;YAClB,CAAC,UAAU,EAAE,MAAM,GAAG,KAAK,CAAC,KAAK,CAAC;gBAChC,IAAI,EAAE,MAAM,CAAC;gBACb,IAAI,EAAE,MAAM,CAAC;gBACb,KAAK,EAAE,MAAM,CAAC;aACf,CAAC,CAAC,CAAC;SACL,CAAC;KACH,CAAC;CACH;AAGD,MAAM,MAAM,qBAAqB,GAC7B,gBAAgB,GAChB,mBAAmB,GACnB,mBAAmB,GACnB,iBAAiB,GACjB,mBAAmB,GACnB,oBAAoB,GACpB,sBAAsB,GACtB,yBAAyB,GACzB,yBAAyB,GACzB,uBAAuB,GACvB,mBAAmB,GACnB,eAAe,GACf,kBAAkB,GAClB,yBAAyB,GACzB,2BAA2B,GAC3B,8BAA8B,GAC9B,2BAA2B,GAC3B,yBAAyB,CAAC;AAG9B,MAAM,WAAW,mBAAmB;IAClC,EAAE,EAAE,MAAM,CAAC;IACX,UAAU,EAAE,qBAAqB,EAAE,CAAC;IACpC,YAAY,CAAC,EAAE,OAAO,CAAC;IACvB,eAAe,CAAC,EAAE,OAAO,CAAC;CAC3B;AAGD,MAAM,WAAW,2BAA2B;IAC1C,SAAS,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,OAAO,CAAC,EAAE,GAAG,CAAC;CACf;AAED,MAAM,WAAW,kBAAkB;IACjC,OAAO,EAAE,OAAO,CAAC;IACjB,QAAQ,CAAC,EAAE,GAAG,CAAC;IACf,MAAM,CAAC,EAAE,2BAA2B,EAAE,CAAC;IACvC,QAAQ,CAAC,EAAE,2BAA2B,EAAE,CAAC;IACzC,iBAAiB,CAAC,EAAE,MAAM,CAAC;IAC3B,OAAO,CAAC,EAAE,MAAM,CAAC;IACjB,OAAO,CAAC,EAAE,MAAM,EAAE,CAAC;IACnB,MAAM,CAAC,EAAE,MAAM,EAAE,CAAC;IAClB,uBAAuB,CAAC,EAAE,KAAK,CAAC;QAAE,IAAI,EAAE,MAAM,CAAC;QAAC,EAAE,EAAE,MAAM,CAAA;KAAE,CAAC,CAAC;IAC9D,cAAc,CAAC,EAAE,OAAO,CAAC;IACzB,gBAAgB,CAAC,EAAE,OAAO,CAAC;IAC3B,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC;IACrB,YAAY,CAAC,EAAE,MAAM,EAAE,CAAC;IACxB,mBAAmB,CAAC,EAAE,MAAM,CAAC;CAC9B;AAGD,MAAM,WAAW,aAAa;IAC5B,EAAE,CAAC,EAAE,MAAM,CAAC;IACZ,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAGD,wBAAgB,eAAe,CAAC,EAAE,EAAE,qBAAqB,GAAG,EAAE,IAC5D,gBAAgB,GAAG,mBAAmB,GAAG,mBAAmB,GAC5D,iBAAiB,GAAG,mBAAmB,GAAG,oBAAoB,CAE/D;AAED,wBAAgB,qBAAqB,CAAC,EAAE,EAAE,qBAAqB,GAAG,EAAE,IAClE,sBAAsB,GAAG,yBAAyB,GAAG,yBAAyB,GAAG,8BAA8B,GAAG,2BAA2B,CAE9I;AAED,wBAAgB,mBAAmB,CAAC,EAAE,EAAE,qBAAqB,GAAG,EAAE,IAChE,uBAAuB,GAAG,mBAAmB,GAAG,eAAe,GAAG,kBAAkB,CAErF"} {"version":3,"file":"workflow-diff.d.ts","sourceRoot":"","sources":["../../src/types/workflow-diff.ts"],"names":[],"mappings":"AAKA,OAAO,EAAE,YAAY,EAAsB,MAAM,WAAW,CAAC;AAG7D,MAAM,WAAW,aAAa;IAC5B,IAAI,EAAE,MAAM,CAAC;IACb,WAAW,CAAC,EAAE,MAAM,CAAC;CACtB;AAGD,MAAM,WAAW,gBAAiB,SAAQ,aAAa;IACrD,IAAI,EAAE,SAAS,CAAC;IAChB,IAAI,EAAE,OAAO,CAAC,YAAY,CAAC,GAAG;QAC5B,IAAI,EAAE,MAAM,CAAC;QACb,IAAI,EAAE,MAAM,CAAC;QACb,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;KAC5B,CAAC;CACH;AAED,MAAM,WAAW,mBAAoB,SAAQ,aAAa;IACxD,IAAI,EAAE,YAAY,CAAC;IACnB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;CACnB;AAED,MAAM,WAAW,mBAAoB,SAAQ,aAAa;IACxD,IAAI,EAAE,YAAY,CAAC;IACnB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE;QACP,CAAC,IAAI,EAAE,MAAM,GAAG,GAAG,CAAC;KACrB,CAAC;CACH;AAED,MAAM,WAAW,iBAAkB,SAAQ,aAAa;IACtD,IAAI,EAAE,UAAU,CAAC;IACjB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;CAC5B;AAED,MAAM,WAAW,mBAAoB,SAAQ,aAAa;IACxD,IAAI,EAAE,YAAY,CAAC;IACnB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;CACnB;AAED,MAAM,WAAW,oBAAqB,SAAQ,aAAa;IACzD,IAAI,EAAE,aAAa,CAAC;IACpB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;CACnB;AAGD,MAAM,WAAW,sBAAuB,SAAQ,aAAa;IAC3D,IAAI,EAAE,eAAe,CAAC;IACtB,MAAM,EAAE,MAAM,CAAC;IACf,MAAM,EAAE,MAAM,CAAC;IACf,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,WAAW,CAAC,EAAE,MAAM,CAAC;IAErB,MAAM,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC;IAC1B,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAED,MAAM,WAAW,yBAA0B,SAAQ,aAAa;IAC9D,IAAI,EAAE,kBAAkB,CAAC;IACzB,MAAM,EAAE,MAAM,CAAC;IACf,MAAM,EAAE,MAAM,CAAC;IACf,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,YAAY,CAAC,EAAE,OAAO,CAAC;CACxB;AAED,MAAM,WAAW,yBAA0B,SAAQ,aAAa;IAC9D,IAAI,EAAE,kBAAkB,CAAC;IACzB,MAAM,EAAE,MAAM,CAAC;IACf,IAAI,EAAE,MAAM,CAAC;IACb,EAAE,EAAE,MAAM,CAAC;IACX,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,WAAW,CAAC,EAAE,MAAM,CAAC;IAErB,MAAM,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC;IAC1B,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAGD,MAAM,WAAW,uBAAwB,SAAQ,aAAa;IAC5D,IAAI,EAAE,gBAAgB,CAAC;IACvB,QAAQ,EAAE;QACR,CAAC,GAAG,EAAE,MAAM,GAAG,GAAG,CAAC;KACpB,CAAC;CACH;AAED,MAAM,WAAW,mBAAoB,SAAQ,aAAa;IACxD,IAAI,EAAE,YAAY,CAAC;IACnB,IAAI,EAAE,MAAM,CAAC;CACd;AAED,MAAM,WAAW,eAAgB,SAAQ,aAAa;IACpD,IAAI,EAAE,QAAQ,CAAC;IACf,GAAG,EAAE,MAAM,CAAC;CACb;AAED,MAAM,WAAW,kBAAmB,SAAQ,aAAa;IACvD,IAAI,EAAE,WAAW,CAAC;IAClB,GAAG,EAAE,MAAM,CAAC;CACb;AAED,MAAM,WAAW,yBAA0B,SAAQ,aAAa;IAC9D,IAAI,EAAE,kBAAkB,CAAC;CAE1B;AAED,MAAM,WAAW,2BAA4B,SAAQ,aAAa;IAChE,IAAI,EAAE,oBAAoB,CAAC;CAE5B;AAGD,MAAM,WAAW,8BAA+B,SAAQ,aAAa;IACnE,IAAI,EAAE,uBAAuB,CAAC;IAC9B,MAAM,CAAC,EAAE,OAAO,CAAC;CAClB;AAED,MAAM,WAAW,2BAA4B,SAAQ,aAAa;IAChE,IAAI,EAAE,oBAAoB,CAAC;IAC3B,WAAW,EAAE;QACX,CAAC,QAAQ,EAAE,MAAM,GAAG;YAClB,CAAC,UAAU,EAAE,MAAM,GAAG,KAAK,CAAC,KAAK,CAAC;gBAChC,IAAI,EAAE,MAAM,CAAC;gBACb,IAAI,EAAE,MAAM,CAAC;gBACb,KAAK,EAAE,MAAM,CAAC;aACf,CAAC,CAAC,CAAC;SACL,CAAC;KACH,CAAC;CACH;AAGD,MAAM,MAAM,qBAAqB,GAC7B,gBAAgB,GAChB,mBAAmB,GACnB,mBAAmB,GACnB,iBAAiB,GACjB,mBAAmB,GACnB,oBAAoB,GACpB,sBAAsB,GACtB,yBAAyB,GACzB,yBAAyB,GACzB,uBAAuB,GACvB,mBAAmB,GACnB,eAAe,GACf,kBAAkB,GAClB,yBAAyB,GACzB,2BAA2B,GAC3B,8BAA8B,GAC9B,2BAA2B,CAAC;AAGhC,MAAM,WAAW,mBAAmB;IAClC,EAAE,EAAE,MAAM,CAAC;IACX,UAAU,EAAE,qBAAqB,EAAE,CAAC;IACpC,YAAY,CAAC,EAAE,OAAO,CAAC;IACvB,eAAe,CAAC,EAAE,OAAO,CAAC;CAC3B;AAGD,MAAM,WAAW,2BAA2B;IAC1C,SAAS,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,OAAO,CAAC,EAAE,GAAG,CAAC;CACf;AAED,MAAM,WAAW,kBAAkB;IACjC,OAAO,EAAE,OAAO,CAAC;IACjB,QAAQ,CAAC,EAAE,GAAG,CAAC;IACf,MAAM,CAAC,EAAE,2BAA2B,EAAE,CAAC;IACvC,QAAQ,CAAC,EAAE,2BAA2B,EAAE,CAAC;IACzC,iBAAiB,CAAC,EAAE,MAAM,CAAC;IAC3B,OAAO,CAAC,EAAE,MAAM,CAAC;IACjB,OAAO,CAAC,EAAE,MAAM,EAAE,CAAC;IACnB,MAAM,CAAC,EAAE,MAAM,EAAE,CAAC;IAClB,uBAAuB,CAAC,EAAE,KAAK,CAAC;QAAE,IAAI,EAAE,MAAM,CAAC;QAAC,EAAE,EAAE,MAAM,CAAA;KAAE,CAAC,CAAC;IAC9D,cAAc,CAAC,EAAE,OAAO,CAAC;IACzB,gBAAgB,CAAC,EAAE,OAAO,CAAC;CAC5B;AAGD,MAAM,WAAW,aAAa;IAC5B,EAAE,CAAC,EAAE,MAAM,CAAC;IACZ,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAGD,wBAAgB,eAAe,CAAC,EAAE,EAAE,qBAAqB,GAAG,EAAE,IAC5D,gBAAgB,GAAG,mBAAmB,GAAG,mBAAmB,GAC5D,iBAAiB,GAAG,mBAAmB,GAAG,oBAAoB,CAE/D;AAED,wBAAgB,qBAAqB,CAAC,EAAE,EAAE,qBAAqB,GAAG,EAAE,IAClE,sBAAsB,GAAG,yBAAyB,GAAG,yBAAyB,GAAG,8BAA8B,GAAG,2BAA2B,CAE9I;AAED,wBAAgB,mBAAmB,CAAC,EAAE,EAAE,qBAAqB,GAAG,EAAE,IAChE,uBAAuB,GAAG,mBAAmB,GAAG,eAAe,GAAG,kBAAkB,CAErF"}

View File

@@ -1 +1 @@
{"version":3,"file":"workflow-diff.js","sourceRoot":"","sources":["../../src/types/workflow-diff.ts"],"names":[],"mappings":";;AAkNA,0CAIC;AAED,sDAGC;AAED,kDAGC;AAdD,SAAgB,eAAe,CAAC,EAAyB;IAGvD,OAAO,CAAC,SAAS,EAAE,YAAY,EAAE,YAAY,EAAE,UAAU,EAAE,YAAY,EAAE,aAAa,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC,IAAI,CAAC,CAAC;AAC5G,CAAC;AAED,SAAgB,qBAAqB,CAAC,EAAyB;IAE7D,OAAO,CAAC,eAAe,EAAE,kBAAkB,EAAE,kBAAkB,EAAE,uBAAuB,EAAE,oBAAoB,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC,IAAI,CAAC,CAAC;AACpI,CAAC;AAED,SAAgB,mBAAmB,CAAC,EAAyB;IAE3D,OAAO,CAAC,gBAAgB,EAAE,YAAY,EAAE,QAAQ,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC,IAAI,CAAC,CAAC;AACnF,CAAC"} {"version":3,"file":"workflow-diff.js","sourceRoot":"","sources":["../../src/types/workflow-diff.ts"],"names":[],"mappings":";;AAyMA,0CAIC;AAED,sDAGC;AAED,kDAGC;AAdD,SAAgB,eAAe,CAAC,EAAyB;IAGvD,OAAO,CAAC,SAAS,EAAE,YAAY,EAAE,YAAY,EAAE,UAAU,EAAE,YAAY,EAAE,aAAa,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC,IAAI,CAAC,CAAC;AAC5G,CAAC;AAED,SAAgB,qBAAqB,CAAC,EAAyB;IAE7D,OAAO,CAAC,eAAe,EAAE,kBAAkB,EAAE,kBAAkB,EAAE,uBAAuB,EAAE,oBAAoB,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC,IAAI,CAAC,CAAC;AACpI,CAAC;AAED,SAAgB,mBAAmB,CAAC,EAAyB;IAE3D,OAAO,CAAC,gBAAgB,EAAE,YAAY,EAAE,QAAQ,EAAE,WAAW,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC,IAAI,CAAC,CAAC;AACnF,CAAC"}

View File

@@ -87,10 +87,12 @@ function isTriggerNode(nodeType) {
if (lowerType.includes('webhook') && !lowerType.includes('respond')) { if (lowerType.includes('webhook') && !lowerType.includes('respond')) {
return true; return true;
} }
if (lowerType.includes('emailread') || lowerType.includes('emailreadimap')) { const specificTriggers = [
return true; 'nodes-base.start',
} 'nodes-base.manualTrigger',
return normalized === 'nodes-base.start'; 'nodes-base.formTrigger'
];
return specificTriggers.includes(normalized);
} }
function isActivatableTrigger(nodeType) { function isActivatableTrigger(nodeType) {
return isTriggerNode(nodeType); return isTriggerNode(nodeType);

View File

@@ -1 +1 @@
{"version":3,"file":"node-type-utils.js","sourceRoot":"","sources":["../../src/utils/node-type-utils.ts"],"names":[],"mappings":";;AAcA,8CAMC;AASD,kDAQC;AASD,0CASC;AASD,wCASC;AAKD,gCAGC;AAKD,0CAGC;AAMD,sDAaC;AAUD,sDAwBC;AAkBD,sCAsBC;AAqBD,oDAGC;AAQD,8DAiCC;AAzOD,SAAgB,iBAAiB,CAAC,IAAY;IAC5C,IAAI,CAAC,IAAI;QAAE,OAAO,IAAI,CAAC;IAEvB,OAAO,IAAI;SACR,OAAO,CAAC,mBAAmB,EAAE,aAAa,CAAC;SAC3C,OAAO,CAAC,8BAA8B,EAAE,kBAAkB,CAAC,CAAC;AACjE,CAAC;AASD,SAAgB,mBAAmB,CAAC,IAAY,EAAE,WAAiC;IACjF,IAAI,CAAC,IAAI;QAAE,OAAO,IAAI,CAAC;IAEvB,IAAI,WAAW,KAAK,MAAM,EAAE,CAAC;QAC3B,OAAO,IAAI,CAAC,OAAO,CAAC,eAAe,EAAE,iBAAiB,CAAC,CAAC;IAC1D,CAAC;IAED,OAAO,IAAI,CAAC,OAAO,CAAC,oBAAoB,EAAE,2BAA2B,CAAC,CAAC;AACzE,CAAC;AASD,SAAgB,eAAe,CAAC,IAAY;IAC1C,IAAI,CAAC,IAAI;QAAE,OAAO,EAAE,CAAC;IAGrB,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;IAG3C,MAAM,KAAK,GAAG,UAAU,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IACpC,OAAO,KAAK,CAAC,KAAK,CAAC,MAAM,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC;AACvC,CAAC;AASD,SAAgB,cAAc,CAAC,IAAY;IACzC,IAAI,CAAC,IAAI,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC;QAAE,OAAO,IAAI,CAAC;IAG9C,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;IAG3C,MAAM,KAAK,GAAG,UAAU,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IACpC,OAAO,KAAK,CAAC,CAAC,CAAC,IAAI,IAAI,CAAC;AAC1B,CAAC;AAKD,SAAgB,UAAU,CAAC,IAAY;IACrC,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;IAC3C,OAAO,UAAU,CAAC,UAAU,CAAC,aAAa,CAAC,CAAC;AAC9C,CAAC;AAKD,SAAgB,eAAe,CAAC,IAAY;IAC1C,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;IAC3C,OAAO,UAAU,CAAC,UAAU,CAAC,kBAAkB,CAAC,CAAC;AACnD,CAAC;AAMD,SAAgB,qBAAqB,CAAC,IAAY;IAChD,IAAI,CAAC,IAAI,IAAI,OAAO,IAAI,KAAK,QAAQ;QAAE,OAAO,KAAK,CAAC;IAGpD,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC;QAAE,OAAO,KAAK,CAAC;IAEtC,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IAG9B,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC;QAAE,OAAO,KAAK,CAAC;IAGrC,OAAO,KAAK,CAAC,CAAC,CAAC,CAAC,MAAM,GAAG,CAAC,IAAI,KAAK,CAAC,CAAC,CAAC,CAAC,MAAM,GAAG,CAAC,CAAC;AACpD,CAAC;AAUD,SAAgB,qBAAqB,CAAC,IAAY;IAChD,MAAM,UAAU,GAAa,EAAE,CAAC;IAGhC,IAAI,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,EAAE,CAAC;QACvB,UAAU,CAAC,IAAI,CAAC,iBAAiB,CAAC,IAAI,CAAC,CAAC,CAAC;QAGzC,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;QAC3C,IAAI,UAAU,CAAC,UAAU,CAAC,aAAa,CAAC,EAAE,CAAC;YACzC,UAAU,CAAC,IAAI,CAAC,mBAAmB,CAAC,UAAU,EAAE,MAAM,CAAC,CAAC,CAAC;QAC3D,CAAC;aAAM,IAAI,UAAU,CAAC,UAAU,CAAC,kBAAkB,CAAC,EAAE,CAAC;YACrD,UAAU,CAAC,IAAI,CAAC,mBAAmB,CAAC,UAAU,EAAE,WAAW,CAAC,CAAC,CAAC;QAChE,CAAC;IACH,CAAC;SAAM,CAAC;QAEN,UAAU,CAAC,IAAI,CAAC,cAAc,IAAI,EAAE,CAAC,CAAC;QACtC,UAAU,CAAC,IAAI,CAAC,kBAAkB,IAAI,EAAE,CAAC,CAAC;QAC1C,UAAU,CAAC,IAAI,CAAC,mBAAmB,IAAI,EAAE,CAAC,CAAC;QAC3C,UAAU,CAAC,IAAI,CAAC,4BAA4B,IAAI,EAAE,CAAC,CAAC;IACtD,CAAC;IAGD,OAAO,CAAC,GAAG,IAAI,GAAG,CAAC,UAAU,CAAC,CAAC,CAAC;AAClC,CAAC;AAkBD,SAAgB,aAAa,CAAC,QAAgB;IAC5C,MAAM,UAAU,GAAG,iBAAiB,CAAC,QAAQ,CAAC,CAAC;IAC/C,MAAM,SAAS,GAAG,UAAU,CAAC,WAAW,EAAE,CAAC;IAG3C,IAAI,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,CAAC;QAClC,OAAO,IAAI,CAAC;IACd,CAAC;IAGD,IAAI,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,IAAI,CAAC,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,CAAC;QACpE,OAAO,IAAI,CAAC;IACd,CAAC;IAGD,IAAI,SAAS,CAAC,QAAQ,CAAC,WAAW,CAAC,IAAI,SAAS,CAAC,QAAQ,CAAC,eAAe,CAAC,EAAE,CAAC;QAC3E,OAAO,IAAI,CAAC;IACd,CAAC;IAID,OAAO,UAAU,KAAK,kBAAkB,CAAC;AAC3C,CAAC;AAqBD,SAAgB,oBAAoB,CAAC,QAAgB;IAEnD,OAAO,aAAa,CAAC,QAAQ,CAAC,CAAC;AACjC,CAAC;AAQD,SAAgB,yBAAyB,CAAC,QAAgB;IACxD,MAAM,UAAU,GAAG,iBAAiB,CAAC,QAAQ,CAAC,CAAC;IAC/C,MAAM,SAAS,GAAG,UAAU,CAAC,WAAW,EAAE,CAAC;IAE3C,IAAI,SAAS,CAAC,QAAQ,CAAC,iBAAiB,CAAC,EAAE,CAAC;QAC1C,OAAO,uDAAuD,CAAC;IACjE,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,CAAC;QAClC,OAAO,iCAAiC,CAAC;IAC3C,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,UAAU,CAAC,IAAI,SAAS,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;QACjE,OAAO,+BAA+B,CAAC;IACzC,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,QAAQ,CAAC,IAAI,UAAU,KAAK,kBAAkB,EAAE,CAAC;QACtE,OAAO,mCAAmC,CAAC;IAC7C,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,OAAO,CAAC,IAAI,SAAS,CAAC,QAAQ,CAAC,MAAM,CAAC,IAAI,SAAS,CAAC,QAAQ,CAAC,OAAO,CAAC,EAAE,CAAC;QAC7F,OAAO,yBAAyB,CAAC;IACnC,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;QAC/B,OAAO,iCAAiC,CAAC;IAC3C,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,CAAC;QAClC,OAAO,uBAAuB,CAAC;IACjC,CAAC;IAED,OAAO,sBAAsB,CAAC;AAChC,CAAC"} {"version":3,"file":"node-type-utils.js","sourceRoot":"","sources":["../../src/utils/node-type-utils.ts"],"names":[],"mappings":";;AAcA,8CAMC;AASD,kDAQC;AASD,0CASC;AASD,wCASC;AAKD,gCAGC;AAKD,0CAGC;AAMD,sDAaC;AAUD,sDAwBC;AAkBD,sCAsBC;AAqBD,oDAGC;AAQD,8DAiCC;AAzOD,SAAgB,iBAAiB,CAAC,IAAY;IAC5C,IAAI,CAAC,IAAI;QAAE,OAAO,IAAI,CAAC;IAEvB,OAAO,IAAI;SACR,OAAO,CAAC,mBAAmB,EAAE,aAAa,CAAC;SAC3C,OAAO,CAAC,8BAA8B,EAAE,kBAAkB,CAAC,CAAC;AACjE,CAAC;AASD,SAAgB,mBAAmB,CAAC,IAAY,EAAE,WAAiC;IACjF,IAAI,CAAC,IAAI;QAAE,OAAO,IAAI,CAAC;IAEvB,IAAI,WAAW,KAAK,MAAM,EAAE,CAAC;QAC3B,OAAO,IAAI,CAAC,OAAO,CAAC,eAAe,EAAE,iBAAiB,CAAC,CAAC;IAC1D,CAAC;IAED,OAAO,IAAI,CAAC,OAAO,CAAC,oBAAoB,EAAE,2BAA2B,CAAC,CAAC;AACzE,CAAC;AASD,SAAgB,eAAe,CAAC,IAAY;IAC1C,IAAI,CAAC,IAAI;QAAE,OAAO,EAAE,CAAC;IAGrB,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;IAG3C,MAAM,KAAK,GAAG,UAAU,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IACpC,OAAO,KAAK,CAAC,KAAK,CAAC,MAAM,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC;AACvC,CAAC;AASD,SAAgB,cAAc,CAAC,IAAY;IACzC,IAAI,CAAC,IAAI,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC;QAAE,OAAO,IAAI,CAAC;IAG9C,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;IAG3C,MAAM,KAAK,GAAG,UAAU,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IACpC,OAAO,KAAK,CAAC,CAAC,CAAC,IAAI,IAAI,CAAC;AAC1B,CAAC;AAKD,SAAgB,UAAU,CAAC,IAAY;IACrC,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;IAC3C,OAAO,UAAU,CAAC,UAAU,CAAC,aAAa,CAAC,CAAC;AAC9C,CAAC;AAKD,SAAgB,eAAe,CAAC,IAAY;IAC1C,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;IAC3C,OAAO,UAAU,CAAC,UAAU,CAAC,kBAAkB,CAAC,CAAC;AACnD,CAAC;AAMD,SAAgB,qBAAqB,CAAC,IAAY;IAChD,IAAI,CAAC,IAAI,IAAI,OAAO,IAAI,KAAK,QAAQ;QAAE,OAAO,KAAK,CAAC;IAGpD,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC;QAAE,OAAO,KAAK,CAAC;IAEtC,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IAG9B,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC;QAAE,OAAO,KAAK,CAAC;IAGrC,OAAO,KAAK,CAAC,CAAC,CAAC,CAAC,MAAM,GAAG,CAAC,IAAI,KAAK,CAAC,CAAC,CAAC,CAAC,MAAM,GAAG,CAAC,CAAC;AACpD,CAAC;AAUD,SAAgB,qBAAqB,CAAC,IAAY;IAChD,MAAM,UAAU,GAAa,EAAE,CAAC;IAGhC,IAAI,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,EAAE,CAAC;QACvB,UAAU,CAAC,IAAI,CAAC,iBAAiB,CAAC,IAAI,CAAC,CAAC,CAAC;QAGzC,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC,CAAC;QAC3C,IAAI,UAAU,CAAC,UAAU,CAAC,aAAa,CAAC,EAAE,CAAC;YACzC,UAAU,CAAC,IAAI,CAAC,mBAAmB,CAAC,UAAU,EAAE,MAAM,CAAC,CAAC,CAAC;QAC3D,CAAC;aAAM,IAAI,UAAU,CAAC,UAAU,CAAC,kBAAkB,CAAC,EAAE,CAAC;YACrD,UAAU,CAAC,IAAI,CAAC,mBAAmB,CAAC,UAAU,EAAE,WAAW,CAAC,CAAC,CAAC;QAChE,CAAC;IACH,CAAC;SAAM,CAAC;QAEN,UAAU,CAAC,IAAI,CAAC,cAAc,IAAI,EAAE,CAAC,CAAC;QACtC,UAAU,CAAC,IAAI,CAAC,kBAAkB,IAAI,EAAE,CAAC,CAAC;QAC1C,UAAU,CAAC,IAAI,CAAC,mBAAmB,IAAI,EAAE,CAAC,CAAC;QAC3C,UAAU,CAAC,IAAI,CAAC,4BAA4B,IAAI,EAAE,CAAC,CAAC;IACtD,CAAC;IAGD,OAAO,CAAC,GAAG,IAAI,GAAG,CAAC,UAAU,CAAC,CAAC,CAAC;AAClC,CAAC;AAkBD,SAAgB,aAAa,CAAC,QAAgB;IAC5C,MAAM,UAAU,GAAG,iBAAiB,CAAC,QAAQ,CAAC,CAAC;IAC/C,MAAM,SAAS,GAAG,UAAU,CAAC,WAAW,EAAE,CAAC;IAG3C,IAAI,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,CAAC;QAClC,OAAO,IAAI,CAAC;IACd,CAAC;IAGD,IAAI,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,IAAI,CAAC,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,CAAC;QACpE,OAAO,IAAI,CAAC;IACd,CAAC;IAGD,MAAM,gBAAgB,GAAG;QACvB,kBAAkB;QAClB,0BAA0B;QAC1B,wBAAwB;KACzB,CAAC;IAEF,OAAO,gBAAgB,CAAC,QAAQ,CAAC,UAAU,CAAC,CAAC;AAC/C,CAAC;AAqBD,SAAgB,oBAAoB,CAAC,QAAgB;IAEnD,OAAO,aAAa,CAAC,QAAQ,CAAC,CAAC;AACjC,CAAC;AAQD,SAAgB,yBAAyB,CAAC,QAAgB;IACxD,MAAM,UAAU,GAAG,iBAAiB,CAAC,QAAQ,CAAC,CAAC;IAC/C,MAAM,SAAS,GAAG,UAAU,CAAC,WAAW,EAAE,CAAC;IAE3C,IAAI,SAAS,CAAC,QAAQ,CAAC,iBAAiB,CAAC,EAAE,CAAC;QAC1C,OAAO,uDAAuD,CAAC;IACjE,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,CAAC;QAClC,OAAO,iCAAiC,CAAC;IAC3C,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,UAAU,CAAC,IAAI,SAAS,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;QACjE,OAAO,+BAA+B,CAAC;IACzC,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,QAAQ,CAAC,IAAI,UAAU,KAAK,kBAAkB,EAAE,CAAC;QACtE,OAAO,mCAAmC,CAAC;IAC7C,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,OAAO,CAAC,IAAI,SAAS,CAAC,QAAQ,CAAC,MAAM,CAAC,IAAI,SAAS,CAAC,QAAQ,CAAC,OAAO,CAAC,EAAE,CAAC;QAC7F,OAAO,yBAAyB,CAAC;IACnC,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;QAC/B,OAAO,iCAAiC,CAAC;IAC3C,CAAC;IAED,IAAI,SAAS,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE,CAAC;QAClC,OAAO,uBAAuB,CAAC;IACjC,CAAC;IAED,OAAO,sBAAsB,CAAC;AAChC,CAAC"}

5333
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{ {
"name": "n8n-mcp", "name": "n8n-mcp",
"version": "2.40.1", "version": "2.36.2",
"description": "Integration between n8n workflow automation and Model Context Protocol (MCP)", "description": "Integration between n8n workflow automation and Model Context Protocol (MCP)",
"main": "dist/index.js", "main": "dist/index.js",
"types": "dist/index.d.ts", "types": "dist/index.d.ts",
@@ -153,16 +153,16 @@
}, },
"dependencies": { "dependencies": {
"@modelcontextprotocol/sdk": "^1.27.1", "@modelcontextprotocol/sdk": "^1.27.1",
"@n8n/n8n-nodes-langchain": "^2.12.0", "@n8n/n8n-nodes-langchain": "^2.11.2",
"@supabase/supabase-js": "^2.57.4", "@supabase/supabase-js": "^2.57.4",
"dotenv": "^16.5.0", "dotenv": "^16.5.0",
"express": "^5.1.0", "express": "^5.1.0",
"express-rate-limit": "^7.1.5", "express-rate-limit": "^7.1.5",
"form-data": "^4.0.5", "form-data": "^4.0.5",
"lru-cache": "^11.2.1", "lru-cache": "^11.2.1",
"n8n": "^2.12.3", "n8n": "^2.11.4",
"n8n-core": "^2.12.0", "n8n-core": "^2.11.1",
"n8n-workflow": "^2.12.0", "n8n-workflow": "^2.11.1",
"openai": "^4.77.0", "openai": "^4.77.0",
"sql.js": "^1.13.0", "sql.js": "^1.13.0",
"tslib": "^2.6.2", "tslib": "^2.6.2",

View File

@@ -11,7 +11,6 @@
* Issue: https://github.com/czlonkowski/n8n-mcp/issues/XXX * Issue: https://github.com/czlonkowski/n8n-mcp/issues/XXX
*/ */
import path from 'path';
import { DatabaseAdapter, createDatabaseAdapter } from './database-adapter'; import { DatabaseAdapter, createDatabaseAdapter } from './database-adapter';
import { NodeRepository } from './node-repository'; import { NodeRepository } from './node-repository';
import { TemplateService } from '../templates/template-service'; import { TemplateService } from '../templates/template-service';
@@ -44,26 +43,21 @@ let initializationPromise: Promise<SharedDatabaseState> | null = null;
* @returns Shared database state with connection and services * @returns Shared database state with connection and services
*/ */
export async function getSharedDatabase(dbPath: string): Promise<SharedDatabaseState> { export async function getSharedDatabase(dbPath: string): Promise<SharedDatabaseState> {
// Normalize to a canonical absolute path so that callers using different
// relative or join-based paths (e.g. "./data/nodes.db" vs an absolute path)
// resolve to the same string and do not trigger a false "different path" error.
const normalizedPath = dbPath === ':memory:' ? dbPath : path.resolve(dbPath);
// If already initialized with the same path, increment ref count and return // If already initialized with the same path, increment ref count and return
if (sharedState && sharedState.initialized && sharedState.dbPath === normalizedPath) { if (sharedState && sharedState.initialized && sharedState.dbPath === dbPath) {
sharedState.refCount++; sharedState.refCount++;
logger.debug('Reusing shared database connection', { logger.debug('Reusing shared database connection', {
refCount: sharedState.refCount, refCount: sharedState.refCount,
dbPath: normalizedPath dbPath
}); });
return sharedState; return sharedState;
} }
// If already initialized with a DIFFERENT path, this is a configuration error // If already initialized with a DIFFERENT path, this is a configuration error
if (sharedState && sharedState.initialized && sharedState.dbPath !== normalizedPath) { if (sharedState && sharedState.initialized && sharedState.dbPath !== dbPath) {
logger.error('Attempted to initialize shared database with different path', { logger.error('Attempted to initialize shared database with different path', {
existingPath: sharedState.dbPath, existingPath: sharedState.dbPath,
requestedPath: normalizedPath requestedPath: dbPath
}); });
throw new Error(`Shared database already initialized with different path: ${sharedState.dbPath}`); throw new Error(`Shared database already initialized with different path: ${sharedState.dbPath}`);
} }
@@ -75,7 +69,7 @@ export async function getSharedDatabase(dbPath: string): Promise<SharedDatabaseS
state.refCount++; state.refCount++;
logger.debug('Reusing shared database (waited for init)', { logger.debug('Reusing shared database (waited for init)', {
refCount: state.refCount, refCount: state.refCount,
dbPath: normalizedPath dbPath
}); });
return state; return state;
} catch (error) { } catch (error) {
@@ -86,7 +80,7 @@ export async function getSharedDatabase(dbPath: string): Promise<SharedDatabaseS
} }
// Start new initialization // Start new initialization
initializationPromise = initializeSharedDatabase(normalizedPath); initializationPromise = initializeSharedDatabase(dbPath);
try { try {
const state = await initializationPromise; const state = await initializationPromise;

View File

@@ -8,7 +8,7 @@ import {
WebhookRequest, WebhookRequest,
McpToolResponse, McpToolResponse,
ExecutionFilterOptions, ExecutionFilterOptions,
ExecutionMode, ExecutionMode
} from '../types/n8n-api'; } from '../types/n8n-api';
import type { TriggerType, TestWorkflowInput } from '../triggers/types'; import type { TriggerType, TestWorkflowInput } from '../triggers/types';
import { import {
@@ -383,7 +383,6 @@ const createWorkflowSchema = z.object({
executionTimeout: z.number().optional(), executionTimeout: z.number().optional(),
errorWorkflow: z.string().optional(), errorWorkflow: z.string().optional(),
}).optional(), }).optional(),
projectId: z.string().optional(),
}); });
const updateWorkflowSchema = z.object({ const updateWorkflowSchema = z.object({
@@ -520,17 +519,6 @@ export async function handleCreateWorkflow(args: unknown, context?: InstanceCont
// Create workflow (n8n API expects node types in FULL form) // Create workflow (n8n API expects node types in FULL form)
const workflow = await client.createWorkflow(input); const workflow = await client.createWorkflow(input);
// Defensive check: ensure the API returned a valid workflow with an ID
if (!workflow || !workflow.id) {
return {
success: false,
error: 'Workflow creation failed: n8n API returned an empty or invalid response. Verify your N8N_API_URL points to the correct /api/v1 endpoint and that the n8n instance supports workflow creation.',
details: {
response: workflow ? { keys: Object.keys(workflow) } : null
}
};
}
// Track successful workflow creation // Track successful workflow creation
telemetry.trackWorkflowCreation(workflow, true); telemetry.trackWorkflowCreation(workflow, true);
@@ -1975,7 +1963,7 @@ export async function handleDiagnostic(request: any, context?: InstanceContext):
// Check which tools are available // Check which tools are available
const documentationTools = 7; // Base documentation tools (after v2.26.0 consolidation) const documentationTools = 7; // Base documentation tools (after v2.26.0 consolidation)
const managementTools = apiConfigured ? 14 : 0; // Management tools requiring API (includes n8n_manage_datatable) const managementTools = apiConfigured ? 13 : 0; // Management tools requiring API (includes n8n_deploy_template)
const totalTools = documentationTools + managementTools; const totalTools = documentationTools + managementTools;
// Check npm version // Check npm version
@@ -2689,258 +2677,3 @@ export async function handleTriggerWebhookWorkflow(args: unknown, context?: Inst
}; };
} }
} }
// ========================================================================
// Data Table Handlers
// ========================================================================
// Shared Zod schemas for data table operations
const dataTableFilterConditionSchema = z.object({
columnName: z.string().min(1),
condition: z.enum(['eq', 'neq', 'like', 'ilike', 'gt', 'gte', 'lt', 'lte']),
value: z.any(),
});
const dataTableFilterSchema = z.object({
type: z.enum(['and', 'or']).optional().default('and'),
filters: z.array(dataTableFilterConditionSchema).min(1, 'At least one filter condition is required'),
});
// Shared base schema for actions requiring a tableId
const tableIdSchema = z.object({
tableId: z.string().min(1, 'tableId is required'),
});
// Per-action Zod schemas
const createTableSchema = z.object({
name: z.string().min(1, 'Table name cannot be empty'),
columns: z.array(z.object({
name: z.string().min(1, 'Column name cannot be empty'),
type: z.enum(['string', 'number', 'boolean', 'date']).optional(),
})).optional(),
});
const listTablesSchema = z.object({
limit: z.number().min(1).max(100).optional(),
cursor: z.string().optional(),
});
const updateTableSchema = tableIdSchema.extend({
name: z.string().min(1, 'New table name cannot be empty'),
});
// MCP transports may serialize JSON objects/arrays as strings.
// Parse them back, but return the original value on failure so Zod reports a proper type error.
function tryParseJson(val: unknown): unknown {
if (typeof val !== 'string') return val;
try { return JSON.parse(val); } catch { return val; }
}
const coerceJsonArray = z.preprocess(tryParseJson, z.array(z.record(z.unknown())));
const coerceJsonObject = z.preprocess(tryParseJson, z.record(z.unknown()));
const coerceJsonFilter = z.preprocess(tryParseJson, dataTableFilterSchema);
const getRowsSchema = tableIdSchema.extend({
limit: z.number().min(1).max(100).optional(),
cursor: z.string().optional(),
filter: z.union([coerceJsonFilter, z.string()]).optional(),
sortBy: z.string().optional(),
search: z.string().optional(),
});
const insertRowsSchema = tableIdSchema.extend({
data: coerceJsonArray.pipe(z.array(z.record(z.unknown())).min(1, 'At least one row is required')),
returnType: z.enum(['count', 'id', 'all']).optional(),
});
// Shared schema for update/upsert (identical structure)
const mutateRowsSchema = tableIdSchema.extend({
filter: coerceJsonFilter,
data: coerceJsonObject,
returnData: z.boolean().optional(),
dryRun: z.boolean().optional(),
});
const deleteRowsSchema = tableIdSchema.extend({
filter: coerceJsonFilter,
returnData: z.boolean().optional(),
dryRun: z.boolean().optional(),
});
function handleDataTableError(error: unknown): McpToolResponse {
if (error instanceof z.ZodError) {
return { success: false, error: 'Invalid input', details: { errors: error.errors } };
}
if (error instanceof N8nApiError) {
return {
success: false,
error: getUserFriendlyErrorMessage(error),
code: error.code,
details: error.details as Record<string, unknown> | undefined,
};
}
return { success: false, error: error instanceof Error ? error.message : 'Unknown error occurred' };
}
export async function handleCreateTable(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const input = createTableSchema.parse(args);
const dataTable = await client.createDataTable(input);
if (!dataTable || !dataTable.id) {
return { success: false, error: 'Data table creation failed: n8n API returned an empty or invalid response' };
}
return {
success: true,
data: { id: dataTable.id, name: dataTable.name },
message: `Data table "${dataTable.name}" created with ID: ${dataTable.id}`,
};
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleListTables(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const input = listTablesSchema.parse(args || {});
const result = await client.listDataTables(input);
return {
success: true,
data: {
tables: result.data,
count: result.data.length,
nextCursor: result.nextCursor || undefined,
},
};
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleGetTable(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const { tableId } = tableIdSchema.parse(args);
const dataTable = await client.getDataTable(tableId);
return { success: true, data: dataTable };
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleUpdateTable(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const { tableId, name } = updateTableSchema.parse(args);
const dataTable = await client.updateDataTable(tableId, { name });
return {
success: true,
data: dataTable,
message: `Data table renamed to "${dataTable.name}"`,
};
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleDeleteTable(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const { tableId } = tableIdSchema.parse(args);
await client.deleteDataTable(tableId);
return { success: true, message: `Data table ${tableId} deleted successfully` };
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleGetRows(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const { tableId, filter, sortBy, ...params } = getRowsSchema.parse(args);
const queryParams: Record<string, unknown> = { ...params };
if (filter) {
const filterStr = typeof filter === 'string' ? filter : JSON.stringify(filter);
queryParams.filter = encodeURIComponent(filterStr);
}
if (sortBy) {
queryParams.sortBy = encodeURIComponent(sortBy);
}
const result = await client.getDataTableRows(tableId, queryParams as any);
return {
success: true,
data: {
rows: result.data,
count: result.data.length,
nextCursor: result.nextCursor || undefined,
},
};
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleInsertRows(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const { tableId, ...params } = insertRowsSchema.parse(args);
const result = await client.insertDataTableRows(tableId, params);
return {
success: true,
data: result,
message: `Rows inserted into data table ${tableId}`,
};
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleUpdateRows(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const { tableId, ...params } = mutateRowsSchema.parse(args);
const result = await client.updateDataTableRows(tableId, params);
return {
success: true,
data: result,
message: params.dryRun ? 'Dry run: rows matched (no changes applied)' : 'Rows updated successfully',
};
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleUpsertRows(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const { tableId, ...params } = mutateRowsSchema.parse(args);
const result = await client.upsertDataTableRow(tableId, params);
return {
success: true,
data: result,
message: params.dryRun ? 'Dry run: upsert previewed (no changes applied)' : 'Row upserted successfully',
};
} catch (error) {
return handleDataTableError(error);
}
}
export async function handleDeleteRows(args: unknown, context?: InstanceContext): Promise<McpToolResponse> {
try {
const client = ensureApiConfigured(context);
const { tableId, filter, ...params } = deleteRowsSchema.parse(args);
const queryParams = {
filter: encodeURIComponent(JSON.stringify(filter)),
...params,
};
const result = await client.deleteDataTableRows(tableId, queryParams as any);
return {
success: true,
data: result,
message: params.dryRun ? 'Dry run: rows matched for deletion (no changes applied)' : 'Rows deleted successfully',
};
} catch (error) {
return handleDataTableError(error);
}
}

View File

@@ -5,7 +5,7 @@
import { z } from 'zod'; import { z } from 'zod';
import { McpToolResponse } from '../types/n8n-api'; import { McpToolResponse } from '../types/n8n-api';
import { WorkflowDiffRequest, WorkflowDiffOperation, WorkflowDiffValidationError } from '../types/workflow-diff'; import { WorkflowDiffRequest, WorkflowDiffOperation } from '../types/workflow-diff';
import { WorkflowDiffEngine } from '../services/workflow-diff-engine'; import { WorkflowDiffEngine } from '../services/workflow-diff-engine';
import { getN8nApiClient } from './handlers-n8n-manager'; import { getN8nApiClient } from './handlers-n8n-manager';
import { N8nApiError, getUserFriendlyErrorMessage } from '../utils/n8n-errors'; import { N8nApiError, getUserFriendlyErrorMessage } from '../utils/n8n-errors';
@@ -31,11 +31,6 @@ function getValidator(repository: NodeRepository): WorkflowValidator {
return cachedValidator; return cachedValidator;
} }
// Operation types that identify nodes by nodeId/nodeName
const NODE_TARGETING_OPERATIONS = new Set([
'updateNode', 'removeNode', 'moveNode', 'enableNode', 'disableNode'
]);
// Zod schema for the diff request // Zod schema for the diff request
const workflowDiffSchema = z.object({ const workflowDiffSchema = z.object({
id: z.string(), id: z.string(),
@@ -53,8 +48,8 @@ const workflowDiffSchema = z.object({
target: z.string().optional(), target: z.string().optional(),
from: z.string().optional(), // For rewireConnection from: z.string().optional(), // For rewireConnection
to: z.string().optional(), // For rewireConnection to: z.string().optional(), // For rewireConnection
sourceOutput: z.union([z.string(), z.number()]).transform(String).optional(), sourceOutput: z.string().optional(),
targetInput: z.union([z.string(), z.number()]).transform(String).optional(), targetInput: z.string().optional(),
sourceIndex: z.number().optional(), sourceIndex: z.number().optional(),
targetIndex: z.number().optional(), targetIndex: z.number().optional(),
// Smart parameters (Phase 1 UX improvement) // Smart parameters (Phase 1 UX improvement)
@@ -68,25 +63,6 @@ const workflowDiffSchema = z.object({
settings: z.any().optional(), settings: z.any().optional(),
name: z.string().optional(), name: z.string().optional(),
tag: z.string().optional(), tag: z.string().optional(),
// Transfer operation
destinationProjectId: z.string().min(1).optional(),
// Aliases: LLMs often use "id" instead of "nodeId" — accept both
id: z.string().optional(),
}).transform((op) => {
// Normalize common field aliases for node-targeting operations:
// - "name" → "nodeName" (LLMs confuse the updateName "name" field with node identification)
// - "id" → "nodeId" (natural alias)
if (NODE_TARGETING_OPERATIONS.has(op.type)) {
if (!op.nodeName && !op.nodeId && op.name) {
op.nodeName = op.name;
op.name = undefined;
}
if (!op.nodeId && op.id) {
op.nodeId = op.id;
op.id = undefined;
}
}
return op;
})), })),
validateOnly: z.boolean().optional(), validateOnly: z.boolean().optional(),
continueOnError: z.boolean().optional(), continueOnError: z.boolean().optional(),
@@ -202,12 +178,11 @@ export async function handleUpdatePartialWorkflow(
// Complete failure - return error // Complete failure - return error
return { return {
success: false, success: false,
saved: false,
error: 'Failed to apply diff operations', error: 'Failed to apply diff operations',
operationsApplied: diffResult.operationsApplied,
details: { details: {
errors: diffResult.errors, errors: diffResult.errors,
warnings: diffResult.warnings, warnings: diffResult.warnings,
operationsApplied: diffResult.operationsApplied,
applied: diffResult.applied, applied: diffResult.applied,
failed: diffResult.failed failed: diffResult.failed
} }
@@ -290,7 +265,6 @@ export async function handleUpdatePartialWorkflow(
if (!skipValidation) { if (!skipValidation) {
return { return {
success: false, success: false,
saved: false,
error: errorMessage, error: errorMessage,
details: { details: {
errors: structureErrors, errors: structureErrors,
@@ -299,7 +273,7 @@ export async function handleUpdatePartialWorkflow(
applied: diffResult.applied, applied: diffResult.applied,
recoveryGuidance: recoverySteps, recoveryGuidance: recoverySteps,
note: 'Operations were applied but created an invalid workflow structure. The workflow was NOT saved to n8n to prevent UI rendering errors.', note: 'Operations were applied but created an invalid workflow structure. The workflow was NOT saved to n8n to prevent UI rendering errors.',
autoSanitizationNote: 'Auto-sanitization runs on modified nodes during updates to fix operator structures and add missing metadata. However, it cannot fix all issues (e.g., broken connections, branch mismatches). Use the recovery guidance above to resolve remaining issues.' autoSanitizationNote: 'Auto-sanitization runs on all nodes during updates to fix operator structures and add missing metadata. However, it cannot fix all issues (e.g., broken connections, branch mismatches). Use the recovery guidance above to resolve remaining issues.'
} }
}; };
} }
@@ -315,83 +289,6 @@ export async function handleUpdatePartialWorkflow(
try { try {
const updatedWorkflow = await client.updateWorkflow(input.id, diffResult.workflow!); const updatedWorkflow = await client.updateWorkflow(input.id, diffResult.workflow!);
// Handle tag operations via dedicated API (#599)
let tagWarnings: string[] = [];
if (diffResult.tagsToAdd?.length || diffResult.tagsToRemove?.length) {
try {
// Get existing tags from the updated workflow
const existingTags: Array<{ id: string; name: string }> = Array.isArray(updatedWorkflow.tags)
? updatedWorkflow.tags.map((t: any) => typeof t === 'object' ? { id: t.id, name: t.name } : { id: '', name: t })
: [];
// Resolve tag names to IDs
const allTags = await client.listTags();
const tagMap = new Map<string, string>();
for (const t of allTags.data) {
if (t.id) tagMap.set(t.name.toLowerCase(), t.id);
}
// Create any tags that don't exist yet
for (const tagName of (diffResult.tagsToAdd || [])) {
if (!tagMap.has(tagName.toLowerCase())) {
try {
const newTag = await client.createTag({ name: tagName });
if (newTag.id) tagMap.set(tagName.toLowerCase(), newTag.id);
} catch (createErr) {
tagWarnings.push(`Failed to create tag "${tagName}": ${createErr instanceof Error ? createErr.message : 'Unknown error'}`);
}
}
}
// Compute final tag set — resolve string-type tags via tagMap
const currentTagIds = new Set<string>();
for (const et of existingTags) {
if (et.id) {
currentTagIds.add(et.id);
} else {
const resolved = tagMap.get(et.name.toLowerCase());
if (resolved) currentTagIds.add(resolved);
}
}
for (const tagName of (diffResult.tagsToAdd || [])) {
const tagId = tagMap.get(tagName.toLowerCase());
if (tagId) currentTagIds.add(tagId);
}
for (const tagName of (diffResult.tagsToRemove || [])) {
const tagId = tagMap.get(tagName.toLowerCase());
if (tagId) currentTagIds.delete(tagId);
}
// Update workflow tags via dedicated API
await client.updateWorkflowTags(input.id, Array.from(currentTagIds));
} catch (tagError) {
tagWarnings.push(`Tag update failed: ${tagError instanceof Error ? tagError.message : 'Unknown error'}`);
logger.warn('Tag operations failed (non-blocking)', tagError);
}
}
// Handle project transfer if requested (before activation so workflow is in target project first)
let transferMessage = '';
if (diffResult.transferToProjectId) {
try {
await client.transferWorkflow(input.id, diffResult.transferToProjectId);
transferMessage = ` Workflow transferred to project ${diffResult.transferToProjectId}.`;
} catch (transferError) {
logger.error('Failed to transfer workflow to project', transferError);
return {
success: false,
saved: true,
error: 'Workflow updated successfully but project transfer failed',
details: {
workflowUpdated: true,
transferError: transferError instanceof Error ? transferError.message : 'Unknown error'
}
};
}
}
// Handle activation/deactivation if requested // Handle activation/deactivation if requested
let finalWorkflow = updatedWorkflow; let finalWorkflow = updatedWorkflow;
let activationMessage = ''; let activationMessage = '';
@@ -422,7 +319,6 @@ export async function handleUpdatePartialWorkflow(
logger.error('Failed to activate workflow after update', activationError); logger.error('Failed to activate workflow after update', activationError);
return { return {
success: false, success: false,
saved: true,
error: 'Workflow updated successfully but activation failed', error: 'Workflow updated successfully but activation failed',
details: { details: {
workflowUpdated: true, workflowUpdated: true,
@@ -438,7 +334,6 @@ export async function handleUpdatePartialWorkflow(
logger.error('Failed to deactivate workflow after update', deactivationError); logger.error('Failed to deactivate workflow after update', deactivationError);
return { return {
success: false, success: false,
saved: true,
error: 'Workflow updated successfully but deactivation failed', error: 'Workflow updated successfully but deactivation failed',
details: { details: {
workflowUpdated: true, workflowUpdated: true,
@@ -468,7 +363,6 @@ export async function handleUpdatePartialWorkflow(
return { return {
success: true, success: true,
saved: true,
data: { data: {
id: finalWorkflow.id, id: finalWorkflow.id,
name: finalWorkflow.name, name: finalWorkflow.name,
@@ -476,12 +370,12 @@ export async function handleUpdatePartialWorkflow(
nodeCount: finalWorkflow.nodes?.length || 0, nodeCount: finalWorkflow.nodes?.length || 0,
operationsApplied: diffResult.operationsApplied operationsApplied: diffResult.operationsApplied
}, },
message: `Workflow "${finalWorkflow.name}" updated successfully. Applied ${diffResult.operationsApplied} operations.${transferMessage}${activationMessage} Use n8n_get_workflow with mode 'structure' to verify current state.`, message: `Workflow "${finalWorkflow.name}" updated successfully. Applied ${diffResult.operationsApplied} operations.${activationMessage} Use n8n_get_workflow with mode 'structure' to verify current state.`,
details: { details: {
applied: diffResult.applied, applied: diffResult.applied,
failed: diffResult.failed, failed: diffResult.failed,
errors: diffResult.errors, errors: diffResult.errors,
warnings: mergeWarnings(diffResult.warnings, tagWarnings) warnings: diffResult.warnings
} }
}; };
} catch (error) { } catch (error) {
@@ -519,9 +413,7 @@ export async function handleUpdatePartialWorkflow(
return { return {
success: false, success: false,
error: 'Invalid input', error: 'Invalid input',
details: { details: { errors: error.errors }
errors: error.errors.map(e => `${e.path.join('.')}: ${e.message}`)
}
}; };
} }
@@ -533,21 +425,6 @@ export async function handleUpdatePartialWorkflow(
} }
} }
/**
* Merge diff engine warnings with tag operation warnings into a single array.
* Returns undefined when there are no warnings to keep the response clean.
*/
function mergeWarnings(
diffWarnings: WorkflowDiffValidationError[] | undefined,
tagWarnings: string[]
): WorkflowDiffValidationError[] | undefined {
const merged: WorkflowDiffValidationError[] = [
...(diffWarnings || []),
...tagWarnings.map(w => ({ operation: -1, message: w }))
];
return merged.length > 0 ? merged : undefined;
}
/** /**
* Infer intent from operations when not explicitly provided * Infer intent from operations when not explicitly provided
*/ */
@@ -581,8 +458,6 @@ function inferIntentFromOperations(operations: any[]): string {
return 'Activate workflow'; return 'Activate workflow';
case 'deactivateWorkflow': case 'deactivateWorkflow':
return 'Deactivate workflow'; return 'Deactivate workflow';
case 'transferWorkflow':
return `Transfer workflow to project ${op.destinationProjectId || ''}`.trim();
default: default:
return `Workflow ${op.type}`; return `Workflow ${op.type}`;
} }

View File

@@ -210,13 +210,6 @@ export class N8NDocumentationMCPServer {
} }
}); });
// Attach a no-op catch handler to prevent Node.js from flagging this as an
// unhandled rejection in the interval between construction and the first
// await of this.initialized (via ensureInitialized). This does NOT suppress
// the error: the original this.initialized promise still rejects, and
// ensureInitialized() will re-throw it when awaited.
this.initialized.catch(() => {});
logger.info('Initializing n8n Documentation MCP server'); logger.info('Initializing n8n Documentation MCP server');
this.server = new Server( this.server = new Server(
@@ -1029,11 +1022,6 @@ export class N8NDocumentationMCPServer {
? { valid: true, errors: [] } ? { valid: true, errors: [] }
: { valid: false, errors: [{ field: 'action', message: 'action is required' }] }; : { valid: false, errors: [{ field: 'action', message: 'action is required' }] };
break; break;
case 'n8n_manage_datatable':
validationResult = args.action
? { valid: true, errors: [] }
: { valid: false, errors: [{ field: 'action', message: 'action is required' }] };
break;
case 'n8n_deploy_template': case 'n8n_deploy_template':
// Requires templateId parameter // Requires templateId parameter
validationResult = args.templateId !== undefined validationResult = args.templateId !== undefined
@@ -1501,26 +1489,6 @@ export class N8NDocumentationMCPServer {
if (!this.repository) throw new Error('Repository not initialized'); if (!this.repository) throw new Error('Repository not initialized');
return n8nHandlers.handleDeployTemplate(args, this.templateService, this.repository, this.instanceContext); return n8nHandlers.handleDeployTemplate(args, this.templateService, this.repository, this.instanceContext);
case 'n8n_manage_datatable': {
this.validateToolParams(name, args, ['action']);
const dtAction = args.action;
// Each handler validates its own inputs via Zod schemas
switch (dtAction) {
case 'createTable': return n8nHandlers.handleCreateTable(args, this.instanceContext);
case 'listTables': return n8nHandlers.handleListTables(args, this.instanceContext);
case 'getTable': return n8nHandlers.handleGetTable(args, this.instanceContext);
case 'updateTable': return n8nHandlers.handleUpdateTable(args, this.instanceContext);
case 'deleteTable': return n8nHandlers.handleDeleteTable(args, this.instanceContext);
case 'getRows': return n8nHandlers.handleGetRows(args, this.instanceContext);
case 'insertRows': return n8nHandlers.handleInsertRows(args, this.instanceContext);
case 'updateRows': return n8nHandlers.handleUpdateRows(args, this.instanceContext);
case 'upsertRows': return n8nHandlers.handleUpsertRows(args, this.instanceContext);
case 'deleteRows': return n8nHandlers.handleDeleteRows(args, this.instanceContext);
default:
throw new Error(`Unknown action: ${dtAction}. Valid actions: createTable, listTables, getTable, updateTable, deleteTable, getRows, insertRows, updateRows, upsertRows, deleteRows`);
}
}
default: default:
throw new Error(`Unknown tool: ${name}`); throw new Error(`Unknown tool: ${name}`);
} }

View File

@@ -22,8 +22,7 @@ import {
n8nTestWorkflowDoc, n8nTestWorkflowDoc,
n8nExecutionsDoc, n8nExecutionsDoc,
n8nWorkflowVersionsDoc, n8nWorkflowVersionsDoc,
n8nDeployTemplateDoc, n8nDeployTemplateDoc
n8nManageDatatableDoc
} from './workflow_management'; } from './workflow_management';
// Combine all tool documentations into a single object // Combine all tool documentations into a single object
@@ -61,8 +60,7 @@ export const toolsDocumentation: Record<string, ToolDocumentation> = {
n8n_test_workflow: n8nTestWorkflowDoc, n8n_test_workflow: n8nTestWorkflowDoc,
n8n_executions: n8nExecutionsDoc, n8n_executions: n8nExecutionsDoc,
n8n_workflow_versions: n8nWorkflowVersionsDoc, n8n_workflow_versions: n8nWorkflowVersionsDoc,
n8n_deploy_template: n8nDeployTemplateDoc, n8n_deploy_template: n8nDeployTemplateDoc
n8n_manage_datatable: n8nManageDatatableDoc
}; };
// Re-export types // Re-export types

View File

@@ -10,4 +10,3 @@ export { n8nTestWorkflowDoc } from './n8n-test-workflow';
export { n8nExecutionsDoc } from './n8n-executions'; export { n8nExecutionsDoc } from './n8n-executions';
export { n8nWorkflowVersionsDoc } from './n8n-workflow-versions'; export { n8nWorkflowVersionsDoc } from './n8n-workflow-versions';
export { n8nDeployTemplateDoc } from './n8n-deploy-template'; export { n8nDeployTemplateDoc } from './n8n-deploy-template';
export { n8nManageDatatableDoc } from './n8n-manage-datatable';

View File

@@ -1,109 +0,0 @@
import { ToolDocumentation } from '../types';
export const n8nManageDatatableDoc: ToolDocumentation = {
name: 'n8n_manage_datatable',
category: 'workflow_management',
essentials: {
description: 'Manage n8n data tables and rows. Unified tool for table CRUD and row operations with filtering, pagination, and dry-run support.',
keyParameters: ['action', 'tableId', 'name', 'data', 'filter'],
example: 'n8n_manage_datatable({action: "createTable", name: "Contacts", columns: [{name: "email", type: "string"}]})',
performance: 'Fast (100-500ms)',
tips: [
'Table actions: createTable, listTables, getTable, updateTable, deleteTable',
'Row actions: getRows, insertRows, updateRows, upsertRows, deleteRows',
'Use dryRun: true to preview update/upsert/delete before applying',
'Filter supports: eq, neq, like, ilike, gt, gte, lt, lte conditions',
'Use returnData: true to get affected rows back from update/upsert/delete',
'Requires n8n enterprise or cloud with data tables feature'
]
},
full: {
description: `**Table Actions:**
- **createTable**: Create a new data table with optional typed columns
- **listTables**: List all data tables (paginated)
- **getTable**: Get table details and column definitions by ID
- **updateTable**: Rename an existing table
- **deleteTable**: Permanently delete a table and all its rows
**Row Actions:**
- **getRows**: List rows with filtering, sorting, search, and pagination
- **insertRows**: Insert one or more rows (bulk)
- **updateRows**: Update rows matching a filter condition
- **upsertRows**: Update matching row or insert if none match
- **deleteRows**: Delete rows matching a filter condition (filter required)
**Filter System:** Used in getRows, updateRows, upsertRows, deleteRows
- Combine conditions with "and" (default) or "or"
- Conditions: eq, neq, like, ilike, gt, gte, lt, lte
- Example: {type: "and", filters: [{columnName: "status", condition: "eq", value: "active"}]}
**Dry Run:** updateRows, upsertRows, and deleteRows support dryRun: true to preview changes without applying them.`,
parameters: {
action: { type: 'string', required: true, description: 'Operation to perform' },
tableId: { type: 'string', required: false, description: 'Data table ID (required for all except createTable and listTables)' },
name: { type: 'string', required: false, description: 'For createTable/updateTable: table name' },
columns: { type: 'array', required: false, description: 'For createTable: column definitions [{name, type?}]. Types: string, number, boolean, date' },
data: { type: 'array|object', required: false, description: 'For insertRows: array of row objects. For updateRows/upsertRows: object with column values' },
filter: { type: 'object', required: false, description: 'Filter: {type?: "and"|"or", filters: [{columnName, condition, value}]}' },
limit: { type: 'number', required: false, description: 'For listTables/getRows: max results (1-100)' },
cursor: { type: 'string', required: false, description: 'For listTables/getRows: pagination cursor' },
sortBy: { type: 'string', required: false, description: 'For getRows: "columnName:asc" or "columnName:desc"' },
search: { type: 'string', required: false, description: 'For getRows: full-text search across string columns' },
returnType: { type: 'string', required: false, description: 'For insertRows: "count" (default), "id", or "all"' },
returnData: { type: 'boolean', required: false, description: 'For updateRows/upsertRows/deleteRows: return affected rows (default: false)' },
dryRun: { type: 'boolean', required: false, description: 'For updateRows/upsertRows/deleteRows: preview without applying (default: false)' },
},
returns: `Depends on action:
- createTable: {id, name}
- listTables: {tables, count, nextCursor?}
- getTable: Full table object with columns
- updateTable: Updated table object
- deleteTable: Success message
- getRows: {rows, count, nextCursor?}
- insertRows: Depends on returnType (count/ids/rows)
- updateRows: Update result with optional rows
- upsertRows: Upsert result with action type
- deleteRows: Delete result with optional rows`,
examples: [
'// Create a table\nn8n_manage_datatable({action: "createTable", name: "Contacts", columns: [{name: "email", type: "string"}, {name: "score", type: "number"}]})',
'// List all tables\nn8n_manage_datatable({action: "listTables"})',
'// Get table details\nn8n_manage_datatable({action: "getTable", tableId: "dt-123"})',
'// Rename a table\nn8n_manage_datatable({action: "updateTable", tableId: "dt-123", name: "New Name"})',
'// Delete a table\nn8n_manage_datatable({action: "deleteTable", tableId: "dt-123"})',
'// Get rows with filter\nn8n_manage_datatable({action: "getRows", tableId: "dt-123", filter: {filters: [{columnName: "status", condition: "eq", value: "active"}]}, limit: 50})',
'// Search rows\nn8n_manage_datatable({action: "getRows", tableId: "dt-123", search: "john", sortBy: "name:asc"})',
'// Insert rows\nn8n_manage_datatable({action: "insertRows", tableId: "dt-123", data: [{email: "a@b.com", score: 10}], returnType: "all"})',
'// Update rows (dry run)\nn8n_manage_datatable({action: "updateRows", tableId: "dt-123", filter: {filters: [{columnName: "score", condition: "lt", value: 5}]}, data: {status: "inactive"}, dryRun: true})',
'// Upsert a row\nn8n_manage_datatable({action: "upsertRows", tableId: "dt-123", filter: {filters: [{columnName: "email", condition: "eq", value: "a@b.com"}]}, data: {score: 15}, returnData: true})',
'// Delete rows\nn8n_manage_datatable({action: "deleteRows", tableId: "dt-123", filter: {filters: [{columnName: "status", condition: "eq", value: "deleted"}]}})',
],
useCases: [
'Persist structured workflow data across executions',
'Store and query lookup tables for workflow logic',
'Bulk insert records from external data sources',
'Conditionally update records matching criteria',
'Upsert to maintain unique records by key column',
'Clean up old or invalid rows with filtered delete',
'Preview changes with dryRun before modifying data',
],
performance: 'Table operations: 50-300ms. Row operations: 100-500ms depending on data size and filters.',
bestPractices: [
'Define column types upfront for schema consistency',
'Use dryRun: true before bulk updates/deletes to verify filter correctness',
'Use returnType: "count" (default) for insertRows to minimize response size',
'Use filter with specific conditions to avoid unintended bulk operations',
'Use cursor-based pagination for large result sets',
'Use sortBy for deterministic row ordering',
],
pitfalls: [
'Requires N8N_API_URL and N8N_API_KEY configured',
'Feature only available on n8n enterprise or cloud plans',
'deleteTable permanently deletes all rows — cannot be undone',
'deleteRows requires a filter — cannot delete all rows without one',
'Column types cannot be changed after table creation via API',
'updateTable can only rename the table (no column modifications via public API)',
'projectId cannot be set via the public API — use the n8n UI',
],
relatedTools: ['n8n_create_workflow', 'n8n_list_workflows', 'n8n_health_check'],
},
};

View File

@@ -4,7 +4,7 @@ export const n8nUpdatePartialWorkflowDoc: ToolDocumentation = {
name: 'n8n_update_partial_workflow', name: 'n8n_update_partial_workflow',
category: 'workflow_management', category: 'workflow_management',
essentials: { essentials: {
description: 'Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, rewireConnection, cleanStaleConnections, replaceConnections, updateSettings, updateName, add/removeTag, activateWorkflow, deactivateWorkflow, transferWorkflow. Supports smart parameters (branch, case) for multi-output nodes. Full support for AI connections (ai_languageModel, ai_tool, ai_memory, ai_embedding, ai_vectorStore, ai_document, ai_textSplitter, ai_outputParser).', description: 'Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, rewireConnection, cleanStaleConnections, replaceConnections, updateSettings, updateName, add/removeTag, activateWorkflow, deactivateWorkflow. Supports smart parameters (branch, case) for multi-output nodes. Full support for AI connections (ai_languageModel, ai_tool, ai_memory, ai_embedding, ai_vectorStore, ai_document, ai_textSplitter, ai_outputParser).',
keyParameters: ['id', 'operations', 'continueOnError'], keyParameters: ['id', 'operations', 'continueOnError'],
example: 'n8n_update_partial_workflow({id: "wf_123", operations: [{type: "rewireConnection", source: "IF", from: "Old", to: "New", branch: "true"}]})', example: 'n8n_update_partial_workflow({id: "wf_123", operations: [{type: "rewireConnection", source: "IF", from: "Old", to: "New", branch: "true"}]})',
performance: 'Fast (50-200ms)', performance: 'Fast (50-200ms)',
@@ -22,8 +22,7 @@ export const n8nUpdatePartialWorkflowDoc: ToolDocumentation = {
'Batch AI component connections for atomic updates', 'Batch AI component connections for atomic updates',
'Auto-sanitization: ALL nodes auto-fixed during updates (operator structures, missing metadata)', 'Auto-sanitization: ALL nodes auto-fixed during updates (operator structures, missing metadata)',
'Node renames automatically update all connection references - no manual connection operations needed', 'Node renames automatically update all connection references - no manual connection operations needed',
'Activate/deactivate workflows: Use activateWorkflow/deactivateWorkflow operations (requires activatable triggers like webhook/schedule)', 'Activate/deactivate workflows: Use activateWorkflow/deactivateWorkflow operations (requires activatable triggers like webhook/schedule)'
'Transfer workflows between projects: Use transferWorkflow with destinationProjectId (enterprise feature)'
] ]
}, },
full: { full: {
@@ -56,9 +55,6 @@ export const n8nUpdatePartialWorkflowDoc: ToolDocumentation = {
- **activateWorkflow**: Activate the workflow to enable automatic execution via triggers - **activateWorkflow**: Activate the workflow to enable automatic execution via triggers
- **deactivateWorkflow**: Deactivate the workflow to prevent automatic execution - **deactivateWorkflow**: Deactivate the workflow to prevent automatic execution
### Project Management Operations (1 type):
- **transferWorkflow**: Transfer the workflow to a different project. Requires \`destinationProjectId\`. Enterprise/cloud feature.
## Smart Parameters for Multi-Output Nodes ## Smart Parameters for Multi-Output Nodes
For **IF nodes**, use semantic 'branch' parameter instead of technical sourceIndex: For **IF nodes**, use semantic 'branch' parameter instead of technical sourceIndex:
@@ -198,12 +194,12 @@ Please choose a different name.
- Can rename a node and add/remove connections using the new name in the same batch - Can rename a node and add/remove connections using the new name in the same batch
- Use \`validateOnly: true\` to preview effects before applying - Use \`validateOnly: true\` to preview effects before applying
## Removing Properties with null ## Removing Properties with undefined
To remove a property from a node, set its value to \`null\` in the updates object. This is essential when migrating from deprecated properties or cleaning up optional configuration fields. To remove a property from a node, set its value to \`undefined\` in the updates object. This is essential when migrating from deprecated properties or cleaning up optional configuration fields.
### Why Use null? ### Why Use undefined?
- **Property removal**: Setting a property to \`null\` removes it completely from the node object - **Property removal vs. null**: Setting a property to \`undefined\` removes it completely from the node object, while \`null\` sets the property to a null value
- **Validation constraints**: Some properties are mutually exclusive (e.g., \`continueOnFail\` and \`onError\`). Simply setting one without removing the other will fail validation - **Validation constraints**: Some properties are mutually exclusive (e.g., \`continueOnFail\` and \`onError\`). Simply setting one without removing the other will fail validation
- **Deprecated property migration**: When n8n deprecates properties, you must remove the old property before the new one will work - **Deprecated property migration**: When n8n deprecates properties, you must remove the old property before the new one will work
@@ -215,7 +211,7 @@ n8n_update_partial_workflow({
operations: [{ operations: [{
type: "updateNode", type: "updateNode",
nodeName: "HTTP Request", nodeName: "HTTP Request",
updates: { onError: null } updates: { onError: undefined }
}] }]
}); });
@@ -225,7 +221,7 @@ n8n_update_partial_workflow({
operations: [{ operations: [{
type: "updateNode", type: "updateNode",
nodeId: "node_abc", nodeId: "node_abc",
updates: { disabled: null } updates: { disabled: undefined }
}] }]
}); });
\`\`\` \`\`\`
@@ -239,7 +235,7 @@ n8n_update_partial_workflow({
operations: [{ operations: [{
type: "updateNode", type: "updateNode",
nodeName: "API Request", nodeName: "API Request",
updates: { "parameters.authentication": null } updates: { "parameters.authentication": undefined }
}] }]
}); });
@@ -249,7 +245,7 @@ n8n_update_partial_workflow({
operations: [{ operations: [{
type: "updateNode", type: "updateNode",
nodeName: "HTTP Request", nodeName: "HTTP Request",
updates: { "parameters.headers": null } updates: { "parameters.headers": undefined }
}] }]
}); });
\`\`\` \`\`\`
@@ -275,7 +271,7 @@ n8n_update_partial_workflow({
type: "updateNode", type: "updateNode",
nodeName: "HTTP Request", nodeName: "HTTP Request",
updates: { updates: {
continueOnFail: null, continueOnFail: undefined,
onError: "continueErrorOutput" onError: "continueErrorOutput"
} }
}] }]
@@ -291,15 +287,15 @@ n8n_update_partial_workflow({
type: "updateNode", type: "updateNode",
nodeName: "Data Processor", nodeName: "Data Processor",
updates: { updates: {
continueOnFail: null, continueOnFail: undefined,
alwaysOutputData: null, alwaysOutputData: undefined,
"parameters.legacy_option": null "parameters.legacy_option": undefined
} }
}] }]
}); });
\`\`\` \`\`\`
### When to Use null ### When to Use undefined
- Removing deprecated properties during migration - Removing deprecated properties during migration
- Cleaning up optional configuration flags - Cleaning up optional configuration flags
- Resolving mutual exclusivity validation errors - Resolving mutual exclusivity validation errors
@@ -345,14 +341,11 @@ n8n_update_partial_workflow({
'// Rewire AI Agent to use different language model\nn8n_update_partial_workflow({id: "ai9", operations: [{type: "rewireConnection", source: "AI Agent", from: "OpenAI Chat Model", to: "Anthropic Chat Model", sourceOutput: "ai_languageModel"}]})', '// Rewire AI Agent to use different language model\nn8n_update_partial_workflow({id: "ai9", operations: [{type: "rewireConnection", source: "AI Agent", from: "OpenAI Chat Model", to: "Anthropic Chat Model", sourceOutput: "ai_languageModel"}]})',
'// Replace all AI tools for an agent\nn8n_update_partial_workflow({id: "ai10", operations: [\n {type: "removeConnection", source: "Old Tool 1", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "removeConnection", source: "Old Tool 2", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "addConnection", source: "New HTTP Tool", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "addConnection", source: "New Code Tool", target: "AI Agent", sourceOutput: "ai_tool"}\n]})', '// Replace all AI tools for an agent\nn8n_update_partial_workflow({id: "ai10", operations: [\n {type: "removeConnection", source: "Old Tool 1", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "removeConnection", source: "Old Tool 2", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "addConnection", source: "New HTTP Tool", target: "AI Agent", sourceOutput: "ai_tool"},\n {type: "addConnection", source: "New Code Tool", target: "AI Agent", sourceOutput: "ai_tool"}\n]})',
'\n// ============ REMOVING PROPERTIES EXAMPLES ============', '\n// ============ REMOVING PROPERTIES EXAMPLES ============',
'// Remove a simple property\nn8n_update_partial_workflow({id: "rm1", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {onError: null}}]})', '// Remove a simple property\nn8n_update_partial_workflow({id: "rm1", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {onError: undefined}}]})',
'// Migrate from deprecated continueOnFail to onError\nn8n_update_partial_workflow({id: "rm2", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {continueOnFail: null, onError: "continueErrorOutput"}}]})', '// Migrate from deprecated continueOnFail to onError\nn8n_update_partial_workflow({id: "rm2", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {continueOnFail: undefined, onError: "continueErrorOutput"}}]})',
'// Remove nested property\nn8n_update_partial_workflow({id: "rm3", operations: [{type: "updateNode", nodeName: "API Request", updates: {"parameters.authentication": null}}]})', '// Remove nested property\nn8n_update_partial_workflow({id: "rm3", operations: [{type: "updateNode", nodeName: "API Request", updates: {"parameters.authentication": undefined}}]})',
'// Remove multiple properties\nn8n_update_partial_workflow({id: "rm4", operations: [{type: "updateNode", nodeName: "Data Processor", updates: {continueOnFail: null, alwaysOutputData: null, "parameters.legacy_option": null}}]})', '// Remove multiple properties\nn8n_update_partial_workflow({id: "rm4", operations: [{type: "updateNode", nodeName: "Data Processor", updates: {continueOnFail: undefined, alwaysOutputData: undefined, "parameters.legacy_option": undefined}}]})',
'// Remove entire array property\nn8n_update_partial_workflow({id: "rm5", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {"parameters.headers": null}}]})', '// Remove entire array property\nn8n_update_partial_workflow({id: "rm5", operations: [{type: "updateNode", nodeName: "HTTP Request", updates: {"parameters.headers": undefined}}]})'
'\n// ============ PROJECT TRANSFER EXAMPLES ============',
'// Transfer workflow to a different project\nn8n_update_partial_workflow({id: "tf1", operations: [{type: "transferWorkflow", destinationProjectId: "project-abc-123"}]})',
'// Transfer and activate in one call\nn8n_update_partial_workflow({id: "tf2", operations: [{type: "transferWorkflow", destinationProjectId: "project-abc-123"}, {type: "activateWorkflow"}]})'
], ],
useCases: [ useCases: [
'Rewire connections when replacing nodes', 'Rewire connections when replacing nodes',
@@ -370,8 +363,7 @@ n8n_update_partial_workflow({
'Add fallback language models to AI Agents', 'Add fallback language models to AI Agents',
'Configure Vector Store retrieval systems', 'Configure Vector Store retrieval systems',
'Swap language models in existing AI workflows', 'Swap language models in existing AI workflows',
'Batch-update AI tool connections', 'Batch-update AI tool connections'
'Transfer workflows between team projects (enterprise)'
], ],
performance: 'Very fast - typically 50-200ms. Much faster than full updates as only changes are processed.', performance: 'Very fast - typically 50-200ms. Much faster than full updates as only changes are processed.',
bestPractices: [ bestPractices: [
@@ -391,9 +383,9 @@ n8n_update_partial_workflow({
'Use targetIndex for fallback models (primary=0, fallback=1)', 'Use targetIndex for fallback models (primary=0, fallback=1)',
'Batch AI component connections in a single operation for atomicity', 'Batch AI component connections in a single operation for atomicity',
'Validate AI workflows after connection changes to catch configuration errors', 'Validate AI workflows after connection changes to catch configuration errors',
'To remove properties, set them to null in the updates object', 'To remove properties, set them to undefined (not null) in the updates object',
'When migrating from deprecated properties, remove the old property and add the new one in the same operation', 'When migrating from deprecated properties, remove the old property and add the new one in the same operation',
'Use null to resolve mutual exclusivity validation errors between properties', 'Use undefined to resolve mutual exclusivity validation errors between properties',
'Batch multiple property removals in a single updateNode operation for efficiency' 'Batch multiple property removals in a single updateNode operation for efficiency'
], ],
pitfalls: [ pitfalls: [
@@ -415,8 +407,8 @@ n8n_update_partial_workflow({
'**Auto-sanitization runs on ALL nodes**: When ANY update is made, ALL nodes in the workflow are sanitized (not just modified ones)', '**Auto-sanitization runs on ALL nodes**: When ANY update is made, ALL nodes in the workflow are sanitized (not just modified ones)',
'**Auto-sanitization cannot fix everything**: It fixes operator structures and missing metadata, but cannot fix broken connections or branch mismatches', '**Auto-sanitization cannot fix everything**: It fixes operator structures and missing metadata, but cannot fix broken connections or branch mismatches',
'**Corrupted workflows beyond repair**: Workflows in paradoxical states (API returns corrupt, API rejects updates) cannot be fixed via API - must be recreated', '**Corrupted workflows beyond repair**: Workflows in paradoxical states (API returns corrupt, API rejects updates) cannot be fixed via API - must be recreated',
'To remove a property, set it to null in the updates object', 'Setting a property to null does NOT remove it - use undefined instead',
'When properties are mutually exclusive (e.g., continueOnFail and onError), setting only the new property will fail - you must remove the old one with null', 'When properties are mutually exclusive (e.g., continueOnFail and onError), setting only the new property will fail - you must remove the old one with undefined',
'Removing a required property may cause validation errors - check node documentation first', 'Removing a required property may cause validation errors - check node documentation first',
'Nested property removal with dot notation only removes the specific nested field, not the entire parent object', 'Nested property removal with dot notation only removes the specific nested field, not the entire parent object',
'Array index notation (e.g., "parameters.headers[0]") is not supported - remove the entire array property instead' 'Array index notation (e.g., "parameters.headers[0]") is not supported - remove the entire array property instead'

View File

@@ -63,10 +63,6 @@ export const n8nManagementTools: ToolDefinition[] = [
executionTimeout: { type: 'number' }, executionTimeout: { type: 'number' },
errorWorkflow: { type: 'string' } errorWorkflow: { type: 'string' }
} }
},
projectId: {
type: 'string',
description: 'Optional project ID to create the workflow in (enterprise feature)'
} }
}, },
required: ['name', 'nodes', 'connections'] required: ['name', 'nodes', 'connections']
@@ -147,7 +143,7 @@ export const n8nManagementTools: ToolDefinition[] = [
}, },
{ {
name: 'n8n_update_partial_workflow', name: 'n8n_update_partial_workflow',
description: `Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, updateSettings, updateName, add/removeTag, activate/deactivateWorkflow, transferWorkflow. See tools_documentation("n8n_update_partial_workflow", "full") for details.`, description: `Update workflow incrementally with diff operations. Types: addNode, removeNode, updateNode, moveNode, enable/disableNode, addConnection, removeConnection, updateSettings, updateName, add/removeTag. See tools_documentation("n8n_update_partial_workflow", "full") for details.`,
inputSchema: { inputSchema: {
type: 'object', type: 'object',
additionalProperties: true, // Allow any extra properties Claude Desktop might add additionalProperties: true, // Allow any extra properties Claude Desktop might add
@@ -606,52 +602,5 @@ export const n8nManagementTools: ToolDefinition[] = [
destructiveHint: false, destructiveHint: false,
openWorldHint: true, openWorldHint: true,
}, },
}, }
{
name: 'n8n_manage_datatable',
description: `Manage n8n data tables and rows. Actions: createTable, listTables, getTable, updateTable, deleteTable, getRows, insertRows, updateRows, upsertRows, deleteRows. Requires n8n enterprise/cloud with data tables feature.`,
inputSchema: {
type: 'object',
properties: {
action: {
type: 'string',
enum: ['createTable', 'listTables', 'getTable', 'updateTable', 'deleteTable', 'getRows', 'insertRows', 'updateRows', 'upsertRows', 'deleteRows'],
description: 'Operation to perform',
},
tableId: { type: 'string', description: 'Data table ID (required for all actions except createTable and listTables)' },
name: { type: 'string', description: 'For createTable/updateTable: table name' },
columns: {
type: 'array',
description: 'For createTable: column definitions',
items: {
type: 'object',
properties: {
name: { type: 'string' },
type: { type: 'string', enum: ['string', 'number', 'boolean', 'date'] },
},
required: ['name'],
},
},
data: { description: 'For insertRows: array of row objects. For updateRows/upsertRows: object with column values.' },
filter: {
type: 'object',
description: 'For getRows/updateRows/upsertRows/deleteRows: {type?: "and"|"or", filters: [{columnName, condition, value}]}',
},
limit: { type: 'number', description: 'For listTables/getRows: max results (1-100)' },
cursor: { type: 'string', description: 'For listTables/getRows: pagination cursor' },
sortBy: { type: 'string', description: 'For getRows: "columnName:asc" or "columnName:desc"' },
search: { type: 'string', description: 'For getRows: text search across string columns' },
returnType: { type: 'string', enum: ['count', 'id', 'all'], description: 'For insertRows: what to return (default: count)' },
returnData: { type: 'boolean', description: 'For updateRows/upsertRows/deleteRows: return affected rows (default: false)' },
dryRun: { type: 'boolean', description: 'For updateRows/upsertRows/deleteRows: preview without applying (default: false)' },
},
required: ['action'],
},
annotations: {
title: 'Manage Data Tables',
readOnlyHint: false,
destructiveHint: true,
openWorldHint: true,
},
},
]; ];

View File

@@ -28,21 +28,6 @@ export interface WorkflowNode {
typeVersion?: number; typeVersion?: number;
} }
/**
* Get tool description from node, checking all possible property locations.
* Different n8n tool types store descriptions in different places:
* - toolDescription: HTTP Request Tool, Vector Store Tool
* - description: Workflow Tool, Code Tool, AI Agent Tool
* - options.description: SerpApi, Wikipedia, SearXNG
*/
function getToolDescription(node: WorkflowNode): string | undefined {
return (
node.parameters.toolDescription ||
node.parameters.description ||
node.parameters.options?.description
);
}
export interface WorkflowJson { export interface WorkflowJson {
name?: string; name?: string;
nodes: WorkflowNode[]; nodes: WorkflowNode[];
@@ -73,7 +58,7 @@ export function validateHTTPRequestTool(node: WorkflowNode): ValidationIssue[] {
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -81,7 +66,7 @@ export function validateHTTPRequestTool(node: WorkflowNode): ValidationIssue[] {
message: `HTTP Request Tool "${node.name}" has no toolDescription. Add a clear description to help the LLM know when to use this API.`, message: `HTTP Request Tool "${node.name}" has no toolDescription. Add a clear description to help the LLM know when to use this API.`,
code: 'MISSING_TOOL_DESCRIPTION' code: 'MISSING_TOOL_DESCRIPTION'
}); });
} else if (getToolDescription(node)!.trim().length < MIN_DESCRIPTION_LENGTH_MEDIUM) { } else if (node.parameters.toolDescription.trim().length < MIN_DESCRIPTION_LENGTH_MEDIUM) {
issues.push({ issues.push({
severity: 'warning', severity: 'warning',
nodeId: node.id, nodeId: node.id,
@@ -229,8 +214,8 @@ export function validateHTTPRequestTool(node: WorkflowNode): ValidationIssue[] {
export function validateCodeTool(node: WorkflowNode): ValidationIssue[] { export function validateCodeTool(node: WorkflowNode): ValidationIssue[] {
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) - check all possible locations // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -276,7 +261,7 @@ export function validateVectorStoreTool(
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -317,7 +302,7 @@ export function validateWorkflowTool(node: WorkflowNode, reverseConnections?: Ma
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -352,7 +337,7 @@ export function validateAIAgentTool(
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -393,7 +378,7 @@ export function validateMCPClientTool(node: WorkflowNode): ValidationIssue[] {
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -421,14 +406,20 @@ export function validateMCPClientTool(node: WorkflowNode): ValidationIssue[] {
* 7-8. Simple Tools (Calculator, Think) Validators * 7-8. Simple Tools (Calculator, Think) Validators
* From spec lines 1868-2009 * From spec lines 1868-2009
*/ */
export function validateCalculatorTool(_node: WorkflowNode): ValidationIssue[] { export function validateCalculatorTool(node: WorkflowNode): ValidationIssue[] {
// Calculator Tool has a built-in description - no validation needed const issues: ValidationIssue[] = [];
return [];
// Calculator Tool has a built-in description and is self-explanatory
// toolDescription is optional - no validation needed
return issues;
} }
export function validateThinkTool(_node: WorkflowNode): ValidationIssue[] { export function validateThinkTool(node: WorkflowNode): ValidationIssue[] {
// Think Tool has a built-in description - no validation needed const issues: ValidationIssue[] = [];
return [];
// Think Tool has a built-in description and is self-explanatory
// toolDescription is optional - no validation needed
return issues;
} }
/** /**
@@ -439,7 +430,7 @@ export function validateSerpApiTool(node: WorkflowNode): ValidationIssue[] {
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -466,7 +457,7 @@ export function validateWikipediaTool(node: WorkflowNode): ValidationIssue[] {
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -496,7 +487,7 @@ export function validateSearXngTool(node: WorkflowNode): ValidationIssue[] {
const issues: ValidationIssue[] = []; const issues: ValidationIssue[] = [];
// 1. Check toolDescription (REQUIRED) // 1. Check toolDescription (REQUIRED)
if (!getToolDescription(node)) { if (!node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'error', severity: 'error',
nodeId: node.id, nodeId: node.id,
@@ -535,7 +526,7 @@ export function validateWolframAlphaTool(node: WorkflowNode): ValidationIssue[]
} }
// 2. Check description (INFO) // 2. Check description (INFO)
if (!getToolDescription(node)) { if (!node.parameters.description && !node.parameters.toolDescription) {
issues.push({ issues.push({
severity: 'info', severity: 'info',
nodeId: node.id, nodeId: node.id,

View File

@@ -22,15 +22,6 @@ import {
SourceControlStatus, SourceControlStatus,
SourceControlPullResult, SourceControlPullResult,
SourceControlPushResult, SourceControlPushResult,
DataTable,
DataTableColumn,
DataTableListParams,
DataTableRow,
DataTableRowListParams,
DataTableInsertRowsParams,
DataTableUpdateRowsParams,
DataTableUpsertRowParams,
DataTableDeleteRowsParams,
} from '../types/n8n-api'; } from '../types/n8n-api';
import { handleN8nApiError, logN8nError } from '../utils/n8n-errors'; import { handleN8nApiError, logN8nError } from '../utils/n8n-errors';
import { cleanWorkflowForCreate, cleanWorkflowForUpdate } from './n8n-validation'; import { cleanWorkflowForCreate, cleanWorkflowForUpdate } from './n8n-validation';
@@ -261,17 +252,9 @@ export class N8nApiClient {
} }
} }
async transferWorkflow(id: string, destinationProjectId: string): Promise<void> {
try {
await this.client.put(`/workflows/${id}/transfer`, { destinationProjectId });
} catch (error) {
throw handleN8nApiError(error);
}
}
async activateWorkflow(id: string): Promise<Workflow> { async activateWorkflow(id: string): Promise<Workflow> {
try { try {
const response = await this.client.post(`/workflows/${id}/activate`, {}); const response = await this.client.post(`/workflows/${id}/activate`);
return response.data; return response.data;
} catch (error) { } catch (error) {
throw handleN8nApiError(error); throw handleN8nApiError(error);
@@ -280,7 +263,7 @@ export class N8nApiClient {
async deactivateWorkflow(id: string): Promise<Workflow> { async deactivateWorkflow(id: string): Promise<Workflow> {
try { try {
const response = await this.client.post(`/workflows/${id}/deactivate`, {}); const response = await this.client.post(`/workflows/${id}/deactivate`);
return response.data; return response.data;
} catch (error) { } catch (error) {
throw handleN8nApiError(error); throw handleN8nApiError(error);
@@ -510,15 +493,6 @@ export class N8nApiClient {
} }
} }
async updateWorkflowTags(workflowId: string, tagIds: string[]): Promise<Tag[]> {
try {
const response = await this.client.put(`/workflows/${workflowId}/tags`, tagIds.filter(id => id).map(id => ({ id })));
return response.data;
} catch (error) {
throw handleN8nApiError(error);
}
}
// Source Control Management (Enterprise feature) // Source Control Management (Enterprise feature)
async getSourceControlStatus(): Promise<SourceControlStatus> { async getSourceControlStatus(): Promise<SourceControlStatus> {
try { try {
@@ -591,95 +565,6 @@ export class N8nApiClient {
} }
} }
async createDataTable(params: { name: string; columns?: DataTableColumn[] }): Promise<DataTable> {
try {
const response = await this.client.post('/data-tables', params);
return response.data;
} catch (error) {
throw handleN8nApiError(error);
}
}
async listDataTables(params: DataTableListParams = {}): Promise<{ data: DataTable[]; nextCursor?: string | null }> {
try {
const response = await this.client.get('/data-tables', { params });
return this.validateListResponse<DataTable>(response.data, 'data-tables');
} catch (error) {
throw handleN8nApiError(error);
}
}
async getDataTable(id: string): Promise<DataTable> {
try {
const response = await this.client.get(`/data-tables/${id}`);
return response.data;
} catch (error) {
throw handleN8nApiError(error);
}
}
async updateDataTable(id: string, params: { name: string }): Promise<DataTable> {
try {
const response = await this.client.patch(`/data-tables/${id}`, params);
return response.data;
} catch (error) {
throw handleN8nApiError(error);
}
}
async deleteDataTable(id: string): Promise<void> {
try {
await this.client.delete(`/data-tables/${id}`);
} catch (error) {
throw handleN8nApiError(error);
}
}
async getDataTableRows(id: string, params: DataTableRowListParams = {}): Promise<{ data: DataTableRow[]; nextCursor?: string | null }> {
try {
const response = await this.client.get(`/data-tables/${id}/rows`, { params });
return this.validateListResponse<DataTableRow>(response.data, 'data-table-rows');
} catch (error) {
throw handleN8nApiError(error);
}
}
async insertDataTableRows(id: string, params: DataTableInsertRowsParams): Promise<any> {
try {
const response = await this.client.post(`/data-tables/${id}/rows`, params);
return response.data;
} catch (error) {
throw handleN8nApiError(error);
}
}
async updateDataTableRows(id: string, params: DataTableUpdateRowsParams): Promise<any> {
try {
const response = await this.client.patch(`/data-tables/${id}/rows/update`, params);
return response.data;
} catch (error) {
throw handleN8nApiError(error);
}
}
async upsertDataTableRow(id: string, params: DataTableUpsertRowParams): Promise<any> {
try {
const response = await this.client.post(`/data-tables/${id}/rows/upsert`, params);
return response.data;
} catch (error) {
throw handleN8nApiError(error);
}
}
async deleteDataTableRows(id: string, params: DataTableDeleteRowsParams): Promise<any> {
try {
const response = await this.client.delete(`/data-tables/${id}/rows/delete`, { params });
return response.data;
} catch (error) {
throw handleN8nApiError(error);
}
}
/** /**
* Validates and normalizes n8n API list responses. * Validates and normalizes n8n API list responses.
* Handles both modern format {data: [], nextCursor?: string} and legacy array format. * Handles both modern format {data: [], nextCursor?: string} and legacy array format.

View File

@@ -49,7 +49,7 @@ export const workflowConnectionSchema = z.record(
ai_memory: connectionArraySchema.optional(), ai_memory: connectionArraySchema.optional(),
ai_embedding: connectionArraySchema.optional(), ai_embedding: connectionArraySchema.optional(),
ai_vectorStore: connectionArraySchema.optional(), ai_vectorStore: connectionArraySchema.optional(),
}).catchall(connectionArraySchema) // Allow additional AI connection types (ai_outputParser, ai_document, ai_textSplitter, etc.) })
); );
export const workflowSettingsSchema = z.object({ export const workflowSettingsSchema = z.object({
@@ -248,15 +248,15 @@ export function validateWorkflowStructure(workflow: Partial<Workflow>): string[]
const connectedNodes = new Set<string>(); const connectedNodes = new Set<string>();
// Collect all nodes that appear in connections (as source or target) // Collect all nodes that appear in connections (as source or target)
// Iterate over ALL connection types present in the data — not a hardcoded list — // Check ALL connection types, not just 'main' - AI workflows use ai_tool, ai_languageModel, etc.
// so that every AI connection type (ai_outputParser, ai_document, ai_textSplitter, const ALL_CONNECTION_TYPES = ['main', 'error', 'ai_tool', 'ai_languageModel', 'ai_memory', 'ai_embedding', 'ai_vectorStore'] as const;
// ai_agent, ai_chain, ai_retriever, etc.) is covered automatically.
Object.entries(workflow.connections).forEach(([sourceName, connection]) => { Object.entries(workflow.connections).forEach(([sourceName, connection]) => {
connectedNodes.add(sourceName); // Node has outgoing connection connectedNodes.add(sourceName); // Node has outgoing connection
// Check every connection type key present on this source node // Check all connection types for target nodes
const connectionRecord = connection as Record<string, unknown>; ALL_CONNECTION_TYPES.forEach(connType => {
Object.values(connectionRecord).forEach((connData) => { const connData = (connection as Record<string, unknown>)[connType];
if (connData && Array.isArray(connData)) { if (connData && Array.isArray(connData)) {
connData.forEach((outputs) => { connData.forEach((outputs) => {
if (Array.isArray(outputs)) { if (Array.isArray(outputs)) {
@@ -429,29 +429,24 @@ export function validateWorkflowStructure(workflow: Partial<Workflow>): string[]
} }
} }
// Check all connection types (main, error, ai_tool, ai_languageModel, etc.) if (connection.main && Array.isArray(connection.main)) {
const connectionRecord = connection as Record<string, unknown>; connection.main.forEach((outputs, outputIndex) => {
Object.values(connectionRecord).forEach((connData) => { if (Array.isArray(outputs)) {
if (connData && Array.isArray(connData)) { outputs.forEach((target, targetIndex) => {
connData.forEach((outputs: any, outputIndex: number) => { // Check if target exists by name (correct)
if (Array.isArray(outputs)) { if (!nodeNames.has(target.node)) {
outputs.forEach((target: any, targetIndex: number) => { // Check if they're using an ID instead of name
if (!target?.node) return; if (nodeIds.has(target.node)) {
// Check if target exists by name (correct) const correctName = nodeIdToName.get(target.node);
if (!nodeNames.has(target.node)) { errors.push(`Connection target uses node ID '${target.node}' but must use node name '${correctName}' (from ${sourceName}[${outputIndex}][${targetIndex}])`);
// Check if they're using an ID instead of name } else {
if (nodeIds.has(target.node)) { errors.push(`Connection references non-existent target node: ${target.node} (from ${sourceName}[${outputIndex}][${targetIndex}])`);
const correctName = nodeIdToName.get(target.node);
errors.push(`Connection target uses node ID '${target.node}' but must use node name '${correctName}' (from ${sourceName}[${outputIndex}][${targetIndex}])`);
} else {
errors.push(`Connection references non-existent target node: ${target.node} (from ${sourceName}[${outputIndex}][${targetIndex}])`);
}
} }
}); }
} });
}); }
} });
}); }
}); });
} }

View File

@@ -41,7 +41,7 @@ export function sanitizeWorkflowNodes(workflow: any): any {
return { return {
...workflow, ...workflow,
nodes: workflow.nodes.map(sanitizeNode) nodes: workflow.nodes.map((node: any) => sanitizeNode(node))
}; };
} }
@@ -121,7 +121,9 @@ function sanitizeFilterConditions(conditions: any): any {
// Sanitize conditions array // Sanitize conditions array
if (sanitized.conditions && Array.isArray(sanitized.conditions)) { if (sanitized.conditions && Array.isArray(sanitized.conditions)) {
sanitized.conditions = sanitized.conditions.map(sanitizeCondition); sanitized.conditions = sanitized.conditions.map((condition: any) =>
sanitizeCondition(condition)
);
} }
return sanitized; return sanitized;
@@ -212,25 +214,18 @@ function inferDataType(operation: string): string {
return 'boolean'; return 'boolean';
} }
// Number operations (partial match to catch variants like "greaterThan" containing "gt") // Number operations
const numberOps = ['isNumeric', 'gt', 'gte', 'lt', 'lte']; const numberOps = ['isNumeric', 'gt', 'gte', 'lt', 'lte'];
if (numberOps.some(op => operation.includes(op))) { if (numberOps.some(op => operation.includes(op))) {
return 'number'; return 'number';
} }
// Date operations (partial match to catch variants like "isAfter" containing "after") // Date operations
const dateOps = ['after', 'before', 'afterDate', 'beforeDate']; const dateOps = ['after', 'before', 'afterDate', 'beforeDate'];
if (dateOps.some(op => operation.includes(op))) { if (dateOps.some(op => operation.includes(op))) {
return 'dateTime'; return 'dateTime';
} }
// Object operations: empty/notEmpty/exists/notExists are generic object-level checks
// (distinct from isEmpty/isNotEmpty which are boolean-typed operations)
const objectOps = ['empty', 'notEmpty', 'exists', 'notExists'];
if (objectOps.includes(operation)) {
return 'object';
}
// Default to string // Default to string
return 'string'; return 'string';
} }
@@ -244,11 +239,7 @@ function isUnaryOperator(operation: string): boolean {
'isNotEmpty', 'isNotEmpty',
'true', 'true',
'false', 'false',
'isNumeric', 'isNumeric'
'empty',
'notEmpty',
'exists',
'notExists'
]; ];
return unaryOps.includes(operation); return unaryOps.includes(operation);
} }

View File

@@ -1374,11 +1374,8 @@ export class NodeSpecificValidators {
}); });
} }
// Skip primitive return check when helper functions are present, // Check for primitive return
// since we can't distinguish top-level vs nested returns without AST. if (/return\s+(true|false|null|undefined|\d+|['"`])/m.test(code)) {
// Matches: function name(), const/let/var name = [async] function/arrow
const hasHelperFunctions = /(?:function\s+\w+\s*\(|(?:const|let|var)\s+\w+\s*=\s*(?:async\s+)?(?:function|\([^)]*\)\s*=>|\w+\s*=>))/.test(code);
if (!hasHelperFunctions && /return\s+(true|false|null|undefined|\d+|['"`])/m.test(code)) {
errors.push({ errors.push({
type: 'invalid_value', type: 'invalid_value',
property: 'jsCode', property: 'jsCode',
@@ -1490,7 +1487,7 @@ export class NodeSpecificValidators {
// Check for common variable mistakes // Check for common variable mistakes
if (language === 'javaScript') { if (language === 'javaScript') {
// Using $ without proper variable // Using $ without proper variable
if (/\$(?![a-zA-Z_(])/.test(code) && !code.includes('${')) { if (/\$(?![a-zA-Z])/.test(code) && !code.includes('${')) {
warnings.push({ warnings.push({
type: 'best_practice', type: 'best_practice',
message: 'Invalid $ usage detected', message: 'Invalid $ usage detected',

View File

@@ -671,13 +671,11 @@ export class WorkflowAutoFixer {
filteredFixes: FixOperation[], filteredFixes: FixOperation[],
allFixes: FixOperation[] allFixes: FixOperation[]
): WorkflowDiffOperation[] { ): WorkflowDiffOperation[] {
// fixedNodes contains node names (FixOperation.node = node.name)
const fixedNodes = new Set(filteredFixes.map(f => f.node)); const fixedNodes = new Set(filteredFixes.map(f => f.node));
const hasConnectionFixes = filteredFixes.some(f => CONNECTION_FIX_TYPES.includes(f.type)); const hasConnectionFixes = filteredFixes.some(f => CONNECTION_FIX_TYPES.includes(f.type));
return operations.filter(op => { return operations.filter(op => {
if (op.type === 'updateNode') { if (op.type === 'updateNode') {
// Check both nodeName and nodeId — operations may use either return fixedNodes.has(op.nodeId || '');
return fixedNodes.has(op.nodeName || '') || fixedNodes.has(op.nodeId || '');
} }
if (op.type === 'replaceConnections') { if (op.type === 'replaceConnections') {
return hasConnectionFixes; return hasConnectionFixes;
@@ -1188,11 +1186,10 @@ export class WorkflowAutoFixer {
description: `Upgrade ${node.name} from v${currentVersion} to v${latestVersion}. ${analysis.reason}` description: `Upgrade ${node.name} from v${currentVersion} to v${latestVersion}. ${analysis.reason}`
}); });
// Create update operation — both nodeId and nodeName needed for fix filtering // Create update operation
const operation: UpdateNodeOperation = { const operation: UpdateNodeOperation = {
type: 'updateNode', type: 'updateNode',
nodeId: node.id, nodeId: node.id,
nodeName: node.name,
updates: { updates: {
typeVersion: parseFloat(latestVersion), typeVersion: parseFloat(latestVersion),
parameters: migrationResult.updatedNode.parameters, parameters: migrationResult.updatedNode.parameters,

View File

@@ -28,8 +28,7 @@ import {
ActivateWorkflowOperation, ActivateWorkflowOperation,
DeactivateWorkflowOperation, DeactivateWorkflowOperation,
CleanStaleConnectionsOperation, CleanStaleConnectionsOperation,
ReplaceConnectionsOperation, ReplaceConnectionsOperation
TransferWorkflowOperation
} from '../types/workflow-diff'; } from '../types/workflow-diff';
import { Workflow, WorkflowNode, WorkflowConnection } from '../types/n8n-api'; import { Workflow, WorkflowNode, WorkflowConnection } from '../types/n8n-api';
import { Logger } from '../utils/logger'; import { Logger } from '../utils/logger';
@@ -39,24 +38,11 @@ import { isActivatableTrigger } from '../utils/node-type-utils';
const logger = new Logger({ prefix: '[WorkflowDiffEngine]' }); const logger = new Logger({ prefix: '[WorkflowDiffEngine]' });
/**
* Not safe for concurrent use — create a new instance per request.
* Instance state is reset at the start of each applyDiff() call.
*/
export class WorkflowDiffEngine { export class WorkflowDiffEngine {
// Track node name changes during operations for connection reference updates // Track node name changes during operations for connection reference updates
private renameMap: Map<string, string> = new Map(); private renameMap: Map<string, string> = new Map();
// Track warnings during operation processing // Track warnings during operation processing
private warnings: WorkflowDiffValidationError[] = []; private warnings: WorkflowDiffValidationError[] = [];
// Track which nodes were added/updated so sanitization only runs on them
private modifiedNodeIds = new Set<string>();
// Track removed node names for better error messages
private removedNodeNames = new Set<string>();
// Track tag operations for dedicated API calls
private tagsToAdd: string[] = [];
private tagsToRemove: string[] = [];
// Track transfer operation for dedicated API call
private transferToProjectId: string | undefined;
/** /**
* Apply diff operations to a workflow * Apply diff operations to a workflow
@@ -69,11 +55,6 @@ export class WorkflowDiffEngine {
// Reset tracking for this diff operation // Reset tracking for this diff operation
this.renameMap.clear(); this.renameMap.clear();
this.warnings = []; this.warnings = [];
this.modifiedNodeIds.clear();
this.removedNodeNames.clear();
this.tagsToAdd = [];
this.tagsToRemove = [];
this.transferToProjectId = undefined;
// Clone workflow to avoid modifying original // Clone workflow to avoid modifying original
const workflowCopy = JSON.parse(JSON.stringify(workflow)); const workflowCopy = JSON.parse(JSON.stringify(workflow));
@@ -145,12 +126,6 @@ export class WorkflowDiffEngine {
}; };
} }
// Extract and clean up activation flags (same as atomic mode)
const shouldActivate = (workflowCopy as any)._shouldActivate === true;
const shouldDeactivate = (workflowCopy as any)._shouldDeactivate === true;
delete (workflowCopy as any)._shouldActivate;
delete (workflowCopy as any)._shouldDeactivate;
const success = appliedIndices.length > 0; const success = appliedIndices.length > 0;
return { return {
success, success,
@@ -160,12 +135,7 @@ export class WorkflowDiffEngine {
errors: errors.length > 0 ? errors : undefined, errors: errors.length > 0 ? errors : undefined,
warnings: this.warnings.length > 0 ? this.warnings : undefined, warnings: this.warnings.length > 0 ? this.warnings : undefined,
applied: appliedIndices, applied: appliedIndices,
failed: failedIndices, failed: failedIndices
shouldActivate: shouldActivate || undefined,
shouldDeactivate: shouldDeactivate || undefined,
tagsToAdd: this.tagsToAdd.length > 0 ? this.tagsToAdd : undefined,
tagsToRemove: this.tagsToRemove.length > 0 ? this.tagsToRemove : undefined,
transferToProjectId: this.transferToProjectId || undefined
}; };
} else { } else {
// Atomic mode: all operations must succeed // Atomic mode: all operations must succeed
@@ -231,16 +201,12 @@ export class WorkflowDiffEngine {
} }
} }
// Sanitize only modified nodes to avoid breaking unrelated nodes (#592) // Sanitize ALL nodes in the workflow after operations are applied
if (this.modifiedNodeIds.size > 0) { // This ensures existing invalid nodes (e.g., binary operators with singleValue: true)
workflowCopy.nodes = workflowCopy.nodes.map((node: WorkflowNode) => { // are fixed automatically when any update is made to the workflow
if (this.modifiedNodeIds.has(node.id)) { workflowCopy.nodes = workflowCopy.nodes.map((node: WorkflowNode) => sanitizeNode(node));
return sanitizeNode(node);
} logger.debug('Applied full-workflow sanitization to all nodes');
return node;
});
logger.debug(`Sanitized ${this.modifiedNodeIds.size} modified nodes`);
}
// If validateOnly flag is set, return success without applying // If validateOnly flag is set, return success without applying
if (request.validateOnly) { if (request.validateOnly) {
@@ -267,10 +233,7 @@ export class WorkflowDiffEngine {
message: `Successfully applied ${operationsApplied} operations (${nodeOperations.length} node ops, ${otherOperations.length} other ops)`, message: `Successfully applied ${operationsApplied} operations (${nodeOperations.length} node ops, ${otherOperations.length} other ops)`,
warnings: this.warnings.length > 0 ? this.warnings : undefined, warnings: this.warnings.length > 0 ? this.warnings : undefined,
shouldActivate: shouldActivate || undefined, shouldActivate: shouldActivate || undefined,
shouldDeactivate: shouldDeactivate || undefined, shouldDeactivate: shouldDeactivate || undefined
tagsToAdd: this.tagsToAdd.length > 0 ? this.tagsToAdd : undefined,
tagsToRemove: this.tagsToRemove.length > 0 ? this.tagsToRemove : undefined,
transferToProjectId: this.transferToProjectId || undefined
}; };
} }
} catch (error) { } catch (error) {
@@ -285,6 +248,7 @@ export class WorkflowDiffEngine {
} }
} }
/** /**
* Validate a single operation * Validate a single operation
*/ */
@@ -312,8 +276,6 @@ export class WorkflowDiffEngine {
case 'addTag': case 'addTag':
case 'removeTag': case 'removeTag':
return null; // These are always valid return null; // These are always valid
case 'transferWorkflow':
return this.validateTransferWorkflow(workflow, operation as TransferWorkflowOperation);
case 'activateWorkflow': case 'activateWorkflow':
return this.validateActivateWorkflow(workflow, operation); return this.validateActivateWorkflow(workflow, operation);
case 'deactivateWorkflow': case 'deactivateWorkflow':
@@ -383,9 +345,6 @@ export class WorkflowDiffEngine {
case 'replaceConnections': case 'replaceConnections':
this.applyReplaceConnections(workflow, operation); this.applyReplaceConnections(workflow, operation);
break; break;
case 'transferWorkflow':
this.applyTransferWorkflow(workflow, operation as TransferWorkflowOperation);
break;
} }
} }
@@ -446,7 +405,7 @@ export class WorkflowDiffEngine {
// Check for missing required parameter // Check for missing required parameter
if (!operation.updates) { if (!operation.updates) {
return `Missing required parameter 'updates'. The updateNode operation requires an 'updates' object. Correct structure: {type: "updateNode", nodeId: "abc-123" OR nodeName: "My Node", updates: {name: "New Name", "parameters.url": "https://example.com"}}`; return `Missing required parameter 'updates'. The updateNode operation requires an 'updates' object containing properties to modify. Example: {type: "updateNode", nodeId: "abc", updates: {name: "New Name"}}`;
} }
const node = this.findNode(workflow, operation.nodeId, operation.nodeName); const node = this.findNode(workflow, operation.nodeId, operation.nodeName);
@@ -551,18 +510,12 @@ export class WorkflowDiffEngine {
const targetNode = this.findNode(workflow, operation.target, operation.target); const targetNode = this.findNode(workflow, operation.target, operation.target);
if (!sourceNode) { if (!sourceNode) {
if (this.removedNodeNames.has(operation.source)) {
return `Source node "${operation.source}" was already removed by a prior removeNode operation. Its connections were automatically cleaned up — no separate removeConnection needed.`;
}
const availableNodes = workflow.nodes const availableNodes = workflow.nodes
.map(n => `"${n.name}" (id: ${n.id.substring(0, 8)}...)`) .map(n => `"${n.name}" (id: ${n.id.substring(0, 8)}...)`)
.join(', '); .join(', ');
return `Source node not found: "${operation.source}". Available nodes: ${availableNodes}. Tip: Use node ID for names with special characters.`; return `Source node not found: "${operation.source}". Available nodes: ${availableNodes}. Tip: Use node ID for names with special characters.`;
} }
if (!targetNode) { if (!targetNode) {
if (this.removedNodeNames.has(operation.target)) {
return `Target node "${operation.target}" was already removed by a prior removeNode operation. Its connections were automatically cleaned up — no separate removeConnection needed.`;
}
const availableNodes = workflow.nodes const availableNodes = workflow.nodes
.map(n => `"${n.name}" (id: ${n.id.substring(0, 8)}...)`) .map(n => `"${n.name}" (id: ${n.id.substring(0, 8)}...)`)
.join(', '); .join(', ');
@@ -661,16 +614,13 @@ export class WorkflowDiffEngine {
// Sanitize node to ensure complete metadata (filter options, operator structure, etc.) // Sanitize node to ensure complete metadata (filter options, operator structure, etc.)
const sanitizedNode = sanitizeNode(newNode); const sanitizedNode = sanitizeNode(newNode);
this.modifiedNodeIds.add(sanitizedNode.id);
workflow.nodes.push(sanitizedNode); workflow.nodes.push(sanitizedNode);
} }
private applyRemoveNode(workflow: Workflow, operation: RemoveNodeOperation): void { private applyRemoveNode(workflow: Workflow, operation: RemoveNodeOperation): void {
const node = this.findNode(workflow, operation.nodeId, operation.nodeName); const node = this.findNode(workflow, operation.nodeId, operation.nodeName);
if (!node) return; if (!node) return;
this.removedNodeNames.add(node.name);
// Remove node from array // Remove node from array
const index = workflow.nodes.findIndex(n => n.id === node.id); const index = workflow.nodes.findIndex(n => n.id === node.id);
if (index !== -1) { if (index !== -1) {
@@ -681,36 +631,30 @@ export class WorkflowDiffEngine {
delete workflow.connections[node.name]; delete workflow.connections[node.name];
// Remove all connections to this node // Remove all connections to this node
for (const [sourceName, sourceConnections] of Object.entries(workflow.connections)) { Object.keys(workflow.connections).forEach(sourceName => {
for (const [outputName, outputConns] of Object.entries(sourceConnections)) { const sourceConnections = workflow.connections[sourceName];
sourceConnections[outputName] = outputConns.map(connections => Object.keys(sourceConnections).forEach(outputName => {
sourceConnections[outputName] = sourceConnections[outputName].map(connections =>
connections.filter(conn => conn.node !== node.name) connections.filter(conn => conn.node !== node.name)
); ).filter(connections => connections.length > 0);
// Trim trailing empty arrays only (preserve intermediate empty arrays for positional indices) // Clean up empty arrays
const trimmed = sourceConnections[outputName]; if (sourceConnections[outputName].length === 0) {
while (trimmed.length > 0 && trimmed[trimmed.length - 1].length === 0) {
trimmed.pop();
}
if (trimmed.length === 0) {
delete sourceConnections[outputName]; delete sourceConnections[outputName];
} }
} });
// Clean up empty connection objects // Clean up empty connection objects
if (Object.keys(sourceConnections).length === 0) { if (Object.keys(sourceConnections).length === 0) {
delete workflow.connections[sourceName]; delete workflow.connections[sourceName];
} }
} });
} }
private applyUpdateNode(workflow: Workflow, operation: UpdateNodeOperation): void { private applyUpdateNode(workflow: Workflow, operation: UpdateNodeOperation): void {
const node = this.findNode(workflow, operation.nodeId, operation.nodeName); const node = this.findNode(workflow, operation.nodeId, operation.nodeName);
if (!node) return; if (!node) return;
this.modifiedNodeIds.add(node.id);
// Track node renames for connection reference updates // Track node renames for connection reference updates
if (operation.updates.name && operation.updates.name !== node.name) { if (operation.updates.name && operation.updates.name !== node.name) {
const oldName = node.name; const oldName = node.name;
@@ -762,18 +706,10 @@ export class WorkflowDiffEngine {
): { sourceOutput: string; sourceIndex: number } { ): { sourceOutput: string; sourceIndex: number } {
const sourceNode = this.findNode(workflow, operation.source, operation.source); const sourceNode = this.findNode(workflow, operation.source, operation.source);
// Start with explicit values or defaults, coercing to correct types // Start with explicit values or defaults
let sourceOutput = String(operation.sourceOutput ?? 'main'); let sourceOutput = operation.sourceOutput ?? 'main';
let sourceIndex = operation.sourceIndex ?? 0; let sourceIndex = operation.sourceIndex ?? 0;
// Remap numeric sourceOutput (e.g., "0", "1") to "main" with sourceIndex (#537)
// Skip when smart parameters (branch, case) are present — they take precedence
if (/^\d+$/.test(sourceOutput) && operation.sourceIndex === undefined
&& operation.branch === undefined && operation.case === undefined) {
sourceIndex = parseInt(sourceOutput, 10);
sourceOutput = 'main';
}
// Smart parameter: branch (for IF nodes) // Smart parameter: branch (for IF nodes)
// IF nodes use 'main' output with index 0 (true) or 1 (false) // IF nodes use 'main' output with index 0 (true) or 1 (false)
if (operation.branch !== undefined && operation.sourceIndex === undefined) { if (operation.branch !== undefined && operation.sourceIndex === undefined) {
@@ -822,8 +758,7 @@ export class WorkflowDiffEngine {
// Use nullish coalescing to properly handle explicit 0 values // Use nullish coalescing to properly handle explicit 0 values
// Default targetInput to sourceOutput to preserve connection type for AI connections (ai_tool, ai_memory, etc.) // Default targetInput to sourceOutput to preserve connection type for AI connections (ai_tool, ai_memory, etc.)
// Coerce to string to handle numeric values passed as sourceOutput/targetInput const targetInput = operation.targetInput ?? sourceOutput;
const targetInput = String(operation.targetInput ?? sourceOutput);
const targetIndex = operation.targetIndex ?? 0; const targetIndex = operation.targetIndex ?? 0;
// Initialize source node connections object // Initialize source node connections object
@@ -860,14 +795,18 @@ export class WorkflowDiffEngine {
private applyRemoveConnection(workflow: Workflow, operation: RemoveConnectionOperation): void { private applyRemoveConnection(workflow: Workflow, operation: RemoveConnectionOperation): void {
const sourceNode = this.findNode(workflow, operation.source, operation.source); const sourceNode = this.findNode(workflow, operation.source, operation.source);
const targetNode = this.findNode(workflow, operation.target, operation.target); const targetNode = this.findNode(workflow, operation.target, operation.target);
// If ignoreErrors is true, silently succeed even if nodes don't exist
if (!sourceNode || !targetNode) { if (!sourceNode || !targetNode) {
return; if (operation.ignoreErrors) {
return; // Gracefully handle missing nodes
}
return; // Should never reach here if validation passed, but safety check
} }
const sourceOutput = String(operation.sourceOutput ?? 'main'); const sourceOutput = operation.sourceOutput || 'main';
const connections = workflow.connections[sourceNode.name]?.[sourceOutput]; const connections = workflow.connections[sourceNode.name]?.[sourceOutput];
if (!connections) return; if (!connections) return;
// Remove connection from all indices // Remove connection from all indices
workflow.connections[sourceNode.name][sourceOutput] = connections.map(conns => workflow.connections[sourceNode.name][sourceOutput] = connections.map(conns =>
conns.filter(conn => conn.node !== targetNode.name) conns.filter(conn => conn.node !== targetNode.name)
@@ -938,26 +877,20 @@ export class WorkflowDiffEngine {
} }
private applyAddTag(workflow: Workflow, operation: AddTagOperation): void { private applyAddTag(workflow: Workflow, operation: AddTagOperation): void {
// Track for dedicated API call instead of modifying workflow.tags directly if (!workflow.tags) {
// Reconcile: if previously marked for removal, cancel the removal instead workflow.tags = [];
const removeIdx = this.tagsToRemove.indexOf(operation.tag);
if (removeIdx !== -1) {
this.tagsToRemove.splice(removeIdx, 1);
} }
if (!this.tagsToAdd.includes(operation.tag)) { if (!workflow.tags.includes(operation.tag)) {
this.tagsToAdd.push(operation.tag); workflow.tags.push(operation.tag);
} }
} }
private applyRemoveTag(workflow: Workflow, operation: RemoveTagOperation): void { private applyRemoveTag(workflow: Workflow, operation: RemoveTagOperation): void {
// Track for dedicated API call instead of modifying workflow.tags directly if (!workflow.tags) return;
// Reconcile: if previously marked for addition, cancel the addition instead
const addIdx = this.tagsToAdd.indexOf(operation.tag); const index = workflow.tags.indexOf(operation.tag);
if (addIdx !== -1) { if (index !== -1) {
this.tagsToAdd.splice(addIdx, 1); workflow.tags.splice(index, 1);
}
if (!this.tagsToRemove.includes(operation.tag)) {
this.tagsToRemove.push(operation.tag);
} }
} }
@@ -994,18 +927,6 @@ export class WorkflowDiffEngine {
(workflow as any)._shouldDeactivate = true; (workflow as any)._shouldDeactivate = true;
} }
// Transfer operation — uses dedicated API call (PUT /workflows/{id}/transfer)
private validateTransferWorkflow(_workflow: Workflow, operation: TransferWorkflowOperation): string | null {
if (!operation.destinationProjectId) {
return 'transferWorkflow requires a non-empty destinationProjectId string';
}
return null;
}
private applyTransferWorkflow(_workflow: Workflow, operation: TransferWorkflowOperation): void {
this.transferToProjectId = operation.destinationProjectId;
}
// Connection cleanup operation validators // Connection cleanup operation validators
private validateCleanStaleConnections(workflow: Workflow, operation: CleanStaleConnectionsOperation): string | null { private validateCleanStaleConnections(workflow: Workflow, operation: CleanStaleConnectionsOperation): string | null {
// This operation is always valid - it just cleans up what it finds // This operation is always valid - it just cleans up what it finds
@@ -1094,12 +1015,7 @@ export class WorkflowDiffEngine {
} }
return true; return true;
}) })
); ).filter(conns => conns.length > 0);
// Trim trailing empty arrays only (preserve intermediate for positional indices)
while (filteredConnections.length > 0 && filteredConnections[filteredConnections.length - 1].length === 0) {
filteredConnections.pop();
}
if (filteredConnections.length === 0) { if (filteredConnections.length === 0) {
delete outputs[outputName]; delete outputs[outputName];
@@ -1159,10 +1075,9 @@ export class WorkflowDiffEngine {
const connection = connectionsAtIndex[connIndex]; const connection = connectionsAtIndex[connIndex];
// Check if target node was renamed // Check if target node was renamed
if (renames.has(connection.node)) { if (renames.has(connection.node)) {
const oldTargetName = connection.node;
const newTargetName = renames.get(connection.node)!; const newTargetName = renames.get(connection.node)!;
connection.node = newTargetName; connection.node = newTargetName;
logger.debug(`Updated connection: ${sourceName}[${outputType}][${outputIndex}][${connIndex}].node: "${oldTargetName}" → "${newTargetName}"`); logger.debug(`Updated connection: ${sourceName}[${outputType}][${outputIndex}][${connIndex}].node: "${connection.node}" → "${newTargetName}"`);
} }
} }
} }
@@ -1269,21 +1184,15 @@ export class WorkflowDiffEngine {
private setNestedProperty(obj: any, path: string, value: any): void { private setNestedProperty(obj: any, path: string, value: any): void {
const keys = path.split('.'); const keys = path.split('.');
let current = obj; let current = obj;
for (let i = 0; i < keys.length - 1; i++) { for (let i = 0; i < keys.length - 1; i++) {
const key = keys[i]; const key = keys[i];
if (!(key in current) || typeof current[key] !== 'object' || current[key] === null) { if (!(key in current) || typeof current[key] !== 'object') {
if (value === null) return; // parent path doesn't exist, nothing to delete
current[key] = {}; current[key] = {};
} }
current = current[key]; current = current[key];
} }
const finalKey = keys[keys.length - 1]; current[keys[keys.length - 1]] = value;
if (value === null) {
delete current[finalKey];
} else {
current[finalKey] = value;
}
} }
} }

View File

@@ -311,7 +311,6 @@ export interface WebhookRequest {
// MCP Tool Response Type // MCP Tool Response Type
export interface McpToolResponse { export interface McpToolResponse {
success: boolean; success: boolean;
saved?: boolean;
data?: unknown; data?: unknown;
error?: string; error?: string;
message?: string; message?: string;
@@ -319,7 +318,6 @@ export interface McpToolResponse {
details?: Record<string, unknown>; details?: Record<string, unknown>;
executionId?: string; executionId?: string;
workflowId?: string; workflowId?: string;
operationsApplied?: number;
} }
// Execution Filtering Types // Execution Filtering Types
@@ -454,82 +452,4 @@ export interface ErrorSuggestion {
title: string; title: string;
description: string; description: string;
confidence: 'high' | 'medium' | 'low'; confidence: 'high' | 'medium' | 'low';
}
// Data Table types
export interface DataTableColumn {
name: string;
type?: 'string' | 'number' | 'boolean' | 'date';
}
export interface DataTableColumnResponse {
id: string;
name: string;
type: 'string' | 'number' | 'boolean' | 'date';
index: number;
}
export interface DataTable {
id: string;
name: string;
columns?: DataTableColumnResponse[];
projectId?: string;
createdAt?: string;
updatedAt?: string;
}
export interface DataTableRow {
id?: number;
createdAt?: string;
updatedAt?: string;
[columnName: string]: unknown;
}
export interface DataTableFilterCondition {
columnName: string;
condition: 'eq' | 'neq' | 'like' | 'ilike' | 'gt' | 'gte' | 'lt' | 'lte';
value?: any;
}
export interface DataTableFilter {
type?: 'and' | 'or';
filters: DataTableFilterCondition[];
}
export interface DataTableListParams {
limit?: number;
cursor?: string;
}
export interface DataTableRowListParams {
limit?: number;
cursor?: string;
filter?: string;
sortBy?: string;
search?: string;
}
export interface DataTableInsertRowsParams {
data: Record<string, unknown>[];
returnType?: 'count' | 'id' | 'all';
}
export interface DataTableUpdateRowsParams {
filter: DataTableFilter;
data: Record<string, unknown>;
returnData?: boolean;
dryRun?: boolean;
}
export interface DataTableUpsertRowParams {
filter: DataTableFilter;
data: Record<string, unknown>;
returnData?: boolean;
dryRun?: boolean;
}
export interface DataTableDeleteRowsParams {
filter: string;
returnData?: boolean;
dryRun?: boolean;
} }

View File

@@ -124,11 +124,6 @@ export interface DeactivateWorkflowOperation extends DiffOperation {
// No additional properties needed - just deactivates the workflow // No additional properties needed - just deactivates the workflow
} }
export interface TransferWorkflowOperation extends DiffOperation {
type: 'transferWorkflow';
destinationProjectId: string;
}
// Connection Cleanup Operations // Connection Cleanup Operations
export interface CleanStaleConnectionsOperation extends DiffOperation { export interface CleanStaleConnectionsOperation extends DiffOperation {
type: 'cleanStaleConnections'; type: 'cleanStaleConnections';
@@ -166,8 +161,7 @@ export type WorkflowDiffOperation =
| ActivateWorkflowOperation | ActivateWorkflowOperation
| DeactivateWorkflowOperation | DeactivateWorkflowOperation
| CleanStaleConnectionsOperation | CleanStaleConnectionsOperation
| ReplaceConnectionsOperation | ReplaceConnectionsOperation;
| TransferWorkflowOperation;
// Main diff request structure // Main diff request structure
export interface WorkflowDiffRequest { export interface WorkflowDiffRequest {
@@ -196,9 +190,6 @@ export interface WorkflowDiffResult {
staleConnectionsRemoved?: Array<{ from: string; to: string }>; // For cleanStaleConnections operation staleConnectionsRemoved?: Array<{ from: string; to: string }>; // For cleanStaleConnections operation
shouldActivate?: boolean; // Flag to activate workflow after update (for activateWorkflow operation) shouldActivate?: boolean; // Flag to activate workflow after update (for activateWorkflow operation)
shouldDeactivate?: boolean; // Flag to deactivate workflow after update (for deactivateWorkflow operation) shouldDeactivate?: boolean; // Flag to deactivate workflow after update (for deactivateWorkflow operation)
tagsToAdd?: string[];
tagsToRemove?: string[];
transferToProjectId?: string; // For transferWorkflow operation - uses dedicated API call
} }
// Helper type for node reference (supports both ID and name) // Helper type for node reference (supports both ID and name)

View File

@@ -22,10 +22,8 @@ export class N8nAuthenticationError extends N8nApiError {
} }
export class N8nNotFoundError extends N8nApiError { export class N8nNotFoundError extends N8nApiError {
constructor(messageOrResource: string, id?: string) { constructor(resource: string, id?: string) {
// If id is provided, format as "resource with ID id not found" const message = id ? `${resource} with ID ${id} not found` : `${resource} not found`;
// Otherwise, use messageOrResource as-is (it's already a complete message from the API)
const message = id ? `${messageOrResource} with ID ${id} not found` : messageOrResource;
super(message, 404, 'NOT_FOUND'); super(message, 404, 'NOT_FOUND');
this.name = 'N8nNotFoundError'; this.name = 'N8nNotFoundError';
} }
@@ -72,7 +70,7 @@ export function handleN8nApiError(error: unknown): N8nApiError {
case 401: case 401:
return new N8nAuthenticationError(message); return new N8nAuthenticationError(message);
case 404: case 404:
return new N8nNotFoundError(message || 'Resource'); return new N8nNotFoundError('Resource', message);
case 400: case 400:
return new N8nValidationError(message, data); return new N8nValidationError(message, data);
case 429: case 429:

View File

@@ -172,14 +172,14 @@ export function isTriggerNode(nodeType: string): boolean {
return true; return true;
} }
// Check for polling-based triggers that don't have 'trigger' in their name
if (lowerType.includes('emailread') || lowerType.includes('emailreadimap')) {
return true;
}
// Check for specific trigger types that don't have 'trigger' in their name // Check for specific trigger types that don't have 'trigger' in their name
// (manualTrigger and formTrigger are already caught by the 'trigger' check above) const specificTriggers = [
return normalized === 'nodes-base.start'; 'nodes-base.start',
'nodes-base.manualTrigger',
'nodes-base.formTrigger'
];
return specificTriggers.includes(normalized);
} }
/** /**

View File

@@ -65,7 +65,7 @@ describe('Database Performance Tests', () => {
// Adjusted based on actual CI performance measurements + type safety overhead // Adjusted based on actual CI performance measurements + type safety overhead
// CI environments show ratios of ~7-10 for 1000:100 and ~6-7 for 5000:1000 // CI environments show ratios of ~7-10 for 1000:100 and ~6-7 for 5000:1000
// Increased thresholds to account for community node columns (8 additional fields) // Increased thresholds to account for community node columns (8 additional fields)
expect(ratio1000to100).toBeLessThan(20); // Allow for CI variability + community columns (was 15) expect(ratio1000to100).toBeLessThan(15); // Allow for CI variability + community columns (was 12)
expect(ratio5000to1000).toBeLessThan(12); // Allow for type safety overhead + community columns (was 11) expect(ratio5000to1000).toBeLessThan(12); // Allow for type safety overhead + community columns (was 11)
}); });

View File

@@ -105,14 +105,21 @@ describe('MCP Protocol Compliance', () => {
describe('Message Format Validation', () => { describe('Message Format Validation', () => {
it('should reject messages without method', async () => { it('should reject messages without method', async () => {
// MCP SDK 1.27+ enforces single-connection per Server instance, // Test by sending raw message through transport
// so use the existing client from beforeEach instead of a new one. const [serverTransport, clientTransport] = InMemoryTransport.createLinkedPair();
const testClient = new Client({ name: 'test', version: '1.0.0' }, {});
await mcpServer.connectToTransport(serverTransport);
await testClient.connect(clientTransport);
try { try {
// This should fail as MCP SDK validates method // This should fail as MCP SDK validates method
await (client as any).request({ method: '', params: {} }); await (testClient as any).request({ method: '', params: {} });
expect.fail('Should have thrown an error'); expect.fail('Should have thrown an error');
} catch (error) { } catch (error) {
expect(error).toBeDefined(); expect(error).toBeDefined();
} finally {
await testClient.close();
} }
}); });
@@ -243,15 +250,10 @@ describe('MCP Protocol Compliance', () => {
describe('Transport Layer', () => { describe('Transport Layer', () => {
it('should handle transport disconnection gracefully', async () => { it('should handle transport disconnection gracefully', async () => {
// Use a dedicated server instance so we don't conflict with the
// shared mcpServer that beforeEach already connected a transport to.
const dedicatedServer = new TestableN8NMCPServer();
await dedicatedServer.initialize();
const [serverTransport, clientTransport] = InMemoryTransport.createLinkedPair(); const [serverTransport, clientTransport] = InMemoryTransport.createLinkedPair();
await dedicatedServer.connectToTransport(serverTransport);
const testClient = new Client({ name: 'test', version: '1.0.0' }, {}); const testClient = new Client({ name: 'test', version: '1.0.0' }, {});
await mcpServer.connectToTransport(serverTransport);
await testClient.connect(clientTransport); await testClient.connect(clientTransport);
// Make a request // Make a request
@@ -268,8 +270,6 @@ describe('MCP Protocol Compliance', () => {
} catch (error) { } catch (error) {
expect(error).toBeDefined(); expect(error).toBeDefined();
} }
await dedicatedServer.close();
}); });
it('should handle multiple sequential connections', async () => { it('should handle multiple sequential connections', async () => {

View File

@@ -73,11 +73,10 @@ describe('MCP Session Management', { timeout: 15000 }, () => {
const serverInfo = await client.getServerVersion(); const serverInfo = await client.getServerVersion();
expect(serverInfo).toBeDefined(); expect(serverInfo).toBeDefined();
expect(serverInfo?.name).toBe('n8n-documentation-mcp'); expect(serverInfo?.name).toBe('n8n-documentation-mcp');
// Check capabilities via the dedicated method // Check capabilities if they exist
const capabilities = client.getServerCapabilities(); if (serverInfo?.capabilities) {
if (capabilities) { expect(serverInfo.capabilities).toHaveProperty('tools');
expect(capabilities).toHaveProperty('tools');
} }
// Clean up - ensure proper order // Clean up - ensure proper order
@@ -341,9 +340,9 @@ describe('MCP Session Management', { timeout: 15000 }, () => {
it('should handle different client versions', async () => { it('should handle different client versions', async () => {
const mcpServer = new TestableN8NMCPServer(); const mcpServer = new TestableN8NMCPServer();
await mcpServer.initialize(); await mcpServer.initialize();
const clients = [];
// MCP SDK 1.27+ enforces single-connection per Server instance,
// so we test each version sequentially rather than concurrently.
for (const version of ['1.0.0', '1.1.0', '2.0.0']) { for (const version of ['1.0.0', '1.1.0', '2.0.0']) {
const [serverTransport, clientTransport] = InMemoryTransport.createLinkedPair(); const [serverTransport, clientTransport] = InMemoryTransport.createLinkedPair();
await mcpServer.connectToTransport(serverTransport); await mcpServer.connectToTransport(serverTransport);
@@ -354,14 +353,21 @@ describe('MCP Session Management', { timeout: 15000 }, () => {
}, {}); }, {});
await client.connect(clientTransport); await client.connect(clientTransport);
clients.push(client);
const info = await client.getServerVersion();
expect(info!.name).toBe('n8n-documentation-mcp');
await client.close();
await new Promise(resolve => setTimeout(resolve, 50));
} }
// All versions should work
const responses = await Promise.all(
clients.map(client => client.getServerVersion())
);
responses.forEach(info => {
expect(info!.name).toBe('n8n-documentation-mcp');
});
// Clean up
await Promise.all(clients.map(client => client.close()));
await new Promise(resolve => setTimeout(resolve, 100)); // Give time for all clients to fully close
await mcpServer.close(); await mcpServer.close();
}); });
}); });

View File

@@ -1,7 +1,7 @@
import { Server } from '@modelcontextprotocol/sdk/server/index.js'; import { Server } from '@modelcontextprotocol/sdk/server/index.js';
import { Transport } from '@modelcontextprotocol/sdk/shared/transport.js'; import { Transport } from '@modelcontextprotocol/sdk/shared/transport.js';
import { import {
CallToolRequestSchema, CallToolRequestSchema,
ListToolsRequestSchema, ListToolsRequestSchema,
InitializeRequestSchema, InitializeRequestSchema,
} from '@modelcontextprotocol/sdk/types.js'; } from '@modelcontextprotocol/sdk/types.js';
@@ -14,30 +14,18 @@ export class TestableN8NMCPServer {
private mcpServer: N8NDocumentationMCPServer; private mcpServer: N8NDocumentationMCPServer;
private server: Server; private server: Server;
private transports = new Set<Transport>(); private transports = new Set<Transport>();
private connections = new Set<any>();
private static instanceCount = 0; private static instanceCount = 0;
private testDbPath: string; private testDbPath: string;
constructor() { constructor() {
// Use path.resolve to produce a canonical absolute path so the shared // Use a unique test database for each instance to avoid conflicts
// database singleton always sees the exact same string, preventing // This prevents concurrent test issues with database locking
// "Shared database already initialized with different path" errors. const instanceId = TestableN8NMCPServer.instanceCount++;
const path = require('path'); this.testDbPath = `/tmp/n8n-mcp-test-${process.pid}-${instanceId}.db`;
this.testDbPath = path.resolve(process.cwd(), 'data', 'nodes.db');
process.env.NODE_DB_PATH = this.testDbPath; process.env.NODE_DB_PATH = this.testDbPath;
this.server = this.createServer(); this.server = new Server({
this.mcpServer = new N8NDocumentationMCPServer();
this.setupHandlers(this.server);
}
/**
* Create a fresh MCP SDK Server instance.
* MCP SDK 1.27+ enforces single-connection per Protocol instance,
* so we create a new one each time we need to connect to a transport.
*/
private createServer(): Server {
return new Server({
name: 'n8n-documentation-mcp', name: 'n8n-documentation-mcp',
version: '1.0.0' version: '1.0.0'
}, { }, {
@@ -45,11 +33,14 @@ export class TestableN8NMCPServer {
tools: {} tools: {}
} }
}); });
this.mcpServer = new N8NDocumentationMCPServer();
this.setupHandlers();
} }
private setupHandlers(server: Server) { private setupHandlers() {
// Initialize handler // Initialize handler
server.setRequestHandler(InitializeRequestSchema, async () => { this.server.setRequestHandler(InitializeRequestSchema, async () => {
return { return {
protocolVersion: '2024-11-05', protocolVersion: '2024-11-05',
capabilities: { capabilities: {
@@ -63,27 +54,27 @@ export class TestableN8NMCPServer {
}); });
// List tools handler // List tools handler
server.setRequestHandler(ListToolsRequestSchema, async () => { this.server.setRequestHandler(ListToolsRequestSchema, async () => {
// Import the tools directly from the tools module // Import the tools directly from the tools module
const { n8nDocumentationToolsFinal } = await import('../../../src/mcp/tools'); const { n8nDocumentationToolsFinal } = await import('../../../src/mcp/tools');
const { n8nManagementTools } = await import('../../../src/mcp/tools-n8n-manager'); const { n8nManagementTools } = await import('../../../src/mcp/tools-n8n-manager');
const { isN8nApiConfigured } = await import('../../../src/config/n8n-api'); const { isN8nApiConfigured } = await import('../../../src/config/n8n-api');
// Combine documentation tools with management tools if API is configured // Combine documentation tools with management tools if API is configured
const tools = [...n8nDocumentationToolsFinal]; const tools = [...n8nDocumentationToolsFinal];
if (isN8nApiConfigured()) { if (isN8nApiConfigured()) {
tools.push(...n8nManagementTools); tools.push(...n8nManagementTools);
} }
return { tools }; return { tools };
}); });
// Call tool handler // Call tool handler
server.setRequestHandler(CallToolRequestSchema, async (request) => { this.server.setRequestHandler(CallToolRequestSchema, async (request) => {
try { try {
// The mcpServer.executeTool returns raw data, we need to wrap it in the MCP response format // The mcpServer.executeTool returns raw data, we need to wrap it in the MCP response format
const result = await this.mcpServer.executeTool(request.params.name, request.params.arguments || {}); const result = await this.mcpServer.executeTool(request.params.name, request.params.arguments || {});
return { return {
content: [ content: [
{ {
@@ -107,8 +98,21 @@ export class TestableN8NMCPServer {
} }
async initialize(): Promise<void> { async initialize(): Promise<void> {
// The MCP server initializes its database lazily via the shared // Copy production database to test location for realistic testing
// database singleton. Trigger initialization by calling executeTool. try {
const fs = await import('fs');
const path = await import('path');
const prodDbPath = path.join(process.cwd(), 'data', 'nodes.db');
if (await fs.promises.access(prodDbPath).then(() => true).catch(() => false)) {
await fs.promises.copyFile(prodDbPath, this.testDbPath);
}
} catch (error) {
// Ignore copy errors, database will be created fresh
}
// The MCP server initializes its database lazily
// We can trigger initialization by calling executeTool
try { try {
await this.mcpServer.executeTool('tools_documentation', {}); await this.mcpServer.executeTool('tools_documentation', {});
} catch (error) { } catch (error) {
@@ -121,26 +125,33 @@ export class TestableN8NMCPServer {
if (!transport || typeof transport !== 'object') { if (!transport || typeof transport !== 'object') {
throw new Error('Invalid transport provided'); throw new Error('Invalid transport provided');
} }
// MCP SDK 1.27+ enforces single-connection per Protocol instance. // Set up any missing transport handlers to prevent "Cannot set properties of undefined" errors
// Close the current server and create a fresh one so that _transport if (transport && typeof transport === 'object') {
// is guaranteed to be undefined. Reusing the same Server after close() const transportAny = transport as any;
// is unreliable because _transport is cleared asynchronously via the if (transportAny.serverTransport && !transportAny.serverTransport.onclose) {
// transport onclose callback chain, which can fail in CI. transportAny.serverTransport.onclose = () => {};
try { }
await this.server.close();
} catch {
// Ignore errors during cleanup of previous transport
} }
// Create a brand-new Server instance for this connection // MCP SDK 1.27+ enforces single-connection per Server instance.
this.server = this.createServer(); // Close existing connections before connecting a new transport.
this.setupHandlers(this.server); for (const conn of this.connections) {
try {
if (conn && typeof conn.close === 'function') {
await conn.close();
}
} catch {
// Ignore errors during cleanup
}
}
this.connections.clear();
// Track this transport for cleanup // Track this transport for cleanup
this.transports.add(transport); this.transports.add(transport);
await this.server.connect(transport); const connection = await this.server.connect(transport);
this.connections.add(connection);
} }
async close(): Promise<void> { async close(): Promise<void> {
@@ -153,47 +164,78 @@ export class TestableN8NMCPServer {
}); });
const performClose = async () => { const performClose = async () => {
// Close the MCP SDK Server (resets _transport via _onclose) // Close all connections first with timeout protection
try { const connectionPromises = Array.from(this.connections).map(async (connection) => {
await this.server.close(); const connTimeout = new Promise<void>((resolve) => setTimeout(resolve, 500));
} catch {
// Ignore errors during server close try {
} if (connection && typeof connection.close === 'function') {
await Promise.race([connection.close(), connTimeout]);
// Shut down the inner N8NDocumentationMCPServer to release the }
// shared database reference and prevent resource leaks. } catch (error) {
try { // Ignore errors during connection cleanup
await this.mcpServer.shutdown(); }
} catch { });
// Ignore errors during inner server shutdown
} await Promise.allSettled(connectionPromises);
this.connections.clear();
// Close all tracked transports with timeout protection // Close all tracked transports with timeout protection
const transportPromises: Promise<void>[] = []; const transportPromises: Promise<void>[] = [];
for (const transport of this.transports) { for (const transport of this.transports) {
const transportTimeout = new Promise<void>((resolve) => setTimeout(resolve, 500)); const transportTimeout = new Promise<void>((resolve) => setTimeout(resolve, 500));
try { try {
// Force close all transports
const transportAny = transport as any; const transportAny = transport as any;
// Try different close methods
if (transportAny.close && typeof transportAny.close === 'function') { if (transportAny.close && typeof transportAny.close === 'function') {
transportPromises.push( transportPromises.push(
Promise.race([transportAny.close(), transportTimeout]) Promise.race([transportAny.close(), transportTimeout])
); );
} }
} catch { if (transportAny.serverTransport?.close) {
transportPromises.push(
Promise.race([transportAny.serverTransport.close(), transportTimeout])
);
}
if (transportAny.clientTransport?.close) {
transportPromises.push(
Promise.race([transportAny.clientTransport.close(), transportTimeout])
);
}
} catch (error) {
// Ignore errors during transport cleanup // Ignore errors during transport cleanup
} }
} }
// Wait for all transports to close with timeout
await Promise.allSettled(transportPromises); await Promise.allSettled(transportPromises);
// Clear the transports set
this.transports.clear(); this.transports.clear();
// Don't shut down the shared MCP server instance
}; };
// Race between actual close and timeout // Race between actual close and timeout
await Promise.race([performClose(), closeTimeout]); await Promise.race([performClose(), closeTimeout]);
// Clean up test database
if (this.testDbPath) {
try {
const fs = await import('fs');
await fs.promises.unlink(this.testDbPath).catch(() => {});
await fs.promises.unlink(`${this.testDbPath}-shm`).catch(() => {});
await fs.promises.unlink(`${this.testDbPath}-wal`).catch(() => {});
} catch (error) {
// Ignore cleanup errors
}
}
} }
static async shutdownShared(): Promise<void> { static async shutdownShared(): Promise<void> {
if (sharedMcpServer) { if (sharedMcpServer) {
await sharedMcpServer.shutdown(); await sharedMcpServer.shutdown();

View File

@@ -1,728 +0,0 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
import { N8nApiClient } from '@/services/n8n-api-client';
import { N8nApiError } from '@/utils/n8n-errors';
// Mock dependencies
vi.mock('@/services/n8n-api-client');
vi.mock('@/config/n8n-api', () => ({
getN8nApiConfig: vi.fn(),
}));
vi.mock('@/services/n8n-validation', () => ({
validateWorkflowStructure: vi.fn(),
hasWebhookTrigger: vi.fn(),
getWebhookUrl: vi.fn(),
}));
vi.mock('@/utils/logger', () => ({
logger: {
info: vi.fn(),
error: vi.fn(),
debug: vi.fn(),
warn: vi.fn(),
},
Logger: vi.fn().mockImplementation(() => ({
info: vi.fn(),
error: vi.fn(),
debug: vi.fn(),
warn: vi.fn(),
})),
LogLevel: {
ERROR: 0,
WARN: 1,
INFO: 2,
DEBUG: 3,
},
}));
describe('Data Table Handlers (n8n_manage_datatable)', () => {
let mockApiClient: any;
let handlers: any;
let getN8nApiConfig: any;
beforeEach(async () => {
vi.clearAllMocks();
// Setup mock API client with all data table methods
mockApiClient = {
createWorkflow: vi.fn(),
getWorkflow: vi.fn(),
updateWorkflow: vi.fn(),
deleteWorkflow: vi.fn(),
listWorkflows: vi.fn(),
triggerWebhook: vi.fn(),
getExecution: vi.fn(),
listExecutions: vi.fn(),
deleteExecution: vi.fn(),
healthCheck: vi.fn(),
createDataTable: vi.fn(),
listDataTables: vi.fn(),
getDataTable: vi.fn(),
updateDataTable: vi.fn(),
deleteDataTable: vi.fn(),
getDataTableRows: vi.fn(),
insertDataTableRows: vi.fn(),
updateDataTableRows: vi.fn(),
upsertDataTableRow: vi.fn(),
deleteDataTableRows: vi.fn(),
};
// Import mocked modules
getN8nApiConfig = (await import('@/config/n8n-api')).getN8nApiConfig;
// Mock the API config
vi.mocked(getN8nApiConfig).mockReturnValue({
baseUrl: 'https://n8n.test.com',
apiKey: 'test-key',
timeout: 30000,
maxRetries: 3,
});
// Mock the N8nApiClient constructor
vi.mocked(N8nApiClient).mockImplementation(() => mockApiClient);
// Import handlers module after setting up mocks
handlers = await import('@/mcp/handlers-n8n-manager');
});
afterEach(() => {
if (handlers) {
const clientGetter = handlers.getN8nApiClient;
if (clientGetter) {
vi.mocked(getN8nApiConfig).mockReturnValue(null);
clientGetter();
}
}
});
// ========================================================================
// handleCreateTable
// ========================================================================
describe('handleCreateTable', () => {
it('should create data table with name and columns successfully', async () => {
const createdTable = {
id: 'dt-123',
name: 'My Data Table',
columns: [
{ id: 'col-1', name: 'email', type: 'string', index: 0 },
{ id: 'col-2', name: 'age', type: 'number', index: 1 },
],
};
mockApiClient.createDataTable.mockResolvedValue(createdTable);
const result = await handlers.handleCreateTable({
name: 'My Data Table',
columns: [
{ name: 'email', type: 'string' },
{ name: 'age', type: 'number' },
],
});
expect(result).toEqual({
success: true,
data: { id: 'dt-123', name: 'My Data Table' },
message: 'Data table "My Data Table" created with ID: dt-123',
});
expect(mockApiClient.createDataTable).toHaveBeenCalledWith({
name: 'My Data Table',
columns: [
{ name: 'email', type: 'string' },
{ name: 'age', type: 'number' },
],
});
});
it('should create data table with name only (no columns)', async () => {
const createdTable = {
id: 'dt-456',
name: 'Empty Table',
};
mockApiClient.createDataTable.mockResolvedValue(createdTable);
const result = await handlers.handleCreateTable({
name: 'Empty Table',
});
expect(result).toEqual({
success: true,
data: { id: 'dt-456', name: 'Empty Table' },
message: 'Data table "Empty Table" created with ID: dt-456',
});
expect(mockApiClient.createDataTable).toHaveBeenCalledWith({
name: 'Empty Table',
});
});
it('should return error when API returns empty response (null)', async () => {
mockApiClient.createDataTable.mockResolvedValue(null);
const result = await handlers.handleCreateTable({
name: 'Ghost Table',
});
expect(result).toEqual({
success: false,
error: 'Data table creation failed: n8n API returned an empty or invalid response',
});
});
it('should return error when API call fails', async () => {
const apiError = new Error('Data table creation failed on the server');
mockApiClient.createDataTable.mockRejectedValue(apiError);
const result = await handlers.handleCreateTable({
name: 'Broken Table',
});
expect(result).toEqual({
success: false,
error: 'Data table creation failed on the server',
});
});
it('should return Zod validation error when name is missing', async () => {
const result = await handlers.handleCreateTable({});
expect(result.success).toBe(false);
expect(result.error).toBe('Invalid input');
expect(result.details).toHaveProperty('errors');
});
it('should return error when n8n API is not configured', async () => {
vi.mocked(getN8nApiConfig).mockReturnValue(null);
const result = await handlers.handleCreateTable({
name: 'Test Table',
});
expect(result).toEqual({
success: false,
error: 'n8n API not configured. Please set N8N_API_URL and N8N_API_KEY environment variables.',
});
});
it('should return structured error for N8nApiError', async () => {
const apiError = new N8nApiError('Feature not available', 402, 'PAYMENT_REQUIRED', { plan: 'enterprise' });
mockApiClient.createDataTable.mockRejectedValue(apiError);
const result = await handlers.handleCreateTable({
name: 'Enterprise Table',
});
expect(result.success).toBe(false);
expect(result.error).toBeDefined();
expect(result.code).toBe('PAYMENT_REQUIRED');
expect(result.details).toEqual({ plan: 'enterprise' });
});
it('should return Unknown error when a non-Error value is thrown', async () => {
mockApiClient.createDataTable.mockRejectedValue('string-error');
const result = await handlers.handleCreateTable({
name: 'Error Table',
});
expect(result).toEqual({
success: false,
error: 'Unknown error occurred',
});
});
});
// ========================================================================
// handleListTables
// ========================================================================
describe('handleListTables', () => {
it('should list tables successfully', async () => {
const tables = [
{ id: 'dt-1', name: 'Table One' },
{ id: 'dt-2', name: 'Table Two' },
];
mockApiClient.listDataTables.mockResolvedValue({ data: tables, nextCursor: null });
const result = await handlers.handleListTables({});
expect(result).toEqual({
success: true,
data: {
tables,
count: 2,
nextCursor: undefined,
},
});
});
it('should return empty list when no tables exist', async () => {
mockApiClient.listDataTables.mockResolvedValue({ data: [], nextCursor: null });
const result = await handlers.handleListTables({});
expect(result).toEqual({
success: true,
data: {
tables: [],
count: 0,
nextCursor: undefined,
},
});
});
it('should pass pagination params (limit, cursor)', async () => {
mockApiClient.listDataTables.mockResolvedValue({
data: [{ id: 'dt-3', name: 'Page Two' }],
nextCursor: 'cursor-next',
});
const result = await handlers.handleListTables({ limit: 10, cursor: 'cursor-abc' });
expect(mockApiClient.listDataTables).toHaveBeenCalledWith({ limit: 10, cursor: 'cursor-abc' });
expect(result.success).toBe(true);
expect(result.data.nextCursor).toBe('cursor-next');
});
it('should handle API error', async () => {
mockApiClient.listDataTables.mockRejectedValue(new Error('Server down'));
const result = await handlers.handleListTables({});
expect(result.success).toBe(false);
expect(result.error).toBe('Server down');
});
});
// ========================================================================
// handleGetTable
// ========================================================================
describe('handleGetTable', () => {
it('should get table successfully', async () => {
const table = { id: 'dt-1', name: 'My Table', columns: [] };
mockApiClient.getDataTable.mockResolvedValue(table);
const result = await handlers.handleGetTable({ tableId: 'dt-1' });
expect(result).toEqual({
success: true,
data: table,
});
expect(mockApiClient.getDataTable).toHaveBeenCalledWith('dt-1');
});
it('should return error on 404', async () => {
const notFoundError = new N8nApiError('Data table not found', 404, 'NOT_FOUND');
mockApiClient.getDataTable.mockRejectedValue(notFoundError);
const result = await handlers.handleGetTable({ tableId: 'dt-nonexistent' });
expect(result.success).toBe(false);
expect(result.error).toBeDefined();
expect(result.code).toBe('NOT_FOUND');
});
it('should return Zod validation error when tableId is missing', async () => {
const result = await handlers.handleGetTable({});
expect(result.success).toBe(false);
expect(result.error).toBe('Invalid input');
expect(result.details).toHaveProperty('errors');
});
});
// ========================================================================
// handleUpdateTable
// ========================================================================
describe('handleUpdateTable', () => {
it('should rename table successfully', async () => {
const updatedTable = { id: 'dt-1', name: 'Renamed Table' };
mockApiClient.updateDataTable.mockResolvedValue(updatedTable);
const result = await handlers.handleUpdateTable({ tableId: 'dt-1', name: 'Renamed Table' });
expect(result).toEqual({
success: true,
data: updatedTable,
message: 'Data table renamed to "Renamed Table"',
});
expect(mockApiClient.updateDataTable).toHaveBeenCalledWith('dt-1', { name: 'Renamed Table' });
});
it('should return Zod validation error when tableId is missing', async () => {
const result = await handlers.handleUpdateTable({ name: 'New Name' });
expect(result.success).toBe(false);
expect(result.error).toBe('Invalid input');
expect(result.details).toHaveProperty('errors');
});
it('should return error when API call fails', async () => {
mockApiClient.updateDataTable.mockRejectedValue(new Error('Update failed'));
const result = await handlers.handleUpdateTable({ tableId: 'dt-1', name: 'New Name' });
expect(result.success).toBe(false);
expect(result.error).toBe('Update failed');
});
});
// ========================================================================
// handleDeleteTable
// ========================================================================
describe('handleDeleteTable', () => {
it('should delete table successfully', async () => {
mockApiClient.deleteDataTable.mockResolvedValue(undefined);
const result = await handlers.handleDeleteTable({ tableId: 'dt-1' });
expect(result).toEqual({
success: true,
message: 'Data table dt-1 deleted successfully',
});
expect(mockApiClient.deleteDataTable).toHaveBeenCalledWith('dt-1');
});
it('should return error on 404', async () => {
const notFoundError = new N8nApiError('Data table not found', 404, 'NOT_FOUND');
mockApiClient.deleteDataTable.mockRejectedValue(notFoundError);
const result = await handlers.handleDeleteTable({ tableId: 'dt-nonexistent' });
expect(result.success).toBe(false);
expect(result.error).toBeDefined();
expect(result.code).toBe('NOT_FOUND');
});
});
// ========================================================================
// handleGetRows
// ========================================================================
describe('handleGetRows', () => {
it('should get rows with default params', async () => {
const rows = [
{ id: 1, email: 'a@b.com', score: 10 },
{ id: 2, email: 'c@d.com', score: 20 },
];
mockApiClient.getDataTableRows.mockResolvedValue({ data: rows, nextCursor: null });
const result = await handlers.handleGetRows({ tableId: 'dt-1' });
expect(result).toEqual({
success: true,
data: {
rows,
count: 2,
nextCursor: undefined,
},
});
expect(mockApiClient.getDataTableRows).toHaveBeenCalledWith('dt-1', {});
});
it('should pass filter, sort, and search params', async () => {
mockApiClient.getDataTableRows.mockResolvedValue({ data: [], nextCursor: null });
await handlers.handleGetRows({
tableId: 'dt-1',
limit: 50,
sortBy: 'name:asc',
search: 'john',
});
expect(mockApiClient.getDataTableRows).toHaveBeenCalledWith('dt-1', {
limit: 50,
sortBy: encodeURIComponent('name:asc'),
search: 'john',
});
});
it('should serialize object filter to URL-encoded JSON string', async () => {
mockApiClient.getDataTableRows.mockResolvedValue({ data: [], nextCursor: null });
const objectFilter = {
type: 'and' as const,
filters: [{ columnName: 'status', condition: 'eq' as const, value: 'active' }],
};
await handlers.handleGetRows({
tableId: 'dt-1',
filter: objectFilter,
});
expect(mockApiClient.getDataTableRows).toHaveBeenCalledWith('dt-1', {
filter: encodeURIComponent(JSON.stringify(objectFilter)),
});
});
it('should URL-encode string filter', async () => {
mockApiClient.getDataTableRows.mockResolvedValue({ data: [], nextCursor: null });
const filterStr = '{"type":"and","filters":[]}';
await handlers.handleGetRows({
tableId: 'dt-1',
filter: filterStr,
});
expect(mockApiClient.getDataTableRows).toHaveBeenCalledWith('dt-1', {
filter: encodeURIComponent(filterStr),
});
});
});
// ========================================================================
// handleInsertRows
// ========================================================================
describe('handleInsertRows', () => {
it('should insert rows successfully', async () => {
const insertResult = { insertedCount: 2, ids: [1, 2] };
mockApiClient.insertDataTableRows.mockResolvedValue(insertResult);
const result = await handlers.handleInsertRows({
tableId: 'dt-1',
data: [
{ email: 'a@b.com', score: 10 },
{ email: 'c@d.com', score: 20 },
],
});
expect(result).toEqual({
success: true,
data: insertResult,
message: 'Rows inserted into data table dt-1',
});
expect(mockApiClient.insertDataTableRows).toHaveBeenCalledWith('dt-1', {
data: [
{ email: 'a@b.com', score: 10 },
{ email: 'c@d.com', score: 20 },
],
});
});
it('should pass returnType to the API client', async () => {
const insertResult = [{ id: 1, email: 'a@b.com', score: 10 }];
mockApiClient.insertDataTableRows.mockResolvedValue(insertResult);
const result = await handlers.handleInsertRows({
tableId: 'dt-1',
data: [{ email: 'a@b.com', score: 10 }],
returnType: 'all',
});
expect(result.success).toBe(true);
expect(result.data).toEqual(insertResult);
expect(mockApiClient.insertDataTableRows).toHaveBeenCalledWith('dt-1', {
data: [{ email: 'a@b.com', score: 10 }],
returnType: 'all',
});
});
it('should return Zod validation error when data is empty array', async () => {
const result = await handlers.handleInsertRows({
tableId: 'dt-1',
data: [],
});
expect(result.success).toBe(false);
expect(result.error).toBe('Invalid input');
expect(result.details).toHaveProperty('errors');
});
});
// ========================================================================
// handleUpdateRows
// ========================================================================
describe('handleUpdateRows', () => {
it('should update rows successfully', async () => {
const updateResult = { updatedCount: 3 };
mockApiClient.updateDataTableRows.mockResolvedValue(updateResult);
const filter = {
type: 'and' as const,
filters: [{ columnName: 'status', condition: 'eq' as const, value: 'inactive' }],
};
const result = await handlers.handleUpdateRows({
tableId: 'dt-1',
filter,
data: { status: 'active' },
});
expect(result).toEqual({
success: true,
data: updateResult,
message: 'Rows updated successfully',
});
expect(mockApiClient.updateDataTableRows).toHaveBeenCalledWith('dt-1', {
filter,
data: { status: 'active' },
});
});
it('should support dryRun mode', async () => {
const dryRunResult = { matchedCount: 5 };
mockApiClient.updateDataTableRows.mockResolvedValue(dryRunResult);
const filter = {
filters: [{ columnName: 'score', condition: 'lt' as const, value: 5 }],
};
const result = await handlers.handleUpdateRows({
tableId: 'dt-1',
filter,
data: { status: 'low' },
dryRun: true,
});
expect(result.success).toBe(true);
expect(result.message).toBe('Dry run: rows matched (no changes applied)');
expect(mockApiClient.updateDataTableRows).toHaveBeenCalledWith('dt-1', {
filter: { type: 'and', ...filter },
data: { status: 'low' },
dryRun: true,
});
});
it('should return error on API failure', async () => {
mockApiClient.updateDataTableRows.mockRejectedValue(new Error('Conflict'));
const result = await handlers.handleUpdateRows({
tableId: 'dt-1',
filter: { filters: [{ columnName: 'id', condition: 'eq' as const, value: 1 }] },
data: { name: 'test' },
});
expect(result.success).toBe(false);
expect(result.error).toBe('Conflict');
});
});
// ========================================================================
// handleUpsertRows
// ========================================================================
describe('handleUpsertRows', () => {
it('should upsert row successfully', async () => {
const upsertResult = { action: 'updated', row: { id: 1, email: 'a@b.com', score: 15 } };
mockApiClient.upsertDataTableRow.mockResolvedValue(upsertResult);
const filter = {
filters: [{ columnName: 'email', condition: 'eq' as const, value: 'a@b.com' }],
};
const result = await handlers.handleUpsertRows({
tableId: 'dt-1',
filter,
data: { score: 15 },
});
expect(result).toEqual({
success: true,
data: upsertResult,
message: 'Row upserted successfully',
});
expect(mockApiClient.upsertDataTableRow).toHaveBeenCalledWith('dt-1', {
filter: { type: 'and', ...filter },
data: { score: 15 },
});
});
it('should support dryRun mode', async () => {
const dryRunResult = { action: 'would_update', matchedRows: 1 };
mockApiClient.upsertDataTableRow.mockResolvedValue(dryRunResult);
const filter = {
filters: [{ columnName: 'email', condition: 'eq' as const, value: 'a@b.com' }],
};
const result = await handlers.handleUpsertRows({
tableId: 'dt-1',
filter,
data: { score: 20 },
dryRun: true,
});
expect(result.success).toBe(true);
expect(result.message).toBe('Dry run: upsert previewed (no changes applied)');
});
it('should return error on API failure', async () => {
const apiError = new N8nApiError('Server error', 500, 'INTERNAL_ERROR');
mockApiClient.upsertDataTableRow.mockRejectedValue(apiError);
const result = await handlers.handleUpsertRows({
tableId: 'dt-1',
filter: { filters: [{ columnName: 'id', condition: 'eq' as const, value: 1 }] },
data: { name: 'test' },
});
expect(result.success).toBe(false);
expect(result.error).toBeDefined();
expect(result.code).toBe('INTERNAL_ERROR');
});
});
// ========================================================================
// handleDeleteRows
// ========================================================================
describe('handleDeleteRows', () => {
it('should delete rows successfully', async () => {
const deleteResult = { deletedCount: 2 };
mockApiClient.deleteDataTableRows.mockResolvedValue(deleteResult);
const filter = {
filters: [{ columnName: 'status', condition: 'eq' as const, value: 'deleted' }],
};
const result = await handlers.handleDeleteRows({
tableId: 'dt-1',
filter,
});
expect(result).toEqual({
success: true,
data: deleteResult,
message: 'Rows deleted successfully',
});
expect(mockApiClient.deleteDataTableRows).toHaveBeenCalledWith('dt-1', {
filter: encodeURIComponent(JSON.stringify({ type: 'and', ...filter })),
});
});
it('should URL-encode serialized filter for API call', async () => {
mockApiClient.deleteDataTableRows.mockResolvedValue({ deletedCount: 1 });
const filter = {
type: 'or' as const,
filters: [
{ columnName: 'score', condition: 'lt' as const, value: 0 },
{ columnName: 'status', condition: 'eq' as const, value: 'spam' },
],
};
await handlers.handleDeleteRows({ tableId: 'dt-1', filter });
expect(mockApiClient.deleteDataTableRows).toHaveBeenCalledWith('dt-1', {
filter: encodeURIComponent(JSON.stringify(filter)),
});
});
it('should support dryRun mode', async () => {
const dryRunResult = { matchedCount: 4 };
mockApiClient.deleteDataTableRows.mockResolvedValue(dryRunResult);
const filter = {
filters: [{ columnName: 'active', condition: 'eq' as const, value: false }],
};
const result = await handlers.handleDeleteRows({
tableId: 'dt-1',
filter,
dryRun: true,
});
expect(result.success).toBe(true);
expect(result.message).toBe('Dry run: rows matched for deletion (no changes applied)');
expect(mockApiClient.deleteDataTableRows).toHaveBeenCalledWith('dt-1', {
filter: encodeURIComponent(JSON.stringify({ type: 'and', ...filter })),
dryRun: true,
});
});
});
});

View File

@@ -100,16 +100,6 @@ describe('handlers-n8n-manager', () => {
listExecutions: vi.fn(), listExecutions: vi.fn(),
deleteExecution: vi.fn(), deleteExecution: vi.fn(),
healthCheck: vi.fn(), healthCheck: vi.fn(),
createDataTable: vi.fn(),
listDataTables: vi.fn(),
getDataTable: vi.fn(),
updateDataTable: vi.fn(),
deleteDataTable: vi.fn(),
getDataTableRows: vi.fn(),
insertDataTableRows: vi.fn(),
updateDataTableRows: vi.fn(),
upsertDataTableRow: vi.fn(),
deleteDataTableRows: vi.fn(),
}; };
// Setup mock repository // Setup mock repository
@@ -641,27 +631,6 @@ describe('handlers-n8n-manager', () => {
expect(result.details.errors[0]).toContain('Webhook'); expect(result.details.errors[0]).toContain('Webhook');
}); });
}); });
it('should pass projectId to API when provided', async () => {
const testWorkflow = createTestWorkflow();
const input = {
name: 'Test Workflow',
nodes: testWorkflow.nodes,
connections: testWorkflow.connections,
projectId: 'project-abc-123',
};
mockApiClient.createWorkflow.mockResolvedValue(testWorkflow);
const result = await handlers.handleCreateWorkflow(input);
expect(result.success).toBe(true);
expect(mockApiClient.createWorkflow).toHaveBeenCalledWith(
expect.objectContaining({
projectId: 'project-abc-123',
})
);
});
}); });
describe('handleGetWorkflow', () => { describe('handleGetWorkflow', () => {
@@ -1112,10 +1081,10 @@ describe('handlers-n8n-manager', () => {
enabled: true, enabled: true,
}, },
managementTools: { managementTools: {
count: 14, count: 13,
enabled: true, enabled: true,
}, },
totalAvailable: 21, totalAvailable: 20,
}, },
}); });

View File

@@ -73,10 +73,6 @@ describe('handlers-workflow-diff', () => {
mockApiClient = { mockApiClient = {
getWorkflow: vi.fn(), getWorkflow: vi.fn(),
updateWorkflow: vi.fn(), updateWorkflow: vi.fn(),
listTags: vi.fn().mockResolvedValue({ data: [] }),
createTag: vi.fn(),
updateWorkflowTags: vi.fn().mockResolvedValue([]),
transferWorkflow: vi.fn().mockResolvedValue(undefined),
}; };
// Setup mock diff engine // Setup mock diff engine
@@ -154,7 +150,6 @@ describe('handlers-workflow-diff', () => {
expect(result).toEqual({ expect(result).toEqual({
success: true, success: true,
saved: true,
data: { data: {
id: 'test-workflow-id', id: 'test-workflow-id',
name: 'Test Workflow', name: 'Test Workflow',
@@ -314,12 +309,10 @@ describe('handlers-workflow-diff', () => {
expect(result).toEqual({ expect(result).toEqual({
success: false, success: false,
saved: false,
operationsApplied: 0,
error: 'Failed to apply diff operations', error: 'Failed to apply diff operations',
details: { details: {
errors: ['Node "non-existent-node" not found'], errors: ['Node "non-existent-node" not found'],
warnings: undefined, operationsApplied: 0,
applied: [], applied: [],
failed: [0], failed: [0],
}, },
@@ -637,14 +630,10 @@ describe('handlers-workflow-diff', () => {
expect(result).toEqual({ expect(result).toEqual({
success: false, success: false,
saved: false,
operationsApplied: 1,
error: 'Failed to apply diff operations', error: 'Failed to apply diff operations',
details: { details: {
errors: ['Operation 2 failed: Node "invalid-node" not found'], errors: ['Operation 2 failed: Node "invalid-node" not found'],
warnings: undefined, operationsApplied: 1,
applied: undefined,
failed: undefined,
}, },
}); });
}); });
@@ -866,438 +855,5 @@ describe('handlers-workflow-diff', () => {
}); });
}); });
}); });
describe('Tag Operations via Dedicated API', () => {
it('should create a new tag and associate it with the workflow', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
tagsToAdd: ['new-tag'],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
mockApiClient.listTags.mockResolvedValue({ data: [] });
mockApiClient.createTag.mockResolvedValue({ id: 'tag-123', name: 'new-tag' });
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'addTag', tag: 'new-tag' }],
}, mockRepository);
expect(result.success).toBe(true);
expect(mockApiClient.createTag).toHaveBeenCalledWith({ name: 'new-tag' });
expect(mockApiClient.updateWorkflowTags).toHaveBeenCalledWith('test-workflow-id', ['tag-123']);
});
it('should use existing tag ID when tag already exists', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
tagsToAdd: ['existing-tag'],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
mockApiClient.listTags.mockResolvedValue({ data: [{ id: 'tag-456', name: 'existing-tag' }] });
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'addTag', tag: 'existing-tag' }],
}, mockRepository);
expect(result.success).toBe(true);
expect(mockApiClient.createTag).not.toHaveBeenCalled();
expect(mockApiClient.updateWorkflowTags).toHaveBeenCalledWith('test-workflow-id', ['tag-456']);
});
it('should remove a tag from the workflow', async () => {
const testWorkflow = createTestWorkflow({
tags: [{ id: 'tag-789', name: 'old-tag' }],
});
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
tagsToRemove: ['old-tag'],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
mockApiClient.listTags.mockResolvedValue({ data: [{ id: 'tag-789', name: 'old-tag' }] });
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'removeTag', tag: 'old-tag' }],
}, mockRepository);
expect(result.success).toBe(true);
expect(mockApiClient.updateWorkflowTags).toHaveBeenCalledWith('test-workflow-id', []);
});
it('should produce warning on tag creation failure without failing the operation', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
tagsToAdd: ['fail-tag'],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
mockApiClient.listTags.mockResolvedValue({ data: [] });
mockApiClient.createTag.mockRejectedValue(new Error('Tag creation failed'));
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'addTag', tag: 'fail-tag' }],
}, mockRepository);
expect(result.success).toBe(true);
expect(result.saved).toBe(true);
// Tag creation failure should produce a warning, not block the update
const warnings = (result.details as any)?.warnings;
expect(warnings).toBeDefined();
expect(warnings.some((w: any) => w.message.includes('Failed to create tag'))).toBe(true);
});
it('should not call tag APIs when no tag operations are present', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'updateName', name: 'New Name' }],
}, mockRepository);
expect(mockApiClient.listTags).not.toHaveBeenCalled();
expect(mockApiClient.createTag).not.toHaveBeenCalled();
expect(mockApiClient.updateWorkflowTags).not.toHaveBeenCalled();
});
});
describe('Project Transfer via Dedicated API', () => {
it('should call transferWorkflow when diffResult has transferToProjectId', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
transferToProjectId: 'project-abc-123',
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'transferWorkflow', destinationProjectId: 'project-abc-123' }],
}, mockRepository);
expect(result.success).toBe(true);
expect(mockApiClient.transferWorkflow).toHaveBeenCalledWith('test-workflow-id', 'project-abc-123');
expect(result.message).toContain('transferred to project');
});
it('should NOT call transferWorkflow when transferToProjectId is absent', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'updateName', name: 'New Name' }],
}, mockRepository);
expect(mockApiClient.transferWorkflow).not.toHaveBeenCalled();
});
it('should return success false with saved true when transfer fails', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
transferToProjectId: 'project-bad-id',
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
mockApiClient.transferWorkflow.mockRejectedValue(new Error('Project not found'));
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'transferWorkflow', destinationProjectId: 'project-bad-id' }],
}, mockRepository);
expect(result.success).toBe(false);
expect(result.saved).toBe(true);
expect(result.error).toBe('Workflow updated successfully but project transfer failed');
expect(result.details).toEqual({
workflowUpdated: true,
transferError: 'Project not found',
});
});
it('should return Unknown error when non-Error value is thrown during transfer', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
transferToProjectId: 'project-unknown',
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
mockApiClient.transferWorkflow.mockRejectedValue('string error');
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{ type: 'transferWorkflow', destinationProjectId: 'project-unknown' }],
}, mockRepository);
expect(result.success).toBe(false);
expect(result.saved).toBe(true);
expect(result.details).toEqual({
workflowUpdated: true,
transferError: 'Unknown error',
});
});
it('should call transferWorkflow BEFORE activateWorkflow', async () => {
const testWorkflow = createTestWorkflow({ active: false });
const updatedWorkflow = { ...testWorkflow, active: false };
const activatedWorkflow = { ...testWorkflow, active: true };
const callOrder: string[] = [];
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 2,
message: 'Success',
errors: [],
transferToProjectId: 'project-target',
shouldActivate: true,
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
mockApiClient.transferWorkflow.mockImplementation(async () => {
callOrder.push('transfer');
});
mockApiClient.activateWorkflow = vi.fn().mockImplementation(async () => {
callOrder.push('activate');
return activatedWorkflow;
});
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [
{ type: 'transferWorkflow', destinationProjectId: 'project-target' },
{ type: 'activateWorkflow' },
],
}, mockRepository);
expect(result.success).toBe(true);
expect(mockApiClient.transferWorkflow).toHaveBeenCalledWith('test-workflow-id', 'project-target');
expect(mockApiClient.activateWorkflow).toHaveBeenCalledWith('test-workflow-id');
expect(callOrder).toEqual(['transfer', 'activate']);
});
it('should skip activation when transfer fails', async () => {
const testWorkflow = createTestWorkflow({ active: false });
const updatedWorkflow = { ...testWorkflow, active: false };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 2,
message: 'Success',
errors: [],
transferToProjectId: 'project-fail',
shouldActivate: true,
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
mockApiClient.transferWorkflow.mockRejectedValue(new Error('Transfer denied'));
mockApiClient.activateWorkflow = vi.fn();
const result = await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [
{ type: 'transferWorkflow', destinationProjectId: 'project-fail' },
{ type: 'activateWorkflow' },
],
}, mockRepository);
expect(result.success).toBe(false);
expect(result.saved).toBe(true);
expect(result.error).toBe('Workflow updated successfully but project transfer failed');
expect(mockApiClient.activateWorkflow).not.toHaveBeenCalled();
});
});
describe('field name normalization', () => {
it('should normalize "name" to "nodeName" for updateNode operations', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{
type: 'updateNode',
name: 'HTTP Request', // LLMs often use "name" instead of "nodeName"
updates: { 'parameters.url': 'https://new-url.com' },
}],
}, mockRepository);
// Verify the diff engine received nodeName (normalized from name)
expect(mockDiffEngine.applyDiff).toHaveBeenCalled();
const diffArgs = mockDiffEngine.applyDiff.mock.calls[0][1];
expect(diffArgs.operations[0].nodeName).toBe('HTTP Request');
});
it('should normalize "id" to "nodeId" for removeNode operations', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{
type: 'removeNode',
id: 'node2', // LLMs may use "id" instead of "nodeId"
}],
}, mockRepository);
// Verify the diff engine received nodeId (normalized from id)
expect(mockDiffEngine.applyDiff).toHaveBeenCalled();
const diffArgs = mockDiffEngine.applyDiff.mock.calls[0][1];
expect(diffArgs.operations[0].nodeId).toBe('node2');
});
it('should NOT normalize "name" for updateName operations', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{
type: 'updateName',
name: 'New Workflow Name', // This is the correct field for updateName
}],
}, mockRepository);
// Verify "name" stays as "name" (not moved to nodeName) for updateName
expect(mockDiffEngine.applyDiff).toHaveBeenCalled();
const diffArgs = mockDiffEngine.applyDiff.mock.calls[0][1];
expect(diffArgs.operations[0].name).toBe('New Workflow Name');
expect(diffArgs.operations[0].nodeName).toBeUndefined();
});
it('should prefer explicit "nodeName" over "name" alias', async () => {
const testWorkflow = createTestWorkflow();
const updatedWorkflow = { ...testWorkflow };
mockApiClient.getWorkflow.mockResolvedValue(testWorkflow);
mockDiffEngine.applyDiff.mockResolvedValue({
success: true,
workflow: updatedWorkflow,
operationsApplied: 1,
message: 'Success',
errors: [],
});
mockApiClient.updateWorkflow.mockResolvedValue(updatedWorkflow);
await handleUpdatePartialWorkflow({
id: 'test-workflow-id',
operations: [{
type: 'updateNode',
nodeName: 'HTTP Request', // Explicit nodeName provided
name: 'Should Be Ignored', // Should NOT override nodeName
updates: { 'parameters.url': 'https://new-url.com' },
}],
}, mockRepository);
expect(mockDiffEngine.applyDiff).toHaveBeenCalled();
const diffArgs = mockDiffEngine.applyDiff.mock.calls[0][1];
expect(diffArgs.operations[0].nodeName).toBe('HTTP Request');
});
});
}); });
}); });

View File

@@ -542,9 +542,6 @@ describe('Parameter Validation', () => {
await expect(server.testExecuteTool('n8n_test_workflow', {})) await expect(server.testExecuteTool('n8n_test_workflow', {}))
.rejects.toThrow('Missing required parameters for n8n_test_workflow: workflowId'); .rejects.toThrow('Missing required parameters for n8n_test_workflow: workflowId');
await expect(server.testExecuteTool('n8n_manage_datatable', {}))
.rejects.toThrow('n8n_manage_datatable: Validation failed:\n • action: action is required');
for (const tool of n8nToolsWithRequiredParams) { for (const tool of n8nToolsWithRequiredParams) {
await expect(server.testExecuteTool(tool.name, tool.args)) await expect(server.testExecuteTool(tool.name, tool.args))
.rejects.toThrow(tool.expected); .rejects.toThrow(tool.expected);

View File

@@ -297,7 +297,7 @@ describe('N8nApiClient', () => {
expect.fail('Should have thrown an error'); expect.fail('Should have thrown an error');
} catch (err) { } catch (err) {
expect(err).toBeInstanceOf(N8nNotFoundError); expect(err).toBeInstanceOf(N8nNotFoundError);
expect((err as N8nNotFoundError).message.toLowerCase()).toContain('not found'); expect((err as N8nNotFoundError).message).toContain('not found');
expect((err as N8nNotFoundError).statusCode).toBe(404); expect((err as N8nNotFoundError).statusCode).toBe(404);
} }
}); });
@@ -380,7 +380,7 @@ describe('N8nApiClient', () => {
expect.fail('Should have thrown an error'); expect.fail('Should have thrown an error');
} catch (err) { } catch (err) {
expect(err).toBeInstanceOf(N8nNotFoundError); expect(err).toBeInstanceOf(N8nNotFoundError);
expect((err as N8nNotFoundError).message.toLowerCase()).toContain('not found'); expect((err as N8nNotFoundError).message).toContain('not found');
expect((err as N8nNotFoundError).statusCode).toBe(404); expect((err as N8nNotFoundError).statusCode).toBe(404);
} }
}); });
@@ -398,7 +398,7 @@ describe('N8nApiClient', () => {
const result = await client.activateWorkflow('123'); const result = await client.activateWorkflow('123');
expect(mockAxiosInstance.post).toHaveBeenCalledWith('/workflows/123/activate', {}); expect(mockAxiosInstance.post).toHaveBeenCalledWith('/workflows/123/activate');
expect(result).toEqual(activatedWorkflow); expect(result).toEqual(activatedWorkflow);
expect(result.active).toBe(true); expect(result.active).toBe(true);
}); });
@@ -432,7 +432,7 @@ describe('N8nApiClient', () => {
expect.fail('Should have thrown an error'); expect.fail('Should have thrown an error');
} catch (err) { } catch (err) {
expect(err).toBeInstanceOf(N8nNotFoundError); expect(err).toBeInstanceOf(N8nNotFoundError);
expect((err as N8nNotFoundError).message.toLowerCase()).toContain('not found'); expect((err as N8nNotFoundError).message).toContain('not found');
expect((err as N8nNotFoundError).statusCode).toBe(404); expect((err as N8nNotFoundError).statusCode).toBe(404);
} }
}); });
@@ -484,7 +484,7 @@ describe('N8nApiClient', () => {
const result = await client.deactivateWorkflow('123'); const result = await client.deactivateWorkflow('123');
expect(mockAxiosInstance.post).toHaveBeenCalledWith('/workflows/123/deactivate', {}); expect(mockAxiosInstance.post).toHaveBeenCalledWith('/workflows/123/deactivate');
expect(result).toEqual(deactivatedWorkflow); expect(result).toEqual(deactivatedWorkflow);
expect(result.active).toBe(false); expect(result.active).toBe(false);
}); });
@@ -501,7 +501,7 @@ describe('N8nApiClient', () => {
expect.fail('Should have thrown an error'); expect.fail('Should have thrown an error');
} catch (err) { } catch (err) {
expect(err).toBeInstanceOf(N8nNotFoundError); expect(err).toBeInstanceOf(N8nNotFoundError);
expect((err as N8nNotFoundError).message.toLowerCase()).toContain('not found'); expect((err as N8nNotFoundError).message).toContain('not found');
expect((err as N8nNotFoundError).statusCode).toBe(404); expect((err as N8nNotFoundError).statusCode).toBe(404);
} }
}); });
@@ -1250,413 +1250,6 @@ describe('N8nApiClient', () => {
}); });
}); });
describe('transferWorkflow', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should transfer workflow successfully via PUT', async () => {
mockAxiosInstance.put.mockResolvedValue({ data: undefined });
await client.transferWorkflow('123', 'project-456');
expect(mockAxiosInstance.put).toHaveBeenCalledWith(
'/workflows/123/transfer',
{ destinationProjectId: 'project-456' }
);
});
it('should throw N8nNotFoundError on 404', async () => {
const error = {
message: 'Request failed',
response: { status: 404, data: { message: 'Workflow not found' } }
};
await mockAxiosInstance.simulateError('put', error);
try {
await client.transferWorkflow('123', 'project-456');
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nNotFoundError);
expect((err as N8nNotFoundError).message.toLowerCase()).toContain('not found');
expect((err as N8nNotFoundError).statusCode).toBe(404);
}
});
it('should throw appropriate error on 403 forbidden', async () => {
const error = {
message: 'Request failed',
response: { status: 403, data: { message: 'Forbidden' } }
};
await mockAxiosInstance.simulateError('put', error);
try {
await client.transferWorkflow('123', 'project-456');
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nApiError);
expect((err as N8nApiError).statusCode).toBe(403);
}
});
});
describe('createDataTable', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should create data table with name and columns', async () => {
const params = {
name: 'My Table',
columns: [
{ name: 'email', type: 'string' as const },
{ name: 'count', type: 'number' as const },
],
};
const createdTable = { id: 'dt-1', name: 'My Table', columns: [] };
mockAxiosInstance.post.mockResolvedValue({ data: createdTable });
const result = await client.createDataTable(params);
expect(mockAxiosInstance.post).toHaveBeenCalledWith('/data-tables', params);
expect(result).toEqual(createdTable);
});
it('should create data table without columns', async () => {
const params = { name: 'Empty Table' };
const createdTable = { id: 'dt-2', name: 'Empty Table' };
mockAxiosInstance.post.mockResolvedValue({ data: createdTable });
const result = await client.createDataTable(params);
expect(mockAxiosInstance.post).toHaveBeenCalledWith('/data-tables', params);
expect(result).toEqual(createdTable);
});
it('should handle 400 error', async () => {
const error = {
message: 'Request failed',
response: { status: 400, data: { message: 'Invalid table name' } },
};
await mockAxiosInstance.simulateError('post', error);
try {
await client.createDataTable({ name: '' });
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nValidationError);
expect((err as N8nValidationError).message).toBe('Invalid table name');
expect((err as N8nValidationError).statusCode).toBe(400);
}
});
});
describe('listDataTables', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should list data tables successfully', async () => {
const response = { data: [{ id: 'dt-1', name: 'Table One' }], nextCursor: 'abc' };
mockAxiosInstance.get.mockResolvedValue({ data: response });
const result = await client.listDataTables({ limit: 10, cursor: 'xyz' });
expect(mockAxiosInstance.get).toHaveBeenCalledWith('/data-tables', { params: { limit: 10, cursor: 'xyz' } });
expect(result).toEqual(response);
});
it('should handle error', async () => {
const error = {
message: 'Request failed',
response: { status: 500, data: { message: 'Internal server error' } },
};
await mockAxiosInstance.simulateError('get', error);
try {
await client.listDataTables();
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nServerError);
expect((err as N8nServerError).statusCode).toBe(500);
}
});
});
describe('getDataTable', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should get data table successfully', async () => {
const table = { id: 'dt-1', name: 'My Table', columns: [] };
mockAxiosInstance.get.mockResolvedValue({ data: table });
const result = await client.getDataTable('dt-1');
expect(mockAxiosInstance.get).toHaveBeenCalledWith('/data-tables/dt-1');
expect(result).toEqual(table);
});
it('should handle 404 error', async () => {
const error = {
message: 'Request failed',
response: { status: 404, data: { message: 'Data table not found' } },
};
await mockAxiosInstance.simulateError('get', error);
try {
await client.getDataTable('dt-nonexistent');
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nNotFoundError);
expect((err as N8nNotFoundError).statusCode).toBe(404);
}
});
});
describe('updateDataTable', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should update data table successfully', async () => {
const updated = { id: 'dt-1', name: 'Renamed' };
mockAxiosInstance.patch.mockResolvedValue({ data: updated });
const result = await client.updateDataTable('dt-1', { name: 'Renamed' });
expect(mockAxiosInstance.patch).toHaveBeenCalledWith('/data-tables/dt-1', { name: 'Renamed' });
expect(result).toEqual(updated);
});
it('should handle error', async () => {
const error = {
message: 'Request failed',
response: { status: 400, data: { message: 'Invalid name' } },
};
await mockAxiosInstance.simulateError('patch', error);
try {
await client.updateDataTable('dt-1', { name: '' });
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nValidationError);
expect((err as N8nValidationError).statusCode).toBe(400);
}
});
});
describe('deleteDataTable', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should delete data table successfully', async () => {
mockAxiosInstance.delete.mockResolvedValue({ data: {} });
await client.deleteDataTable('dt-1');
expect(mockAxiosInstance.delete).toHaveBeenCalledWith('/data-tables/dt-1');
});
it('should handle 404 error', async () => {
const error = {
message: 'Request failed',
response: { status: 404, data: { message: 'Data table not found' } },
};
await mockAxiosInstance.simulateError('delete', error);
try {
await client.deleteDataTable('dt-nonexistent');
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nNotFoundError);
expect((err as N8nNotFoundError).statusCode).toBe(404);
}
});
});
describe('getDataTableRows', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should get data table rows with params', async () => {
const response = { data: [{ id: 1, email: 'a@b.com' }], nextCursor: null };
mockAxiosInstance.get.mockResolvedValue({ data: response });
const params = { limit: 50, sortBy: 'email:asc', search: 'john' };
const result = await client.getDataTableRows('dt-1', params);
expect(mockAxiosInstance.get).toHaveBeenCalledWith('/data-tables/dt-1/rows', { params });
expect(result).toEqual(response);
});
it('should handle error', async () => {
const error = {
message: 'Request failed',
response: { status: 500, data: { message: 'Internal server error' } },
};
await mockAxiosInstance.simulateError('get', error);
try {
await client.getDataTableRows('dt-1');
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nServerError);
expect((err as N8nServerError).statusCode).toBe(500);
}
});
});
describe('insertDataTableRows', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should insert data table rows successfully', async () => {
const insertResult = { insertedCount: 2 };
mockAxiosInstance.post.mockResolvedValue({ data: insertResult });
const params = { data: [{ email: 'a@b.com' }, { email: 'c@d.com' }], returnType: 'count' as const };
const result = await client.insertDataTableRows('dt-1', params);
expect(mockAxiosInstance.post).toHaveBeenCalledWith('/data-tables/dt-1/rows', params);
expect(result).toEqual(insertResult);
});
it('should handle 400 error', async () => {
const error = {
message: 'Request failed',
response: { status: 400, data: { message: 'Invalid row data' } },
};
await mockAxiosInstance.simulateError('post', error);
try {
await client.insertDataTableRows('dt-1', { data: [{}] });
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nValidationError);
expect((err as N8nValidationError).message).toBe('Invalid row data');
expect((err as N8nValidationError).statusCode).toBe(400);
}
});
});
describe('updateDataTableRows', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should update data table rows successfully', async () => {
const updateResult = { updatedCount: 3 };
mockAxiosInstance.patch.mockResolvedValue({ data: updateResult });
const params = {
filter: { type: 'and' as const, filters: [{ columnName: 'status', condition: 'eq' as const, value: 'old' }] },
data: { status: 'new' },
};
const result = await client.updateDataTableRows('dt-1', params);
expect(mockAxiosInstance.patch).toHaveBeenCalledWith('/data-tables/dt-1/rows/update', params);
expect(result).toEqual(updateResult);
});
it('should handle error', async () => {
const error = {
message: 'Request failed',
response: { status: 500, data: { message: 'Internal server error' } },
};
await mockAxiosInstance.simulateError('patch', error);
try {
await client.updateDataTableRows('dt-1', {
filter: { type: 'and', filters: [{ columnName: 'id', condition: 'eq', value: 1 }] },
data: { name: 'test' },
});
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nServerError);
expect((err as N8nServerError).statusCode).toBe(500);
}
});
});
describe('upsertDataTableRow', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should upsert data table row successfully', async () => {
const upsertResult = { action: 'updated', row: { id: 1, email: 'a@b.com' } };
mockAxiosInstance.post.mockResolvedValue({ data: upsertResult });
const params = {
filter: { type: 'and' as const, filters: [{ columnName: 'email', condition: 'eq' as const, value: 'a@b.com' }] },
data: { score: 15 },
};
const result = await client.upsertDataTableRow('dt-1', params);
expect(mockAxiosInstance.post).toHaveBeenCalledWith('/data-tables/dt-1/rows/upsert', params);
expect(result).toEqual(upsertResult);
});
it('should handle error', async () => {
const error = {
message: 'Request failed',
response: { status: 400, data: { message: 'Invalid upsert params' } },
};
await mockAxiosInstance.simulateError('post', error);
try {
await client.upsertDataTableRow('dt-1', {
filter: { type: 'and', filters: [{ columnName: 'id', condition: 'eq', value: 1 }] },
data: { name: 'test' },
});
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nValidationError);
expect((err as N8nValidationError).statusCode).toBe(400);
}
});
});
describe('deleteDataTableRows', () => {
beforeEach(() => {
client = new N8nApiClient(defaultConfig);
});
it('should delete data table rows successfully', async () => {
const deleteResult = { deletedCount: 2 };
mockAxiosInstance.delete.mockResolvedValue({ data: deleteResult });
const params = { filter: '{"type":"and","filters":[]}', dryRun: false };
const result = await client.deleteDataTableRows('dt-1', params);
expect(mockAxiosInstance.delete).toHaveBeenCalledWith('/data-tables/dt-1/rows/delete', { params });
expect(result).toEqual(deleteResult);
});
it('should handle error', async () => {
const error = {
message: 'Request failed',
response: { status: 500, data: { message: 'Internal server error' } },
};
await mockAxiosInstance.simulateError('delete', error);
try {
await client.deleteDataTableRows('dt-1', { filter: '{}' });
expect.fail('Should have thrown an error');
} catch (err) {
expect(err).toBeInstanceOf(N8nServerError);
expect((err as N8nServerError).statusCode).toBe(500);
}
});
});
describe('interceptors', () => { describe('interceptors', () => {
let requestInterceptor: any; let requestInterceptor: any;
let responseInterceptor: any; let responseInterceptor: any;
@@ -1724,4 +1317,4 @@ describe('N8nApiClient', () => {
expect(result.message).toBe('Bad request'); expect(result.message).toBe('Bad request');
}); });
}); });
}); });

View File

@@ -1096,83 +1096,6 @@ describe('n8n-validation', () => {
expect(disconnectedErrors).toHaveLength(0); expect(disconnectedErrors).toHaveLength(0);
}); });
it('should NOT flag nodes as disconnected when connected via ai_outputParser', () => {
const workflow = {
name: 'AI Output Parser Workflow',
nodes: [
{
id: 'agent-1',
name: 'AI Agent',
type: '@n8n/n8n-nodes-langchain.agent',
typeVersion: 1.6,
position: [500, 300] as [number, number],
parameters: {},
},
{
id: 'parser-1',
name: 'Structured Output Parser',
type: '@n8n/n8n-nodes-langchain.outputParserStructured',
typeVersion: 1,
position: [300, 400] as [number, number],
parameters: {},
},
],
connections: {
'Structured Output Parser': {
ai_outputParser: [[{ node: 'AI Agent', type: 'ai_outputParser', index: 0 }]],
},
},
};
const errors = validateWorkflowStructure(workflow);
const disconnectedErrors = errors.filter(e => e.includes('Disconnected'));
expect(disconnectedErrors).toHaveLength(0);
});
it('should NOT flag nodes as disconnected when connected via ai_document or ai_textSplitter', () => {
const workflow = {
name: 'Document Processing Workflow',
nodes: [
{
id: 'vs-1',
name: 'Pinecone Vector Store',
type: '@n8n/n8n-nodes-langchain.vectorStorePinecone',
typeVersion: 1,
position: [500, 300] as [number, number],
parameters: {},
},
{
id: 'doc-1',
name: 'Default Data Loader',
type: '@n8n/n8n-nodes-langchain.documentDefaultDataLoader',
typeVersion: 1,
position: [300, 400] as [number, number],
parameters: {},
},
{
id: 'splitter-1',
name: 'Text Splitter',
type: '@n8n/n8n-nodes-langchain.textSplitterRecursiveCharacterTextSplitter',
typeVersion: 1,
position: [100, 400] as [number, number],
parameters: {},
},
],
connections: {
'Default Data Loader': {
ai_document: [[{ node: 'Pinecone Vector Store', type: 'ai_document', index: 0 }]],
},
'Text Splitter': {
ai_textSplitter: [[{ node: 'Default Data Loader', type: 'ai_textSplitter', index: 0 }]],
},
},
};
const errors = validateWorkflowStructure(workflow);
const disconnectedErrors = errors.filter(e => e.includes('Disconnected'));
expect(disconnectedErrors).toHaveLength(0);
});
it('should still flag truly disconnected nodes in AI workflows', () => { it('should still flag truly disconnected nodes in AI workflows', () => {
const workflow = { const workflow = {
name: 'AI Workflow with Disconnected Node', name: 'AI Workflow with Disconnected Node',

View File

@@ -1912,55 +1912,6 @@ return [{"json": {"result": result}}]
}); });
}); });
it('should not error on primitive return inside helper functions', () => {
context.config = {
language: 'javaScript',
jsCode: 'function isValid(item) { return false; }\nconst items = $input.all();\nreturn items.filter(isValid).map(i => ({json: i.json}));'
};
NodeSpecificValidators.validateCode(context);
const primitiveErrors = context.errors.filter(e => e.message === 'Cannot return primitive values directly');
expect(primitiveErrors).toHaveLength(0);
});
it('should not error on primitive return inside arrow function helpers', () => {
context.config = {
language: 'javaScript',
jsCode: 'const isValid = (item) => { return false; };\nreturn $input.all().filter(isValid).map(i => ({json: i.json}));'
};
NodeSpecificValidators.validateCode(context);
const primitiveErrors = context.errors.filter(e => e.message === 'Cannot return primitive values directly');
expect(primitiveErrors).toHaveLength(0);
});
it('should not error on primitive return inside async function helpers', () => {
context.config = {
language: 'javaScript',
jsCode: 'async function fetchData(url) { return null; }\nconst data = await fetchData("https://api.example.com");\nreturn [{json: {data}}];'
};
NodeSpecificValidators.validateCode(context);
const primitiveErrors = context.errors.filter(e => e.message === 'Cannot return primitive values directly');
expect(primitiveErrors).toHaveLength(0);
});
it('should still error on primitive return without helper functions', () => {
context.config = {
language: 'javaScript',
jsCode: 'return "success";'
};
NodeSpecificValidators.validateCode(context);
expect(context.errors).toContainEqual(expect.objectContaining({
message: 'Cannot return primitive values directly'
}));
});
it('should error on Python primitive return', () => { it('should error on Python primitive return', () => {
context.config = { context.config = {
language: 'python', language: 'python',
@@ -2087,30 +2038,6 @@ return [{"json": {"result": result}}]
}); });
}); });
it('should not warn about $() node reference syntax', () => {
context.config = {
language: 'javaScript',
jsCode: 'const data = $("Previous Node").first().json;\nreturn [{json: data}];'
};
NodeSpecificValidators.validateCode(context);
const dollarWarnings = context.warnings.filter(w => w.message === 'Invalid $ usage detected');
expect(dollarWarnings).toHaveLength(0);
});
it('should not warn about $_ variables', () => {
context.config = {
language: 'javaScript',
jsCode: 'const $_temp = 1;\nreturn [{json: {value: $_temp}}];'
};
NodeSpecificValidators.validateCode(context);
const dollarWarnings = context.warnings.filter(w => w.message === 'Invalid $ usage detected');
expect(dollarWarnings).toHaveLength(0);
});
it('should correct helpers usage', () => { it('should correct helpers usage', () => {
context.config = { context.config = {
language: 'javaScript', language: 'javaScript',

View File

@@ -17,8 +17,7 @@ import {
AddTagOperation, AddTagOperation,
RemoveTagOperation, RemoveTagOperation,
CleanStaleConnectionsOperation, CleanStaleConnectionsOperation,
ReplaceConnectionsOperation, ReplaceConnectionsOperation
TransferWorkflowOperation
} from '@/types/workflow-diff'; } from '@/types/workflow-diff';
import { Workflow } from '@/types/n8n-api'; import { Workflow } from '@/types/n8n-api';
@@ -425,7 +424,7 @@ describe('WorkflowDiffEngine', () => {
expect(result.success).toBe(false); expect(result.success).toBe(false);
expect(result.errors![0].message).toContain('Missing required parameter \'updates\''); expect(result.errors![0].message).toContain('Missing required parameter \'updates\'');
expect(result.errors![0].message).toContain('Correct structure:'); expect(result.errors![0].message).toContain('Example:');
}); });
}); });
@@ -1899,15 +1898,16 @@ describe('WorkflowDiffEngine', () => {
}; };
const result = await diffEngine.applyDiff(baseWorkflow, request); const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true); expect(result.success).toBe(true);
expect(result.tagsToAdd).toContain('production'); expect(result.workflow!.tags).toContain('production');
expect(result.workflow!.tags).toHaveLength(3);
}); });
it('should not add duplicate tags', async () => { it('should not add duplicate tags', async () => {
const operation: AddTagOperation = { const operation: AddTagOperation = {
type: 'addTag', type: 'addTag',
tag: 'test' // Already exists in workflow but tagsToAdd tracks it for API tag: 'test' // Already exists
}; };
const request: WorkflowDiffRequest = { const request: WorkflowDiffRequest = {
@@ -1916,10 +1916,9 @@ describe('WorkflowDiffEngine', () => {
}; };
const result = await diffEngine.applyDiff(baseWorkflow, request); const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true); expect(result.success).toBe(true);
// Tags are now tracked for dedicated API call, not modified on workflow expect(result.workflow!.tags).toHaveLength(2); // No change
expect(result.tagsToAdd).toEqual(['test']);
}); });
it('should create tags array if not exists', async () => { it('should create tags array if not exists', async () => {
@@ -1936,9 +1935,10 @@ describe('WorkflowDiffEngine', () => {
}; };
const result = await diffEngine.applyDiff(baseWorkflow, request); const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true); expect(result.success).toBe(true);
expect(result.tagsToAdd).toEqual(['new-tag']); expect(result.workflow!.tags).toBeDefined();
expect(result.workflow!.tags).toEqual(['new-tag']);
}); });
it('should remove an existing tag', async () => { it('should remove an existing tag', async () => {
@@ -1953,9 +1953,10 @@ describe('WorkflowDiffEngine', () => {
}; };
const result = await diffEngine.applyDiff(baseWorkflow, request); const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true); expect(result.success).toBe(true);
expect(result.tagsToRemove).toContain('test'); expect(result.workflow!.tags).not.toContain('test');
expect(result.workflow!.tags).toHaveLength(1);
}); });
it('should handle removing non-existent tag gracefully', async () => { it('should handle removing non-existent tag gracefully', async () => {
@@ -1970,11 +1971,9 @@ describe('WorkflowDiffEngine', () => {
}; };
const result = await diffEngine.applyDiff(baseWorkflow, request); const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true); expect(result.success).toBe(true);
expect(result.tagsToRemove).toEqual(['non-existent']); expect(result.workflow!.tags).toHaveLength(2); // No change
// workflow.tags unchanged since tags are now handled via dedicated API
expect(result.workflow!.tags).toHaveLength(2);
}); });
}); });
@@ -2510,7 +2509,7 @@ describe('WorkflowDiffEngine', () => {
expect(result.failed).toEqual([1]); // Operation 1 failed expect(result.failed).toEqual([1]); // Operation 1 failed
expect(result.errors).toHaveLength(1); expect(result.errors).toHaveLength(1);
expect(result.workflow.name).toBe('New Workflow Name'); expect(result.workflow.name).toBe('New Workflow Name');
expect(result.tagsToAdd).toContain('production'); expect(result.workflow.tags).toContain('production');
}); });
it('should return success false if all operations fail in continueOnError mode', async () => { it('should return success false if all operations fail in continueOnError mode', async () => {
@@ -3357,7 +3356,7 @@ describe('WorkflowDiffEngine', () => {
expect(result.failed).toContain(1); // replaceConnections with invalid node expect(result.failed).toContain(1); // replaceConnections with invalid node
expect(result.applied).toContain(2); // removeConnection with ignoreErrors expect(result.applied).toContain(2); // removeConnection with ignoreErrors
expect(result.applied).toContain(3); // addTag expect(result.applied).toContain(3); // addTag
expect(result.tagsToAdd).toContain('final-tag'); expect(result.workflow.tags).toContain('final-tag');
}); });
}); });
@@ -4611,7 +4610,7 @@ describe('WorkflowDiffEngine', () => {
expect(result.success).toBe(true); expect(result.success).toBe(true);
expect(result.operationsApplied).toBe(3); expect(result.operationsApplied).toBe(3);
expect(result.workflow!.name).toBe('Updated Workflow Name'); expect(result.workflow!.name).toBe('Updated Workflow Name');
expect(result.tagsToAdd).toContain('production'); expect(result.workflow!.tags).toContain('production');
expect(result.shouldActivate).toBe(true); expect(result.shouldActivate).toBe(true);
}); });
@@ -4883,258 +4882,4 @@ describe('WorkflowDiffEngine', () => {
expect(result.workflow.connections['Source Node']['main'][0][0].type).toBe('main'); expect(result.workflow.connections['Source Node']['main'][0][0].type).toBe('main');
}); });
}); });
});
describe('null value property deletion', () => {
it('should delete a property when value is null', async () => {
const node = baseWorkflow.nodes.find((n: any) => n.name === 'HTTP Request')!;
(node as any).continueOnFail = true;
const operation: UpdateNodeOperation = {
type: 'updateNode',
nodeName: 'HTTP Request',
updates: { continueOnFail: null }
};
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
const updatedNode = result.workflow.nodes.find((n: any) => n.name === 'HTTP Request')!;
expect('continueOnFail' in updatedNode).toBe(false);
});
it('should delete a nested property when value is null', async () => {
const node = baseWorkflow.nodes.find((n: any) => n.name === 'HTTP Request')!;
(node as any).parameters = { url: 'https://example.com', authentication: 'basic' };
const operation: UpdateNodeOperation = {
type: 'updateNode',
nodeName: 'HTTP Request',
updates: { 'parameters.authentication': null }
};
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
const updatedNode = result.workflow.nodes.find((n: any) => n.name === 'HTTP Request')!;
expect((updatedNode as any).parameters.url).toBe('https://example.com');
expect('authentication' in (updatedNode as any).parameters).toBe(false);
});
it('should set property normally when value is not null', async () => {
const operation: UpdateNodeOperation = {
type: 'updateNode',
nodeName: 'HTTP Request',
updates: { continueOnFail: true }
};
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
const updatedNode = result.workflow.nodes.find((n: any) => n.name === 'HTTP Request')!;
expect((updatedNode as any).continueOnFail).toBe(true);
});
it('should be a no-op when deleting a non-existent property', async () => {
const node = baseWorkflow.nodes.find((n: any) => n.name === 'HTTP Request')!;
const originalKeys = Object.keys(node).sort();
const operation: UpdateNodeOperation = {
type: 'updateNode',
nodeName: 'HTTP Request',
updates: { nonExistentProp: null }
};
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
const updatedNode = result.workflow.nodes.find((n: any) => n.name === 'HTTP Request')!;
expect('nonExistentProp' in updatedNode).toBe(false);
});
it('should skip intermediate object creation when deleting from non-existent parent path', async () => {
const operation: UpdateNodeOperation = {
type: 'updateNode',
nodeName: 'HTTP Request',
updates: { 'nonExistent.deeply.nested.prop': null }
};
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
const updatedNode = result.workflow.nodes.find((n: any) => n.name === 'HTTP Request')!;
expect('nonExistent' in updatedNode).toBe(false);
});
});
describe('transferWorkflow operation', () => {
it('should set transferToProjectId in result for valid transferWorkflow', async () => {
const operation: TransferWorkflowOperation = {
type: 'transferWorkflow',
destinationProjectId: 'project-abc-123'
};
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
expect(result.transferToProjectId).toBe('project-abc-123');
});
it('should fail validation when destinationProjectId is empty', async () => {
const operation: TransferWorkflowOperation = {
type: 'transferWorkflow',
destinationProjectId: ''
};
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(false);
expect(result.errors).toBeDefined();
expect(result.errors![0].message).toContain('destinationProjectId');
});
it('should fail validation when destinationProjectId is undefined', async () => {
const operation = {
type: 'transferWorkflow',
destinationProjectId: undefined
} as any as TransferWorkflowOperation;
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(false);
expect(result.errors).toBeDefined();
expect(result.errors![0].message).toContain('destinationProjectId');
});
it('should not include transferToProjectId when no transferWorkflow operation is present', async () => {
const operation: UpdateNameOperation = {
type: 'updateName',
name: 'Renamed Workflow'
};
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations: [operation]
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
expect(result.transferToProjectId).toBeUndefined();
});
it('should combine updateName and transferWorkflow operations', async () => {
const operations: WorkflowDiffOperation[] = [
{
type: 'updateName',
name: 'Transferred Workflow'
} as UpdateNameOperation,
{
type: 'transferWorkflow',
destinationProjectId: 'project-xyz-789'
} as TransferWorkflowOperation
];
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
expect(result.operationsApplied).toBe(2);
expect(result.workflow!.name).toBe('Transferred Workflow');
expect(result.transferToProjectId).toBe('project-xyz-789');
});
it('should combine removeTag and transferWorkflow in continueOnError mode', async () => {
const operations: WorkflowDiffOperation[] = [
{
type: 'removeTag',
tag: 'non-existent-tag'
} as RemoveTagOperation,
{
type: 'transferWorkflow',
destinationProjectId: 'project-target-456'
} as TransferWorkflowOperation
];
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations,
continueOnError: true
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(true);
expect(result.transferToProjectId).toBe('project-target-456');
});
it('should fail entire batch in atomic mode when transferWorkflow has empty destinationProjectId alongside updateName', async () => {
const operations: WorkflowDiffOperation[] = [
{
type: 'updateName',
name: 'Should Not Apply'
} as UpdateNameOperation,
{
type: 'transferWorkflow',
destinationProjectId: ''
} as TransferWorkflowOperation
];
const request: WorkflowDiffRequest = {
id: 'test-workflow',
operations
};
const result = await diffEngine.applyDiff(baseWorkflow, request);
expect(result.success).toBe(false);
expect(result.errors).toBeDefined();
expect(result.errors![0].message).toContain('destinationProjectId');
// In atomic mode, the workflow should not be returned since the batch failed
expect(result.workflow).toBeUndefined();
});
});
});

View File

@@ -80,9 +80,8 @@ describe('AuthManager.timingSafeCompare', () => {
// For constant-time comparison, variance should be minimal // For constant-time comparison, variance should be minimal
// If maxMedian is 0, check absolute difference is small (< 1000ns) // If maxMedian is 0, check absolute difference is small (< 1000ns)
// Otherwise, check relative variance is < 50% (relaxed for CI runner noise; // Otherwise, check relative variance is < 10%
// the underlying crypto.timingSafeEqual is guaranteed constant-time) expect(variance).toBeLessThan(maxMedian === 0 ? 1000 : 0.10);
expect(variance).toBeLessThan(maxMedian === 0 ? 1000 : 0.50);
}); });
it('should handle special characters safely', () => { it('should handle special characters safely', () => {