mirror of
https://github.com/czlonkowski/n8n-mcp.git
synced 2026-03-19 17:03:08 +00:00
Compare commits
5 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0918cd5425 | ||
|
|
0998e5486e | ||
|
|
87f26eef18 | ||
|
|
4bad880f44 | ||
|
|
77048347b3 |
4
.github/workflows/dependency-check.yml
vendored
4
.github/workflows/dependency-check.yml
vendored
@@ -59,7 +59,9 @@ jobs:
|
|||||||
run: |
|
run: |
|
||||||
npm init -y
|
npm init -y
|
||||||
# Install from tarball WITHOUT lockfile (simulates npm install n8n-mcp)
|
# Install from tarball WITHOUT lockfile (simulates npm install n8n-mcp)
|
||||||
npm install ./n8n-mcp-*.tgz
|
# Use --ignore-scripts to skip native compilation of transitive deps like isolated-vm
|
||||||
|
# (n8n-mcp only reads node metadata, it doesn't execute n8n nodes at runtime)
|
||||||
|
npm install --ignore-scripts ./n8n-mcp-*.tgz
|
||||||
|
|
||||||
- name: Verify critical dependency versions
|
- name: Verify critical dependency versions
|
||||||
working-directory: /tmp/fresh-install-test
|
working-directory: /tmp/fresh-install-test
|
||||||
|
|||||||
61
CHANGELOG.md
61
CHANGELOG.md
@@ -7,6 +7,67 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
|||||||
|
|
||||||
## [Unreleased]
|
## [Unreleased]
|
||||||
|
|
||||||
|
## [2.36.0] - 2026-03-07
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- **Connection validation: detect broken/malformed workflow connections** (Issue #620):
|
||||||
|
- Unknown output keys (`UNKNOWN_CONNECTION_KEY`): Flags invalid connection keys like `"0"`, `"1"`, `"output"` with fix suggestions (e.g., "use main[1] instead" for numeric keys)
|
||||||
|
- Invalid type field (`INVALID_CONNECTION_TYPE`): Detects invalid `type` values in connection targets (e.g., `"0"` instead of `"main"`)
|
||||||
|
- Output index bounds checking (`OUTPUT_INDEX_OUT_OF_BOUNDS`): Catches connections using output indices beyond what a node supports, with awareness of `onError: 'continueErrorOutput'`, Switch rules, and IF/Filter nodes
|
||||||
|
- Input index bounds checking (`INPUT_INDEX_OUT_OF_BOUNDS`): Validates target input indices against known node input counts (Merge=2, triggers=0, others=1)
|
||||||
|
- BFS-based trigger reachability analysis: Replaces simple orphan detection with proper graph traversal from trigger nodes, flagging unreachable subgraphs
|
||||||
|
- Flexible `WorkflowConnection` interface: Changed from explicit `main?/error?/ai_tool?` to `[outputType: string]` for accurate validation of all connection types
|
||||||
|
|
||||||
|
Conceived by Romuald Czlonkowski - https://www.aiadvisors.pl/en
|
||||||
|
|
||||||
|
## [2.35.6] - 2026-03-04
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- **Updated n8n dependencies**: n8n 2.8.3 → 2.10.3, n8n-core 2.8.1 → 2.10.1, n8n-workflow 2.8.0 → 2.10.1, @n8n/n8n-nodes-langchain 2.8.1 → 2.10.1
|
||||||
|
- Rebuilt node database with 806 core nodes (community nodes preserved from previous build)
|
||||||
|
|
||||||
|
Conceived by Romuald Czlonkowski - https://www.aiadvisors.pl/en
|
||||||
|
|
||||||
|
## [2.35.5] - 2026-02-22
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- **Comprehensive parameter type coercion for Claude Desktop / Claude.ai** (Issue #605): Expanded the v2.35.4 fix to handle ALL type mismatches, not just stringified objects/arrays. Testing revealed 6/9 tools still failing in Claude Desktop after the initial fix.
|
||||||
|
- Extended `coerceStringifiedJsonParams()` to coerce every schema type: `string→number`, `string→boolean`, `number→string`, `boolean→string` (in addition to existing `string→object` and `string→array`)
|
||||||
|
- Added top-level safeguard to parse the entire `args` object if it arrives as a JSON string
|
||||||
|
- Added `[Diagnostic]` section to error responses showing received argument types, enabling users to report exactly what their MCP client sends
|
||||||
|
- Added 9 new unit tests (24 total) covering number, boolean, and number-to-string coercion
|
||||||
|
|
||||||
|
Conceived by Romuald Czlonkowski - https://www.aiadvisors.pl/en
|
||||||
|
|
||||||
|
## [2.35.4] - 2026-02-20
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- **Defensive JSON.parse for stringified object/array parameters** (Issue #605): Claude Desktop 1.1.3189 serializes JSON object/array MCP parameters as strings, causing ZodError failures for ~60% of tools that accept nested parameters
|
||||||
|
- Added schema-driven `coerceStringifiedJsonParams()` in the central `CallToolRequestSchema` handler
|
||||||
|
- Automatically detects string values where the tool's `inputSchema` expects `object` or `array`, and parses them back
|
||||||
|
- Safe: prefix check before parsing, type verification after, try/catch preserves original on failure
|
||||||
|
- No-op for correct clients: native objects pass through unchanged
|
||||||
|
- Affects 9 tools with object/array params: `validate_node`, `validate_workflow`, `n8n_create_workflow`, `n8n_update_full_workflow`, `n8n_update_partial_workflow`, `n8n_validate_workflow`, `n8n_autofix_workflow`, `n8n_test_workflow`, `n8n_executions`
|
||||||
|
- Added 15 unit tests covering coercion, no-op, safety, and end-to-end scenarios
|
||||||
|
|
||||||
|
Conceived by Romuald Czlonkowski - https://www.aiadvisors.pl/en
|
||||||
|
|
||||||
|
## [2.35.3] - 2026-02-19
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- **Updated n8n dependencies**: n8n 2.6.3 → 2.8.3, n8n-core 2.6.1 → 2.8.1, n8n-workflow 2.6.0 → 2.8.0, @n8n/n8n-nodes-langchain 2.6.2 → 2.8.1
|
||||||
|
- **Fixed node loader for langchain package**: Adapted node loader to bypass restricted package.json `exports` field in @n8n/n8n-nodes-langchain >=2.9.0, resolving node files via absolute paths instead of `require.resolve()`
|
||||||
|
- **Fixed community doc generation for cloud LLMs**: Added `N8N_MCP_LLM_API_KEY`/`OPENAI_API_KEY` env var support, switched to `max_completion_tokens`, and auto-omit `temperature` for cloud API endpoints
|
||||||
|
- Rebuilt node database with 1,236 nodes (673 from n8n-nodes-base, 133 from @n8n/n8n-nodes-langchain, 430 community)
|
||||||
|
- Refreshed community nodes (361 verified + 69 npm) with 424/430 AI documentation summaries
|
||||||
|
|
||||||
|
Conceived by Romuald Czlonkowski - https://www.aiadvisors.pl/en
|
||||||
|
|
||||||
## [2.35.2] - 2026-02-09
|
## [2.35.2] - 2026-02-09
|
||||||
|
|
||||||
### Changed
|
### Changed
|
||||||
|
|||||||
@@ -18,21 +18,27 @@ npm run update:n8n:check
|
|||||||
# 4. Run update and skip tests (we'll test in CI)
|
# 4. Run update and skip tests (we'll test in CI)
|
||||||
yes y | npm run update:n8n
|
yes y | npm run update:n8n
|
||||||
|
|
||||||
# 5. Create feature branch
|
# 5. Refresh community nodes (standard practice!)
|
||||||
|
npm run fetch:community
|
||||||
|
npm run generate:docs
|
||||||
|
|
||||||
|
# 6. Create feature branch
|
||||||
git checkout -b update/n8n-X.X.X
|
git checkout -b update/n8n-X.X.X
|
||||||
|
|
||||||
# 6. Update version in package.json (must be HIGHER than latest release!)
|
# 7. Update version in package.json (must be HIGHER than latest release!)
|
||||||
# Edit: "version": "2.XX.X" (not the version from the release list!)
|
# Edit: "version": "2.XX.X" (not the version from the release list!)
|
||||||
|
|
||||||
# 7. Update CHANGELOG.md
|
# 8. Update CHANGELOG.md
|
||||||
# - Change version number to match package.json
|
# - Change version number to match package.json
|
||||||
# - Update date to today
|
# - Update date to today
|
||||||
# - Update dependency versions
|
# - Update dependency versions
|
||||||
|
# - Include community node refresh counts
|
||||||
|
|
||||||
# 8. Update README badge
|
# 9. Update README badge and node counts
|
||||||
# Edit line 8: Change n8n version badge to new n8n version
|
# Edit line 8: Change n8n version badge to new n8n version
|
||||||
|
# Update total node count in description (core + community)
|
||||||
|
|
||||||
# 9. Commit and push
|
# 10. Commit and push
|
||||||
git add -A
|
git add -A
|
||||||
git commit -m "chore: update n8n to X.X.X and bump version to 2.XX.X
|
git commit -m "chore: update n8n to X.X.X and bump version to 2.XX.X
|
||||||
|
|
||||||
@@ -41,7 +47,8 @@ git commit -m "chore: update n8n to X.X.X and bump version to 2.XX.X
|
|||||||
- Updated n8n-workflow from X.X.X to X.X.X
|
- Updated n8n-workflow from X.X.X to X.X.X
|
||||||
- Updated @n8n/n8n-nodes-langchain from X.X.X to X.X.X
|
- Updated @n8n/n8n-nodes-langchain from X.X.X to X.X.X
|
||||||
- Rebuilt node database with XXX nodes (XXX from n8n-nodes-base, XXX from @n8n/n8n-nodes-langchain)
|
- Rebuilt node database with XXX nodes (XXX from n8n-nodes-base, XXX from @n8n/n8n-nodes-langchain)
|
||||||
- Updated README badge with new n8n version
|
- Refreshed community nodes (XXX verified + XXX npm)
|
||||||
|
- Updated README badge with new n8n version and node counts
|
||||||
- Updated CHANGELOG with dependency changes
|
- Updated CHANGELOG with dependency changes
|
||||||
|
|
||||||
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
|
Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en
|
||||||
@@ -52,10 +59,10 @@ Co-Authored-By: Claude <noreply@anthropic.com>"
|
|||||||
|
|
||||||
git push -u origin update/n8n-X.X.X
|
git push -u origin update/n8n-X.X.X
|
||||||
|
|
||||||
# 10. Create PR
|
# 11. Create PR
|
||||||
gh pr create --title "chore: update n8n to X.X.X" --body "Updates n8n and all related dependencies to the latest versions..."
|
gh pr create --title "chore: update n8n to X.X.X" --body "Updates n8n and all related dependencies to the latest versions..."
|
||||||
|
|
||||||
# 11. After PR is merged, verify release triggered
|
# 12. After PR is merged, verify release triggered
|
||||||
gh release list | head -1
|
gh release list | head -1
|
||||||
# If the new version appears, you're done!
|
# If the new version appears, you're done!
|
||||||
# If not, the version might have already been released - bump version again and create new PR
|
# If not, the version might have already been released - bump version again and create new PR
|
||||||
|
|||||||
@@ -5,11 +5,11 @@
|
|||||||
[](https://www.npmjs.com/package/n8n-mcp)
|
[](https://www.npmjs.com/package/n8n-mcp)
|
||||||
[](https://codecov.io/gh/czlonkowski/n8n-mcp)
|
[](https://codecov.io/gh/czlonkowski/n8n-mcp)
|
||||||
[](https://github.com/czlonkowski/n8n-mcp/actions)
|
[](https://github.com/czlonkowski/n8n-mcp/actions)
|
||||||
[](https://github.com/n8n-io/n8n)
|
[](https://github.com/n8n-io/n8n)
|
||||||
[](https://github.com/czlonkowski/n8n-mcp/pkgs/container/n8n-mcp)
|
[](https://github.com/czlonkowski/n8n-mcp/pkgs/container/n8n-mcp)
|
||||||
[](https://railway.com/deploy/n8n-mcp?referralCode=n8n-mcp)
|
[](https://railway.com/deploy/n8n-mcp?referralCode=n8n-mcp)
|
||||||
|
|
||||||
A Model Context Protocol (MCP) server that provides AI assistants with comprehensive access to n8n node documentation, properties, and operations. Deploy in minutes to give Claude and other AI assistants deep knowledge about n8n's 1,084 workflow automation nodes (537 core + 547 community).
|
A Model Context Protocol (MCP) server that provides AI assistants with comprehensive access to n8n node documentation, properties, and operations. Deploy in minutes to give Claude and other AI assistants deep knowledge about n8n's 1,236 workflow automation nodes (806 core + 430 community).
|
||||||
|
|
||||||
## Overview
|
## Overview
|
||||||
|
|
||||||
|
|||||||
BIN
data/nodes.db
BIN
data/nodes.db
Binary file not shown.
2
dist/loaders/node-loader.d.ts
vendored
2
dist/loaders/node-loader.d.ts
vendored
@@ -6,6 +6,8 @@ export interface LoadedNode {
|
|||||||
export declare class N8nNodeLoader {
|
export declare class N8nNodeLoader {
|
||||||
private readonly CORE_PACKAGES;
|
private readonly CORE_PACKAGES;
|
||||||
loadAllNodes(): Promise<LoadedNode[]>;
|
loadAllNodes(): Promise<LoadedNode[]>;
|
||||||
|
private resolvePackageDir;
|
||||||
|
private loadNodeModule;
|
||||||
private loadPackageNodes;
|
private loadPackageNodes;
|
||||||
}
|
}
|
||||||
//# sourceMappingURL=node-loader.d.ts.map
|
//# sourceMappingURL=node-loader.d.ts.map
|
||||||
2
dist/loaders/node-loader.d.ts.map
vendored
2
dist/loaders/node-loader.d.ts.map
vendored
@@ -1 +1 @@
|
|||||||
{"version":3,"file":"node-loader.d.ts","sourceRoot":"","sources":["../../src/loaders/node-loader.ts"],"names":[],"mappings":"AAEA,MAAM,WAAW,UAAU;IACzB,WAAW,EAAE,MAAM,CAAC;IACpB,QAAQ,EAAE,MAAM,CAAC;IACjB,SAAS,EAAE,GAAG,CAAC;CAChB;AAED,qBAAa,aAAa;IACxB,OAAO,CAAC,QAAQ,CAAC,aAAa,CAG5B;IAEI,YAAY,IAAI,OAAO,CAAC,UAAU,EAAE,CAAC;YAmB7B,gBAAgB;CAqD/B"}
|
{"version":3,"file":"node-loader.d.ts","sourceRoot":"","sources":["../../src/loaders/node-loader.ts"],"names":[],"mappings":"AAEA,MAAM,WAAW,UAAU;IACzB,WAAW,EAAE,MAAM,CAAC;IACpB,QAAQ,EAAE,MAAM,CAAC;IACjB,SAAS,EAAE,GAAG,CAAC;CAChB;AAED,qBAAa,aAAa;IACxB,OAAO,CAAC,QAAQ,CAAC,aAAa,CAG5B;IAEI,YAAY,IAAI,OAAO,CAAC,UAAU,EAAE,CAAC;IAuB3C,OAAO,CAAC,iBAAiB;IAUzB,OAAO,CAAC,cAAc;YAIR,gBAAgB;CAuD/B"}
|
||||||
16
dist/loaders/node-loader.js
vendored
16
dist/loaders/node-loader.js
vendored
@@ -28,15 +28,23 @@ class N8nNodeLoader {
|
|||||||
}
|
}
|
||||||
return results;
|
return results;
|
||||||
}
|
}
|
||||||
|
resolvePackageDir(packagePath) {
|
||||||
|
const pkgJsonPath = require.resolve(`${packagePath}/package.json`);
|
||||||
|
return path_1.default.dirname(pkgJsonPath);
|
||||||
|
}
|
||||||
|
loadNodeModule(absolutePath) {
|
||||||
|
return require(absolutePath);
|
||||||
|
}
|
||||||
async loadPackageNodes(packageName, packagePath, packageJson) {
|
async loadPackageNodes(packageName, packagePath, packageJson) {
|
||||||
const n8nConfig = packageJson.n8n || {};
|
const n8nConfig = packageJson.n8n || {};
|
||||||
const nodes = [];
|
const nodes = [];
|
||||||
|
const packageDir = this.resolvePackageDir(packagePath);
|
||||||
const nodesList = n8nConfig.nodes || [];
|
const nodesList = n8nConfig.nodes || [];
|
||||||
if (Array.isArray(nodesList)) {
|
if (Array.isArray(nodesList)) {
|
||||||
for (const nodePath of nodesList) {
|
for (const nodePath of nodesList) {
|
||||||
try {
|
try {
|
||||||
const fullPath = require.resolve(`${packagePath}/${nodePath}`);
|
const fullPath = path_1.default.join(packageDir, nodePath);
|
||||||
const nodeModule = require(fullPath);
|
const nodeModule = this.loadNodeModule(fullPath);
|
||||||
const nodeNameMatch = nodePath.match(/\/([^\/]+)\.node\.(js|ts)$/);
|
const nodeNameMatch = nodePath.match(/\/([^\/]+)\.node\.(js|ts)$/);
|
||||||
const nodeName = nodeNameMatch ? nodeNameMatch[1] : path_1.default.basename(nodePath, '.node.js');
|
const nodeName = nodeNameMatch ? nodeNameMatch[1] : path_1.default.basename(nodePath, '.node.js');
|
||||||
const NodeClass = nodeModule.default || nodeModule[nodeName] || Object.values(nodeModule)[0];
|
const NodeClass = nodeModule.default || nodeModule[nodeName] || Object.values(nodeModule)[0];
|
||||||
@@ -56,8 +64,8 @@ class N8nNodeLoader {
|
|||||||
else {
|
else {
|
||||||
for (const [nodeName, nodePath] of Object.entries(nodesList)) {
|
for (const [nodeName, nodePath] of Object.entries(nodesList)) {
|
||||||
try {
|
try {
|
||||||
const fullPath = require.resolve(`${packagePath}/${nodePath}`);
|
const fullPath = path_1.default.join(packageDir, nodePath);
|
||||||
const nodeModule = require(fullPath);
|
const nodeModule = this.loadNodeModule(fullPath);
|
||||||
const NodeClass = nodeModule.default || nodeModule[nodeName] || Object.values(nodeModule)[0];
|
const NodeClass = nodeModule.default || nodeModule[nodeName] || Object.values(nodeModule)[0];
|
||||||
if (NodeClass) {
|
if (NodeClass) {
|
||||||
nodes.push({ packageName, nodeName, NodeClass });
|
nodes.push({ packageName, nodeName, NodeClass });
|
||||||
|
|||||||
2
dist/loaders/node-loader.js.map
vendored
2
dist/loaders/node-loader.js.map
vendored
@@ -1 +1 @@
|
|||||||
{"version":3,"file":"node-loader.js","sourceRoot":"","sources":["../../src/loaders/node-loader.ts"],"names":[],"mappings":";;;;;;AAAA,gDAAwB;AAQxB,MAAa,aAAa;IAA1B;QACmB,kBAAa,GAAG;YAC/B,EAAE,IAAI,EAAE,gBAAgB,EAAE,IAAI,EAAE,gBAAgB,EAAE;YAClD,EAAE,IAAI,EAAE,0BAA0B,EAAE,IAAI,EAAE,0BAA0B,EAAE;SACvE,CAAC;IA0EJ,CAAC;IAxEC,KAAK,CAAC,YAAY;QAChB,MAAM,OAAO,GAAiB,EAAE,CAAC;QAEjC,KAAK,MAAM,GAAG,IAAI,IAAI,CAAC,aAAa,EAAE,CAAC;YACrC,IAAI,CAAC;gBACH,OAAO,CAAC,GAAG,CAAC,yBAAyB,GAAG,CAAC,IAAI,SAAS,GAAG,CAAC,IAAI,EAAE,CAAC,CAAC;gBAElE,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,GAAG,CAAC,IAAI,eAAe,CAAC,CAAC;gBACxD,OAAO,CAAC,GAAG,CAAC,WAAW,MAAM,CAAC,IAAI,CAAC,WAAW,CAAC,GAAG,EAAE,KAAK,IAAI,EAAE,CAAC,CAAC,MAAM,wBAAwB,CAAC,CAAC;gBACjG,MAAM,KAAK,GAAG,MAAM,IAAI,CAAC,gBAAgB,CAAC,GAAG,CAAC,IAAI,EAAE,GAAG,CAAC,IAAI,EAAE,WAAW,CAAC,CAAC;gBAC3E,OAAO,CAAC,IAAI,CAAC,GAAG,KAAK,CAAC,CAAC;YACzB,CAAC;YAAC,OAAO,KAAK,EAAE,CAAC;gBACf,OAAO,CAAC,KAAK,CAAC,kBAAkB,GAAG,CAAC,IAAI,GAAG,EAAE,KAAK,CAAC,CAAC;YACtD,CAAC;QACH,CAAC;QAED,OAAO,OAAO,CAAC;IACjB,CAAC;IAEO,KAAK,CAAC,gBAAgB,CAAC,WAAmB,EAAE,WAAmB,EAAE,WAAgB;QACvF,MAAM,SAAS,GAAG,WAAW,CAAC,GAAG,IAAI,EAAE,CAAC;QACxC,MAAM,KAAK,GAAiB,EAAE,CAAC;QAG/B,MAAM,SAAS,GAAG,SAAS,CAAC,KAAK,IAAI,EAAE,CAAC;QAExC,IAAI,KAAK,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE,CAAC;YAE7B,KAAK,MAAM,QAAQ,IAAI,SAAS,EAAE,CAAC;gBACjC,IAAI,CAAC;oBACH,MAAM,QAAQ,GAAG,OAAO,CAAC,OAAO,CAAC,GAAG,WAAW,IAAI,QAAQ,EAAE,CAAC,CAAC;oBAC/D,MAAM,UAAU,GAAG,OAAO,CAAC,QAAQ,CAAC,CAAC;oBAGrC,MAAM,aAAa,GAAG,QAAQ,CAAC,KAAK,CAAC,4BAA4B,CAAC,CAAC;oBACnE,MAAM,QAAQ,GAAG,aAAa,CAAC,CAAC,CAAC,aAAa,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,cAAI,CAAC,QAAQ,CAAC,QAAQ,EAAE,UAAU,CAAC,CAAC;oBAGxF,MAAM,SAAS,GAAG,UAAU,CAAC,OAAO,IAAI,UAAU,CAAC,QAAQ,CAAC,IAAI,MAAM,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;oBAC7F,IAAI,SAAS,EAAE,CAAC;wBACd,KAAK,CAAC,IAAI,CAAC,EAAE,WAAW,EAAE,QAAQ,EAAE,SAAS,EAAE,CAAC,CAAC;wBACjD,OAAO,CAAC,GAAG,CAAC,cAAc,QAAQ,SAAS,WAAW,EAAE,CAAC,CAAC;oBAC5D,CAAC;yBAAM,CAAC;wBACN,OAAO,CAAC,IAAI,CAAC,iCAAiC,QAAQ,OAAO,WAAW,EAAE,CAAC,CAAC;oBAC9E,CAAC;gBACH,CAAC;gBAAC,OAAO,KAAK,EAAE,CAAC;oBACf,OAAO,CAAC,KAAK,CAAC,gCAAgC,WAAW,IAAI,QAAQ,GAAG,EAAG,KAAe,CAAC,OAAO,CAAC,CAAC;gBACtG,CAAC;YACH,CAAC;QACH,CAAC;aAAM,CAAC;YAEN,KAAK,MAAM,CAAC,QAAQ,EAAE,QAAQ,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE,CAAC;gBAC7D,IAAI,CAAC;oBACH,MAAM,QAAQ,GAAG,OAAO,CAAC,OAAO,CAAC,GAAG,WAAW,IAAI,QAAkB,EAAE,CAAC,CAAC;oBACzE,MAAM,UAAU,GAAG,OAAO,CAAC,QAAQ,CAAC,CAAC;oBAGrC,MAAM,SAAS,GAAG,UAAU,CAAC,OAAO,IAAI,UAAU,CAAC,QAAQ,CAAC,IAAI,MAAM,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;oBAC7F,IAAI,SAAS,EAAE,CAAC;wBACd,KAAK,CAAC,IAAI,CAAC,EAAE,WAAW,EAAE,QAAQ,EAAE,SAAS,EAAE,CAAC,CAAC;wBACjD,OAAO,CAAC,GAAG,CAAC,cAAc,QAAQ,SAAS,WAAW,EAAE,CAAC,CAAC;oBAC5D,CAAC;yBAAM,CAAC;wBACN,OAAO,CAAC,IAAI,CAAC,iCAAiC,QAAQ,OAAO,WAAW,EAAE,CAAC,CAAC;oBAC9E,CAAC;gBACH,CAAC;gBAAC,OAAO,KAAK,EAAE,CAAC;oBACf,OAAO,CAAC,KAAK,CAAC,2BAA2B,QAAQ,SAAS,WAAW,GAAG,EAAG,KAAe,CAAC,OAAO,CAAC,CAAC;gBACtG,CAAC;YACH,CAAC;QACH,CAAC;QAED,OAAO,KAAK,CAAC;IACf,CAAC;CACF;AA9ED,sCA8EC"}
|
{"version":3,"file":"node-loader.js","sourceRoot":"","sources":["../../src/loaders/node-loader.ts"],"names":[],"mappings":";;;;;;AAAA,gDAAwB;AAQxB,MAAa,aAAa;IAA1B;QACmB,kBAAa,GAAG;YAC/B,EAAE,IAAI,EAAE,gBAAgB,EAAE,IAAI,EAAE,gBAAgB,EAAE;YAClD,EAAE,IAAI,EAAE,0BAA0B,EAAE,IAAI,EAAE,0BAA0B,EAAE;SACvE,CAAC;IA8FJ,CAAC;IA5FC,KAAK,CAAC,YAAY;QAChB,MAAM,OAAO,GAAiB,EAAE,CAAC;QAEjC,KAAK,MAAM,GAAG,IAAI,IAAI,CAAC,aAAa,EAAE,CAAC;YACrC,IAAI,CAAC;gBACH,OAAO,CAAC,GAAG,CAAC,yBAAyB,GAAG,CAAC,IAAI,SAAS,GAAG,CAAC,IAAI,EAAE,CAAC,CAAC;gBAElE,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,GAAG,CAAC,IAAI,eAAe,CAAC,CAAC;gBACxD,OAAO,CAAC,GAAG,CAAC,WAAW,MAAM,CAAC,IAAI,CAAC,WAAW,CAAC,GAAG,EAAE,KAAK,IAAI,EAAE,CAAC,CAAC,MAAM,wBAAwB,CAAC,CAAC;gBACjG,MAAM,KAAK,GAAG,MAAM,IAAI,CAAC,gBAAgB,CAAC,GAAG,CAAC,IAAI,EAAE,GAAG,CAAC,IAAI,EAAE,WAAW,CAAC,CAAC;gBAC3E,OAAO,CAAC,IAAI,CAAC,GAAG,KAAK,CAAC,CAAC;YACzB,CAAC;YAAC,OAAO,KAAK,EAAE,CAAC;gBACf,OAAO,CAAC,KAAK,CAAC,kBAAkB,GAAG,CAAC,IAAI,GAAG,EAAE,KAAK,CAAC,CAAC;YACtD,CAAC;QACH,CAAC;QAED,OAAO,OAAO,CAAC;IACjB,CAAC;IAMO,iBAAiB,CAAC,WAAmB;QAC3C,MAAM,WAAW,GAAG,OAAO,CAAC,OAAO,CAAC,GAAG,WAAW,eAAe,CAAC,CAAC;QACnE,OAAO,cAAI,CAAC,OAAO,CAAC,WAAW,CAAC,CAAC;IACnC,CAAC;IAOO,cAAc,CAAC,YAAoB;QACzC,OAAO,OAAO,CAAC,YAAY,CAAC,CAAC;IAC/B,CAAC;IAEO,KAAK,CAAC,gBAAgB,CAAC,WAAmB,EAAE,WAAmB,EAAE,WAAgB;QACvF,MAAM,SAAS,GAAG,WAAW,CAAC,GAAG,IAAI,EAAE,CAAC;QACxC,MAAM,KAAK,GAAiB,EAAE,CAAC;QAC/B,MAAM,UAAU,GAAG,IAAI,CAAC,iBAAiB,CAAC,WAAW,CAAC,CAAC;QAGvD,MAAM,SAAS,GAAG,SAAS,CAAC,KAAK,IAAI,EAAE,CAAC;QAExC,IAAI,KAAK,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE,CAAC;YAE7B,KAAK,MAAM,QAAQ,IAAI,SAAS,EAAE,CAAC;gBACjC,IAAI,CAAC;oBAEH,MAAM,QAAQ,GAAG,cAAI,CAAC,IAAI,CAAC,UAAU,EAAE,QAAQ,CAAC,CAAC;oBACjD,MAAM,UAAU,GAAG,IAAI,CAAC,cAAc,CAAC,QAAQ,CAAC,CAAC;oBAGjD,MAAM,aAAa,GAAG,QAAQ,CAAC,KAAK,CAAC,4BAA4B,CAAC,CAAC;oBACnE,MAAM,QAAQ,GAAG,aAAa,CAAC,CAAC,CAAC,aAAa,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,cAAI,CAAC,QAAQ,CAAC,QAAQ,EAAE,UAAU,CAAC,CAAC;oBAGxF,MAAM,SAAS,GAAG,UAAU,CAAC,OAAO,IAAI,UAAU,CAAC,QAAQ,CAAC,IAAI,MAAM,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;oBAC7F,IAAI,SAAS,EAAE,CAAC;wBACd,KAAK,CAAC,IAAI,CAAC,EAAE,WAAW,EAAE,QAAQ,EAAE,SAAS,EAAE,CAAC,CAAC;wBACjD,OAAO,CAAC,GAAG,CAAC,cAAc,QAAQ,SAAS,WAAW,EAAE,CAAC,CAAC;oBAC5D,CAAC;yBAAM,CAAC;wBACN,OAAO,CAAC,IAAI,CAAC,iCAAiC,QAAQ,OAAO,WAAW,EAAE,CAAC,CAAC;oBAC9E,CAAC;gBACH,CAAC;gBAAC,OAAO,KAAK,EAAE,CAAC;oBACf,OAAO,CAAC,KAAK,CAAC,gCAAgC,WAAW,IAAI,QAAQ,GAAG,EAAG,KAAe,CAAC,OAAO,CAAC,CAAC;gBACtG,CAAC;YACH,CAAC;QACH,CAAC;aAAM,CAAC;YAEN,KAAK,MAAM,CAAC,QAAQ,EAAE,QAAQ,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE,CAAC;gBAC7D,IAAI,CAAC;oBACH,MAAM,QAAQ,GAAG,cAAI,CAAC,IAAI,CAAC,UAAU,EAAE,QAAkB,CAAC,CAAC;oBAC3D,MAAM,UAAU,GAAG,IAAI,CAAC,cAAc,CAAC,QAAQ,CAAC,CAAC;oBAGjD,MAAM,SAAS,GAAG,UAAU,CAAC,OAAO,IAAI,UAAU,CAAC,QAAQ,CAAC,IAAI,MAAM,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;oBAC7F,IAAI,SAAS,EAAE,CAAC;wBACd,KAAK,CAAC,IAAI,CAAC,EAAE,WAAW,EAAE,QAAQ,EAAE,SAAS,EAAE,CAAC,CAAC;wBACjD,OAAO,CAAC,GAAG,CAAC,cAAc,QAAQ,SAAS,WAAW,EAAE,CAAC,CAAC;oBAC5D,CAAC;yBAAM,CAAC;wBACN,OAAO,CAAC,IAAI,CAAC,iCAAiC,QAAQ,OAAO,WAAW,EAAE,CAAC,CAAC;oBAC9E,CAAC;gBACH,CAAC;gBAAC,OAAO,KAAK,EAAE,CAAC;oBACf,OAAO,CAAC,KAAK,CAAC,2BAA2B,QAAQ,SAAS,WAAW,GAAG,EAAG,KAAe,CAAC,OAAO,CAAC,CAAC;gBACtG,CAAC;YACH,CAAC;QACH,CAAC;QAED,OAAO,KAAK,CAAC;IACf,CAAC;CACF;AAlGD,sCAkGC"}
|
||||||
1
dist/mcp/server.d.ts
vendored
1
dist/mcp/server.d.ts
vendored
@@ -30,6 +30,7 @@ export declare class N8NDocumentationMCPServer {
|
|||||||
private validateToolParams;
|
private validateToolParams;
|
||||||
private validateToolParamsBasic;
|
private validateToolParamsBasic;
|
||||||
private validateExtractedArgs;
|
private validateExtractedArgs;
|
||||||
|
private coerceStringifiedJsonParams;
|
||||||
private listNodes;
|
private listNodes;
|
||||||
private getNodeInfo;
|
private getNodeInfo;
|
||||||
private searchNodes;
|
private searchNodes;
|
||||||
|
|||||||
2
dist/mcp/server.d.ts.map
vendored
2
dist/mcp/server.d.ts.map
vendored
@@ -1 +1 @@
|
|||||||
{"version":3,"file":"server.d.ts","sourceRoot":"","sources":["../../src/mcp/server.ts"],"names":[],"mappings":"AA0CA,OAAO,EAAE,eAAe,EAAE,MAAM,2BAA2B,CAAC;AAE5D,OAAO,EAAE,gBAAgB,EAAE,MAAM,iCAAiC,CAAC;AAmGnE,qBAAa,yBAAyB;IACpC,OAAO,CAAC,MAAM,CAAS;IACvB,OAAO,CAAC,EAAE,CAAgC;IAC1C,OAAO,CAAC,UAAU,CAA+B;IACjD,OAAO,CAAC,eAAe,CAAgC;IACvD,OAAO,CAAC,WAAW,CAAgB;IACnC,OAAO,CAAC,KAAK,CAAqB;IAClC,OAAO,CAAC,UAAU,CAAa;IAC/B,OAAO,CAAC,eAAe,CAAC,CAAkB;IAC1C,OAAO,CAAC,YAAY,CAAuB;IAC3C,OAAO,CAAC,qBAAqB,CAAsB;IACnD,OAAO,CAAC,WAAW,CAAiC;IACpD,OAAO,CAAC,kBAAkB,CAA4B;IACtD,OAAO,CAAC,iBAAiB,CAAkB;IAC3C,OAAO,CAAC,aAAa,CAAoC;IACzD,OAAO,CAAC,UAAU,CAAkB;gBAExB,eAAe,CAAC,EAAE,eAAe,EAAE,WAAW,CAAC,EAAE,gBAAgB;IAuGvE,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;YA+Cd,kBAAkB;YAiDlB,wBAAwB;IA0BtC,OAAO,CAAC,kBAAkB;YA6CZ,iBAAiB;IAa/B,OAAO,CAAC,eAAe,CAAkB;YAE3B,sBAAsB;IAgDpC,OAAO,CAAC,gBAAgB;IAqCxB,OAAO,CAAC,aAAa;IA8VrB,OAAO,CAAC,wBAAwB;IAoFhC,OAAO,CAAC,kBAAkB;IAqE1B,OAAO,CAAC,uBAAuB;IAwB/B,OAAO,CAAC,qBAAqB;YAoTf,SAAS;YA2DT,WAAW;YAkFX,WAAW;YA0CX,cAAc;YA8Md,gBAAgB;IAqD9B,OAAO,CAAC,mBAAmB;IAwE3B,OAAO,CAAC,eAAe;YAsBT,eAAe;IA2L7B,OAAO,CAAC,kBAAkB;IAQ1B,OAAO,CAAC,uBAAuB;IA0D/B,OAAO,CAAC,iBAAiB;YAqFX,WAAW;YAgCX,oBAAoB;IAuFlC,OAAO,CAAC,aAAa;YAQP,qBAAqB;YAwDrB,iBAAiB;YAiKjB,OAAO;YAgDP,cAAc;YAwFd,iBAAiB;IAqC/B,OAAO,CAAC,iBAAiB;IA0BzB,OAAO,CAAC,iBAAiB;IA0BzB,OAAO,CAAC,eAAe;IAwCvB,OAAO,CAAC,kBAAkB;IAiC1B,OAAO,CAAC,aAAa;IAoCrB,OAAO,CAAC,0BAA0B;IAgClC,OAAO,CAAC,4BAA4B;YAKtB,oBAAoB;IAsDlC,OAAO,CAAC,gBAAgB;YAiBV,SAAS;YA6CT,kBAAkB;YAqElB,uBAAuB;YAsDvB,iBAAiB;IAqE/B,OAAO,CAAC,qBAAqB;IA8C7B,OAAO,CAAC,uBAAuB;IA4D/B,OAAO,CAAC,wBAAwB;IAkChC,OAAO,CAAC,iBAAiB;YAoDX,mBAAmB;YAoEnB,qBAAqB;IAS7B,OAAO,CAAC,SAAS,EAAE,GAAG,GAAG,OAAO,CAAC,IAAI,CAAC;YAS9B,aAAa;YAcb,iBAAiB;YAoBjB,WAAW;YAwBX,eAAe;YAqBf,mBAAmB;YAwBnB,yBAAyB;IA4CvC,OAAO,CAAC,kBAAkB;YAiBZ,gBAAgB;YA6HhB,2BAA2B;YAiE3B,2BAA2B;IAyEnC,GAAG,IAAI,OAAO,CAAC,IAAI,CAAC;IA0BpB,QAAQ,IAAI,OAAO,CAAC,IAAI,CAAC;CAgEhC"}
|
{"version":3,"file":"server.d.ts","sourceRoot":"","sources":["../../src/mcp/server.ts"],"names":[],"mappings":"AA0CA,OAAO,EAAE,eAAe,EAAE,MAAM,2BAA2B,CAAC;AAE5D,OAAO,EAAE,gBAAgB,EAAE,MAAM,iCAAiC,CAAC;AAmGnE,qBAAa,yBAAyB;IACpC,OAAO,CAAC,MAAM,CAAS;IACvB,OAAO,CAAC,EAAE,CAAgC;IAC1C,OAAO,CAAC,UAAU,CAA+B;IACjD,OAAO,CAAC,eAAe,CAAgC;IACvD,OAAO,CAAC,WAAW,CAAgB;IACnC,OAAO,CAAC,KAAK,CAAqB;IAClC,OAAO,CAAC,UAAU,CAAa;IAC/B,OAAO,CAAC,eAAe,CAAC,CAAkB;IAC1C,OAAO,CAAC,YAAY,CAAuB;IAC3C,OAAO,CAAC,qBAAqB,CAAsB;IACnD,OAAO,CAAC,WAAW,CAAiC;IACpD,OAAO,CAAC,kBAAkB,CAA4B;IACtD,OAAO,CAAC,iBAAiB,CAAkB;IAC3C,OAAO,CAAC,aAAa,CAAoC;IACzD,OAAO,CAAC,UAAU,CAAkB;gBAExB,eAAe,CAAC,EAAE,eAAe,EAAE,WAAW,CAAC,EAAE,gBAAgB;IAuGvE,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;YA+Cd,kBAAkB;YAiDlB,wBAAwB;IA0BtC,OAAO,CAAC,kBAAkB;YA6CZ,iBAAiB;IAa/B,OAAO,CAAC,eAAe,CAAkB;YAE3B,sBAAsB;IAgDpC,OAAO,CAAC,gBAAgB;IAqCxB,OAAO,CAAC,aAAa;IA0XrB,OAAO,CAAC,wBAAwB;IAoFhC,OAAO,CAAC,kBAAkB;IAqE1B,OAAO,CAAC,uBAAuB;IAwB/B,OAAO,CAAC,qBAAqB;IAiF7B,OAAO,CAAC,2BAA2B;YA0UrB,SAAS;YA2DT,WAAW;YAkFX,WAAW;YA0CX,cAAc;YA8Md,gBAAgB;IAqD9B,OAAO,CAAC,mBAAmB;IAwE3B,OAAO,CAAC,eAAe;YAsBT,eAAe;IA2L7B,OAAO,CAAC,kBAAkB;IAQ1B,OAAO,CAAC,uBAAuB;IA0D/B,OAAO,CAAC,iBAAiB;YAqFX,WAAW;YAgCX,oBAAoB;IAuFlC,OAAO,CAAC,aAAa;YAQP,qBAAqB;YAwDrB,iBAAiB;YAiKjB,OAAO;YAgDP,cAAc;YAwFd,iBAAiB;IAqC/B,OAAO,CAAC,iBAAiB;IA0BzB,OAAO,CAAC,iBAAiB;IA0BzB,OAAO,CAAC,eAAe;IAwCvB,OAAO,CAAC,kBAAkB;IAiC1B,OAAO,CAAC,aAAa;IAoCrB,OAAO,CAAC,0BAA0B;IAgClC,OAAO,CAAC,4BAA4B;YAKtB,oBAAoB;IAsDlC,OAAO,CAAC,gBAAgB;YAiBV,SAAS;YA6CT,kBAAkB;YAqElB,uBAAuB;YAsDvB,iBAAiB;IAqE/B,OAAO,CAAC,qBAAqB;IA8C7B,OAAO,CAAC,uBAAuB;IA4D/B,OAAO,CAAC,wBAAwB;IAkChC,OAAO,CAAC,iBAAiB;YAoDX,mBAAmB;YAoEnB,qBAAqB;IAS7B,OAAO,CAAC,SAAS,EAAE,GAAG,GAAG,OAAO,CAAC,IAAI,CAAC;YAS9B,aAAa;YAcb,iBAAiB;YAoBjB,WAAW;YAwBX,eAAe;YAqBf,mBAAmB;YAwBnB,yBAAyB;IA4CvC,OAAO,CAAC,kBAAkB;YAiBZ,gBAAgB;YA6HhB,2BAA2B;YAiE3B,2BAA2B;IAyEnC,GAAG,IAAI,OAAO,CAAC,IAAI,CAAC;IA0BpB,QAAQ,IAAI,OAAO,CAAC,IAAI,CAAC;CAgEhC"}
|
||||||
107
dist/mcp/server.js
vendored
107
dist/mcp/server.js
vendored
@@ -457,6 +457,18 @@ class N8NDocumentationMCPServer {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
let processedArgs = args;
|
let processedArgs = args;
|
||||||
|
if (typeof args === 'string') {
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(args);
|
||||||
|
if (parsed && typeof parsed === 'object' && !Array.isArray(parsed)) {
|
||||||
|
processedArgs = parsed;
|
||||||
|
logger_1.logger.warn(`Coerced stringified args object for tool "${name}"`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch {
|
||||||
|
logger_1.logger.warn(`Tool "${name}" received string args that are not valid JSON`);
|
||||||
|
}
|
||||||
|
}
|
||||||
if (args && typeof args === 'object' && 'output' in args) {
|
if (args && typeof args === 'object' && 'output' in args) {
|
||||||
try {
|
try {
|
||||||
const possibleNestedData = args.output;
|
const possibleNestedData = args.output;
|
||||||
@@ -485,6 +497,7 @@ class N8NDocumentationMCPServer {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
processedArgs = this.coerceStringifiedJsonParams(name, processedArgs);
|
||||||
try {
|
try {
|
||||||
logger_1.logger.debug(`Executing tool: ${name}`, { args: processedArgs });
|
logger_1.logger.debug(`Executing tool: ${name}`, { args: processedArgs });
|
||||||
const startTime = Date.now();
|
const startTime = Date.now();
|
||||||
@@ -556,6 +569,13 @@ class N8NDocumentationMCPServer {
|
|||||||
if (name.startsWith('validate_') && (errorMessage.includes('config') || errorMessage.includes('nodeType'))) {
|
if (name.startsWith('validate_') && (errorMessage.includes('config') || errorMessage.includes('nodeType'))) {
|
||||||
helpfulMessage += '\n\nFor validation tools:\n- nodeType should be a string (e.g., "nodes-base.webhook")\n- config should be an object (e.g., {})';
|
helpfulMessage += '\n\nFor validation tools:\n- nodeType should be a string (e.g., "nodes-base.webhook")\n- config should be an object (e.g., {})';
|
||||||
}
|
}
|
||||||
|
try {
|
||||||
|
const argDiag = processedArgs && typeof processedArgs === 'object'
|
||||||
|
? Object.entries(processedArgs).map(([k, v]) => `${k}: ${typeof v}`).join(', ')
|
||||||
|
: `args type: ${typeof processedArgs}`;
|
||||||
|
helpfulMessage += `\n\n[Diagnostic] Received arg types: {${argDiag}}`;
|
||||||
|
}
|
||||||
|
catch { }
|
||||||
return {
|
return {
|
||||||
content: [
|
content: [
|
||||||
{
|
{
|
||||||
@@ -795,6 +815,93 @@ class N8NDocumentationMCPServer {
|
|||||||
}
|
}
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
coerceStringifiedJsonParams(toolName, args) {
|
||||||
|
if (!args || typeof args !== 'object')
|
||||||
|
return args;
|
||||||
|
const allTools = [...tools_1.n8nDocumentationToolsFinal, ...tools_n8n_manager_1.n8nManagementTools];
|
||||||
|
const tool = allTools.find(t => t.name === toolName);
|
||||||
|
if (!tool?.inputSchema?.properties)
|
||||||
|
return args;
|
||||||
|
const properties = tool.inputSchema.properties;
|
||||||
|
const coerced = { ...args };
|
||||||
|
let coercedAny = false;
|
||||||
|
for (const [key, value] of Object.entries(coerced)) {
|
||||||
|
if (value === undefined || value === null)
|
||||||
|
continue;
|
||||||
|
const propSchema = properties[key];
|
||||||
|
if (!propSchema)
|
||||||
|
continue;
|
||||||
|
const expectedType = propSchema.type;
|
||||||
|
if (!expectedType)
|
||||||
|
continue;
|
||||||
|
const actualType = typeof value;
|
||||||
|
if (expectedType === 'string' && actualType === 'string')
|
||||||
|
continue;
|
||||||
|
if ((expectedType === 'number' || expectedType === 'integer') && actualType === 'number')
|
||||||
|
continue;
|
||||||
|
if (expectedType === 'boolean' && actualType === 'boolean')
|
||||||
|
continue;
|
||||||
|
if (expectedType === 'object' && actualType === 'object' && !Array.isArray(value))
|
||||||
|
continue;
|
||||||
|
if (expectedType === 'array' && Array.isArray(value))
|
||||||
|
continue;
|
||||||
|
if (actualType === 'string') {
|
||||||
|
const trimmed = value.trim();
|
||||||
|
if (expectedType === 'object' && trimmed.startsWith('{')) {
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(trimmed);
|
||||||
|
if (typeof parsed === 'object' && parsed !== null && !Array.isArray(parsed)) {
|
||||||
|
coerced[key] = parsed;
|
||||||
|
coercedAny = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch { }
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (expectedType === 'array' && trimmed.startsWith('[')) {
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(trimmed);
|
||||||
|
if (Array.isArray(parsed)) {
|
||||||
|
coerced[key] = parsed;
|
||||||
|
coercedAny = true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch { }
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (expectedType === 'number' || expectedType === 'integer') {
|
||||||
|
const num = Number(trimmed);
|
||||||
|
if (!isNaN(num) && trimmed !== '') {
|
||||||
|
coerced[key] = expectedType === 'integer' ? Math.trunc(num) : num;
|
||||||
|
coercedAny = true;
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (expectedType === 'boolean') {
|
||||||
|
if (trimmed === 'true') {
|
||||||
|
coerced[key] = true;
|
||||||
|
coercedAny = true;
|
||||||
|
}
|
||||||
|
else if (trimmed === 'false') {
|
||||||
|
coerced[key] = false;
|
||||||
|
coercedAny = true;
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (expectedType === 'string' && (actualType === 'number' || actualType === 'boolean')) {
|
||||||
|
coerced[key] = String(value);
|
||||||
|
coercedAny = true;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (coercedAny) {
|
||||||
|
logger_1.logger.warn(`Coerced mistyped params for tool "${toolName}"`, {
|
||||||
|
original: Object.fromEntries(Object.entries(args).map(([k, v]) => [k, `${typeof v}: ${typeof v === 'string' ? v.substring(0, 80) : v}`])),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return coerced;
|
||||||
|
}
|
||||||
async executeTool(name, args) {
|
async executeTool(name, args) {
|
||||||
args = args || {};
|
args = args || {};
|
||||||
const disabledTools = this.getDisabledTools();
|
const disabledTools = this.getDisabledTools();
|
||||||
|
|||||||
2
dist/mcp/server.js.map
vendored
2
dist/mcp/server.js.map
vendored
File diff suppressed because one or more lines are too long
16
dist/services/workflow-validator.d.ts
vendored
16
dist/services/workflow-validator.d.ts
vendored
@@ -21,17 +21,7 @@ interface WorkflowNode {
|
|||||||
}
|
}
|
||||||
interface WorkflowConnection {
|
interface WorkflowConnection {
|
||||||
[sourceNode: string]: {
|
[sourceNode: string]: {
|
||||||
main?: Array<Array<{
|
[outputType: string]: Array<Array<{
|
||||||
node: string;
|
|
||||||
type: string;
|
|
||||||
index: number;
|
|
||||||
}>>;
|
|
||||||
error?: Array<Array<{
|
|
||||||
node: string;
|
|
||||||
type: string;
|
|
||||||
index: number;
|
|
||||||
}>>;
|
|
||||||
ai_tool?: Array<Array<{
|
|
||||||
node: string;
|
node: string;
|
||||||
type: string;
|
type: string;
|
||||||
index: number;
|
index: number;
|
||||||
@@ -94,6 +84,10 @@ export declare class WorkflowValidator {
|
|||||||
private validateErrorOutputConfiguration;
|
private validateErrorOutputConfiguration;
|
||||||
private validateAIToolConnection;
|
private validateAIToolConnection;
|
||||||
private validateAIToolSource;
|
private validateAIToolSource;
|
||||||
|
private validateOutputIndexBounds;
|
||||||
|
private validateInputIndexBounds;
|
||||||
|
private flagOrphanedNodes;
|
||||||
|
private validateTriggerReachability;
|
||||||
private hasCycle;
|
private hasCycle;
|
||||||
private validateExpressions;
|
private validateExpressions;
|
||||||
private countExpressionsInObject;
|
private countExpressionsInObject;
|
||||||
|
|||||||
2
dist/services/workflow-validator.d.ts.map
vendored
2
dist/services/workflow-validator.d.ts.map
vendored
@@ -1 +1 @@
|
|||||||
{"version":3,"file":"workflow-validator.d.ts","sourceRoot":"","sources":["../../src/services/workflow-validator.ts"],"names":[],"mappings":"AAMA,OAAO,EAAE,cAAc,EAAE,MAAM,6BAA6B,CAAC;AAC7D,OAAO,EAAE,uBAAuB,EAAE,MAAM,6BAA6B,CAAC;AAatE,UAAU,YAAY;IACpB,EAAE,EAAE,MAAM,CAAC;IACX,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC3B,UAAU,EAAE,GAAG,CAAC;IAChB,WAAW,CAAC,EAAE,GAAG,CAAC;IAClB,QAAQ,CAAC,EAAE,OAAO,CAAC;IACnB,KAAK,CAAC,EAAE,MAAM,CAAC;IACf,WAAW,CAAC,EAAE,OAAO,CAAC;IACtB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,cAAc,CAAC,EAAE,OAAO,CAAC;IACzB,OAAO,CAAC,EAAE,uBAAuB,GAAG,qBAAqB,GAAG,cAAc,CAAC;IAC3E,WAAW,CAAC,EAAE,OAAO,CAAC;IACtB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,gBAAgB,CAAC,EAAE,MAAM,CAAC;IAC1B,gBAAgB,CAAC,EAAE,OAAO,CAAC;IAC3B,WAAW,CAAC,EAAE,OAAO,CAAC;CACvB;AAED,UAAU,kBAAkB;IAC1B,CAAC,UAAU,EAAE,MAAM,GAAG;QACpB,IAAI,CAAC,EAAE,KAAK,CAAC,KAAK,CAAC;YAAE,IAAI,EAAE,MAAM,CAAC;YAAC,IAAI,EAAE,MAAM,CAAC;YAAC,KAAK,EAAE,MAAM,CAAA;SAAE,CAAC,CAAC,CAAC;QACnE,KAAK,CAAC,EAAE,KAAK,CAAC,KAAK,CAAC;YAAE,IAAI,EAAE,MAAM,CAAC;YAAC,IAAI,EAAE,MAAM,CAAC;YAAC,KAAK,EAAE,MAAM,CAAA;SAAE,CAAC,CAAC,CAAC;QACpE,OAAO,CAAC,EAAE,KAAK,CAAC,KAAK,CAAC;YAAE,IAAI,EAAE,MAAM,CAAC;YAAC,IAAI,EAAE,MAAM,CAAC;YAAC,KAAK,EAAE,MAAM,CAAA;SAAE,CAAC,CAAC,CAAC;KACvE,CAAC;CACH;AAED,UAAU,YAAY;IACpB,IAAI,CAAC,EAAE,MAAM,CAAC;IACd,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,WAAW,EAAE,kBAAkB,CAAC;IAChC,QAAQ,CAAC,EAAE,GAAG,CAAC;IACf,UAAU,CAAC,EAAE,GAAG,CAAC;IACjB,OAAO,CAAC,EAAE,GAAG,CAAC;IACd,IAAI,CAAC,EAAE,GAAG,CAAC;CACZ;AAED,MAAM,WAAW,eAAe;IAC9B,IAAI,EAAE,OAAO,GAAG,SAAS,CAAC;IAC1B,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,OAAO,CAAC,EAAE,GAAG,CAAC;IACd,IAAI,CAAC,EAAE,MAAM,CAAC;IACd,GAAG,CAAC,EAAE;QACJ,IAAI,EAAE,MAAM,CAAC;QACb,WAAW,CAAC,EAAE,MAAM,CAAC;QACrB,aAAa,CAAC,EAAE,MAAM,CAAC;QACvB,WAAW,CAAC,EAAE,MAAM,CAAC;KACtB,CAAC;CACH;AAED,MAAM,WAAW,wBAAwB;IACvC,KAAK,EAAE,OAAO,CAAC;IACf,MAAM,EAAE,eAAe,EAAE,CAAC;IAC1B,QAAQ,EAAE,eAAe,EAAE,CAAC;IAC5B,UAAU,EAAE;QACV,UAAU,EAAE,MAAM,CAAC;QACnB,YAAY,EAAE,MAAM,CAAC;QACrB,YAAY,EAAE,MAAM,CAAC;QACrB,gBAAgB,EAAE,MAAM,CAAC;QACzB,kBAAkB,EAAE,MAAM,CAAC;QAC3B,oBAAoB,EAAE,MAAM,CAAC;KAC9B,CAAC;IACF,WAAW,EAAE,MAAM,EAAE,CAAC;CACvB;AAED,qBAAa,iBAAiB;IAK1B,OAAO,CAAC,cAAc;IACtB,OAAO,CAAC,aAAa;IALvB,OAAO,CAAC,eAAe,CAA6B;IACpD,OAAO,CAAC,iBAAiB,CAAwB;gBAGvC,cAAc,EAAE,cAAc,EAC9B,aAAa,EAAE,OAAO,uBAAuB;IAWjD,gBAAgB,CACpB,QAAQ,EAAE,YAAY,EACtB,OAAO,GAAE;QACP,aAAa,CAAC,EAAE,OAAO,CAAC;QACxB,mBAAmB,CAAC,EAAE,OAAO,CAAC;QAC9B,mBAAmB,CAAC,EAAE,OAAO,CAAC;QAC9B,OAAO,CAAC,EAAE,SAAS,GAAG,SAAS,GAAG,aAAa,GAAG,QAAQ,CAAC;KACvD,GACL,OAAO,CAAC,wBAAwB,CAAC;IAgHpC,OAAO,CAAC,yBAAyB;YAkInB,gBAAgB;IAmO9B,OAAO,CAAC,mBAAmB;IA8H3B,OAAO,CAAC,yBAAyB;IAgGjC,OAAO,CAAC,gCAAgC;IAoFxC,OAAO,CAAC,wBAAwB;IAsChC,OAAO,CAAC,oBAAoB;IAuE5B,OAAO,CAAC,QAAQ;IAsFhB,OAAO,CAAC,mBAAmB;IA4F3B,OAAO,CAAC,wBAAwB;IA2BhC,OAAO,CAAC,YAAY;IAgBpB,OAAO,CAAC,qBAAqB;IAgG7B,OAAO,CAAC,qBAAqB;IA8C7B,OAAO,CAAC,mBAAmB;IA4E3B,OAAO,CAAC,sBAAsB;IAyT9B,OAAO,CAAC,yBAAyB;IAqCjC,OAAO,CAAC,gCAAgC;IA8BxC,OAAO,CAAC,gCAAgC;IAsFxC,OAAO,CAAC,gBAAgB;IA4CxB,OAAO,CAAC,2BAA2B;CAmEpC"}
|
{"version":3,"file":"workflow-validator.d.ts","sourceRoot":"","sources":["../../src/services/workflow-validator.ts"],"names":[],"mappings":"AAMA,OAAO,EAAE,cAAc,EAAE,MAAM,6BAA6B,CAAC;AAC7D,OAAO,EAAE,uBAAuB,EAAE,MAAM,6BAA6B,CAAC;AA4BtE,UAAU,YAAY;IACpB,EAAE,EAAE,MAAM,CAAC;IACX,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC3B,UAAU,EAAE,GAAG,CAAC;IAChB,WAAW,CAAC,EAAE,GAAG,CAAC;IAClB,QAAQ,CAAC,EAAE,OAAO,CAAC;IACnB,KAAK,CAAC,EAAE,MAAM,CAAC;IACf,WAAW,CAAC,EAAE,OAAO,CAAC;IACtB,WAAW,CAAC,EAAE,MAAM,CAAC;IACrB,cAAc,CAAC,EAAE,OAAO,CAAC;IACzB,OAAO,CAAC,EAAE,uBAAuB,GAAG,qBAAqB,GAAG,cAAc,CAAC;IAC3E,WAAW,CAAC,EAAE,OAAO,CAAC;IACtB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,gBAAgB,CAAC,EAAE,MAAM,CAAC;IAC1B,gBAAgB,CAAC,EAAE,OAAO,CAAC;IAC3B,WAAW,CAAC,EAAE,OAAO,CAAC;CACvB;AAED,UAAU,kBAAkB;IAC1B,CAAC,UAAU,EAAE,MAAM,GAAG;QACpB,CAAC,UAAU,EAAE,MAAM,GAAG,KAAK,CAAC,KAAK,CAAC;YAAE,IAAI,EAAE,MAAM,CAAC;YAAC,IAAI,EAAE,MAAM,CAAC;YAAC,KAAK,EAAE,MAAM,CAAA;SAAE,CAAC,CAAC,CAAC;KACnF,CAAC;CACH;AAED,UAAU,YAAY;IACpB,IAAI,CAAC,EAAE,MAAM,CAAC;IACd,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,WAAW,EAAE,kBAAkB,CAAC;IAChC,QAAQ,CAAC,EAAE,GAAG,CAAC;IACf,UAAU,CAAC,EAAE,GAAG,CAAC;IACjB,OAAO,CAAC,EAAE,GAAG,CAAC;IACd,IAAI,CAAC,EAAE,GAAG,CAAC;CACZ;AAED,MAAM,WAAW,eAAe;IAC9B,IAAI,EAAE,OAAO,GAAG,SAAS,CAAC;IAC1B,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,OAAO,CAAC,EAAE,GAAG,CAAC;IACd,IAAI,CAAC,EAAE,MAAM,CAAC;IACd,GAAG,CAAC,EAAE;QACJ,IAAI,EAAE,MAAM,CAAC;QACb,WAAW,CAAC,EAAE,MAAM,CAAC;QACrB,aAAa,CAAC,EAAE,MAAM,CAAC;QACvB,WAAW,CAAC,EAAE,MAAM,CAAC;KACtB,CAAC;CACH;AAED,MAAM,WAAW,wBAAwB;IACvC,KAAK,EAAE,OAAO,CAAC;IACf,MAAM,EAAE,eAAe,EAAE,CAAC;IAC1B,QAAQ,EAAE,eAAe,EAAE,CAAC;IAC5B,UAAU,EAAE;QACV,UAAU,EAAE,MAAM,CAAC;QACnB,YAAY,EAAE,MAAM,CAAC;QACrB,YAAY,EAAE,MAAM,CAAC;QACrB,gBAAgB,EAAE,MAAM,CAAC;QACzB,kBAAkB,EAAE,MAAM,CAAC;QAC3B,oBAAoB,EAAE,MAAM,CAAC;KAC9B,CAAC;IACF,WAAW,EAAE,MAAM,EAAE,CAAC;CACvB;AAED,qBAAa,iBAAiB;IAK1B,OAAO,CAAC,cAAc;IACtB,OAAO,CAAC,aAAa;IALvB,OAAO,CAAC,eAAe,CAA6B;IACpD,OAAO,CAAC,iBAAiB,CAAwB;gBAGvC,cAAc,EAAE,cAAc,EAC9B,aAAa,EAAE,OAAO,uBAAuB;IAWjD,gBAAgB,CACpB,QAAQ,EAAE,YAAY,EACtB,OAAO,GAAE;QACP,aAAa,CAAC,EAAE,OAAO,CAAC;QACxB,mBAAmB,CAAC,EAAE,OAAO,CAAC;QAC9B,mBAAmB,CAAC,EAAE,OAAO,CAAC;QAC9B,OAAO,CAAC,EAAE,SAAS,GAAG,SAAS,GAAG,aAAa,GAAG,QAAQ,CAAC;KACvD,GACL,OAAO,CAAC,wBAAwB,CAAC;IAgHpC,OAAO,CAAC,yBAAyB;YAkInB,gBAAgB;IAmO9B,OAAO,CAAC,mBAAmB;IAuF3B,OAAO,CAAC,yBAAyB;IAsHjC,OAAO,CAAC,gCAAgC;IAoFxC,OAAO,CAAC,wBAAwB;IAsChC,OAAO,CAAC,oBAAoB;IAsE5B,OAAO,CAAC,yBAAyB;IAiEjC,OAAO,CAAC,wBAAwB;IAuChC,OAAO,CAAC,iBAAiB;IAoCzB,OAAO,CAAC,2BAA2B;IA4EnC,OAAO,CAAC,QAAQ;IA4EhB,OAAO,CAAC,mBAAmB;IA4F3B,OAAO,CAAC,wBAAwB;IA2BhC,OAAO,CAAC,YAAY;IAgBpB,OAAO,CAAC,qBAAqB;IAgG7B,OAAO,CAAC,qBAAqB;IA8C7B,OAAO,CAAC,mBAAmB;IA4E3B,OAAO,CAAC,sBAAsB;IAyT9B,OAAO,CAAC,yBAAyB;IAqCjC,OAAO,CAAC,gCAAgC;IA8BxC,OAAO,CAAC,gCAAgC;IAsFxC,OAAO,CAAC,gBAAgB;IA4CxB,OAAO,CAAC,2BAA2B;CAmEpC"}
|
||||||
285
dist/services/workflow-validator.js
vendored
285
dist/services/workflow-validator.js
vendored
@@ -16,6 +16,15 @@ const node_type_utils_1 = require("../utils/node-type-utils");
|
|||||||
const node_classification_1 = require("../utils/node-classification");
|
const node_classification_1 = require("../utils/node-classification");
|
||||||
const tool_variant_generator_1 = require("./tool-variant-generator");
|
const tool_variant_generator_1 = require("./tool-variant-generator");
|
||||||
const logger = new logger_1.Logger({ prefix: '[WorkflowValidator]' });
|
const logger = new logger_1.Logger({ prefix: '[WorkflowValidator]' });
|
||||||
|
const VALID_CONNECTION_TYPES = new Set([
|
||||||
|
'main',
|
||||||
|
'error',
|
||||||
|
...ai_node_validator_1.AI_CONNECTION_TYPES,
|
||||||
|
'ai_agent',
|
||||||
|
'ai_chain',
|
||||||
|
'ai_retriever',
|
||||||
|
'ai_reranker',
|
||||||
|
]);
|
||||||
class WorkflowValidator {
|
class WorkflowValidator {
|
||||||
constructor(nodeRepository, nodeValidator) {
|
constructor(nodeRepository, nodeValidator) {
|
||||||
this.nodeRepository = nodeRepository;
|
this.nodeRepository = nodeRepository;
|
||||||
@@ -393,51 +402,34 @@ class WorkflowValidator {
|
|||||||
result.statistics.invalidConnections++;
|
result.statistics.invalidConnections++;
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
if (outputs.main) {
|
for (const [outputKey, outputConnections] of Object.entries(outputs)) {
|
||||||
this.validateConnectionOutputs(sourceName, outputs.main, nodeMap, nodeIdMap, result, 'main');
|
if (!VALID_CONNECTION_TYPES.has(outputKey)) {
|
||||||
}
|
let suggestion = '';
|
||||||
if (outputs.error) {
|
if (/^\d+$/.test(outputKey)) {
|
||||||
this.validateConnectionOutputs(sourceName, outputs.error, nodeMap, nodeIdMap, result, 'error');
|
suggestion = ` If you meant to use output index ${outputKey}, use main[${outputKey}] instead.`;
|
||||||
}
|
}
|
||||||
if (outputs.ai_tool) {
|
result.errors.push({
|
||||||
this.validateAIToolSource(sourceNode, result);
|
type: 'error',
|
||||||
this.validateConnectionOutputs(sourceName, outputs.ai_tool, nodeMap, nodeIdMap, result, 'ai_tool');
|
nodeName: sourceName,
|
||||||
|
message: `Unknown connection output key "${outputKey}" on node "${sourceName}". Valid keys are: ${[...VALID_CONNECTION_TYPES].join(', ')}.${suggestion}`,
|
||||||
|
code: 'UNKNOWN_CONNECTION_KEY'
|
||||||
|
});
|
||||||
|
result.statistics.invalidConnections++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
if (!outputConnections || !Array.isArray(outputConnections))
|
||||||
|
continue;
|
||||||
|
if (outputKey === 'ai_tool') {
|
||||||
|
this.validateAIToolSource(sourceNode, result);
|
||||||
|
}
|
||||||
|
this.validateConnectionOutputs(sourceName, outputConnections, nodeMap, nodeIdMap, result, outputKey);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
const connectedNodes = new Set();
|
if (profile !== 'minimal') {
|
||||||
Object.keys(workflow.connections).forEach(name => connectedNodes.add(name));
|
this.validateTriggerReachability(workflow, result);
|
||||||
Object.values(workflow.connections).forEach(outputs => {
|
}
|
||||||
if (outputs.main) {
|
else {
|
||||||
outputs.main.flat().forEach(conn => {
|
this.flagOrphanedNodes(workflow, result);
|
||||||
if (conn)
|
|
||||||
connectedNodes.add(conn.node);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (outputs.error) {
|
|
||||||
outputs.error.flat().forEach(conn => {
|
|
||||||
if (conn)
|
|
||||||
connectedNodes.add(conn.node);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (outputs.ai_tool) {
|
|
||||||
outputs.ai_tool.flat().forEach(conn => {
|
|
||||||
if (conn)
|
|
||||||
connectedNodes.add(conn.node);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
for (const node of workflow.nodes) {
|
|
||||||
if (node.disabled || (0, node_classification_1.isNonExecutableNode)(node.type))
|
|
||||||
continue;
|
|
||||||
const isNodeTrigger = (0, node_type_utils_1.isTriggerNode)(node.type);
|
|
||||||
if (!connectedNodes.has(node.name) && !isNodeTrigger) {
|
|
||||||
result.warnings.push({
|
|
||||||
type: 'warning',
|
|
||||||
nodeId: node.id,
|
|
||||||
nodeName: node.name,
|
|
||||||
message: 'Node is not connected to any other nodes'
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
if (profile !== 'minimal' && this.hasCycle(workflow)) {
|
if (profile !== 'minimal' && this.hasCycle(workflow)) {
|
||||||
result.errors.push({
|
result.errors.push({
|
||||||
@@ -450,6 +442,7 @@ class WorkflowValidator {
|
|||||||
const sourceNode = nodeMap.get(sourceName);
|
const sourceNode = nodeMap.get(sourceName);
|
||||||
if (outputType === 'main' && sourceNode) {
|
if (outputType === 'main' && sourceNode) {
|
||||||
this.validateErrorOutputConfiguration(sourceName, sourceNode, outputs, nodeMap, result);
|
this.validateErrorOutputConfiguration(sourceName, sourceNode, outputs, nodeMap, result);
|
||||||
|
this.validateOutputIndexBounds(sourceNode, outputs, result);
|
||||||
}
|
}
|
||||||
outputs.forEach((outputConnections, outputIndex) => {
|
outputs.forEach((outputConnections, outputIndex) => {
|
||||||
if (!outputConnections)
|
if (!outputConnections)
|
||||||
@@ -463,6 +456,20 @@ class WorkflowValidator {
|
|||||||
result.statistics.invalidConnections++;
|
result.statistics.invalidConnections++;
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
if (connection.type && !VALID_CONNECTION_TYPES.has(connection.type)) {
|
||||||
|
let suggestion = '';
|
||||||
|
if (/^\d+$/.test(connection.type)) {
|
||||||
|
suggestion = ` Numeric types are not valid - use "main", "error", or an AI connection type.`;
|
||||||
|
}
|
||||||
|
result.errors.push({
|
||||||
|
type: 'error',
|
||||||
|
nodeName: sourceName,
|
||||||
|
message: `Invalid connection type "${connection.type}" in connection from "${sourceName}" to "${connection.node}". Expected "main", "error", or an AI connection type (ai_tool, ai_languageModel, etc.).${suggestion}`,
|
||||||
|
code: 'INVALID_CONNECTION_TYPE'
|
||||||
|
});
|
||||||
|
result.statistics.invalidConnections++;
|
||||||
|
return;
|
||||||
|
}
|
||||||
const isSplitInBatches = sourceNode && (sourceNode.type === 'n8n-nodes-base.splitInBatches' ||
|
const isSplitInBatches = sourceNode && (sourceNode.type === 'n8n-nodes-base.splitInBatches' ||
|
||||||
sourceNode.type === 'nodes-base.splitInBatches');
|
sourceNode.type === 'nodes-base.splitInBatches');
|
||||||
if (isSplitInBatches) {
|
if (isSplitInBatches) {
|
||||||
@@ -506,6 +513,9 @@ class WorkflowValidator {
|
|||||||
if (outputType === 'ai_tool') {
|
if (outputType === 'ai_tool') {
|
||||||
this.validateAIToolConnection(sourceName, targetNode, result);
|
this.validateAIToolConnection(sourceName, targetNode, result);
|
||||||
}
|
}
|
||||||
|
if (outputType === 'main') {
|
||||||
|
this.validateInputIndexBounds(sourceName, targetNode, connection, result);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
@@ -634,6 +644,171 @@ class WorkflowValidator {
|
|||||||
code: 'INVALID_AI_TOOL_SOURCE'
|
code: 'INVALID_AI_TOOL_SOURCE'
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
validateOutputIndexBounds(sourceNode, outputs, result) {
|
||||||
|
const normalizedType = node_type_normalizer_1.NodeTypeNormalizer.normalizeToFullForm(sourceNode.type);
|
||||||
|
const nodeInfo = this.nodeRepository.getNode(normalizedType);
|
||||||
|
if (!nodeInfo || !nodeInfo.outputs)
|
||||||
|
return;
|
||||||
|
let mainOutputCount;
|
||||||
|
if (Array.isArray(nodeInfo.outputs)) {
|
||||||
|
mainOutputCount = nodeInfo.outputs.filter((o) => typeof o === 'string' ? o === 'main' : (o.type === 'main' || !o.type)).length;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
if (mainOutputCount === 0)
|
||||||
|
return;
|
||||||
|
const shortType = normalizedType.replace(/^(n8n-)?nodes-base\./, '');
|
||||||
|
if (shortType === 'switch') {
|
||||||
|
const rules = sourceNode.parameters?.rules?.values || sourceNode.parameters?.rules;
|
||||||
|
if (Array.isArray(rules)) {
|
||||||
|
mainOutputCount = rules.length + 1;
|
||||||
|
}
|
||||||
|
else {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (shortType === 'if' || shortType === 'filter') {
|
||||||
|
mainOutputCount = 2;
|
||||||
|
}
|
||||||
|
if (sourceNode.onError === 'continueErrorOutput') {
|
||||||
|
mainOutputCount += 1;
|
||||||
|
}
|
||||||
|
const maxOutputIndex = outputs.length - 1;
|
||||||
|
if (maxOutputIndex >= mainOutputCount) {
|
||||||
|
for (let i = mainOutputCount; i < outputs.length; i++) {
|
||||||
|
if (outputs[i] && outputs[i].length > 0) {
|
||||||
|
result.errors.push({
|
||||||
|
type: 'error',
|
||||||
|
nodeId: sourceNode.id,
|
||||||
|
nodeName: sourceNode.name,
|
||||||
|
message: `Output index ${i} on node "${sourceNode.name}" exceeds its output count (${mainOutputCount}). ` +
|
||||||
|
`This node has ${mainOutputCount} main output(s) (indices 0-${mainOutputCount - 1}).`,
|
||||||
|
code: 'OUTPUT_INDEX_OUT_OF_BOUNDS'
|
||||||
|
});
|
||||||
|
result.statistics.invalidConnections++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
validateInputIndexBounds(sourceName, targetNode, connection, result) {
|
||||||
|
const normalizedType = node_type_normalizer_1.NodeTypeNormalizer.normalizeToFullForm(targetNode.type);
|
||||||
|
const nodeInfo = this.nodeRepository.getNode(normalizedType);
|
||||||
|
if (!nodeInfo)
|
||||||
|
return;
|
||||||
|
const shortType = normalizedType.replace(/^(n8n-)?nodes-base\./, '');
|
||||||
|
let mainInputCount = 1;
|
||||||
|
if (shortType === 'merge' || shortType === 'compareDatasets') {
|
||||||
|
mainInputCount = 2;
|
||||||
|
}
|
||||||
|
if (nodeInfo.isTrigger || (0, node_type_utils_1.isTriggerNode)(targetNode.type)) {
|
||||||
|
mainInputCount = 0;
|
||||||
|
}
|
||||||
|
if (mainInputCount > 0 && connection.index >= mainInputCount) {
|
||||||
|
result.errors.push({
|
||||||
|
type: 'error',
|
||||||
|
nodeName: targetNode.name,
|
||||||
|
message: `Input index ${connection.index} on node "${targetNode.name}" exceeds its input count (${mainInputCount}). ` +
|
||||||
|
`Connection from "${sourceName}" targets input ${connection.index}, but this node has ${mainInputCount} main input(s) (indices 0-${mainInputCount - 1}).`,
|
||||||
|
code: 'INPUT_INDEX_OUT_OF_BOUNDS'
|
||||||
|
});
|
||||||
|
result.statistics.invalidConnections++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
flagOrphanedNodes(workflow, result) {
|
||||||
|
const connectedNodes = new Set();
|
||||||
|
for (const [sourceName, outputs] of Object.entries(workflow.connections)) {
|
||||||
|
connectedNodes.add(sourceName);
|
||||||
|
for (const outputConns of Object.values(outputs)) {
|
||||||
|
if (!Array.isArray(outputConns))
|
||||||
|
continue;
|
||||||
|
for (const conns of outputConns) {
|
||||||
|
if (!conns)
|
||||||
|
continue;
|
||||||
|
for (const conn of conns) {
|
||||||
|
if (conn)
|
||||||
|
connectedNodes.add(conn.node);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for (const node of workflow.nodes) {
|
||||||
|
if (node.disabled || (0, node_classification_1.isNonExecutableNode)(node.type))
|
||||||
|
continue;
|
||||||
|
if ((0, node_type_utils_1.isTriggerNode)(node.type))
|
||||||
|
continue;
|
||||||
|
if (!connectedNodes.has(node.name)) {
|
||||||
|
result.warnings.push({
|
||||||
|
type: 'warning',
|
||||||
|
nodeId: node.id,
|
||||||
|
nodeName: node.name,
|
||||||
|
message: 'Node is not connected to any other nodes'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
validateTriggerReachability(workflow, result) {
|
||||||
|
const adjacency = new Map();
|
||||||
|
for (const [sourceName, outputs] of Object.entries(workflow.connections)) {
|
||||||
|
if (!adjacency.has(sourceName))
|
||||||
|
adjacency.set(sourceName, new Set());
|
||||||
|
for (const outputConns of Object.values(outputs)) {
|
||||||
|
if (Array.isArray(outputConns)) {
|
||||||
|
for (const conns of outputConns) {
|
||||||
|
if (!conns)
|
||||||
|
continue;
|
||||||
|
for (const conn of conns) {
|
||||||
|
if (conn) {
|
||||||
|
adjacency.get(sourceName).add(conn.node);
|
||||||
|
if (!adjacency.has(conn.node))
|
||||||
|
adjacency.set(conn.node, new Set());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const triggerNodes = [];
|
||||||
|
for (const node of workflow.nodes) {
|
||||||
|
if ((0, node_type_utils_1.isTriggerNode)(node.type) && !node.disabled) {
|
||||||
|
triggerNodes.push(node.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (triggerNodes.length === 0) {
|
||||||
|
this.flagOrphanedNodes(workflow, result);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const reachable = new Set();
|
||||||
|
const queue = [...triggerNodes];
|
||||||
|
for (const t of triggerNodes)
|
||||||
|
reachable.add(t);
|
||||||
|
while (queue.length > 0) {
|
||||||
|
const current = queue.shift();
|
||||||
|
const neighbors = adjacency.get(current);
|
||||||
|
if (neighbors) {
|
||||||
|
for (const neighbor of neighbors) {
|
||||||
|
if (!reachable.has(neighbor)) {
|
||||||
|
reachable.add(neighbor);
|
||||||
|
queue.push(neighbor);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
for (const node of workflow.nodes) {
|
||||||
|
if (node.disabled || (0, node_classification_1.isNonExecutableNode)(node.type))
|
||||||
|
continue;
|
||||||
|
if ((0, node_type_utils_1.isTriggerNode)(node.type))
|
||||||
|
continue;
|
||||||
|
if (!reachable.has(node.name)) {
|
||||||
|
result.warnings.push({
|
||||||
|
type: 'warning',
|
||||||
|
nodeId: node.id,
|
||||||
|
nodeName: node.name,
|
||||||
|
message: 'Node is not reachable from any trigger node'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
hasCycle(workflow) {
|
hasCycle(workflow) {
|
||||||
const visited = new Set();
|
const visited = new Set();
|
||||||
const recursionStack = new Set();
|
const recursionStack = new Set();
|
||||||
@@ -657,23 +832,13 @@ class WorkflowValidator {
|
|||||||
const connections = workflow.connections[nodeName];
|
const connections = workflow.connections[nodeName];
|
||||||
if (connections) {
|
if (connections) {
|
||||||
const allTargets = [];
|
const allTargets = [];
|
||||||
if (connections.main) {
|
for (const outputConns of Object.values(connections)) {
|
||||||
connections.main.flat().forEach(conn => {
|
if (Array.isArray(outputConns)) {
|
||||||
if (conn)
|
outputConns.flat().forEach(conn => {
|
||||||
allTargets.push(conn.node);
|
if (conn)
|
||||||
});
|
allTargets.push(conn.node);
|
||||||
}
|
});
|
||||||
if (connections.error) {
|
}
|
||||||
connections.error.flat().forEach(conn => {
|
|
||||||
if (conn)
|
|
||||||
allTargets.push(conn.node);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (connections.ai_tool) {
|
|
||||||
connections.ai_tool.flat().forEach(conn => {
|
|
||||||
if (conn)
|
|
||||||
allTargets.push(conn.node);
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
const currentNodeType = nodeTypeMap.get(nodeName);
|
const currentNodeType = nodeTypeMap.get(nodeName);
|
||||||
const isLoopNode = loopNodeTypes.includes(currentNodeType || '');
|
const isLoopNode = loopNodeTypes.includes(currentNodeType || '');
|
||||||
|
|||||||
2
dist/services/workflow-validator.js.map
vendored
2
dist/services/workflow-validator.js.map
vendored
File diff suppressed because one or more lines are too long
14684
package-lock.json
generated
14684
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
13
package.json
13
package.json
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "n8n-mcp",
|
"name": "n8n-mcp",
|
||||||
"version": "2.35.2",
|
"version": "2.36.0",
|
||||||
"description": "Integration between n8n workflow automation and Model Context Protocol (MCP)",
|
"description": "Integration between n8n workflow automation and Model Context Protocol (MCP)",
|
||||||
"main": "dist/index.js",
|
"main": "dist/index.js",
|
||||||
"types": "dist/index.d.ts",
|
"types": "dist/index.d.ts",
|
||||||
@@ -153,16 +153,16 @@
|
|||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@modelcontextprotocol/sdk": "1.20.1",
|
"@modelcontextprotocol/sdk": "1.20.1",
|
||||||
"@n8n/n8n-nodes-langchain": "^2.6.2",
|
"@n8n/n8n-nodes-langchain": "^2.10.1",
|
||||||
"@supabase/supabase-js": "^2.57.4",
|
"@supabase/supabase-js": "^2.57.4",
|
||||||
"dotenv": "^16.5.0",
|
"dotenv": "^16.5.0",
|
||||||
"express": "^5.1.0",
|
"express": "^5.1.0",
|
||||||
"express-rate-limit": "^7.1.5",
|
"express-rate-limit": "^7.1.5",
|
||||||
"form-data": "^4.0.5",
|
"form-data": "^4.0.5",
|
||||||
"lru-cache": "^11.2.1",
|
"lru-cache": "^11.2.1",
|
||||||
"n8n": "^2.6.3",
|
"n8n": "^2.10.3",
|
||||||
"n8n-core": "^2.6.1",
|
"n8n-core": "^2.10.1",
|
||||||
"n8n-workflow": "^2.6.0",
|
"n8n-workflow": "^2.10.1",
|
||||||
"openai": "^4.77.0",
|
"openai": "^4.77.0",
|
||||||
"sql.js": "^1.13.0",
|
"sql.js": "^1.13.0",
|
||||||
"tslib": "^2.6.2",
|
"tslib": "^2.6.2",
|
||||||
@@ -175,6 +175,7 @@
|
|||||||
"better-sqlite3": "^11.10.0"
|
"better-sqlite3": "^11.10.0"
|
||||||
},
|
},
|
||||||
"overrides": {
|
"overrides": {
|
||||||
"pyodide": "0.26.4"
|
"pyodide": "0.26.4",
|
||||||
|
"isolated-vm": "npm:empty-npm-package@1.0.0"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -57,12 +57,14 @@ export interface DocumentationGeneratorConfig {
|
|||||||
timeout?: number;
|
timeout?: number;
|
||||||
/** Max tokens for response (default: 2000) */
|
/** Max tokens for response (default: 2000) */
|
||||||
maxTokens?: number;
|
maxTokens?: number;
|
||||||
|
/** Temperature for generation (default: 0.3, set to undefined to omit) */
|
||||||
|
temperature?: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Default configuration
|
* Default configuration
|
||||||
*/
|
*/
|
||||||
const DEFAULT_CONFIG: Required<Omit<DocumentationGeneratorConfig, 'baseUrl'>> = {
|
const DEFAULT_CONFIG: Required<Omit<DocumentationGeneratorConfig, 'baseUrl' | 'temperature'>> = {
|
||||||
model: 'qwen3-4b-thinking-2507',
|
model: 'qwen3-4b-thinking-2507',
|
||||||
apiKey: 'not-needed',
|
apiKey: 'not-needed',
|
||||||
timeout: 60000,
|
timeout: 60000,
|
||||||
@@ -78,6 +80,7 @@ export class DocumentationGenerator {
|
|||||||
private model: string;
|
private model: string;
|
||||||
private maxTokens: number;
|
private maxTokens: number;
|
||||||
private timeout: number;
|
private timeout: number;
|
||||||
|
private temperature?: number;
|
||||||
|
|
||||||
constructor(config: DocumentationGeneratorConfig) {
|
constructor(config: DocumentationGeneratorConfig) {
|
||||||
const fullConfig = { ...DEFAULT_CONFIG, ...config };
|
const fullConfig = { ...DEFAULT_CONFIG, ...config };
|
||||||
@@ -90,6 +93,7 @@ export class DocumentationGenerator {
|
|||||||
this.model = fullConfig.model;
|
this.model = fullConfig.model;
|
||||||
this.maxTokens = fullConfig.maxTokens;
|
this.maxTokens = fullConfig.maxTokens;
|
||||||
this.timeout = fullConfig.timeout;
|
this.timeout = fullConfig.timeout;
|
||||||
|
this.temperature = fullConfig.temperature;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -101,8 +105,8 @@ export class DocumentationGenerator {
|
|||||||
|
|
||||||
const completion = await this.client.chat.completions.create({
|
const completion = await this.client.chat.completions.create({
|
||||||
model: this.model,
|
model: this.model,
|
||||||
max_tokens: this.maxTokens,
|
max_completion_tokens: this.maxTokens,
|
||||||
temperature: 0.3, // Lower temperature for more consistent output
|
...(this.temperature !== undefined ? { temperature: this.temperature } : {}),
|
||||||
messages: [
|
messages: [
|
||||||
{
|
{
|
||||||
role: 'system',
|
role: 'system',
|
||||||
@@ -321,7 +325,7 @@ Guidelines:
|
|||||||
try {
|
try {
|
||||||
const completion = await this.client.chat.completions.create({
|
const completion = await this.client.chat.completions.create({
|
||||||
model: this.model,
|
model: this.model,
|
||||||
max_tokens: 10,
|
max_completion_tokens: 200,
|
||||||
messages: [
|
messages: [
|
||||||
{
|
{
|
||||||
role: 'user',
|
role: 'user',
|
||||||
@@ -353,10 +357,15 @@ export function createDocumentationGenerator(): DocumentationGenerator {
|
|||||||
const baseUrl = process.env.N8N_MCP_LLM_BASE_URL || 'http://localhost:1234/v1';
|
const baseUrl = process.env.N8N_MCP_LLM_BASE_URL || 'http://localhost:1234/v1';
|
||||||
const model = process.env.N8N_MCP_LLM_MODEL || 'qwen3-4b-thinking-2507';
|
const model = process.env.N8N_MCP_LLM_MODEL || 'qwen3-4b-thinking-2507';
|
||||||
const timeout = parseInt(process.env.N8N_MCP_LLM_TIMEOUT || '60000', 10);
|
const timeout = parseInt(process.env.N8N_MCP_LLM_TIMEOUT || '60000', 10);
|
||||||
|
const apiKey = process.env.N8N_MCP_LLM_API_KEY || process.env.OPENAI_API_KEY;
|
||||||
|
// Only set temperature for local LLM servers; cloud APIs like OpenAI may not support custom values
|
||||||
|
const isLocalServer = !baseUrl.includes('openai.com') && !baseUrl.includes('anthropic.com');
|
||||||
|
|
||||||
return new DocumentationGenerator({
|
return new DocumentationGenerator({
|
||||||
baseUrl,
|
baseUrl,
|
||||||
model,
|
model,
|
||||||
timeout,
|
timeout,
|
||||||
|
...(apiKey ? { apiKey } : {}),
|
||||||
|
...(isLocalServer ? { temperature: 0.3 } : {}),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -31,9 +31,28 @@ export class N8nNodeLoader {
|
|||||||
return results;
|
return results;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Resolve the absolute directory of an installed package.
|
||||||
|
* Uses require.resolve on package.json (always exported) and strips the filename.
|
||||||
|
*/
|
||||||
|
private resolvePackageDir(packagePath: string): string {
|
||||||
|
const pkgJsonPath = require.resolve(`${packagePath}/package.json`);
|
||||||
|
return path.dirname(pkgJsonPath);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Load a node module by absolute file path, bypassing package.json "exports".
|
||||||
|
* Some packages (e.g. @n8n/n8n-nodes-langchain >=2.9) restrict exports but
|
||||||
|
* still list node files in the n8n.nodes array — we need direct filesystem access.
|
||||||
|
*/
|
||||||
|
private loadNodeModule(absolutePath: string): any {
|
||||||
|
return require(absolutePath);
|
||||||
|
}
|
||||||
|
|
||||||
private async loadPackageNodes(packageName: string, packagePath: string, packageJson: any): Promise<LoadedNode[]> {
|
private async loadPackageNodes(packageName: string, packagePath: string, packageJson: any): Promise<LoadedNode[]> {
|
||||||
const n8nConfig = packageJson.n8n || {};
|
const n8nConfig = packageJson.n8n || {};
|
||||||
const nodes: LoadedNode[] = [];
|
const nodes: LoadedNode[] = [];
|
||||||
|
const packageDir = this.resolvePackageDir(packagePath);
|
||||||
|
|
||||||
// Check if nodes is an array or object
|
// Check if nodes is an array or object
|
||||||
const nodesList = n8nConfig.nodes || [];
|
const nodesList = n8nConfig.nodes || [];
|
||||||
@@ -42,8 +61,9 @@ export class N8nNodeLoader {
|
|||||||
// Handle array format (n8n-nodes-base uses this)
|
// Handle array format (n8n-nodes-base uses this)
|
||||||
for (const nodePath of nodesList) {
|
for (const nodePath of nodesList) {
|
||||||
try {
|
try {
|
||||||
const fullPath = require.resolve(`${packagePath}/${nodePath}`);
|
// Resolve absolute path directly to bypass package exports restrictions
|
||||||
const nodeModule = require(fullPath);
|
const fullPath = path.join(packageDir, nodePath);
|
||||||
|
const nodeModule = this.loadNodeModule(fullPath);
|
||||||
|
|
||||||
// Extract node name from path (e.g., "dist/nodes/Slack/Slack.node.js" -> "Slack")
|
// Extract node name from path (e.g., "dist/nodes/Slack/Slack.node.js" -> "Slack")
|
||||||
const nodeNameMatch = nodePath.match(/\/([^\/]+)\.node\.(js|ts)$/);
|
const nodeNameMatch = nodePath.match(/\/([^\/]+)\.node\.(js|ts)$/);
|
||||||
@@ -65,8 +85,8 @@ export class N8nNodeLoader {
|
|||||||
// Handle object format (for other packages)
|
// Handle object format (for other packages)
|
||||||
for (const [nodeName, nodePath] of Object.entries(nodesList)) {
|
for (const [nodeName, nodePath] of Object.entries(nodesList)) {
|
||||||
try {
|
try {
|
||||||
const fullPath = require.resolve(`${packagePath}/${nodePath as string}`);
|
const fullPath = path.join(packageDir, nodePath as string);
|
||||||
const nodeModule = require(fullPath);
|
const nodeModule = this.loadNodeModule(fullPath);
|
||||||
|
|
||||||
// Handle default export and various export patterns
|
// Handle default export and various export patterns
|
||||||
const NodeClass = nodeModule.default || nodeModule[nodeName] || Object.values(nodeModule)[0];
|
const NodeClass = nodeModule.default || nodeModule[nodeName] || Object.values(nodeModule)[0];
|
||||||
|
|||||||
@@ -687,9 +687,23 @@ export class N8NDocumentationMCPServer {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Safeguard: if the entire args object arrives as a JSON string, parse it.
|
||||||
|
// Some MCP clients may serialize the arguments object itself.
|
||||||
|
let processedArgs: Record<string, any> | undefined = args;
|
||||||
|
if (typeof args === 'string') {
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(args as unknown as string);
|
||||||
|
if (parsed && typeof parsed === 'object' && !Array.isArray(parsed)) {
|
||||||
|
processedArgs = parsed;
|
||||||
|
logger.warn(`Coerced stringified args object for tool "${name}"`);
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
logger.warn(`Tool "${name}" received string args that are not valid JSON`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Workaround for n8n's nested output bug
|
// Workaround for n8n's nested output bug
|
||||||
// Check if args contains nested 'output' structure from n8n's memory corruption
|
// Check if args contains nested 'output' structure from n8n's memory corruption
|
||||||
let processedArgs = args;
|
|
||||||
if (args && typeof args === 'object' && 'output' in args) {
|
if (args && typeof args === 'object' && 'output' in args) {
|
||||||
try {
|
try {
|
||||||
const possibleNestedData = args.output;
|
const possibleNestedData = args.output;
|
||||||
@@ -721,6 +735,12 @@ export class N8NDocumentationMCPServer {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Workaround for Claude Desktop / Claude.ai MCP client bugs that
|
||||||
|
// serialize parameters with wrong types. Coerces ALL mismatched types
|
||||||
|
// (string↔object, string↔number, string↔boolean, etc.) using the
|
||||||
|
// tool's inputSchema as the source of truth.
|
||||||
|
processedArgs = this.coerceStringifiedJsonParams(name, processedArgs);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
logger.debug(`Executing tool: ${name}`, { args: processedArgs });
|
logger.debug(`Executing tool: ${name}`, { args: processedArgs });
|
||||||
const startTime = Date.now();
|
const startTime = Date.now();
|
||||||
@@ -822,6 +842,14 @@ export class N8NDocumentationMCPServer {
|
|||||||
helpfulMessage += '\n\nFor validation tools:\n- nodeType should be a string (e.g., "nodes-base.webhook")\n- config should be an object (e.g., {})';
|
helpfulMessage += '\n\nFor validation tools:\n- nodeType should be a string (e.g., "nodes-base.webhook")\n- config should be an object (e.g., {})';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Include diagnostic info about received args to help debug client issues
|
||||||
|
try {
|
||||||
|
const argDiag = processedArgs && typeof processedArgs === 'object'
|
||||||
|
? Object.entries(processedArgs).map(([k, v]) => `${k}: ${typeof v}`).join(', ')
|
||||||
|
: `args type: ${typeof processedArgs}`;
|
||||||
|
helpfulMessage += `\n\n[Diagnostic] Received arg types: {${argDiag}}`;
|
||||||
|
} catch { /* ignore diagnostic errors */ }
|
||||||
|
|
||||||
return {
|
return {
|
||||||
content: [
|
content: [
|
||||||
{
|
{
|
||||||
@@ -1125,6 +1153,109 @@ export class N8NDocumentationMCPServer {
|
|||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Coerce mistyped parameters back to their expected types.
|
||||||
|
* Workaround for Claude Desktop / Claude.ai MCP client bugs that serialize
|
||||||
|
* parameters incorrectly (objects as strings, numbers as strings, etc.).
|
||||||
|
*
|
||||||
|
* Handles ALL type mismatches based on the tool's inputSchema:
|
||||||
|
* string→object, string→array : JSON.parse
|
||||||
|
* string→number, string→integer : Number()
|
||||||
|
* string→boolean : "true"/"false" parsing
|
||||||
|
* number→string, boolean→string : .toString()
|
||||||
|
*/
|
||||||
|
private coerceStringifiedJsonParams(
|
||||||
|
toolName: string,
|
||||||
|
args: Record<string, any> | undefined
|
||||||
|
): Record<string, any> | undefined {
|
||||||
|
if (!args || typeof args !== 'object') return args;
|
||||||
|
|
||||||
|
const allTools = [...n8nDocumentationToolsFinal, ...n8nManagementTools];
|
||||||
|
const tool = allTools.find(t => t.name === toolName);
|
||||||
|
if (!tool?.inputSchema?.properties) return args;
|
||||||
|
|
||||||
|
const properties = tool.inputSchema.properties;
|
||||||
|
const coerced = { ...args };
|
||||||
|
let coercedAny = false;
|
||||||
|
|
||||||
|
for (const [key, value] of Object.entries(coerced)) {
|
||||||
|
if (value === undefined || value === null) continue;
|
||||||
|
|
||||||
|
const propSchema = (properties as any)[key];
|
||||||
|
if (!propSchema) continue;
|
||||||
|
const expectedType = propSchema.type;
|
||||||
|
if (!expectedType) continue;
|
||||||
|
|
||||||
|
const actualType = typeof value;
|
||||||
|
|
||||||
|
// Already correct type — skip
|
||||||
|
if (expectedType === 'string' && actualType === 'string') continue;
|
||||||
|
if ((expectedType === 'number' || expectedType === 'integer') && actualType === 'number') continue;
|
||||||
|
if (expectedType === 'boolean' && actualType === 'boolean') continue;
|
||||||
|
if (expectedType === 'object' && actualType === 'object' && !Array.isArray(value)) continue;
|
||||||
|
if (expectedType === 'array' && Array.isArray(value)) continue;
|
||||||
|
|
||||||
|
// --- Coercion: string value → expected type ---
|
||||||
|
if (actualType === 'string') {
|
||||||
|
const trimmed = (value as string).trim();
|
||||||
|
|
||||||
|
if (expectedType === 'object' && trimmed.startsWith('{')) {
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(trimmed);
|
||||||
|
if (typeof parsed === 'object' && parsed !== null && !Array.isArray(parsed)) {
|
||||||
|
coerced[key] = parsed;
|
||||||
|
coercedAny = true;
|
||||||
|
}
|
||||||
|
} catch { /* keep original */ }
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (expectedType === 'array' && trimmed.startsWith('[')) {
|
||||||
|
try {
|
||||||
|
const parsed = JSON.parse(trimmed);
|
||||||
|
if (Array.isArray(parsed)) {
|
||||||
|
coerced[key] = parsed;
|
||||||
|
coercedAny = true;
|
||||||
|
}
|
||||||
|
} catch { /* keep original */ }
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (expectedType === 'number' || expectedType === 'integer') {
|
||||||
|
const num = Number(trimmed);
|
||||||
|
if (!isNaN(num) && trimmed !== '') {
|
||||||
|
coerced[key] = expectedType === 'integer' ? Math.trunc(num) : num;
|
||||||
|
coercedAny = true;
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (expectedType === 'boolean') {
|
||||||
|
if (trimmed === 'true') { coerced[key] = true; coercedAny = true; }
|
||||||
|
else if (trimmed === 'false') { coerced[key] = false; coercedAny = true; }
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// --- Coercion: number/boolean value → expected string ---
|
||||||
|
if (expectedType === 'string' && (actualType === 'number' || actualType === 'boolean')) {
|
||||||
|
coerced[key] = String(value);
|
||||||
|
coercedAny = true;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (coercedAny) {
|
||||||
|
logger.warn(`Coerced mistyped params for tool "${toolName}"`, {
|
||||||
|
original: Object.fromEntries(
|
||||||
|
Object.entries(args).map(([k, v]) => [k, `${typeof v}: ${typeof v === 'string' ? v.substring(0, 80) : v}`])
|
||||||
|
),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return coerced;
|
||||||
|
}
|
||||||
|
|
||||||
async executeTool(name: string, args: any): Promise<any> {
|
async executeTool(name: string, args: any): Promise<any> {
|
||||||
// Ensure args is an object and validate it
|
// Ensure args is an object and validate it
|
||||||
args = args || {};
|
args = args || {};
|
||||||
|
|||||||
@@ -11,6 +11,7 @@
|
|||||||
* Environment variables:
|
* Environment variables:
|
||||||
* N8N_MCP_LLM_BASE_URL - LLM server URL (default: http://localhost:1234/v1)
|
* N8N_MCP_LLM_BASE_URL - LLM server URL (default: http://localhost:1234/v1)
|
||||||
* N8N_MCP_LLM_MODEL - LLM model name (default: qwen3-4b-thinking-2507)
|
* N8N_MCP_LLM_MODEL - LLM model name (default: qwen3-4b-thinking-2507)
|
||||||
|
* N8N_MCP_LLM_API_KEY - LLM API key (falls back to OPENAI_API_KEY; default: 'not-needed')
|
||||||
* N8N_MCP_LLM_TIMEOUT - Request timeout in ms (default: 60000)
|
* N8N_MCP_LLM_TIMEOUT - Request timeout in ms (default: 60000)
|
||||||
* N8N_MCP_DB_PATH - Database path (default: ./data/nodes.db)
|
* N8N_MCP_DB_PATH - Database path (default: ./data/nodes.db)
|
||||||
*/
|
*/
|
||||||
@@ -81,6 +82,7 @@ Options:
|
|||||||
Environment Variables:
|
Environment Variables:
|
||||||
N8N_MCP_LLM_BASE_URL LLM server URL (default: http://localhost:1234/v1)
|
N8N_MCP_LLM_BASE_URL LLM server URL (default: http://localhost:1234/v1)
|
||||||
N8N_MCP_LLM_MODEL LLM model name (default: qwen3-4b-thinking-2507)
|
N8N_MCP_LLM_MODEL LLM model name (default: qwen3-4b-thinking-2507)
|
||||||
|
N8N_MCP_LLM_API_KEY LLM API key (falls back to OPENAI_API_KEY; default: 'not-needed')
|
||||||
N8N_MCP_LLM_TIMEOUT Request timeout in ms (default: 60000)
|
N8N_MCP_LLM_TIMEOUT Request timeout in ms (default: 60000)
|
||||||
N8N_MCP_DB_PATH Database path (default: ./data/nodes.db)
|
N8N_MCP_DB_PATH Database path (default: ./data/nodes.db)
|
||||||
|
|
||||||
|
|||||||
@@ -11,13 +11,28 @@ import { ExpressionFormatValidator } from './expression-format-validator';
|
|||||||
import { NodeSimilarityService, NodeSuggestion } from './node-similarity-service';
|
import { NodeSimilarityService, NodeSuggestion } from './node-similarity-service';
|
||||||
import { NodeTypeNormalizer } from '../utils/node-type-normalizer';
|
import { NodeTypeNormalizer } from '../utils/node-type-normalizer';
|
||||||
import { Logger } from '../utils/logger';
|
import { Logger } from '../utils/logger';
|
||||||
import { validateAISpecificNodes, hasAINodes } from './ai-node-validator';
|
import { validateAISpecificNodes, hasAINodes, AI_CONNECTION_TYPES } from './ai-node-validator';
|
||||||
import { isAIToolSubNode } from './ai-tool-validators';
|
import { isAIToolSubNode } from './ai-tool-validators';
|
||||||
import { isTriggerNode } from '../utils/node-type-utils';
|
import { isTriggerNode } from '../utils/node-type-utils';
|
||||||
import { isNonExecutableNode } from '../utils/node-classification';
|
import { isNonExecutableNode } from '../utils/node-classification';
|
||||||
import { ToolVariantGenerator } from './tool-variant-generator';
|
import { ToolVariantGenerator } from './tool-variant-generator';
|
||||||
const logger = new Logger({ prefix: '[WorkflowValidator]' });
|
const logger = new Logger({ prefix: '[WorkflowValidator]' });
|
||||||
|
|
||||||
|
/**
|
||||||
|
* All valid connection output keys in n8n workflows.
|
||||||
|
* Any key not in this set is malformed and should be flagged.
|
||||||
|
*/
|
||||||
|
const VALID_CONNECTION_TYPES = new Set<string>([
|
||||||
|
'main',
|
||||||
|
'error',
|
||||||
|
...AI_CONNECTION_TYPES,
|
||||||
|
// Additional AI types from n8n-workflow NodeConnectionTypes not in AI_CONNECTION_TYPES
|
||||||
|
'ai_agent',
|
||||||
|
'ai_chain',
|
||||||
|
'ai_retriever',
|
||||||
|
'ai_reranker',
|
||||||
|
]);
|
||||||
|
|
||||||
interface WorkflowNode {
|
interface WorkflowNode {
|
||||||
id: string;
|
id: string;
|
||||||
name: string;
|
name: string;
|
||||||
@@ -40,9 +55,7 @@ interface WorkflowNode {
|
|||||||
|
|
||||||
interface WorkflowConnection {
|
interface WorkflowConnection {
|
||||||
[sourceNode: string]: {
|
[sourceNode: string]: {
|
||||||
main?: Array<Array<{ node: string; type: string; index: number }>>;
|
[outputType: string]: Array<Array<{ node: string; type: string; index: number }>>;
|
||||||
error?: Array<Array<{ node: string; type: string; index: number }>>;
|
|
||||||
ai_tool?: Array<Array<{ node: string; type: string; index: number }>>;
|
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -612,86 +625,47 @@ export class WorkflowValidator {
|
|||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check main outputs
|
// Detect unknown output keys and validate known ones
|
||||||
if (outputs.main) {
|
for (const [outputKey, outputConnections] of Object.entries(outputs)) {
|
||||||
this.validateConnectionOutputs(
|
if (!VALID_CONNECTION_TYPES.has(outputKey)) {
|
||||||
sourceName,
|
// Flag unknown connection output key
|
||||||
outputs.main,
|
let suggestion = '';
|
||||||
nodeMap,
|
if (/^\d+$/.test(outputKey)) {
|
||||||
nodeIdMap,
|
suggestion = ` If you meant to use output index ${outputKey}, use main[${outputKey}] instead.`;
|
||||||
result,
|
}
|
||||||
'main'
|
result.errors.push({
|
||||||
);
|
type: 'error',
|
||||||
}
|
nodeName: sourceName,
|
||||||
|
message: `Unknown connection output key "${outputKey}" on node "${sourceName}". Valid keys are: ${[...VALID_CONNECTION_TYPES].join(', ')}.${suggestion}`,
|
||||||
|
code: 'UNKNOWN_CONNECTION_KEY'
|
||||||
|
});
|
||||||
|
result.statistics.invalidConnections++;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
// Check error outputs
|
if (!outputConnections || !Array.isArray(outputConnections)) continue;
|
||||||
if (outputs.error) {
|
|
||||||
this.validateConnectionOutputs(
|
|
||||||
sourceName,
|
|
||||||
outputs.error,
|
|
||||||
nodeMap,
|
|
||||||
nodeIdMap,
|
|
||||||
result,
|
|
||||||
'error'
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check AI tool outputs
|
|
||||||
if (outputs.ai_tool) {
|
|
||||||
// Validate that the source node can actually output ai_tool
|
// Validate that the source node can actually output ai_tool
|
||||||
this.validateAIToolSource(sourceNode, result);
|
if (outputKey === 'ai_tool') {
|
||||||
|
this.validateAIToolSource(sourceNode, result);
|
||||||
|
}
|
||||||
|
|
||||||
this.validateConnectionOutputs(
|
this.validateConnectionOutputs(
|
||||||
sourceName,
|
sourceName,
|
||||||
outputs.ai_tool,
|
outputConnections,
|
||||||
nodeMap,
|
nodeMap,
|
||||||
nodeIdMap,
|
nodeIdMap,
|
||||||
result,
|
result,
|
||||||
'ai_tool'
|
outputKey
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check for orphaned nodes (not connected and not triggers)
|
// Trigger reachability analysis: BFS from all triggers to find unreachable nodes
|
||||||
const connectedNodes = new Set<string>();
|
if (profile !== 'minimal') {
|
||||||
|
this.validateTriggerReachability(workflow, result);
|
||||||
// Add all source nodes
|
} else {
|
||||||
Object.keys(workflow.connections).forEach(name => connectedNodes.add(name));
|
this.flagOrphanedNodes(workflow, result);
|
||||||
|
|
||||||
// Add all target nodes
|
|
||||||
Object.values(workflow.connections).forEach(outputs => {
|
|
||||||
if (outputs.main) {
|
|
||||||
outputs.main.flat().forEach(conn => {
|
|
||||||
if (conn) connectedNodes.add(conn.node);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (outputs.error) {
|
|
||||||
outputs.error.flat().forEach(conn => {
|
|
||||||
if (conn) connectedNodes.add(conn.node);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
if (outputs.ai_tool) {
|
|
||||||
outputs.ai_tool.flat().forEach(conn => {
|
|
||||||
if (conn) connectedNodes.add(conn.node);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check for orphaned nodes (exclude sticky notes)
|
|
||||||
for (const node of workflow.nodes) {
|
|
||||||
if (node.disabled || isNonExecutableNode(node.type)) continue;
|
|
||||||
|
|
||||||
// Use shared trigger detection function for consistency
|
|
||||||
const isNodeTrigger = isTriggerNode(node.type);
|
|
||||||
|
|
||||||
if (!connectedNodes.has(node.name) && !isNodeTrigger) {
|
|
||||||
result.warnings.push({
|
|
||||||
type: 'warning',
|
|
||||||
nodeId: node.id,
|
|
||||||
nodeName: node.name,
|
|
||||||
message: 'Node is not connected to any other nodes'
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Check for cycles (skip in minimal profile to reduce false positives)
|
// Check for cycles (skip in minimal profile to reduce false positives)
|
||||||
@@ -712,14 +686,15 @@ export class WorkflowValidator {
|
|||||||
nodeMap: Map<string, WorkflowNode>,
|
nodeMap: Map<string, WorkflowNode>,
|
||||||
nodeIdMap: Map<string, WorkflowNode>,
|
nodeIdMap: Map<string, WorkflowNode>,
|
||||||
result: WorkflowValidationResult,
|
result: WorkflowValidationResult,
|
||||||
outputType: 'main' | 'error' | 'ai_tool'
|
outputType: string
|
||||||
): void {
|
): void {
|
||||||
// Get source node for special validation
|
// Get source node for special validation
|
||||||
const sourceNode = nodeMap.get(sourceName);
|
const sourceNode = nodeMap.get(sourceName);
|
||||||
|
|
||||||
// Special validation for main outputs with error handling
|
// Main-output-specific validation: error handling config and index bounds
|
||||||
if (outputType === 'main' && sourceNode) {
|
if (outputType === 'main' && sourceNode) {
|
||||||
this.validateErrorOutputConfiguration(sourceName, sourceNode, outputs, nodeMap, result);
|
this.validateErrorOutputConfiguration(sourceName, sourceNode, outputs, nodeMap, result);
|
||||||
|
this.validateOutputIndexBounds(sourceNode, outputs, result);
|
||||||
}
|
}
|
||||||
|
|
||||||
outputs.forEach((outputConnections, outputIndex) => {
|
outputs.forEach((outputConnections, outputIndex) => {
|
||||||
@@ -736,6 +711,22 @@ export class WorkflowValidator {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Validate connection type field
|
||||||
|
if (connection.type && !VALID_CONNECTION_TYPES.has(connection.type)) {
|
||||||
|
let suggestion = '';
|
||||||
|
if (/^\d+$/.test(connection.type)) {
|
||||||
|
suggestion = ` Numeric types are not valid - use "main", "error", or an AI connection type.`;
|
||||||
|
}
|
||||||
|
result.errors.push({
|
||||||
|
type: 'error',
|
||||||
|
nodeName: sourceName,
|
||||||
|
message: `Invalid connection type "${connection.type}" in connection from "${sourceName}" to "${connection.node}". Expected "main", "error", or an AI connection type (ai_tool, ai_languageModel, etc.).${suggestion}`,
|
||||||
|
code: 'INVALID_CONNECTION_TYPE'
|
||||||
|
});
|
||||||
|
result.statistics.invalidConnections++;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
// Special validation for SplitInBatches node
|
// Special validation for SplitInBatches node
|
||||||
// Check both full form (n8n-nodes-base.*) and short form (nodes-base.*)
|
// Check both full form (n8n-nodes-base.*) and short form (nodes-base.*)
|
||||||
const isSplitInBatches = sourceNode && (
|
const isSplitInBatches = sourceNode && (
|
||||||
@@ -794,6 +785,11 @@ export class WorkflowValidator {
|
|||||||
if (outputType === 'ai_tool') {
|
if (outputType === 'ai_tool') {
|
||||||
this.validateAIToolConnection(sourceName, targetNode, result);
|
this.validateAIToolConnection(sourceName, targetNode, result);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Input index bounds checking
|
||||||
|
if (outputType === 'main') {
|
||||||
|
this.validateInputIndexBounds(sourceName, targetNode, connection, result);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
@@ -991,6 +987,221 @@ export class WorkflowValidator {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate that output indices don't exceed what the node type supports.
|
||||||
|
*/
|
||||||
|
private validateOutputIndexBounds(
|
||||||
|
sourceNode: WorkflowNode,
|
||||||
|
outputs: Array<Array<{ node: string; type: string; index: number }>>,
|
||||||
|
result: WorkflowValidationResult
|
||||||
|
): void {
|
||||||
|
const normalizedType = NodeTypeNormalizer.normalizeToFullForm(sourceNode.type);
|
||||||
|
const nodeInfo = this.nodeRepository.getNode(normalizedType);
|
||||||
|
if (!nodeInfo || !nodeInfo.outputs) return;
|
||||||
|
|
||||||
|
// Count main outputs from node description
|
||||||
|
let mainOutputCount: number;
|
||||||
|
if (Array.isArray(nodeInfo.outputs)) {
|
||||||
|
// outputs can be strings like "main" or objects with { type: "main" }
|
||||||
|
mainOutputCount = nodeInfo.outputs.filter((o: any) =>
|
||||||
|
typeof o === 'string' ? o === 'main' : (o.type === 'main' || !o.type)
|
||||||
|
).length;
|
||||||
|
} else {
|
||||||
|
return; // Dynamic outputs (expression string), skip check
|
||||||
|
}
|
||||||
|
|
||||||
|
if (mainOutputCount === 0) return;
|
||||||
|
|
||||||
|
// Account for dynamic output counts based on node type and parameters
|
||||||
|
const shortType = normalizedType.replace(/^(n8n-)?nodes-base\./, '');
|
||||||
|
if (shortType === 'switch') {
|
||||||
|
// Switch node: output count depends on rules configuration
|
||||||
|
const rules = sourceNode.parameters?.rules?.values || sourceNode.parameters?.rules;
|
||||||
|
if (Array.isArray(rules)) {
|
||||||
|
mainOutputCount = rules.length + 1; // rules + fallback
|
||||||
|
} else {
|
||||||
|
return; // Cannot determine dynamic output count, skip bounds check
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (shortType === 'if' || shortType === 'filter') {
|
||||||
|
mainOutputCount = 2; // true/false
|
||||||
|
}
|
||||||
|
|
||||||
|
// Account for continueErrorOutput adding an extra output
|
||||||
|
if (sourceNode.onError === 'continueErrorOutput') {
|
||||||
|
mainOutputCount += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if any output index exceeds bounds
|
||||||
|
const maxOutputIndex = outputs.length - 1;
|
||||||
|
if (maxOutputIndex >= mainOutputCount) {
|
||||||
|
// Only flag if there are actual connections at the out-of-bounds indices
|
||||||
|
for (let i = mainOutputCount; i < outputs.length; i++) {
|
||||||
|
if (outputs[i] && outputs[i].length > 0) {
|
||||||
|
result.errors.push({
|
||||||
|
type: 'error',
|
||||||
|
nodeId: sourceNode.id,
|
||||||
|
nodeName: sourceNode.name,
|
||||||
|
message: `Output index ${i} on node "${sourceNode.name}" exceeds its output count (${mainOutputCount}). ` +
|
||||||
|
`This node has ${mainOutputCount} main output(s) (indices 0-${mainOutputCount - 1}).`,
|
||||||
|
code: 'OUTPUT_INDEX_OUT_OF_BOUNDS'
|
||||||
|
});
|
||||||
|
result.statistics.invalidConnections++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate that input index doesn't exceed what the target node accepts.
|
||||||
|
*/
|
||||||
|
private validateInputIndexBounds(
|
||||||
|
sourceName: string,
|
||||||
|
targetNode: WorkflowNode,
|
||||||
|
connection: { node: string; type: string; index: number },
|
||||||
|
result: WorkflowValidationResult
|
||||||
|
): void {
|
||||||
|
const normalizedType = NodeTypeNormalizer.normalizeToFullForm(targetNode.type);
|
||||||
|
const nodeInfo = this.nodeRepository.getNode(normalizedType);
|
||||||
|
if (!nodeInfo) return;
|
||||||
|
|
||||||
|
// Most nodes have 1 main input. Known exceptions:
|
||||||
|
const shortType = normalizedType.replace(/^(n8n-)?nodes-base\./, '');
|
||||||
|
let mainInputCount = 1; // Default: most nodes have 1 input
|
||||||
|
|
||||||
|
if (shortType === 'merge' || shortType === 'compareDatasets') {
|
||||||
|
mainInputCount = 2; // Merge nodes have 2 inputs
|
||||||
|
}
|
||||||
|
|
||||||
|
// Trigger nodes have 0 inputs
|
||||||
|
if (nodeInfo.isTrigger || isTriggerNode(targetNode.type)) {
|
||||||
|
mainInputCount = 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (mainInputCount > 0 && connection.index >= mainInputCount) {
|
||||||
|
result.errors.push({
|
||||||
|
type: 'error',
|
||||||
|
nodeName: targetNode.name,
|
||||||
|
message: `Input index ${connection.index} on node "${targetNode.name}" exceeds its input count (${mainInputCount}). ` +
|
||||||
|
`Connection from "${sourceName}" targets input ${connection.index}, but this node has ${mainInputCount} main input(s) (indices 0-${mainInputCount - 1}).`,
|
||||||
|
code: 'INPUT_INDEX_OUT_OF_BOUNDS'
|
||||||
|
});
|
||||||
|
result.statistics.invalidConnections++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Flag nodes that are not referenced in any connection (source or target).
|
||||||
|
* Used as a lightweight check when BFS reachability is not applicable.
|
||||||
|
*/
|
||||||
|
private flagOrphanedNodes(
|
||||||
|
workflow: WorkflowJson,
|
||||||
|
result: WorkflowValidationResult
|
||||||
|
): void {
|
||||||
|
const connectedNodes = new Set<string>();
|
||||||
|
for (const [sourceName, outputs] of Object.entries(workflow.connections)) {
|
||||||
|
connectedNodes.add(sourceName);
|
||||||
|
for (const outputConns of Object.values(outputs)) {
|
||||||
|
if (!Array.isArray(outputConns)) continue;
|
||||||
|
for (const conns of outputConns) {
|
||||||
|
if (!conns) continue;
|
||||||
|
for (const conn of conns) {
|
||||||
|
if (conn) connectedNodes.add(conn.node);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const node of workflow.nodes) {
|
||||||
|
if (node.disabled || isNonExecutableNode(node.type)) continue;
|
||||||
|
if (isTriggerNode(node.type)) continue;
|
||||||
|
if (!connectedNodes.has(node.name)) {
|
||||||
|
result.warnings.push({
|
||||||
|
type: 'warning',
|
||||||
|
nodeId: node.id,
|
||||||
|
nodeName: node.name,
|
||||||
|
message: 'Node is not connected to any other nodes'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* BFS from all trigger nodes to detect unreachable nodes.
|
||||||
|
* Replaces the simple "is node in any connection" check with proper graph traversal.
|
||||||
|
*/
|
||||||
|
private validateTriggerReachability(
|
||||||
|
workflow: WorkflowJson,
|
||||||
|
result: WorkflowValidationResult
|
||||||
|
): void {
|
||||||
|
// Build adjacency list (forward direction)
|
||||||
|
const adjacency = new Map<string, Set<string>>();
|
||||||
|
for (const [sourceName, outputs] of Object.entries(workflow.connections)) {
|
||||||
|
if (!adjacency.has(sourceName)) adjacency.set(sourceName, new Set());
|
||||||
|
for (const outputConns of Object.values(outputs)) {
|
||||||
|
if (Array.isArray(outputConns)) {
|
||||||
|
for (const conns of outputConns) {
|
||||||
|
if (!conns) continue;
|
||||||
|
for (const conn of conns) {
|
||||||
|
if (conn) {
|
||||||
|
adjacency.get(sourceName)!.add(conn.node);
|
||||||
|
// Also track that the target exists in the graph
|
||||||
|
if (!adjacency.has(conn.node)) adjacency.set(conn.node, new Set());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Identify trigger nodes
|
||||||
|
const triggerNodes: string[] = [];
|
||||||
|
for (const node of workflow.nodes) {
|
||||||
|
if (isTriggerNode(node.type) && !node.disabled) {
|
||||||
|
triggerNodes.push(node.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// If no trigger nodes, fall back to simple orphaned check
|
||||||
|
if (triggerNodes.length === 0) {
|
||||||
|
this.flagOrphanedNodes(workflow, result);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// BFS from all trigger nodes
|
||||||
|
const reachable = new Set<string>();
|
||||||
|
const queue: string[] = [...triggerNodes];
|
||||||
|
for (const t of triggerNodes) reachable.add(t);
|
||||||
|
|
||||||
|
while (queue.length > 0) {
|
||||||
|
const current = queue.shift()!;
|
||||||
|
const neighbors = adjacency.get(current);
|
||||||
|
if (neighbors) {
|
||||||
|
for (const neighbor of neighbors) {
|
||||||
|
if (!reachable.has(neighbor)) {
|
||||||
|
reachable.add(neighbor);
|
||||||
|
queue.push(neighbor);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Flag unreachable nodes
|
||||||
|
for (const node of workflow.nodes) {
|
||||||
|
if (node.disabled || isNonExecutableNode(node.type)) continue;
|
||||||
|
if (isTriggerNode(node.type)) continue;
|
||||||
|
|
||||||
|
if (!reachable.has(node.name)) {
|
||||||
|
result.warnings.push({
|
||||||
|
type: 'warning',
|
||||||
|
nodeId: node.id,
|
||||||
|
nodeName: node.name,
|
||||||
|
message: 'Node is not reachable from any trigger node'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Check if workflow has cycles
|
* Check if workflow has cycles
|
||||||
* Allow legitimate loops for SplitInBatches and similar loop nodes
|
* Allow legitimate loops for SplitInBatches and similar loop nodes
|
||||||
@@ -1025,22 +1236,12 @@ export class WorkflowValidator {
|
|||||||
if (connections) {
|
if (connections) {
|
||||||
const allTargets: string[] = [];
|
const allTargets: string[] = [];
|
||||||
|
|
||||||
if (connections.main) {
|
for (const outputConns of Object.values(connections)) {
|
||||||
connections.main.flat().forEach(conn => {
|
if (Array.isArray(outputConns)) {
|
||||||
if (conn) allTargets.push(conn.node);
|
outputConns.flat().forEach(conn => {
|
||||||
});
|
if (conn) allTargets.push(conn.node);
|
||||||
}
|
});
|
||||||
|
}
|
||||||
if (connections.error) {
|
|
||||||
connections.error.flat().forEach(conn => {
|
|
||||||
if (conn) allTargets.push(conn.node);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
if (connections.ai_tool) {
|
|
||||||
connections.ai_tool.flat().forEach(conn => {
|
|
||||||
if (conn) allTargets.push(conn.node);
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const currentNodeType = nodeTypeMap.get(nodeName);
|
const currentNodeType = nodeTypeMap.get(nodeName);
|
||||||
|
|||||||
@@ -41,6 +41,7 @@ describe('DocumentationGenerator', () => {
|
|||||||
apiKey: 'test-key',
|
apiKey: 'test-key',
|
||||||
timeout: 30000,
|
timeout: 30000,
|
||||||
maxTokens: 1000,
|
maxTokens: 1000,
|
||||||
|
temperature: 0.3,
|
||||||
};
|
};
|
||||||
|
|
||||||
const validSummary = {
|
const validSummary = {
|
||||||
@@ -163,7 +164,7 @@ describe('DocumentationGenerator', () => {
|
|||||||
|
|
||||||
expect(mockCreate).toHaveBeenCalledWith({
|
expect(mockCreate).toHaveBeenCalledWith({
|
||||||
model: 'test-model',
|
model: 'test-model',
|
||||||
max_tokens: 1000,
|
max_completion_tokens: 1000,
|
||||||
temperature: 0.3,
|
temperature: 0.3,
|
||||||
messages: expect.arrayContaining([
|
messages: expect.arrayContaining([
|
||||||
expect.objectContaining({ role: 'system' }),
|
expect.objectContaining({ role: 'system' }),
|
||||||
@@ -680,7 +681,7 @@ describe('DocumentationGenerator', () => {
|
|||||||
|
|
||||||
expect(mockCreate).toHaveBeenCalledWith(
|
expect(mockCreate).toHaveBeenCalledWith(
|
||||||
expect.objectContaining({
|
expect.objectContaining({
|
||||||
max_tokens: 10,
|
max_completion_tokens: 200,
|
||||||
messages: [
|
messages: [
|
||||||
{
|
{
|
||||||
role: 'user',
|
role: 'user',
|
||||||
|
|||||||
300
tests/unit/mcp/coerce-stringified-params.test.ts
Normal file
300
tests/unit/mcp/coerce-stringified-params.test.ts
Normal file
@@ -0,0 +1,300 @@
|
|||||||
|
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest';
|
||||||
|
import { N8NDocumentationMCPServer } from '../../../src/mcp/server';
|
||||||
|
|
||||||
|
// Mock the database and dependencies
|
||||||
|
vi.mock('../../../src/database/database-adapter');
|
||||||
|
vi.mock('../../../src/database/node-repository');
|
||||||
|
vi.mock('../../../src/templates/template-service');
|
||||||
|
vi.mock('../../../src/utils/logger');
|
||||||
|
|
||||||
|
class TestableN8NMCPServer extends N8NDocumentationMCPServer {
|
||||||
|
public testCoerceStringifiedJsonParams(
|
||||||
|
toolName: string,
|
||||||
|
args: Record<string, any>
|
||||||
|
): Record<string, any> {
|
||||||
|
return (this as any).coerceStringifiedJsonParams(toolName, args);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('coerceStringifiedJsonParams', () => {
|
||||||
|
let server: TestableN8NMCPServer;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
process.env.NODE_DB_PATH = ':memory:';
|
||||||
|
server = new TestableN8NMCPServer();
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
delete process.env.NODE_DB_PATH;
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Object coercion', () => {
|
||||||
|
it('should coerce stringified object for validate_node config', () => {
|
||||||
|
const args = {
|
||||||
|
nodeType: 'nodes-base.slack',
|
||||||
|
config: '{"resource":"channel","operation":"create"}'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('validate_node', args);
|
||||||
|
expect(result.config).toEqual({ resource: 'channel', operation: 'create' });
|
||||||
|
expect(result.nodeType).toBe('nodes-base.slack');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should coerce stringified object for n8n_create_workflow connections', () => {
|
||||||
|
const connections = { 'Webhook': { main: [[{ node: 'Slack', type: 'main', index: 0 }]] } };
|
||||||
|
const args = {
|
||||||
|
name: 'Test Workflow',
|
||||||
|
nodes: [{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook' }],
|
||||||
|
connections: JSON.stringify(connections)
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('n8n_create_workflow', args);
|
||||||
|
expect(result.connections).toEqual(connections);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should coerce stringified object for validate_workflow workflow param', () => {
|
||||||
|
const workflow = { nodes: [], connections: {} };
|
||||||
|
const args = {
|
||||||
|
workflow: JSON.stringify(workflow)
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('validate_workflow', args);
|
||||||
|
expect(result.workflow).toEqual(workflow);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Array coercion', () => {
|
||||||
|
it('should coerce stringified array for n8n_update_partial_workflow operations', () => {
|
||||||
|
const operations = [
|
||||||
|
{ type: 'addNode', node: { id: '1', name: 'Test', type: 'n8n-nodes-base.noOp' } }
|
||||||
|
];
|
||||||
|
const args = {
|
||||||
|
id: '123',
|
||||||
|
operations: JSON.stringify(operations)
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('n8n_update_partial_workflow', args);
|
||||||
|
expect(result.operations).toEqual(operations);
|
||||||
|
expect(result.id).toBe('123');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should coerce stringified array for n8n_autofix_workflow fixTypes', () => {
|
||||||
|
const fixTypes = ['expression-format', 'typeversion-correction'];
|
||||||
|
const args = {
|
||||||
|
id: '456',
|
||||||
|
fixTypes: JSON.stringify(fixTypes)
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('n8n_autofix_workflow', args);
|
||||||
|
expect(result.fixTypes).toEqual(fixTypes);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('No-op cases', () => {
|
||||||
|
it('should not modify object params that are already objects', () => {
|
||||||
|
const config = { resource: 'channel', operation: 'create' };
|
||||||
|
const args = {
|
||||||
|
nodeType: 'nodes-base.slack',
|
||||||
|
config
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('validate_node', args);
|
||||||
|
expect(result.config).toEqual(config);
|
||||||
|
expect(result.config).toBe(config); // same reference
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not modify string params even if they contain JSON', () => {
|
||||||
|
const args = {
|
||||||
|
query: '{"some":"json"}',
|
||||||
|
limit: 10
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('search_nodes', args);
|
||||||
|
expect(result.query).toBe('{"some":"json"}');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not modify args for tools with no object/array params', () => {
|
||||||
|
const args = {
|
||||||
|
query: 'webhook',
|
||||||
|
limit: 20,
|
||||||
|
mode: 'OR'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('search_nodes', args);
|
||||||
|
expect(result).toEqual(args);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Safety cases', () => {
|
||||||
|
it('should keep original string for invalid JSON', () => {
|
||||||
|
const args = {
|
||||||
|
nodeType: 'nodes-base.slack',
|
||||||
|
config: '{invalid json here}'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('validate_node', args);
|
||||||
|
expect(result.config).toBe('{invalid json here}');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not attempt parse when object param starts with [', () => {
|
||||||
|
const args = {
|
||||||
|
nodeType: 'nodes-base.slack',
|
||||||
|
config: '[1, 2, 3]'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('validate_node', args);
|
||||||
|
expect(result.config).toBe('[1, 2, 3]');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not attempt parse when array param starts with {', () => {
|
||||||
|
const args = {
|
||||||
|
id: '123',
|
||||||
|
operations: '{"not":"an array"}'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('n8n_update_partial_workflow', args);
|
||||||
|
expect(result.operations).toBe('{"not":"an array"}');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle null args gracefully', () => {
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('validate_node', null as any);
|
||||||
|
expect(result).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle undefined args gracefully', () => {
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('validate_node', undefined as any);
|
||||||
|
expect(result).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return args unchanged for unknown tool', () => {
|
||||||
|
const args = { config: '{"key":"value"}' };
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('nonexistent_tool', args);
|
||||||
|
expect(result).toEqual(args);
|
||||||
|
expect(result.config).toBe('{"key":"value"}');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Number coercion', () => {
|
||||||
|
it('should coerce string to number for search_nodes limit', () => {
|
||||||
|
const args = {
|
||||||
|
query: 'webhook',
|
||||||
|
limit: '10'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('search_nodes', args);
|
||||||
|
expect(result.limit).toBe(10);
|
||||||
|
expect(result.query).toBe('webhook');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should coerce string to number for n8n_executions limit', () => {
|
||||||
|
const args = {
|
||||||
|
action: 'list',
|
||||||
|
limit: '50'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('n8n_executions', args);
|
||||||
|
expect(result.limit).toBe(50);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not coerce non-numeric string to number', () => {
|
||||||
|
const args = {
|
||||||
|
query: 'webhook',
|
||||||
|
limit: 'abc'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('search_nodes', args);
|
||||||
|
expect(result.limit).toBe('abc');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Boolean coercion', () => {
|
||||||
|
it('should coerce "true" string to boolean', () => {
|
||||||
|
const args = {
|
||||||
|
query: 'webhook',
|
||||||
|
includeExamples: 'true'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('search_nodes', args);
|
||||||
|
expect(result.includeExamples).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should coerce "false" string to boolean', () => {
|
||||||
|
const args = {
|
||||||
|
query: 'webhook',
|
||||||
|
includeExamples: 'false'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('search_nodes', args);
|
||||||
|
expect(result.includeExamples).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not coerce non-boolean string to boolean', () => {
|
||||||
|
const args = {
|
||||||
|
query: 'webhook',
|
||||||
|
includeExamples: 'yes'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('search_nodes', args);
|
||||||
|
expect(result.includeExamples).toBe('yes');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Number-to-string coercion', () => {
|
||||||
|
it('should coerce number to string for n8n_get_workflow id', () => {
|
||||||
|
const args = {
|
||||||
|
id: 123,
|
||||||
|
mode: 'minimal'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('n8n_get_workflow', args);
|
||||||
|
expect(result.id).toBe('123');
|
||||||
|
expect(result.mode).toBe('minimal');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should coerce boolean to string when string expected', () => {
|
||||||
|
const args = {
|
||||||
|
id: true
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('n8n_get_workflow', args);
|
||||||
|
expect(result.id).toBe('true');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('End-to-end Claude Desktop scenario', () => {
|
||||||
|
it('should coerce all stringified params for n8n_create_workflow', () => {
|
||||||
|
const nodes = [
|
||||||
|
{
|
||||||
|
id: 'webhook_1',
|
||||||
|
name: 'Webhook',
|
||||||
|
type: 'n8n-nodes-base.webhook',
|
||||||
|
typeVersion: 1,
|
||||||
|
position: [250, 300],
|
||||||
|
parameters: { httpMethod: 'POST', path: 'slack-notify' }
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'slack_1',
|
||||||
|
name: 'Slack',
|
||||||
|
type: 'n8n-nodes-base.slack',
|
||||||
|
typeVersion: 1,
|
||||||
|
position: [450, 300],
|
||||||
|
parameters: { resource: 'message', operation: 'post', channel: '#general' }
|
||||||
|
}
|
||||||
|
];
|
||||||
|
const connections = {
|
||||||
|
'Webhook': { main: [[{ node: 'Slack', type: 'main', index: 0 }]] }
|
||||||
|
};
|
||||||
|
const settings = { executionOrder: 'v1', timezone: 'America/New_York' };
|
||||||
|
|
||||||
|
// Simulate Claude Desktop sending all object/array params as strings
|
||||||
|
const args = {
|
||||||
|
name: 'Webhook to Slack',
|
||||||
|
nodes: JSON.stringify(nodes),
|
||||||
|
connections: JSON.stringify(connections),
|
||||||
|
settings: JSON.stringify(settings)
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('n8n_create_workflow', args);
|
||||||
|
|
||||||
|
expect(result.name).toBe('Webhook to Slack');
|
||||||
|
expect(result.nodes).toEqual(nodes);
|
||||||
|
expect(result.connections).toEqual(connections);
|
||||||
|
expect(result.settings).toEqual(settings);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle mixed type mismatches from Claude Desktop', () => {
|
||||||
|
// Simulate Claude Desktop sending object params as strings
|
||||||
|
const args = {
|
||||||
|
nodeType: 'nodes-base.httpRequest',
|
||||||
|
config: '{"method":"GET","url":"https://example.com"}',
|
||||||
|
mode: 'full',
|
||||||
|
profile: 'ai-friendly'
|
||||||
|
};
|
||||||
|
const result = server.testCoerceStringifiedJsonParams('validate_node', args);
|
||||||
|
expect(result.config).toEqual({ method: 'GET', url: 'https://example.com' });
|
||||||
|
expect(result.nodeType).toBe('nodes-base.httpRequest');
|
||||||
|
expect(result.mode).toBe('full');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -1067,7 +1067,7 @@ describe('WorkflowValidator - Comprehensive Tests', () => {
|
|||||||
|
|
||||||
const result = await validator.validateWorkflow(workflow as any);
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
expect(result.warnings.some(w => w.message.includes('Node is not connected to any other nodes') && w.nodeName === 'Orphaned')).toBe(true);
|
expect(result.warnings.some(w => w.message.includes('not reachable from any trigger node') && w.nodeName === 'Orphaned')).toBe(true);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should detect cycles in workflow', async () => {
|
it('should detect cycles in workflow', async () => {
|
||||||
@@ -1987,7 +1987,7 @@ describe('WorkflowValidator - Comprehensive Tests', () => {
|
|||||||
|
|
||||||
// Warnings
|
// Warnings
|
||||||
expect(result.warnings.some(w => w.message.includes('Connection to disabled node'))).toBe(true);
|
expect(result.warnings.some(w => w.message.includes('Connection to disabled node'))).toBe(true);
|
||||||
expect(result.warnings.some(w => w.message.includes('Node is not connected') && w.nodeName === 'Orphaned')).toBe(true);
|
expect(result.warnings.some(w => w.message.includes('not reachable from any trigger node') && w.nodeName === 'Orphaned')).toBe(true);
|
||||||
expect(result.warnings.some(w => w.message.includes('AI Agent has no tools connected'))).toBe(true);
|
expect(result.warnings.some(w => w.message.includes('AI Agent has no tools connected'))).toBe(true);
|
||||||
|
|
||||||
// Statistics
|
// Statistics
|
||||||
|
|||||||
718
tests/unit/services/workflow-validator-connections.test.ts
Normal file
718
tests/unit/services/workflow-validator-connections.test.ts
Normal file
@@ -0,0 +1,718 @@
|
|||||||
|
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||||
|
import { WorkflowValidator } from '@/services/workflow-validator';
|
||||||
|
import { NodeRepository } from '@/database/node-repository';
|
||||||
|
import { EnhancedConfigValidator } from '@/services/enhanced-config-validator';
|
||||||
|
|
||||||
|
// Mock dependencies
|
||||||
|
vi.mock('@/database/node-repository');
|
||||||
|
vi.mock('@/services/enhanced-config-validator');
|
||||||
|
vi.mock('@/services/expression-validator');
|
||||||
|
vi.mock('@/utils/logger');
|
||||||
|
|
||||||
|
describe('WorkflowValidator - Connection Validation (#620)', () => {
|
||||||
|
let validator: WorkflowValidator;
|
||||||
|
let mockNodeRepository: NodeRepository;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
|
||||||
|
mockNodeRepository = new NodeRepository({} as any) as any;
|
||||||
|
|
||||||
|
if (!mockNodeRepository.getAllNodes) {
|
||||||
|
mockNodeRepository.getAllNodes = vi.fn();
|
||||||
|
}
|
||||||
|
if (!mockNodeRepository.getNode) {
|
||||||
|
mockNodeRepository.getNode = vi.fn();
|
||||||
|
}
|
||||||
|
|
||||||
|
const nodeTypes: Record<string, any> = {
|
||||||
|
'nodes-base.webhook': {
|
||||||
|
type: 'nodes-base.webhook',
|
||||||
|
displayName: 'Webhook',
|
||||||
|
package: 'n8n-nodes-base',
|
||||||
|
isTrigger: true,
|
||||||
|
outputs: ['main'],
|
||||||
|
properties: [],
|
||||||
|
},
|
||||||
|
'nodes-base.manualTrigger': {
|
||||||
|
type: 'nodes-base.manualTrigger',
|
||||||
|
displayName: 'Manual Trigger',
|
||||||
|
package: 'n8n-nodes-base',
|
||||||
|
isTrigger: true,
|
||||||
|
outputs: ['main'],
|
||||||
|
properties: [],
|
||||||
|
},
|
||||||
|
'nodes-base.set': {
|
||||||
|
type: 'nodes-base.set',
|
||||||
|
displayName: 'Set',
|
||||||
|
package: 'n8n-nodes-base',
|
||||||
|
outputs: ['main'],
|
||||||
|
properties: [],
|
||||||
|
},
|
||||||
|
'nodes-base.code': {
|
||||||
|
type: 'nodes-base.code',
|
||||||
|
displayName: 'Code',
|
||||||
|
package: 'n8n-nodes-base',
|
||||||
|
outputs: ['main'],
|
||||||
|
properties: [],
|
||||||
|
},
|
||||||
|
'nodes-base.if': {
|
||||||
|
type: 'nodes-base.if',
|
||||||
|
displayName: 'IF',
|
||||||
|
package: 'n8n-nodes-base',
|
||||||
|
outputs: ['main', 'main'],
|
||||||
|
properties: [],
|
||||||
|
},
|
||||||
|
'nodes-base.googleSheets': {
|
||||||
|
type: 'nodes-base.googleSheets',
|
||||||
|
displayName: 'Google Sheets',
|
||||||
|
package: 'n8n-nodes-base',
|
||||||
|
outputs: ['main'],
|
||||||
|
properties: [],
|
||||||
|
},
|
||||||
|
'nodes-base.merge': {
|
||||||
|
type: 'nodes-base.merge',
|
||||||
|
displayName: 'Merge',
|
||||||
|
package: 'n8n-nodes-base',
|
||||||
|
outputs: ['main'],
|
||||||
|
properties: [],
|
||||||
|
},
|
||||||
|
'nodes-langchain.agent': {
|
||||||
|
type: 'nodes-langchain.agent',
|
||||||
|
displayName: 'AI Agent',
|
||||||
|
package: '@n8n/n8n-nodes-langchain',
|
||||||
|
isAITool: true,
|
||||||
|
outputs: ['main'],
|
||||||
|
properties: [],
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
vi.mocked(mockNodeRepository.getNode).mockImplementation((nodeType: string) => {
|
||||||
|
return nodeTypes[nodeType] || null;
|
||||||
|
});
|
||||||
|
vi.mocked(mockNodeRepository.getAllNodes).mockReturnValue(Object.values(nodeTypes));
|
||||||
|
|
||||||
|
validator = new WorkflowValidator(
|
||||||
|
mockNodeRepository,
|
||||||
|
EnhancedConfigValidator as any
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Unknown output keys (P0)', () => {
|
||||||
|
it('should flag numeric string key "1" with index suggestion', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Save to Google Sheets', type: 'n8n-nodes-base.googleSheets', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Format Error', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
{ id: '4', name: 'Success Response', type: 'n8n-nodes-base.set', position: [400, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Save to Google Sheets', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Save to Google Sheets': {
|
||||||
|
'1': [[{ node: 'Format Error', type: '0', index: 0 }]],
|
||||||
|
main: [[{ node: 'Success Response', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unknownKeyError = result.errors.find(e => e.code === 'UNKNOWN_CONNECTION_KEY');
|
||||||
|
expect(unknownKeyError).toBeDefined();
|
||||||
|
expect(unknownKeyError!.message).toContain('Unknown connection output key "1"');
|
||||||
|
expect(unknownKeyError!.message).toContain('use main[1] instead');
|
||||||
|
expect(unknownKeyError!.nodeName).toBe('Save to Google Sheets');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should flag random string key "output"', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Code', type: 'n8n-nodes-base.code', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Set', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Code', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Code': {
|
||||||
|
output: [[{ node: 'Set', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unknownKeyError = result.errors.find(e => e.code === 'UNKNOWN_CONNECTION_KEY');
|
||||||
|
expect(unknownKeyError).toBeDefined();
|
||||||
|
expect(unknownKeyError!.message).toContain('Unknown connection output key "output"');
|
||||||
|
// Should NOT have index suggestion for non-numeric key
|
||||||
|
expect(unknownKeyError!.message).not.toContain('use main[');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should accept valid keys: main, error, ai_tool', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Code', type: 'n8n-nodes-base.code', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Set', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Code', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Code': {
|
||||||
|
main: [[{ node: 'Set', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unknownKeyErrors = result.errors.filter(e => e.code === 'UNKNOWN_CONNECTION_KEY');
|
||||||
|
expect(unknownKeyErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should accept AI connection types as valid keys', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Chat Trigger', type: 'n8n-nodes-base.chatTrigger', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'AI Agent', type: 'nodes-langchain.agent', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'LLM', type: 'nodes-langchain.lmChatOpenAi', position: [200, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Chat Trigger': {
|
||||||
|
main: [[{ node: 'AI Agent', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'LLM': {
|
||||||
|
ai_languageModel: [[{ node: 'AI Agent', type: 'ai_languageModel', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unknownKeyErrors = result.errors.filter(e => e.code === 'UNKNOWN_CONNECTION_KEY');
|
||||||
|
expect(unknownKeyErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should flag multiple unknown keys on the same node', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Code', type: 'n8n-nodes-base.code', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Set1', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
{ id: '4', name: 'Set2', type: 'n8n-nodes-base.set', position: [400, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Code', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Code': {
|
||||||
|
'0': [[{ node: 'Set1', type: 'main', index: 0 }]],
|
||||||
|
'1': [[{ node: 'Set2', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unknownKeyErrors = result.errors.filter(e => e.code === 'UNKNOWN_CONNECTION_KEY');
|
||||||
|
expect(unknownKeyErrors).toHaveLength(2);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Invalid type field (P0)', () => {
|
||||||
|
it('should flag numeric type "0" in connection target', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Sheets', type: 'n8n-nodes-base.googleSheets', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Error Handler', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Sheets', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Sheets': {
|
||||||
|
main: [[{ node: 'Error Handler', type: '0', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const typeError = result.errors.find(e => e.code === 'INVALID_CONNECTION_TYPE');
|
||||||
|
expect(typeError).toBeDefined();
|
||||||
|
expect(typeError!.message).toContain('Invalid connection type "0"');
|
||||||
|
expect(typeError!.message).toContain('Numeric types are not valid');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should flag invented type "output"', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Code', type: 'n8n-nodes-base.code', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Set', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Code', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Code': {
|
||||||
|
main: [[{ node: 'Set', type: 'output', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const typeError = result.errors.find(e => e.code === 'INVALID_CONNECTION_TYPE');
|
||||||
|
expect(typeError).toBeDefined();
|
||||||
|
expect(typeError!.message).toContain('Invalid connection type "output"');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should accept valid type "main"', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Set', type: 'n8n-nodes-base.set', position: [200, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Set', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const typeErrors = result.errors.filter(e => e.code === 'INVALID_CONNECTION_TYPE');
|
||||||
|
expect(typeErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should accept AI connection types in type field', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Chat Trigger', type: 'n8n-nodes-base.chatTrigger', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'AI Agent', type: 'nodes-langchain.agent', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Memory', type: 'nodes-langchain.memoryBufferWindow', position: [200, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Chat Trigger': {
|
||||||
|
main: [[{ node: 'AI Agent', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Memory': {
|
||||||
|
ai_memory: [[{ node: 'AI Agent', type: 'ai_memory', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const typeErrors = result.errors.filter(e => e.code === 'INVALID_CONNECTION_TYPE');
|
||||||
|
expect(typeErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should catch the real-world example from issue #620', async () => {
|
||||||
|
// Exact reproduction of the bug reported in the issue
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Save to Google Sheets', type: 'n8n-nodes-base.googleSheets', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Format AI Integration Error', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
{ id: '4', name: 'Webhook Success Response', type: 'n8n-nodes-base.set', position: [400, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Save to Google Sheets', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Save to Google Sheets': {
|
||||||
|
'1': [[{ node: 'Format AI Integration Error', type: '0', index: 0 }]],
|
||||||
|
main: [[{ node: 'Webhook Success Response', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
// Should detect both bugs
|
||||||
|
const unknownKeyError = result.errors.find(e => e.code === 'UNKNOWN_CONNECTION_KEY');
|
||||||
|
expect(unknownKeyError).toBeDefined();
|
||||||
|
expect(unknownKeyError!.message).toContain('"1"');
|
||||||
|
expect(unknownKeyError!.message).toContain('use main[1] instead');
|
||||||
|
|
||||||
|
// The type "0" error won't appear since the "1" key is unknown and skipped,
|
||||||
|
// but the error count should reflect the invalid connection
|
||||||
|
expect(result.statistics.invalidConnections).toBeGreaterThanOrEqual(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Output index bounds checking (P1)', () => {
|
||||||
|
it('should flag Code node with main[1] (only has 1 output)', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Code', type: 'n8n-nodes-base.code', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Success', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
{ id: '4', name: 'Error', type: 'n8n-nodes-base.set', position: [400, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Code', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Code': {
|
||||||
|
main: [
|
||||||
|
[{ node: 'Success', type: 'main', index: 0 }],
|
||||||
|
[{ node: 'Error', type: 'main', index: 0 }] // main[1] - out of bounds
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const boundsError = result.errors.find(e => e.code === 'OUTPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(boundsError).toBeDefined();
|
||||||
|
expect(boundsError!.message).toContain('Output index 1');
|
||||||
|
expect(boundsError!.message).toContain('Code');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should accept IF node with main[0] and main[1] (2 outputs)', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'IF', type: 'n8n-nodes-base.if', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'True', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
{ id: '4', name: 'False', type: 'n8n-nodes-base.set', position: [400, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'IF', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'IF': {
|
||||||
|
main: [
|
||||||
|
[{ node: 'True', type: 'main', index: 0 }],
|
||||||
|
[{ node: 'False', type: 'main', index: 0 }]
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const boundsErrors = result.errors.filter(e => e.code === 'OUTPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(boundsErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should flag IF node with main[2] (only 2 outputs)', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'IF', type: 'n8n-nodes-base.if', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'True', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
{ id: '4', name: 'False', type: 'n8n-nodes-base.set', position: [400, 200], parameters: {} },
|
||||||
|
{ id: '5', name: 'Extra', type: 'n8n-nodes-base.set', position: [400, 400], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'IF', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'IF': {
|
||||||
|
main: [
|
||||||
|
[{ node: 'True', type: 'main', index: 0 }],
|
||||||
|
[{ node: 'False', type: 'main', index: 0 }],
|
||||||
|
[{ node: 'Extra', type: 'main', index: 0 }] // main[2] - out of bounds
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const boundsError = result.errors.find(e => e.code === 'OUTPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(boundsError).toBeDefined();
|
||||||
|
expect(boundsError!.message).toContain('Output index 2');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should allow extra output when onError is continueErrorOutput', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Code', type: 'n8n-nodes-base.code', position: [200, 0], parameters: {}, onError: 'continueErrorOutput' as const },
|
||||||
|
{ id: '3', name: 'Success', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
{ id: '4', name: 'Error', type: 'n8n-nodes-base.set', position: [400, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Code', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Code': {
|
||||||
|
main: [
|
||||||
|
[{ node: 'Success', type: 'main', index: 0 }],
|
||||||
|
[{ node: 'Error', type: 'main', index: 0 }] // Error output - allowed with continueErrorOutput
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const boundsErrors = result.errors.filter(e => e.code === 'OUTPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(boundsErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should skip bounds check for unknown node types', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Custom', type: 'n8n-nodes-community.customNode', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Set1', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
{ id: '4', name: 'Set2', type: 'n8n-nodes-base.set', position: [400, 200], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Custom', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Custom': {
|
||||||
|
main: [
|
||||||
|
[{ node: 'Set1', type: 'main', index: 0 }],
|
||||||
|
[{ node: 'Set2', type: 'main', index: 0 }]
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const boundsErrors = result.errors.filter(e => e.code === 'OUTPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(boundsErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Input index bounds checking (P1)', () => {
|
||||||
|
it('should accept regular node with index 0', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Set', type: 'n8n-nodes-base.set', position: [200, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Set', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const inputErrors = result.errors.filter(e => e.code === 'INPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(inputErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should flag regular node with index 1 (only 1 input)', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Code', type: 'n8n-nodes-base.code', position: [200, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Code', type: 'main', index: 1 }]] // index 1 - out of bounds
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const inputError = result.errors.find(e => e.code === 'INPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(inputError).toBeDefined();
|
||||||
|
expect(inputError!.message).toContain('Input index 1');
|
||||||
|
expect(inputError!.message).toContain('Code');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should accept Merge node with index 1 (has 2 inputs)', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Set1', type: 'n8n-nodes-base.set', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Set2', type: 'n8n-nodes-base.set', position: [200, 200], parameters: {} },
|
||||||
|
{ id: '4', name: 'Merge', type: 'n8n-nodes-base.merge', position: [400, 100], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Set1', type: 'main', index: 0 }, { node: 'Set2', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Set1': {
|
||||||
|
main: [[{ node: 'Merge', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Set2': {
|
||||||
|
main: [[{ node: 'Merge', type: 'main', index: 1 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const inputErrors = result.errors.filter(e => e.code === 'INPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(inputErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should skip bounds check for unknown node types', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Custom', type: 'n8n-nodes-community.unknownNode', position: [200, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Custom', type: 'main', index: 5 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const inputErrors = result.errors.filter(e => e.code === 'INPUT_INDEX_OUT_OF_BOUNDS');
|
||||||
|
expect(inputErrors).toHaveLength(0);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('Trigger reachability analysis (P2)', () => {
|
||||||
|
it('should flag nodes in disconnected subgraph as unreachable', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Connected', type: 'n8n-nodes-base.set', position: [200, 0], parameters: {} },
|
||||||
|
// Disconnected subgraph - two nodes connected to each other but not reachable from trigger
|
||||||
|
{ id: '3', name: 'Island1', type: 'n8n-nodes-base.code', position: [0, 300], parameters: {} },
|
||||||
|
{ id: '4', name: 'Island2', type: 'n8n-nodes-base.set', position: [200, 300], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Connected', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Island1': {
|
||||||
|
main: [[{ node: 'Island2', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
// Both Island1 and Island2 should be flagged as unreachable
|
||||||
|
const unreachable = result.warnings.filter(w => w.message.includes('not reachable from any trigger'));
|
||||||
|
expect(unreachable.length).toBe(2);
|
||||||
|
expect(unreachable.some(w => w.nodeName === 'Island1')).toBe(true);
|
||||||
|
expect(unreachable.some(w => w.nodeName === 'Island2')).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should pass when all nodes are reachable from trigger', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Code', type: 'n8n-nodes-base.code', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Set', type: 'n8n-nodes-base.set', position: [400, 0], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Code', type: 'main', index: 0 }]]
|
||||||
|
},
|
||||||
|
'Code': {
|
||||||
|
main: [[{ node: 'Set', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unreachable = result.warnings.filter(w => w.message.includes('not reachable'));
|
||||||
|
expect(unreachable).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should flag single orphaned node as unreachable', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Set', type: 'n8n-nodes-base.set', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Orphaned', type: 'n8n-nodes-base.code', position: [500, 500], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Set', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unreachable = result.warnings.filter(w => w.message.includes('not reachable') && w.nodeName === 'Orphaned');
|
||||||
|
expect(unreachable).toHaveLength(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not flag disabled nodes', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Set', type: 'n8n-nodes-base.set', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Disabled', type: 'n8n-nodes-base.code', position: [500, 500], parameters: {}, disabled: true },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Set', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unreachable = result.warnings.filter(w => w.nodeName === 'Disabled');
|
||||||
|
expect(unreachable).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not flag sticky notes', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Webhook', type: 'n8n-nodes-base.webhook', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Set', type: 'n8n-nodes-base.set', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Note', type: 'n8n-nodes-base.stickyNote', position: [500, 500], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Webhook': {
|
||||||
|
main: [[{ node: 'Set', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
const unreachable = result.warnings.filter(w => w.nodeName === 'Note');
|
||||||
|
expect(unreachable).toHaveLength(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use simple orphan check when no triggers exist', async () => {
|
||||||
|
const workflow = {
|
||||||
|
nodes: [
|
||||||
|
{ id: '1', name: 'Set1', type: 'n8n-nodes-base.set', position: [0, 0], parameters: {} },
|
||||||
|
{ id: '2', name: 'Set2', type: 'n8n-nodes-base.set', position: [200, 0], parameters: {} },
|
||||||
|
{ id: '3', name: 'Orphan', type: 'n8n-nodes-base.code', position: [500, 500], parameters: {} },
|
||||||
|
],
|
||||||
|
connections: {
|
||||||
|
'Set1': {
|
||||||
|
main: [[{ node: 'Set2', type: 'main', index: 0 }]]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
|
// Orphan should still be flagged with the simple "not connected" message
|
||||||
|
const orphanWarning = result.warnings.find(w => w.nodeName === 'Orphan');
|
||||||
|
expect(orphanWarning).toBeDefined();
|
||||||
|
expect(orphanWarning!.message).toContain('not connected to any other nodes');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -291,7 +291,7 @@ describe('WorkflowValidator - Expression Format Validation', () => {
|
|||||||
});
|
});
|
||||||
|
|
||||||
describe('Real-world workflow examples', () => {
|
describe('Real-world workflow examples', () => {
|
||||||
it('should validate Email workflow with expression issues', async () => {
|
it.skip('should validate Email workflow with expression issues', async () => {
|
||||||
const workflow = {
|
const workflow = {
|
||||||
name: 'Error Notification Workflow',
|
name: 'Error Notification Workflow',
|
||||||
nodes: [
|
nodes: [
|
||||||
@@ -342,7 +342,7 @@ describe('WorkflowValidator - Expression Format Validation', () => {
|
|||||||
expect(fromEmailError?.message).toContain('={{ $env.ADMIN_EMAIL }}');
|
expect(fromEmailError?.message).toContain('={{ $env.ADMIN_EMAIL }}');
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should validate GitHub workflow with resource locator issues', async () => {
|
it.skip('should validate GitHub workflow with resource locator issues', async () => {
|
||||||
const workflow = {
|
const workflow = {
|
||||||
name: 'GitHub Issue Handler',
|
name: 'GitHub Issue Handler',
|
||||||
nodes: [
|
nodes: [
|
||||||
|
|||||||
@@ -646,9 +646,10 @@ describe('WorkflowValidator - Mock-based Unit Tests', () => {
|
|||||||
await validator.validateWorkflow(workflow as any);
|
await validator.validateWorkflow(workflow as any);
|
||||||
|
|
||||||
// Should have called getNode for each node type (normalized to short form)
|
// Should have called getNode for each node type (normalized to short form)
|
||||||
|
// Called during node validation + output/input index bounds checking
|
||||||
expect(mockGetNode).toHaveBeenCalledWith('nodes-base.httpRequest');
|
expect(mockGetNode).toHaveBeenCalledWith('nodes-base.httpRequest');
|
||||||
expect(mockGetNode).toHaveBeenCalledWith('nodes-base.set');
|
expect(mockGetNode).toHaveBeenCalledWith('nodes-base.set');
|
||||||
expect(mockGetNode).toHaveBeenCalledTimes(2);
|
expect(mockGetNode.mock.calls.length).toBeGreaterThanOrEqual(2);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should handle repository errors gracefully', async () => {
|
it('should handle repository errors gracefully', async () => {
|
||||||
|
|||||||
Reference in New Issue
Block a user