mirror of
https://github.com/czlonkowski/n8n-mcp.git
synced 2026-03-20 09:23:07 +00:00
* feat: add comprehensive telemetry for partial workflow updates Implement telemetry infrastructure to track workflow mutations from partial update operations. This enables data-driven improvements to partial update tooling by capturing: - Workflow state before and after mutations - User intent and operation patterns - Validation results and improvements - Change metrics (nodes/connections modified) - Success/failure rates and error patterns New Components: - Intent classifier: Categorizes mutation patterns - Intent sanitizer: Removes PII from user instructions - Mutation validator: Ensures data quality before tracking - Mutation tracker: Coordinates validation and metric calculation Extended Components: - TelemetryManager: New trackWorkflowMutation() method - EventTracker: Mutation queue management - BatchProcessor: Mutation data flushing to Supabase MCP Tool Enhancements: - n8n_update_partial_workflow: Added optional 'intent' parameter - n8n_update_full_workflow: Added optional 'intent' parameter - Both tools now track mutations asynchronously Database Schema: - New workflow_mutations table with 20+ fields - Comprehensive indexes for efficient querying - Supports deduplication and data analysis This telemetry system is: - Privacy-focused (PII sanitization, anonymized users) - Non-blocking (async tracking, silent failures) - Production-ready (batching, retries, circuit breaker) - Backward compatible (all parameters optional) Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en * fix: correct SQL syntax for expression index in workflow_mutations schema The expression index for significant changes needs double parentheses around the arithmetic expression to be valid PostgreSQL syntax. Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en * fix: enable RLS policies for workflow_mutations table Enable Row-Level Security and add policies: - Allow anonymous (anon) inserts for telemetry data collection - Allow authenticated reads for data analysis and querying These policies are required for the telemetry system to function correctly with Supabase, as the MCP server uses the anon key to insert mutation data. Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en * fix: reduce mutation auto-flush threshold from 5 to 2 Lower the auto-flush threshold for workflow mutations from 5 to 2 to ensure more timely data persistence. Since mutations are less frequent than regular telemetry events, a lower threshold provides: - Faster data persistence (don't wait for 5 mutations) - Better testing experience (easier to verify with fewer operations) - Reduced risk of data loss if process exits before threshold - More responsive telemetry for low-volume mutation scenarios This complements the existing 5-second periodic flush and process exit handlers, ensuring mutations are persisted promptly. Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en * fix: improve mutation telemetry error logging and diagnostics Changes: - Upgrade error logging from debug to warn level for better visibility - Add diagnostic logging to track mutation processing - Log telemetry disabled state explicitly - Add context info (sessionId, intent, operationCount) to error logs - Remove 'await' from telemetry calls to make them truly non-blocking This will help identify why mutations aren't being persisted to the workflow_mutations table despite successful workflow operations. Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en * feat: enhance workflow mutation telemetry for better AI responses Improve workflow mutation tracking to capture comprehensive data that helps provide better responses when users update workflows. This enhancement collects workflow state, user intent, and operation details to enable more context-aware assistance. Key improvements: - Reduce auto-flush threshold from 5 to 2 for more reliable mutation tracking - Add comprehensive workflow and credential sanitization to mutation tracker - Document intent parameter in workflow update tools for better UX - Fix mutation queue handling in telemetry manager (flush now handles 3 queues) - Add extensive unit tests for mutation tracking and validation (35 new tests) Technical changes: - mutation-tracker.ts: Multi-layer sanitization (workflow, node, parameter levels) - batch-processor.ts: Support mutation data flushing to Supabase - telemetry-manager.ts: Auto-flush mutations at threshold 2, track mutations queue - handlers-workflow-diff.ts: Track workflow mutations with sanitized data - Tests: 13 tests for mutation-tracker, 22 tests for mutation-validator The intent parameter messaging emphasizes user benefit ("helps to return better response") rather than technical implementation details. Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> * chore: bump version to 2.22.16 with telemetry changelog Updated package.json and package.runtime.json to version 2.22.16. Added comprehensive CHANGELOG entry documenting workflow mutation telemetry enhancements for better AI-powered workflow assistance. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en Co-Authored-By: Claude <noreply@anthropic.com> * fix: resolve TypeScript lint errors in telemetry tests Fixed type issues in mutation-tracker and mutation-validator tests: - Import and use MutationToolName enum instead of string literals - Fix ValidationResult.errors to use proper object structure - Add UpdateNodeOperation type assertion for operation with nodeName All TypeScript errors resolved, lint now passes. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en Co-Authored-By: Claude <noreply@anthropic.com> --------- Co-authored-by: Claude <noreply@anthropic.com>
238 lines
6.2 KiB
TypeScript
238 lines
6.2 KiB
TypeScript
/**
|
|
* Data quality validator for workflow mutations
|
|
* Ensures mutation data meets quality standards before tracking
|
|
*/
|
|
|
|
import { createHash } from 'crypto';
|
|
import {
|
|
WorkflowMutationData,
|
|
MutationDataQualityResult,
|
|
MutationTrackingOptions,
|
|
} from './mutation-types.js';
|
|
|
|
/**
|
|
* Default options for mutation tracking
|
|
*/
|
|
export const DEFAULT_MUTATION_TRACKING_OPTIONS: Required<MutationTrackingOptions> = {
|
|
enabled: true,
|
|
maxWorkflowSizeKb: 500,
|
|
validateQuality: true,
|
|
sanitize: true,
|
|
};
|
|
|
|
/**
|
|
* Validates workflow mutation data quality
|
|
*/
|
|
export class MutationValidator {
|
|
private options: Required<MutationTrackingOptions>;
|
|
|
|
constructor(options: MutationTrackingOptions = {}) {
|
|
this.options = { ...DEFAULT_MUTATION_TRACKING_OPTIONS, ...options };
|
|
}
|
|
|
|
/**
|
|
* Validate mutation data quality
|
|
*/
|
|
validate(data: WorkflowMutationData): MutationDataQualityResult {
|
|
const errors: string[] = [];
|
|
const warnings: string[] = [];
|
|
|
|
// Check workflow structure
|
|
if (!this.isValidWorkflow(data.workflowBefore)) {
|
|
errors.push('Invalid workflow_before structure');
|
|
}
|
|
|
|
if (!this.isValidWorkflow(data.workflowAfter)) {
|
|
errors.push('Invalid workflow_after structure');
|
|
}
|
|
|
|
// Check workflow size
|
|
const beforeSizeKb = this.getWorkflowSizeKb(data.workflowBefore);
|
|
const afterSizeKb = this.getWorkflowSizeKb(data.workflowAfter);
|
|
|
|
if (beforeSizeKb > this.options.maxWorkflowSizeKb) {
|
|
errors.push(
|
|
`workflow_before size (${beforeSizeKb}KB) exceeds maximum (${this.options.maxWorkflowSizeKb}KB)`
|
|
);
|
|
}
|
|
|
|
if (afterSizeKb > this.options.maxWorkflowSizeKb) {
|
|
errors.push(
|
|
`workflow_after size (${afterSizeKb}KB) exceeds maximum (${this.options.maxWorkflowSizeKb}KB)`
|
|
);
|
|
}
|
|
|
|
// Check for meaningful change
|
|
if (!this.hasMeaningfulChange(data.workflowBefore, data.workflowAfter)) {
|
|
warnings.push('No meaningful change detected between before and after workflows');
|
|
}
|
|
|
|
// Check intent quality
|
|
if (!data.userIntent || data.userIntent.trim().length === 0) {
|
|
warnings.push('User intent is empty');
|
|
} else if (data.userIntent.trim().length < 5) {
|
|
warnings.push('User intent is too short (less than 5 characters)');
|
|
} else if (data.userIntent.length > 1000) {
|
|
warnings.push('User intent is very long (over 1000 characters)');
|
|
}
|
|
|
|
// Check operations
|
|
if (!data.operations || data.operations.length === 0) {
|
|
errors.push('No operations provided');
|
|
}
|
|
|
|
// Check validation data consistency
|
|
if (data.validationBefore && data.validationAfter) {
|
|
if (typeof data.validationBefore.valid !== 'boolean') {
|
|
warnings.push('Invalid validation_before structure');
|
|
}
|
|
if (typeof data.validationAfter.valid !== 'boolean') {
|
|
warnings.push('Invalid validation_after structure');
|
|
}
|
|
}
|
|
|
|
// Check duration sanity
|
|
if (data.durationMs !== undefined) {
|
|
if (data.durationMs < 0) {
|
|
errors.push('Duration cannot be negative');
|
|
}
|
|
if (data.durationMs > 300000) {
|
|
// 5 minutes
|
|
warnings.push('Duration is very long (over 5 minutes)');
|
|
}
|
|
}
|
|
|
|
return {
|
|
valid: errors.length === 0,
|
|
errors,
|
|
warnings,
|
|
};
|
|
}
|
|
|
|
/**
|
|
* Check if workflow has valid structure
|
|
*/
|
|
private isValidWorkflow(workflow: any): boolean {
|
|
if (!workflow || typeof workflow !== 'object') {
|
|
return false;
|
|
}
|
|
|
|
// Must have nodes array
|
|
if (!Array.isArray(workflow.nodes)) {
|
|
return false;
|
|
}
|
|
|
|
// Must have connections object
|
|
if (!workflow.connections || typeof workflow.connections !== 'object') {
|
|
return false;
|
|
}
|
|
|
|
return true;
|
|
}
|
|
|
|
/**
|
|
* Get workflow size in KB
|
|
*/
|
|
private getWorkflowSizeKb(workflow: any): number {
|
|
try {
|
|
const json = JSON.stringify(workflow);
|
|
return json.length / 1024;
|
|
} catch {
|
|
return 0;
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Check if there's meaningful change between workflows
|
|
*/
|
|
private hasMeaningfulChange(workflowBefore: any, workflowAfter: any): boolean {
|
|
try {
|
|
// Compare hashes
|
|
const hashBefore = this.hashWorkflow(workflowBefore);
|
|
const hashAfter = this.hashWorkflow(workflowAfter);
|
|
|
|
return hashBefore !== hashAfter;
|
|
} catch {
|
|
return false;
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Hash workflow for comparison
|
|
*/
|
|
hashWorkflow(workflow: any): string {
|
|
try {
|
|
const json = JSON.stringify(workflow);
|
|
return createHash('sha256').update(json).digest('hex').substring(0, 16);
|
|
} catch {
|
|
return '';
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Check if mutation should be excluded from tracking
|
|
*/
|
|
shouldExclude(data: WorkflowMutationData): boolean {
|
|
// Exclude if not successful and no error message
|
|
if (!data.mutationSuccess && !data.mutationError) {
|
|
return true;
|
|
}
|
|
|
|
// Exclude if workflows are identical
|
|
if (!this.hasMeaningfulChange(data.workflowBefore, data.workflowAfter)) {
|
|
return true;
|
|
}
|
|
|
|
// Exclude if workflow size exceeds limits
|
|
const beforeSizeKb = this.getWorkflowSizeKb(data.workflowBefore);
|
|
const afterSizeKb = this.getWorkflowSizeKb(data.workflowAfter);
|
|
|
|
if (
|
|
beforeSizeKb > this.options.maxWorkflowSizeKb ||
|
|
afterSizeKb > this.options.maxWorkflowSizeKb
|
|
) {
|
|
return true;
|
|
}
|
|
|
|
return false;
|
|
}
|
|
|
|
/**
|
|
* Check for duplicate mutation (same hash + operations)
|
|
*/
|
|
isDuplicate(
|
|
workflowBefore: any,
|
|
workflowAfter: any,
|
|
operations: any[],
|
|
recentMutations: Array<{ hashBefore: string; hashAfter: string; operations: any[] }>
|
|
): boolean {
|
|
const hashBefore = this.hashWorkflow(workflowBefore);
|
|
const hashAfter = this.hashWorkflow(workflowAfter);
|
|
const operationsHash = this.hashOperations(operations);
|
|
|
|
return recentMutations.some(
|
|
(m) =>
|
|
m.hashBefore === hashBefore &&
|
|
m.hashAfter === hashAfter &&
|
|
this.hashOperations(m.operations) === operationsHash
|
|
);
|
|
}
|
|
|
|
/**
|
|
* Hash operations for deduplication
|
|
*/
|
|
private hashOperations(operations: any[]): string {
|
|
try {
|
|
const json = JSON.stringify(operations);
|
|
return createHash('sha256').update(json).digest('hex').substring(0, 16);
|
|
} catch {
|
|
return '';
|
|
}
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Singleton instance for easy access
|
|
*/
|
|
export const mutationValidator = new MutationValidator();
|