Files
n8n-mcp/TELEMETRY_ANALYSIS_README.md
Romuald Członkowski 99c5907b71 feat: enhance workflow mutation telemetry for better AI responses (#419)
* feat: add comprehensive telemetry for partial workflow updates

Implement telemetry infrastructure to track workflow mutations from
partial update operations. This enables data-driven improvements to
partial update tooling by capturing:

- Workflow state before and after mutations
- User intent and operation patterns
- Validation results and improvements
- Change metrics (nodes/connections modified)
- Success/failure rates and error patterns

New Components:
- Intent classifier: Categorizes mutation patterns
- Intent sanitizer: Removes PII from user instructions
- Mutation validator: Ensures data quality before tracking
- Mutation tracker: Coordinates validation and metric calculation

Extended Components:
- TelemetryManager: New trackWorkflowMutation() method
- EventTracker: Mutation queue management
- BatchProcessor: Mutation data flushing to Supabase

MCP Tool Enhancements:
- n8n_update_partial_workflow: Added optional 'intent' parameter
- n8n_update_full_workflow: Added optional 'intent' parameter
- Both tools now track mutations asynchronously

Database Schema:
- New workflow_mutations table with 20+ fields
- Comprehensive indexes for efficient querying
- Supports deduplication and data analysis

This telemetry system is:
- Privacy-focused (PII sanitization, anonymized users)
- Non-blocking (async tracking, silent failures)
- Production-ready (batching, retries, circuit breaker)
- Backward compatible (all parameters optional)

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

* fix: correct SQL syntax for expression index in workflow_mutations schema

The expression index for significant changes needs double parentheses
around the arithmetic expression to be valid PostgreSQL syntax.

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

* fix: enable RLS policies for workflow_mutations table

Enable Row-Level Security and add policies:
- Allow anonymous (anon) inserts for telemetry data collection
- Allow authenticated reads for data analysis and querying

These policies are required for the telemetry system to function
correctly with Supabase, as the MCP server uses the anon key to
insert mutation data.

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

* fix: reduce mutation auto-flush threshold from 5 to 2

Lower the auto-flush threshold for workflow mutations from 5 to 2 to ensure
more timely data persistence. Since mutations are less frequent than regular
telemetry events, a lower threshold provides:

- Faster data persistence (don't wait for 5 mutations)
- Better testing experience (easier to verify with fewer operations)
- Reduced risk of data loss if process exits before threshold
- More responsive telemetry for low-volume mutation scenarios

This complements the existing 5-second periodic flush and process exit
handlers, ensuring mutations are persisted promptly.

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

* fix: improve mutation telemetry error logging and diagnostics

Changes:
- Upgrade error logging from debug to warn level for better visibility
- Add diagnostic logging to track mutation processing
- Log telemetry disabled state explicitly
- Add context info (sessionId, intent, operationCount) to error logs
- Remove 'await' from telemetry calls to make them truly non-blocking

This will help identify why mutations aren't being persisted to the
workflow_mutations table despite successful workflow operations.

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

* feat: enhance workflow mutation telemetry for better AI responses

Improve workflow mutation tracking to capture comprehensive data that helps provide better responses when users update workflows. This enhancement collects workflow state, user intent, and operation details to enable more context-aware assistance.

Key improvements:
- Reduce auto-flush threshold from 5 to 2 for more reliable mutation tracking
- Add comprehensive workflow and credential sanitization to mutation tracker
- Document intent parameter in workflow update tools for better UX
- Fix mutation queue handling in telemetry manager (flush now handles 3 queues)
- Add extensive unit tests for mutation tracking and validation (35 new tests)

Technical changes:
- mutation-tracker.ts: Multi-layer sanitization (workflow, node, parameter levels)
- batch-processor.ts: Support mutation data flushing to Supabase
- telemetry-manager.ts: Auto-flush mutations at threshold 2, track mutations queue
- handlers-workflow-diff.ts: Track workflow mutations with sanitized data
- Tests: 13 tests for mutation-tracker, 22 tests for mutation-validator

The intent parameter messaging emphasizes user benefit ("helps to return better response") rather than technical implementation details.

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

* chore: bump version to 2.22.16 with telemetry changelog

Updated package.json and package.runtime.json to version 2.22.16.
Added comprehensive CHANGELOG entry documenting workflow mutation
telemetry enhancements for better AI-powered workflow assistance.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

Co-Authored-By: Claude <noreply@anthropic.com>

* fix: resolve TypeScript lint errors in telemetry tests

Fixed type issues in mutation-tracker and mutation-validator tests:
- Import and use MutationToolName enum instead of string literals
- Fix ValidationResult.errors to use proper object structure
- Add UpdateNodeOperation type assertion for operation with nodeName

All TypeScript errors resolved, lint now passes.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Conceived by Romuald Członkowski - https://www.aiadvisors.pl/en

Co-Authored-By: Claude <noreply@anthropic.com>

---------

Co-authored-by: Claude <noreply@anthropic.com>
2025-11-13 14:21:51 +01:00

12 KiB

Telemetry Analysis Documentation Index

Comprehensive Analysis of N8N-MCP Telemetry Infrastructure Analysis Date: November 12, 2025 Status: Complete and Ready for Implementation


Quick Start

If you only have 5 minutes:

  • Read the summary section below

If you have 30 minutes:

  • Read TELEMETRY_N8N_FIXER_DATASET.md (master summary)

If you have 2+ hours:

  • Start with TELEMETRY_ANALYSIS.md (main reference)
  • Follow with TELEMETRY_MUTATION_SPEC.md (implementation guide)
  • Use TELEMETRY_QUICK_REFERENCE.md for queries/patterns

One-Sentence Summary

The n8n-mcp telemetry system successfully tracks 276K+ user interactions across a production Supabase backend, but lacks workflow mutation capture needed for building an n8n-fixer dataset. The solution requires a new table plus 3-4 weeks of integration work.


Document Guide

PRIMARY DOCUMENTS (Created November 12, 2025)

1. TELEMETRY_ANALYSIS.md (23 KB, 720 lines)

Your main reference for understanding current state

Contains:

  • Complete table schemas (telemetry_events, telemetry_workflows)
  • All 12 event types with JSON examples
  • Current workflow tracking capabilities
  • Data samples from production
  • Gap analysis for n8n-fixer requirements
  • Proposed schema additions
  • Privacy & security analysis
  • Data capture pipeline architecture

When to read: You need the complete picture of what exists and what's missing

Read time: 20-30 minutes


2. TELEMETRY_MUTATION_SPEC.md (26 KB, 918 lines)

Your implementation blueprint

Contains:

  • Complete SQL schema for workflow_mutations table with 20 indexes
  • TypeScript interfaces and type definitions
  • Integration point specifications
  • Mutation analyzer service code structure
  • Batch processor extensions
  • Code examples for tools to instrument
  • Validation rules and data quality checks
  • Query patterns for dataset analysis
  • 4-phase implementation roadmap

When to read: You're ready to start building the mutation tracking system

Read time: 30-40 minutes


3. TELEMETRY_QUICK_REFERENCE.md (11 KB, 503 lines)

Your developer quick lookup guide

Contains:

  • Supabase connection details
  • Event type quick reference
  • Common SQL query patterns
  • Performance optimization tips
  • User journey analysis examples
  • Platform distribution queries
  • File references and code locations
  • Helpful constants and values

When to read: You need to query existing data or reference specific details

Read time: 10-15 minutes


4. TELEMETRY_N8N_FIXER_DATASET.md (13 KB, 340 lines)

Your executive summary and master planning document

Contains:

  • Overview of analysis findings
  • Documentation map (what to read in what order)
  • Current state summary
  • Recommended 4-phase implementation path
  • Key metrics you'll collect
  • Storage requirements and cost estimates
  • Risk assessment
  • Success criteria for each phase
  • Questions to answer before starting

When to read: Planning implementation or presenting to stakeholders

Read time: 15-20 minutes


SUPPORTING DOCUMENTS (Created November 8, 2025)

TELEMETRY_ANALYSIS_REPORT.md (26 KB)

  • Executive summary with visualizations
  • Event distribution statistics
  • Usage patterns and trends
  • Performance metrics
  • User activity analysis

TELEMETRY_EXECUTIVE_SUMMARY.md (10 KB)

  • High-level overview for executives
  • Key statistics and metrics
  • Business impact assessment
  • Recommendation summary

TELEMETRY_TECHNICAL_DEEP_DIVE.md (18 KB)

  • Architecture and design patterns
  • Component interactions
  • Data flow diagrams
  • Implementation details
  • Performance considerations

TELEMETRY_DATA_FOR_VISUALIZATION.md (18 KB)

  • Sample datasets for dashboards
  • Query results and aggregations
  • Visualization recommendations
  • Chart and graph specifications

TELEMETRY_ANALYSIS_INDEX.md (15 KB)

  • Index of all analyses
  • Cross-references
  • Topic mappings
  • Search guide

For Implementation Teams

  1. TELEMETRY_N8N_FIXER_DATASET.md (15 min) - Understand the plan
  2. TELEMETRY_ANALYSIS.md (30 min) - Understand current state
  3. TELEMETRY_MUTATION_SPEC.md (40 min) - Get implementation details
  4. TELEMETRY_QUICK_REFERENCE.md (10 min) - Reference during coding

Total Time: 95 minutes

For Product Managers

  1. TELEMETRY_EXECUTIVE_SUMMARY.md (10 min)
  2. TELEMETRY_N8N_FIXER_DATASET.md (15 min)
  3. TELEMETRY_ANALYSIS_REPORT.md (20 min)

Total Time: 45 minutes

For Data Analysts

  1. TELEMETRY_ANALYSIS.md (30 min)
  2. TELEMETRY_QUICK_REFERENCE.md (10 min)
  3. TELEMETRY_ANALYSIS_REPORT.md (20 min)

Total Time: 60 minutes

For Architects

  1. TELEMETRY_TECHNICAL_DEEP_DIVE.md (20 min)
  2. TELEMETRY_MUTATION_SPEC.md (40 min)
  3. TELEMETRY_N8N_FIXER_DATASET.md (15 min)

Total Time: 75 minutes


Key Findings Summary

What Exists Today

  • 276K+ telemetry events tracked in Supabase
  • 6.5K+ unique workflows analyzed
  • 12 event types covering tool usage, errors, validation, workflow creation
  • Production-grade infrastructure with batching, retry logic, rate limiting
  • Privacy-focused design with sanitization, anonymization, encryption

Critical Gaps for N8N-Fixer

  • No workflow mutation/modification tracking
  • No before/after workflow snapshots
  • No instruction/transformation capture
  • No mutation success metrics
  • No validation improvement tracking

Proposed Solution

  • New workflow_mutations table (with 20 indexes)
  • Extended telemetry system to capture mutations
  • Instrumentation of 3-4 key tools
  • 4-phase implementation (3-4 weeks)

Data Volume Estimates

  • Per mutation: 25 KB (with compression)
  • Monthly: 250 MB - 1.2 GB
  • Annual: 3-14 GB
  • Cost: $10-200/month (depending on volume)

Implementation Effort

  • Phase 1 (Infrastructure): 40-60 hours
  • Phase 2 (Core Integration): 40-60 hours
  • Phase 3 (Tool Integration): 20-30 hours
  • Phase 4 (Validation): 20-30 hours
  • Total: 120-180 hours (3-4 weeks)

Critical Data

Supabase Connection

URL: https://ydyufsohxdfpopqbubwk.supabase.co
Database: PostgreSQL
Auth: Anon key (in telemetry-types.ts)
Tables: telemetry_events, telemetry_workflows

Event Types (by volume)

  1. tool_used (40-50%)
  2. tool_sequence (20-30%)
  3. error_occurred (10-15%)
  4. validation_details (5-10%)
  5. Others (workflow, session, performance) (5-10%)

Node Files

  • Source types: /Users/romualdczlonkowski/Pliki/n8n-mcp/n8n-mcp/src/telemetry/telemetry-types.ts
  • Main manager: /Users/romualdczlonkowski/Pliki/n8n-mcp/n8n-mcp/src/telemetry/telemetry-manager.ts
  • Event tracker: /Users/romualdczlonkowski/Pliki/n8n-mcp/n8n-mcp/src/telemetry/event-tracker.ts
  • Batch processor: /Users/romualdczlonkowski/Pliki/n8n-mcp/n8n-mcp/src/telemetry/batch-processor.ts

Implementation Checklist

Before Starting

  • Read TELEMETRY_N8N_FIXER_DATASET.md
  • Read TELEMETRY_ANALYSIS.md
  • Answer 6 questions (see TELEMETRY_N8N_FIXER_DATASET.md)
  • Get stakeholder approval for 4-phase plan
  • Assign implementation team

Phase 1: Infrastructure (Weeks 1-2)

  • Create workflow_mutations table in Supabase
  • Add 20+ indexes per specification
  • Define TypeScript types
  • Build mutation validator
  • Write unit tests

Phase 2: Core Integration (Weeks 2-3)

  • Add trackWorkflowMutation() to TelemetryManager
  • Extend EventTracker with mutation queue
  • Extend BatchProcessor for mutations
  • Write integration tests
  • Code review and merge

Phase 3: Tool Integration (Week 4)

  • Instrument n8n_autofix_workflow
  • Instrument n8n_update_partial_workflow
  • Instrument validation engine (if applicable)
  • Manual end-to-end testing
  • Code review and merge

Phase 4: Validation (Week 5)

  • Collect 100+ sample mutations
  • Verify data quality
  • Run analysis queries
  • Assess dataset readiness
  • Begin production collection

Storage & Cost Planning

Conservative Estimate (10K mutations/month)

  • Storage: 250 MB/month
  • Cost: $10-20/month
  • Dataset: 1K mutations in 3-4 days

Moderate Estimate (30K mutations/month)

  • Storage: 750 MB/month
  • Cost: $50-100/month
  • Dataset: 10K mutations in 10 days

High Estimate (50K mutations/month)

  • Storage: 1.2 GB/month
  • Cost: $100-200/month
  • Dataset: 100K mutations in 2 months

With 90-day retention policy, costs stay at lower end.


Questions Before Implementation

  1. Data Retention: Keep mutations for 90 days? 1 year? Indefinite?
  2. Storage Budget: Monthly budget for telemetry storage?
  3. Workflow Size: Max workflow size to store? Compression required?
  4. Dataset Timeline: When do you need first dataset? (1K? 10K? 100K?)
  5. Privacy: Additional PII to sanitize beyond current approach?
  6. User Consent: Separate opt-in for mutation tracking vs. general telemetry?

Risk Assessment

Low Risk

  • No breaking changes to existing system
  • Fully backward compatible
  • Optional feature (can disable if needed)
  • No version bump required

Medium Risk

  • Storage growth if >1.2 GB/month
  • Performance impact if workflows >10 MB
  • Mitigation: Compression + retention policy

High Risk

  • None identified

Success Criteria

When you can answer "yes" to all:

  • 100+ workflow mutations collected
  • Data hash verification passes 100%
  • Sample queries execute <100ms
  • Deduplication working correctly
  • Before/after states properly stored
  • Validation improvements tracked accurately
  • No performance regression in tools
  • Team ready for large-scale collection

Next Steps

Immediate (This Week)

  1. Review this README
  2. Read TELEMETRY_N8N_FIXER_DATASET.md
  3. Read TELEMETRY_ANALYSIS.md
  4. Schedule team review meeting

Short-term (Next 1-2 Weeks)

  1. Answer the 6 questions
  2. Get stakeholder approval
  3. Assign implementation lead
  4. Create Jira tickets for Phase 1

Medium-term (Weeks 3-6)

  1. Execute Phase 1 (Infrastructure)
  2. Execute Phase 2 (Core Integration)
  3. Execute Phase 3 (Tool Integration)
  4. Execute Phase 4 (Validation)

Long-term (Week 7+)

  1. Begin production dataset collection
  2. Monitor storage and costs
  3. Run analysis queries
  4. Iterate based on findings

Contact & Questions

Analysis Completed By: Telemetry Data Analyst Date: November 12, 2025 Status: Ready for team review and implementation

For questions or clarifications:

  1. Review the specific document for your question
  2. Check TELEMETRY_QUICK_REFERENCE.md for common lookups
  3. Refer to source files in src/telemetry/

Document Statistics

Document Size Lines Read Time Purpose
TELEMETRY_ANALYSIS.md 23 KB 720 20-30 min Main reference
TELEMETRY_MUTATION_SPEC.md 26 KB 918 30-40 min Implementation guide
TELEMETRY_QUICK_REFERENCE.md 11 KB 503 10-15 min Developer lookup
TELEMETRY_N8N_FIXER_DATASET.md 13 KB 340 15-20 min Executive summary
TELEMETRY_ANALYSIS_REPORT.md 26 KB 732 20-30 min Statistics & trends
TELEMETRY_EXECUTIVE_SUMMARY.md 10 KB 345 10-15 min Executive brief
TELEMETRY_TECHNICAL_DEEP_DIVE.md 18 KB 654 20-25 min Architecture
TELEMETRY_DATA_FOR_VISUALIZATION.md 18 KB 468 15-20 min Dashboard data
TELEMETRY_ANALYSIS_INDEX.md 15 KB 447 10-15 min Topic index
TOTAL 160 KB 5,237 150-180 min Full analysis

Version History

Date Version Changes
Nov 8, 2025 1.0 Initial analysis and reports
Nov 12, 2025 2.0 Core documentation + mutation spec + this README

License & Attribution

These analysis documents are part of the n8n-mcp project. Conceived by Romuald Członkowski - www.aiadvisors.pl/en


END OF README

For additional information, start with one of the primary documents above based on your role and available time.