- Create benchmark test suites for critical operations: - Node loading performance - Database query performance - Search operations performance - Validation performance - MCP tool execution performance - Add GitHub Actions workflow for benchmark tracking: - Runs on push to main and PRs - Uses github-action-benchmark for historical tracking - Comments on PRs with performance results - Alerts on >10% performance regressions - Stores results in GitHub Pages - Create benchmark infrastructure: - Custom Vitest benchmark configuration - JSON reporter for CI results - Result formatter for github-action-benchmark - Performance threshold documentation - Add supporting utilities: - SQLiteStorageService for benchmark database setup - MCPEngine wrapper for testing MCP tools - Test factories for generating benchmark data - Enhanced NodeRepository with benchmark methods - Document benchmark system: - Comprehensive benchmark guide in docs/BENCHMARKS.md - Performance thresholds in .github/BENCHMARK_THRESHOLDS.md - README for benchmarks directory - Integration with existing test suite The benchmark system will help monitor performance over time and catch regressions before they reach production. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude <noreply@anthropic.com>
3.1 KiB
3.1 KiB
Codecov Setup Guide
This guide explains how to set up and configure Codecov for the n8n-MCP project.
Prerequisites
- A Codecov account (sign up at https://codecov.io)
- Repository admin access to add the CODECOV_TOKEN secret
Setup Steps
1. Get Your Codecov Token
- Sign in to Codecov
- Add your repository:
czlonkowski/n8n-mcp - Copy the upload token from the repository settings
2. Add Token to GitHub Secrets
- Go to your GitHub repository settings
- Navigate to
Settings→Secrets and variables→Actions - Click "New repository secret"
- Name:
CODECOV_TOKEN - Value: Paste your Codecov token
- Click "Add secret"
3. Update the Badge Token
Edit the README.md file and replace YOUR_TOKEN in the Codecov badge with your actual token:
[](https://codecov.io/gh/czlonkowski/n8n-mcp)
Note: The token in the badge URL is a read-only token and safe to commit.
Configuration Details
codecov.yml
The configuration file sets:
- Target coverage: 80% for both project and patch
- Coverage precision: 2 decimal places
- Comment behavior: Comments on all PRs with coverage changes
- Ignored files: Test files, scripts, node_modules, and build outputs
GitHub Actions
The workflow:
- Runs tests with coverage using
npm run test:coverage - Generates LCOV format coverage report
- Uploads to Codecov using the official action
- Fails the build if upload fails
Vitest Configuration
Coverage settings in vitest.config.ts:
- Provider: V8 (fast and accurate)
- Reporters: text, json, html, and lcov
- Thresholds: 80% lines, 80% functions, 75% branches, 80% statements
Viewing Coverage
Local Coverage
# Generate coverage report
npm run test:coverage
# View HTML report
open coverage/index.html
Online Coverage
- Visit https://codecov.io/gh/czlonkowski/n8n-mcp
- View detailed reports, graphs, and file-by-file coverage
- Check PR comments for coverage changes
Troubleshooting
Coverage Not Uploading
- Verify CODECOV_TOKEN is set in GitHub secrets
- Check GitHub Actions logs for errors
- Ensure coverage/lcov.info is generated
Badge Not Showing
- Wait a few minutes after first upload
- Verify the token in the badge URL is correct
- Check if the repository is public/private settings match
Low Coverage Areas
Current areas with lower coverage that could be improved:
- HTTP server implementations
- MCP index files
- Some edge cases in validators
Best Practices
- Write tests first: Aim for TDD when adding features
- Focus on critical paths: Prioritize testing core functionality
- Mock external dependencies: Use MSW for HTTP, mock for databases
- Keep coverage realistic: 80% is good, 100% isn't always practical
- Monitor trends: Watch coverage over time, not just absolute numbers