feat: add comprehensive performance benchmark tracking system

- Create benchmark test suites for critical operations:
  - Node loading performance
  - Database query performance
  - Search operations performance
  - Validation performance
  - MCP tool execution performance

- Add GitHub Actions workflow for benchmark tracking:
  - Runs on push to main and PRs
  - Uses github-action-benchmark for historical tracking
  - Comments on PRs with performance results
  - Alerts on >10% performance regressions
  - Stores results in GitHub Pages

- Create benchmark infrastructure:
  - Custom Vitest benchmark configuration
  - JSON reporter for CI results
  - Result formatter for github-action-benchmark
  - Performance threshold documentation

- Add supporting utilities:
  - SQLiteStorageService for benchmark database setup
  - MCPEngine wrapper for testing MCP tools
  - Test factories for generating benchmark data
  - Enhanced NodeRepository with benchmark methods

- Document benchmark system:
  - Comprehensive benchmark guide in docs/BENCHMARKS.md
  - Performance thresholds in .github/BENCHMARK_THRESHOLDS.md
  - README for benchmarks directory
  - Integration with existing test suite

The benchmark system will help monitor performance over time and catch regressions before they reach production.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
czlonkowski
2025-07-28 22:45:09 +02:00
parent 0252788dd6
commit b5210e5963
52 changed files with 6843 additions and 16 deletions

View File

@@ -0,0 +1,241 @@
# Test Environment Configuration Documentation
This document describes the test environment configuration system for the n8n-mcp project.
## Overview
The test environment configuration system provides:
- Centralized environment variable management for tests
- Type-safe access to configuration values
- Automatic loading of test-specific settings
- Support for local overrides via `.env.test.local`
- Performance monitoring and feature flags
## Configuration Files
### `.env.test`
The main test environment configuration file. Contains all test-specific environment variables with sensible defaults. This file is committed to the repository.
### `.env.test.local` (optional)
Local overrides for sensitive values or developer-specific settings. This file should be added to `.gitignore` and never committed.
## Usage
### In Test Files
```typescript
import { getTestConfig, getTestTimeout, isFeatureEnabled } from '@tests/setup/test-env';
describe('My Test Suite', () => {
const config = getTestConfig();
it('should run with proper timeout', () => {
// Test code here
}, { timeout: getTestTimeout('integration') });
it.skipIf(!isFeatureEnabled('mockExternalApis'))('should mock external APIs', () => {
// This test only runs if FEATURE_MOCK_EXTERNAL_APIS=true
});
});
```
### In Setup Files
```typescript
import { loadTestEnvironment } from './test-env';
// Load test environment at the start of your setup
loadTestEnvironment();
```
## Environment Variables
### Core Configuration
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `NODE_ENV` | string | `test` | Must be 'test' for test execution |
| `MCP_MODE` | string | `test` | MCP operation mode |
| `TEST_ENVIRONMENT` | boolean | `true` | Indicates test environment |
### Database Configuration
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `NODE_DB_PATH` | string | `:memory:` | SQLite database path (use :memory: for in-memory) |
| `REBUILD_ON_START` | boolean | `false` | Rebuild database on startup |
| `TEST_SEED_DATABASE` | boolean | `true` | Seed database with test data |
| `TEST_SEED_TEMPLATES` | boolean | `true` | Seed templates in database |
### API Configuration
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `N8N_API_URL` | string | `http://localhost:3001/mock-api` | Mock API endpoint |
| `N8N_API_KEY` | string | `test-api-key` | API key for testing |
| `N8N_WEBHOOK_BASE_URL` | string | `http://localhost:3001/webhook` | Webhook base URL |
| `N8N_WEBHOOK_TEST_URL` | string | `http://localhost:3001/webhook-test` | Webhook test URL |
### Test Execution
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `TEST_TIMEOUT_UNIT` | number | `5000` | Unit test timeout (ms) |
| `TEST_TIMEOUT_INTEGRATION` | number | `15000` | Integration test timeout (ms) |
| `TEST_TIMEOUT_E2E` | number | `30000` | E2E test timeout (ms) |
| `TEST_TIMEOUT_GLOBAL` | number | `60000` | Global test timeout (ms) |
| `TEST_RETRY_ATTEMPTS` | number | `2` | Number of retry attempts |
| `TEST_RETRY_DELAY` | number | `1000` | Delay between retries (ms) |
| `TEST_PARALLEL` | boolean | `true` | Run tests in parallel |
| `TEST_MAX_WORKERS` | number | `4` | Maximum parallel workers |
### Feature Flags
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `FEATURE_TEST_COVERAGE` | boolean | `true` | Enable code coverage |
| `FEATURE_TEST_SCREENSHOTS` | boolean | `false` | Capture screenshots on failure |
| `FEATURE_TEST_VIDEOS` | boolean | `false` | Record test videos |
| `FEATURE_TEST_TRACE` | boolean | `false` | Enable trace recording |
| `FEATURE_MOCK_EXTERNAL_APIS` | boolean | `true` | Mock external API calls |
| `FEATURE_USE_TEST_CONTAINERS` | boolean | `false` | Use test containers for services |
### Logging
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `LOG_LEVEL` | string | `error` | Log level (debug, info, warn, error) |
| `DEBUG` | boolean | `false` | Enable debug logging |
| `TEST_LOG_VERBOSE` | boolean | `false` | Verbose test logging |
| `ERROR_SHOW_STACK` | boolean | `true` | Show error stack traces |
| `ERROR_SHOW_DETAILS` | boolean | `true` | Show detailed error info |
### Performance Thresholds
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `PERF_THRESHOLD_API_RESPONSE` | number | `100` | API response time threshold (ms) |
| `PERF_THRESHOLD_DB_QUERY` | number | `50` | Database query threshold (ms) |
| `PERF_THRESHOLD_NODE_PARSE` | number | `200` | Node parsing threshold (ms) |
### Mock Services
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `MSW_ENABLED` | boolean | `true` | Enable Mock Service Worker |
| `MSW_API_DELAY` | number | `0` | API response delay (ms) |
| `REDIS_MOCK_ENABLED` | boolean | `true` | Enable Redis mock |
| `REDIS_MOCK_PORT` | number | `6380` | Redis mock port |
| `ELASTICSEARCH_MOCK_ENABLED` | boolean | `false` | Enable Elasticsearch mock |
| `ELASTICSEARCH_MOCK_PORT` | number | `9201` | Elasticsearch mock port |
### Paths
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `TEST_FIXTURES_PATH` | string | `./tests/fixtures` | Test fixtures directory |
| `TEST_DATA_PATH` | string | `./tests/data` | Test data directory |
| `TEST_SNAPSHOTS_PATH` | string | `./tests/__snapshots__` | Snapshots directory |
### Other Settings
| Variable | Type | Default | Description |
|----------|------|---------|-------------|
| `CACHE_TTL` | number | `0` | Cache TTL (0 = disabled) |
| `CACHE_ENABLED` | boolean | `false` | Enable caching |
| `RATE_LIMIT_MAX` | number | `0` | Rate limit max requests (0 = disabled) |
| `RATE_LIMIT_WINDOW` | number | `0` | Rate limit window (ms) |
| `TEST_CLEANUP_ENABLED` | boolean | `true` | Auto cleanup after tests |
| `TEST_CLEANUP_ON_FAILURE` | boolean | `false` | Cleanup on test failure |
| `NETWORK_TIMEOUT` | number | `5000` | Network request timeout (ms) |
| `NETWORK_RETRY_COUNT` | number | `0` | Network retry attempts |
| `TEST_MEMORY_LIMIT` | number | `512` | Memory limit (MB) |
## Best Practices
1. **Never commit sensitive values**: Use `.env.test.local` for API keys, tokens, etc.
2. **Use type-safe config access**: Always use `getTestConfig()` instead of accessing `process.env` directly.
3. **Set appropriate timeouts**: Use `getTestTimeout()` with the correct test type.
4. **Check feature flags**: Use `isFeatureEnabled()` to conditionally run tests.
5. **Reset environment when needed**: Use `resetTestEnvironment()` for test isolation.
## Examples
### Running Tests with Custom Configuration
```bash
# Run with verbose logging
DEBUG=true npm test
# Run with longer timeouts
TEST_TIMEOUT_UNIT=10000 npm test
# Run without mocks
FEATURE_MOCK_EXTERNAL_APIS=false npm test
# Run with test containers
FEATURE_USE_TEST_CONTAINERS=true npm test
```
### Creating Test-Specific Configuration
```typescript
// tests/unit/my-test.spec.ts
import { describe, it, expect, beforeAll } from 'vitest';
import { getTestConfig } from '@tests/setup/test-env';
describe('My Feature', () => {
const config = getTestConfig();
beforeAll(() => {
// Use test configuration
if (config.features.mockExternalApis) {
// Set up mocks
}
});
it('should respect performance thresholds', async () => {
const start = performance.now();
// Your test code
const duration = performance.now() - start;
expect(duration).toBeLessThan(config.performance.thresholds.apiResponse);
});
});
```
## Troubleshooting
### Tests failing with "Missing required test environment variables"
Ensure `.env.test` exists and contains all required variables. Run:
```bash
cp .env.test.example .env.test
```
### Environment variables not loading
1. Check that `loadTestEnvironment()` is called in your setup
2. Verify file paths are correct
3. Ensure `.env.test` is in the project root
### Type errors with process.env
Make sure to include the type definitions:
```typescript
/// <reference types="../types/test-env" />
```
Or add to your `tsconfig.json`:
```json
{
"compilerOptions": {
"types": ["./types/test-env"]
}
}
```

View File

@@ -1,4 +1,11 @@
import { beforeEach, afterEach, vi } from 'vitest';
import { loadTestEnvironment, getTestConfig, getTestTimeout } from './test-env';
// Load test environment configuration
loadTestEnvironment();
// Get test configuration
const testConfig = getTestConfig();
// Reset mocks between tests
beforeEach(() => {
@@ -8,19 +15,40 @@ beforeEach(() => {
// Clean up after each test
afterEach(() => {
vi.restoreAllMocks();
// Perform cleanup if enabled
if (testConfig.cleanup.enabled) {
// Add cleanup logic here if needed
}
});
// Global test timeout
vi.setConfig({ testTimeout: 10000 });
// Global test timeout from configuration
vi.setConfig({ testTimeout: getTestTimeout('global') });
// Silence console during tests unless DEBUG=true
if (process.env.DEBUG !== 'true') {
// Configure console output based on test configuration
if (!testConfig.logging.debug) {
global.console = {
...console,
log: vi.fn(),
debug: vi.fn(),
info: vi.fn(),
warn: vi.fn(),
error: vi.fn(),
warn: testConfig.logging.level === 'error' ? vi.fn() : console.warn,
error: console.error, // Always show errors
};
}
}
// Set up performance monitoring if enabled
if (testConfig.performance) {
global.performance = global.performance || {
now: () => Date.now(),
mark: vi.fn(),
measure: vi.fn(),
getEntriesByName: vi.fn(() => []),
getEntriesByType: vi.fn(() => []),
clearMarks: vi.fn(),
clearMeasures: vi.fn(),
} as any;
}
// Export test configuration for use in tests
export { testConfig, getTestTimeout, getTestConfig };

342
tests/setup/test-env.ts Normal file
View File

@@ -0,0 +1,342 @@
/**
* Test Environment Configuration Loader
*
* This module handles loading and validating test environment variables
* with type safety and default values.
*/
import * as dotenv from 'dotenv';
import * as path from 'path';
import { existsSync } from 'fs';
// Load test environment variables
export function loadTestEnvironment(): void {
// Load base test environment
const testEnvPath = path.resolve(process.cwd(), '.env.test');
if (existsSync(testEnvPath)) {
dotenv.config({ path: testEnvPath });
}
// Load local test overrides (for sensitive values)
const localEnvPath = path.resolve(process.cwd(), '.env.test.local');
if (existsSync(localEnvPath)) {
dotenv.config({ path: localEnvPath, override: true });
}
// Set test-specific defaults
setTestDefaults();
// Validate required environment variables
validateTestEnvironment();
}
/**
* Set default values for test environment variables
*/
function setTestDefaults(): void {
// Ensure we're in test mode
process.env.NODE_ENV = 'test';
process.env.TEST_ENVIRONMENT = 'true';
// Set defaults if not already set
const defaults: Record<string, string> = {
// Database
NODE_DB_PATH: ':memory:',
REBUILD_ON_START: 'false',
// API
N8N_API_URL: 'http://localhost:3001/mock-api',
N8N_API_KEY: 'test-api-key',
// Server
PORT: '3001',
HOST: '127.0.0.1',
// Logging
LOG_LEVEL: 'error',
DEBUG: 'false',
TEST_LOG_VERBOSE: 'false',
// Timeouts
TEST_TIMEOUT_UNIT: '5000',
TEST_TIMEOUT_INTEGRATION: '15000',
TEST_TIMEOUT_E2E: '30000',
TEST_TIMEOUT_GLOBAL: '60000',
// Test execution
TEST_RETRY_ATTEMPTS: '2',
TEST_RETRY_DELAY: '1000',
TEST_PARALLEL: 'true',
TEST_MAX_WORKERS: '4',
// Features
FEATURE_MOCK_EXTERNAL_APIS: 'true',
FEATURE_USE_TEST_CONTAINERS: 'false',
MSW_ENABLED: 'true',
MSW_API_DELAY: '0',
// Paths
TEST_FIXTURES_PATH: './tests/fixtures',
TEST_DATA_PATH: './tests/data',
TEST_SNAPSHOTS_PATH: './tests/__snapshots__',
// Performance
PERF_THRESHOLD_API_RESPONSE: '100',
PERF_THRESHOLD_DB_QUERY: '50',
PERF_THRESHOLD_NODE_PARSE: '200',
// Caching
CACHE_TTL: '0',
CACHE_ENABLED: 'false',
// Rate limiting
RATE_LIMIT_MAX: '0',
RATE_LIMIT_WINDOW: '0',
// Error handling
ERROR_SHOW_STACK: 'true',
ERROR_SHOW_DETAILS: 'true',
// Cleanup
TEST_CLEANUP_ENABLED: 'true',
TEST_CLEANUP_ON_FAILURE: 'false',
// Database seeding
TEST_SEED_DATABASE: 'true',
TEST_SEED_TEMPLATES: 'true',
// Network
NETWORK_TIMEOUT: '5000',
NETWORK_RETRY_COUNT: '0',
// Memory
TEST_MEMORY_LIMIT: '512',
// Coverage
COVERAGE_DIR: './coverage',
COVERAGE_REPORTER: 'lcov,html,text-summary'
};
for (const [key, value] of Object.entries(defaults)) {
if (!process.env[key]) {
process.env[key] = value;
}
}
}
/**
* Validate that required environment variables are set
*/
function validateTestEnvironment(): void {
const required = [
'NODE_ENV',
'NODE_DB_PATH',
'N8N_API_URL',
'N8N_API_KEY'
];
const missing = required.filter(key => !process.env[key]);
if (missing.length > 0) {
throw new Error(
`Missing required test environment variables: ${missing.join(', ')}\n` +
'Please ensure .env.test is properly configured.'
);
}
// Validate NODE_ENV is set to test
if (process.env.NODE_ENV !== 'test') {
throw new Error(
'NODE_ENV must be set to "test" when running tests.\n' +
'This prevents accidental execution against production systems.'
);
}
}
/**
* Get typed test environment configuration
*/
export function getTestConfig() {
return {
// Environment
nodeEnv: process.env.NODE_ENV!,
isTest: process.env.TEST_ENVIRONMENT === 'true',
// Database
database: {
path: process.env.NODE_DB_PATH!,
rebuildOnStart: process.env.REBUILD_ON_START === 'true',
seedData: process.env.TEST_SEED_DATABASE === 'true',
seedTemplates: process.env.TEST_SEED_TEMPLATES === 'true'
},
// API
api: {
url: process.env.N8N_API_URL!,
key: process.env.N8N_API_KEY!,
webhookBaseUrl: process.env.N8N_WEBHOOK_BASE_URL,
webhookTestUrl: process.env.N8N_WEBHOOK_TEST_URL
},
// Server
server: {
port: parseInt(process.env.PORT || '3001', 10),
host: process.env.HOST || '127.0.0.1',
corsOrigin: process.env.CORS_ORIGIN?.split(',') || []
},
// Authentication
auth: {
token: process.env.AUTH_TOKEN,
mcpToken: process.env.MCP_AUTH_TOKEN
},
// Logging
logging: {
level: process.env.LOG_LEVEL || 'error',
debug: process.env.DEBUG === 'true',
verbose: process.env.TEST_LOG_VERBOSE === 'true',
showStack: process.env.ERROR_SHOW_STACK === 'true',
showDetails: process.env.ERROR_SHOW_DETAILS === 'true'
},
// Test execution
execution: {
timeouts: {
unit: parseInt(process.env.TEST_TIMEOUT_UNIT || '5000', 10),
integration: parseInt(process.env.TEST_TIMEOUT_INTEGRATION || '15000', 10),
e2e: parseInt(process.env.TEST_TIMEOUT_E2E || '30000', 10),
global: parseInt(process.env.TEST_TIMEOUT_GLOBAL || '60000', 10)
},
retry: {
attempts: parseInt(process.env.TEST_RETRY_ATTEMPTS || '2', 10),
delay: parseInt(process.env.TEST_RETRY_DELAY || '1000', 10)
},
parallel: process.env.TEST_PARALLEL === 'true',
maxWorkers: parseInt(process.env.TEST_MAX_WORKERS || '4', 10)
},
// Features
features: {
coverage: process.env.FEATURE_TEST_COVERAGE === 'true',
screenshots: process.env.FEATURE_TEST_SCREENSHOTS === 'true',
videos: process.env.FEATURE_TEST_VIDEOS === 'true',
trace: process.env.FEATURE_TEST_TRACE === 'true',
mockExternalApis: process.env.FEATURE_MOCK_EXTERNAL_APIS === 'true',
useTestContainers: process.env.FEATURE_USE_TEST_CONTAINERS === 'true'
},
// Mocking
mocking: {
msw: {
enabled: process.env.MSW_ENABLED === 'true',
apiDelay: parseInt(process.env.MSW_API_DELAY || '0', 10)
},
redis: {
enabled: process.env.REDIS_MOCK_ENABLED === 'true',
port: parseInt(process.env.REDIS_MOCK_PORT || '6380', 10)
},
elasticsearch: {
enabled: process.env.ELASTICSEARCH_MOCK_ENABLED === 'true',
port: parseInt(process.env.ELASTICSEARCH_MOCK_PORT || '9201', 10)
}
},
// Paths
paths: {
fixtures: process.env.TEST_FIXTURES_PATH || './tests/fixtures',
data: process.env.TEST_DATA_PATH || './tests/data',
snapshots: process.env.TEST_SNAPSHOTS_PATH || './tests/__snapshots__'
},
// Performance
performance: {
thresholds: {
apiResponse: parseInt(process.env.PERF_THRESHOLD_API_RESPONSE || '100', 10),
dbQuery: parseInt(process.env.PERF_THRESHOLD_DB_QUERY || '50', 10),
nodeParse: parseInt(process.env.PERF_THRESHOLD_NODE_PARSE || '200', 10)
}
},
// Rate limiting
rateLimiting: {
max: parseInt(process.env.RATE_LIMIT_MAX || '0', 10),
window: parseInt(process.env.RATE_LIMIT_WINDOW || '0', 10)
},
// Caching
cache: {
enabled: process.env.CACHE_ENABLED === 'true',
ttl: parseInt(process.env.CACHE_TTL || '0', 10)
},
// Cleanup
cleanup: {
enabled: process.env.TEST_CLEANUP_ENABLED === 'true',
onFailure: process.env.TEST_CLEANUP_ON_FAILURE === 'true'
},
// Network
network: {
timeout: parseInt(process.env.NETWORK_TIMEOUT || '5000', 10),
retryCount: parseInt(process.env.NETWORK_RETRY_COUNT || '0', 10)
},
// Memory
memory: {
limit: parseInt(process.env.TEST_MEMORY_LIMIT || '512', 10)
},
// Coverage
coverage: {
dir: process.env.COVERAGE_DIR || './coverage',
reporters: (process.env.COVERAGE_REPORTER || 'lcov,html,text-summary').split(',')
}
};
}
// Export type for the test configuration
export type TestConfig = ReturnType<typeof getTestConfig>;
/**
* Helper to check if we're in test mode
*/
export function isTestMode(): boolean {
return process.env.NODE_ENV === 'test' || process.env.TEST_ENVIRONMENT === 'true';
}
/**
* Helper to get timeout for specific test type
*/
export function getTestTimeout(type: 'unit' | 'integration' | 'e2e' | 'global' = 'unit'): number {
const config = getTestConfig();
return config.execution.timeouts[type];
}
/**
* Helper to check if a feature is enabled
*/
export function isFeatureEnabled(feature: keyof TestConfig['features']): boolean {
const config = getTestConfig();
return config.features[feature];
}
/**
* Reset environment to defaults (useful for test isolation)
*/
export function resetTestEnvironment(): void {
// Clear all test-specific environment variables
const testKeys = Object.keys(process.env).filter(key =>
key.startsWith('TEST_') ||
key.startsWith('FEATURE_') ||
key.startsWith('MSW_') ||
key.startsWith('PERF_')
);
testKeys.forEach(key => {
delete process.env[key];
});
// Reload defaults
loadTestEnvironment();
}