mirror of
https://github.com/AutoMaker-Org/automaker.git
synced 2026-03-17 10:03:08 +00:00
* fix(copilot): correct tool.execution_complete event handling The CopilotProvider was using incorrect event type and data structure for tool execution completion events from the @github/copilot-sdk, causing tool call outputs to be empty. Changes: - Update event type from 'tool.execution_end' to 'tool.execution_complete' - Fix data structure to use nested result.content instead of flat result - Fix error structure to use error.message instead of flat error - Add success field to match SDK event structure - Add tests for empty and missing result handling This aligns with the official @github/copilot-sdk v0.1.16 types defined in session-events.d.ts. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * test(copilot): add edge case test for error with code field Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * refactor(copilot): improve error handling and code quality Code review improvements: - Extract magic string '[ERROR]' to TOOL_ERROR_PREFIX constant - Add null-safe error handling with direct error variable assignment - Include error codes in error messages for better debugging - Add JSDoc documentation for tool.execution_complete handler - Update tests to verify error codes are displayed - Add missing tool_use_id assertion in error test These changes improve: - Code maintainability (no magic strings) - Debugging experience (error codes now visible) - Type safety (explicit null checks) - Test coverage (verify error code formatting) Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * Changes from fix/bug-fixes-1-0 * test(copilot): add edge case test for error with code field Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * Changes from fix/bug-fixes-1-0 * fix: Handle detached HEAD state in worktree discovery and recovery * fix: Remove unused isDevServerStarting prop and md: breakpoint classes * fix: Add missing dependency and sanitize persisted cache data * feat: Ensure NODE_ENV is set to test in vitest configs * feat: Configure Playwright to run only E2E tests * fix: Improve PR tracking and dev server lifecycle management * feat: Add settings-based defaults for planning mode, model config, and custom providers. Fixes #816 * feat: Add worktree and branch selector to graph view * fix: Add timeout and error handling for worktree HEAD ref resolution * fix: use absolute icon path and place icon outside asar on Linux The hicolor icon theme index only lists sizes up to 512x512, so an icon installed only at 1024x1024 is invisible to GNOME/KDE's theme resolver, causing both the app launcher and taskbar to show a generic icon. Additionally, BrowserWindow.icon cannot be read by the window manager when the file is inside app.asar. - extraResources: copy logo_larger.png to resources/ (outside asar) so it lands at /opt/Automaker/resources/logo_larger.png on install - linux.desktop.Icon: set to the absolute resources path, bypassing the hicolor theme lookup and its size constraints entirely - icon-manager.ts: on Linux production use process.resourcesPath so BrowserWindow receives a real filesystem path the WM can read directly Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: use linux.desktop.entry for custom desktop Icon field electron-builder v26 rejects arbitrary keys in linux.desktop — the correct schema wraps custom .desktop overrides inside desktop.entry. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: set desktop name on Linux so taskbar uses the correct app icon Without app.setDesktopName(), the window manager cannot associate the running Electron process with automaker.desktop. GNOME/KDE fall back to _NET_WM_ICON which defaults to Electron's own bundled icon. Calling app.setDesktopName('automaker.desktop') before any window is created sets the _GTK_APPLICATION_ID hint and XDG app_id so the WM picks up the desktop entry's Icon for the taskbar. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * Fix: memory and context views mobile friendly (#818) * Changes from fix/memory-and-context-mobile-friendly * fix: Improve file extension detection and add path traversal protection * refactor: Extract file extension utilities and add path traversal guards Code review improvements: - Extract isMarkdownFilename and isImageFilename to shared image-utils.ts - Remove duplicated code from context-view.tsx and memory-view.tsx - Add path traversal guard for context fixture utilities (matching memory) - Add 7 new tests for context fixture path traversal protection - Total 61 tests pass Addresses code review feedback from PR #813 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * test: Add e2e tests for profiles crud and board background persistence * Update apps/ui/playwright.config.ts Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> * fix: Add robust test navigation handling and file filtering * fix: Format NODE_OPTIONS configuration on single line * test: Update profiles and board background persistence tests * test: Replace iPhone 13 Pro with Pixel 5 for mobile test consistency * Update apps/ui/src/components/views/context-view.tsx Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> * chore: Remove test project directory * feat: Filter context files by type and improve mobile menu visibility --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> * fix: Improve test reliability and localhost handling * chore: Use explicit TEST_USE_EXTERNAL_BACKEND env var for server cleanup * feat: Add E2E/CI mock mode for provider factory and auth verification * feat: Add remoteBranch parameter to pull and rebase operations * chore: Enhance E2E testing setup with worker isolation and auth state management - Updated .gitignore to include worker-specific test fixtures. - Modified e2e-tests.yml to implement test sharding for improved CI performance. - Refactored global setup to authenticate once and save session state for reuse across tests. - Introduced worker-isolated fixture paths to prevent conflicts during parallel test execution. - Improved test navigation and loading handling for better reliability. - Updated various test files to utilize new auth state management and fixture paths. * fix: Update Playwright configuration and improve test reliability - Increased the number of workers in Playwright configuration for better parallelism in CI environments. - Enhanced the board background persistence test to ensure dropdown stability by waiting for the list to populate before interaction, improving test reliability. * chore: Simplify E2E test configuration and enhance mock implementations - Updated e2e-tests.yml to run tests in a single shard for streamlined CI execution. - Enhanced unit tests for worktree list handling by introducing a mock for execGitCommand, improving test reliability and coverage. - Refactored setup functions to better manage command mocks for git operations in tests. - Improved error handling in mkdirSafe function to account for undefined stats in certain environments. * refactor: Improve test configurations and enhance error handling - Updated Playwright configuration to clear VITE_SERVER_URL, ensuring the frontend uses the Vite proxy and preventing cookie domain mismatches. - Enhanced MergeRebaseDialog logic to normalize selectedBranch for better handling of various ref formats. - Improved global setup with a more robust backend health check, throwing an error if the backend is not healthy after retries. - Refactored project creation tests to handle file existence checks more reliably. - Added error handling for missing E2E source fixtures to guide setup process. - Enhanced memory navigation to handle sandbox dialog visibility more effectively. * refactor: Enhance Git command execution and improve test configurations - Updated Git command execution to merge environment paths correctly, ensuring proper command execution context. - Refactored the Git initialization process to handle errors more gracefully and ensure user configuration is set before creating the initial commit. - Improved test configurations by updating Playwright test identifiers for better clarity and consistency across different project states. - Enhanced cleanup functions in tests to handle directory removal more robustly, preventing errors during test execution. * fix: Resolve React hooks errors from duplicate instances in dependency tree * style: Format alias configuration for improved readability --------- Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> Co-authored-by: DhanushSantosh <dhanushsantoshs05@gmail.com> Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
364 lines
12 KiB
TypeScript
364 lines
12 KiB
TypeScript
/**
|
|
* Atomic file writing utilities for JSON data
|
|
*
|
|
* Provides atomic write operations using temp-file + rename pattern,
|
|
* ensuring data integrity even during crashes or power failures.
|
|
*/
|
|
|
|
import { secureFs } from '@automaker/platform';
|
|
import path from 'path';
|
|
import crypto from 'crypto';
|
|
import { createLogger } from './logger.js';
|
|
import { mkdirSafe } from './fs-utils.js';
|
|
|
|
const logger = createLogger('AtomicWriter');
|
|
|
|
/** Default maximum number of backup files to keep for crash recovery */
|
|
export const DEFAULT_BACKUP_COUNT = 3;
|
|
|
|
/**
|
|
* Options for atomic write operations
|
|
*/
|
|
export interface AtomicWriteOptions {
|
|
/** Number of spaces for JSON indentation (default: 2) */
|
|
indent?: number;
|
|
/** Create parent directories if they don't exist (default: false) */
|
|
createDirs?: boolean;
|
|
/** Number of backup files to keep (0 = no backups, default: 0). When > 0, rotates .bak1, .bak2, etc. */
|
|
backupCount?: number;
|
|
}
|
|
|
|
/**
|
|
* Rotate backup files (.bak1 -> .bak2 -> .bak3, oldest is deleted)
|
|
* and create a new backup from the current file.
|
|
*
|
|
* @param filePath - Absolute path to the file being backed up
|
|
* @param maxBackups - Maximum number of backup files to keep
|
|
*/
|
|
export async function rotateBackups(
|
|
filePath: string,
|
|
maxBackups: number = DEFAULT_BACKUP_COUNT
|
|
): Promise<void> {
|
|
// Check if the source file exists before attempting backup
|
|
try {
|
|
await secureFs.access(filePath);
|
|
} catch {
|
|
// No existing file to backup
|
|
return;
|
|
}
|
|
|
|
// Rotate existing backups: .bak3 is deleted, .bak2 -> .bak3, .bak1 -> .bak2
|
|
for (let i = maxBackups; i >= 1; i--) {
|
|
const currentBackup = `${filePath}.bak${i}`;
|
|
const nextBackup = `${filePath}.bak${i + 1}`;
|
|
|
|
try {
|
|
if (i === maxBackups) {
|
|
// Delete the oldest backup
|
|
await secureFs.unlink(currentBackup);
|
|
} else {
|
|
// Rename current backup to next slot
|
|
await secureFs.rename(currentBackup, nextBackup);
|
|
}
|
|
} catch {
|
|
// Ignore errors - backup file may not exist
|
|
}
|
|
}
|
|
|
|
// Copy current file to .bak1
|
|
try {
|
|
await secureFs.copyFile(filePath, `${filePath}.bak1`);
|
|
} catch (error) {
|
|
logger.warn(`Failed to create backup of ${filePath}:`, error);
|
|
// Continue with write even if backup fails
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Atomically write JSON data to a file.
|
|
*
|
|
* Uses the temp-file + rename pattern for atomicity:
|
|
* 1. Writes data to a temporary file
|
|
* 2. Atomically renames temp file to target path
|
|
* 3. Cleans up temp file on error
|
|
*
|
|
* @param filePath - Absolute path to the target file
|
|
* @param data - Data to serialize as JSON
|
|
* @param options - Optional write options
|
|
* @throws Error if write fails (temp file is cleaned up)
|
|
*
|
|
* @example
|
|
* ```typescript
|
|
* await atomicWriteJson('/path/to/config.json', { key: 'value' });
|
|
* await atomicWriteJson('/path/to/data.json', data, { indent: 4, createDirs: true });
|
|
* ```
|
|
*/
|
|
export async function atomicWriteJson<T>(
|
|
filePath: string,
|
|
data: T,
|
|
options: AtomicWriteOptions = {}
|
|
): Promise<void> {
|
|
const { indent = 2, backupCount = 0 } = options;
|
|
const resolvedPath = path.resolve(filePath);
|
|
// Use timestamp + random suffix to ensure uniqueness even for concurrent writes
|
|
const uniqueSuffix = `${Date.now()}.${crypto.randomBytes(4).toString('hex')}`;
|
|
const tempPath = `${resolvedPath}.tmp.${uniqueSuffix}`;
|
|
|
|
// Always ensure parent directories exist before writing the temp file
|
|
const dirPath = path.dirname(resolvedPath);
|
|
await mkdirSafe(dirPath);
|
|
|
|
const content = JSON.stringify(data, null, indent);
|
|
|
|
try {
|
|
// Rotate backups before writing (if backups are enabled)
|
|
if (backupCount > 0) {
|
|
await rotateBackups(resolvedPath, backupCount);
|
|
}
|
|
|
|
await secureFs.writeFile(tempPath, content, 'utf-8');
|
|
await secureFs.rename(tempPath, resolvedPath);
|
|
} catch (error) {
|
|
// Clean up temp file if it exists
|
|
try {
|
|
await secureFs.unlink(tempPath);
|
|
} catch {
|
|
// Ignore cleanup errors - best effort
|
|
}
|
|
logger.error(`Failed to atomically write to ${resolvedPath}:`, error);
|
|
throw error;
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Safely read JSON from a file with fallback to default value.
|
|
*
|
|
* Returns the default value if:
|
|
* - File doesn't exist (ENOENT)
|
|
* - File content is invalid JSON
|
|
*
|
|
* @param filePath - Absolute path to the file
|
|
* @param defaultValue - Value to return if file doesn't exist or is invalid
|
|
* @returns Parsed JSON data or default value
|
|
*
|
|
* @example
|
|
* ```typescript
|
|
* const config = await readJsonFile('/path/to/config.json', { version: 1 });
|
|
* ```
|
|
*/
|
|
export async function readJsonFile<T>(filePath: string, defaultValue: T): Promise<T> {
|
|
const resolvedPath = path.resolve(filePath);
|
|
|
|
try {
|
|
const content = (await secureFs.readFile(resolvedPath, 'utf-8')) as string;
|
|
return JSON.parse(content) as T;
|
|
} catch (error) {
|
|
const nodeError = error as NodeJS.ErrnoException;
|
|
if (nodeError.code === 'ENOENT') {
|
|
return defaultValue;
|
|
}
|
|
logger.error(`Error reading JSON from ${resolvedPath}:`, error);
|
|
return defaultValue;
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Atomically update a JSON file by reading, transforming, and writing.
|
|
*
|
|
* Provides a safe read-modify-write pattern:
|
|
* 1. Reads existing file (or uses default)
|
|
* 2. Applies updater function
|
|
* 3. Atomically writes result
|
|
*
|
|
* @param filePath - Absolute path to the file
|
|
* @param defaultValue - Default value if file doesn't exist
|
|
* @param updater - Function that transforms the data
|
|
* @param options - Optional write options
|
|
*
|
|
* @example
|
|
* ```typescript
|
|
* await updateJsonAtomically(
|
|
* '/path/to/counter.json',
|
|
* { count: 0 },
|
|
* (data) => ({ ...data, count: data.count + 1 })
|
|
* );
|
|
* ```
|
|
*/
|
|
export async function updateJsonAtomically<T>(
|
|
filePath: string,
|
|
defaultValue: T,
|
|
updater: (current: T) => T | Promise<T>,
|
|
options: AtomicWriteOptions = {}
|
|
): Promise<void> {
|
|
const current = await readJsonFile(filePath, defaultValue);
|
|
const updated = await updater(current);
|
|
await atomicWriteJson(filePath, updated, options);
|
|
}
|
|
|
|
/**
|
|
* Result of a JSON read operation with recovery information
|
|
*/
|
|
export interface ReadJsonRecoveryResult<T> {
|
|
/** The data that was successfully read */
|
|
data: T;
|
|
/** Whether recovery was needed (main file was corrupted or missing) */
|
|
recovered: boolean;
|
|
/** Source of the data: 'main', 'backup', 'temp', or 'default' */
|
|
source: 'main' | 'backup' | 'temp' | 'default';
|
|
/** Error message if the main file had an issue */
|
|
error?: string;
|
|
}
|
|
|
|
/**
|
|
* Options for readJsonWithRecovery
|
|
*/
|
|
export interface ReadJsonRecoveryOptions {
|
|
/** Maximum number of backup files to check (.bak1, .bak2, etc.) Default: 3 */
|
|
maxBackups?: number;
|
|
/** Whether to automatically restore main file from backup when corrupted. Default: true */
|
|
autoRestore?: boolean;
|
|
}
|
|
|
|
/**
|
|
* Log a warning if recovery was needed (from backup or temp file).
|
|
*
|
|
* Use this helper to reduce duplicate logging code when using readJsonWithRecovery.
|
|
*
|
|
* @param result - The result from readJsonWithRecovery
|
|
* @param identifier - A human-readable identifier for the file being recovered (e.g., "Feature abc123")
|
|
* @param loggerInstance - Optional logger instance to use (defaults to AtomicWriter logger)
|
|
*
|
|
* @example
|
|
* ```typescript
|
|
* const result = await readJsonWithRecovery(featurePath, null);
|
|
* logRecoveryWarning(result, `Feature ${featureId}`);
|
|
* ```
|
|
*/
|
|
export function logRecoveryWarning<T>(
|
|
result: ReadJsonRecoveryResult<T>,
|
|
identifier: string,
|
|
loggerInstance: { warn: (msg: string, ...args: unknown[]) => void } = logger
|
|
): void {
|
|
if (result.recovered && result.source !== 'default') {
|
|
loggerInstance.warn(`${identifier} was recovered from ${result.source}: ${result.error}`);
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Read JSON file with automatic recovery from backups.
|
|
*
|
|
* This function attempts to read a JSON file with fallback to backups:
|
|
* 1. Try to read the main file
|
|
* 2. If corrupted, check for temp files (.tmp.*) that might have valid data
|
|
* 3. If no valid temp file, try backup files (.bak1, .bak2, .bak3)
|
|
* 4. If all fail, return the default value
|
|
*
|
|
* Optionally restores the main file from a valid backup (autoRestore: true).
|
|
*
|
|
* @param filePath - Absolute path to the file
|
|
* @param defaultValue - Value to return if no valid data found
|
|
* @param options - Recovery options
|
|
* @returns Result containing the data and recovery information
|
|
*
|
|
* @example
|
|
* ```typescript
|
|
* const result = await readJsonWithRecovery('/path/to/config.json', { version: 1 });
|
|
* if (result.recovered) {
|
|
* console.log(`Recovered from ${result.source}: ${result.error}`);
|
|
* }
|
|
* const config = result.data;
|
|
* ```
|
|
*/
|
|
export async function readJsonWithRecovery<T>(
|
|
filePath: string,
|
|
defaultValue: T,
|
|
options: ReadJsonRecoveryOptions = {}
|
|
): Promise<ReadJsonRecoveryResult<T>> {
|
|
const { maxBackups = 3, autoRestore = true } = options;
|
|
const resolvedPath = path.resolve(filePath);
|
|
const dirPath = path.dirname(resolvedPath);
|
|
const fileName = path.basename(resolvedPath);
|
|
|
|
// Try to read the main file first
|
|
try {
|
|
const content = (await secureFs.readFile(resolvedPath, 'utf-8')) as string;
|
|
const data = JSON.parse(content) as T;
|
|
return { data, recovered: false, source: 'main' };
|
|
} catch (mainError) {
|
|
const nodeError = mainError as NodeJS.ErrnoException;
|
|
const errorMessage =
|
|
nodeError.code === 'ENOENT'
|
|
? 'File does not exist'
|
|
: `Failed to parse: ${mainError instanceof Error ? mainError.message : String(mainError)}`;
|
|
|
|
// If file doesn't exist, check for temp files or backups
|
|
logger.warn(`Main file ${resolvedPath} unavailable: ${errorMessage}`);
|
|
|
|
// Try to find and recover from temp files first (in case of interrupted write)
|
|
try {
|
|
const files = (await secureFs.readdir(dirPath)) as string[];
|
|
const tempFiles = files
|
|
.filter((f: string) => f.startsWith(`${fileName}.tmp.`))
|
|
.sort()
|
|
.reverse(); // Most recent first
|
|
|
|
for (const tempFile of tempFiles) {
|
|
const tempPath = path.join(dirPath, tempFile);
|
|
try {
|
|
const content = (await secureFs.readFile(tempPath, 'utf-8')) as string;
|
|
const data = JSON.parse(content) as T;
|
|
|
|
logger.info(`Recovered data from temp file: ${tempPath}`);
|
|
|
|
// Optionally restore main file from temp
|
|
if (autoRestore) {
|
|
try {
|
|
await secureFs.rename(tempPath, resolvedPath);
|
|
logger.info(`Restored main file from temp: ${tempPath}`);
|
|
} catch (restoreError) {
|
|
logger.warn(`Failed to restore main file from temp: ${restoreError}`);
|
|
}
|
|
}
|
|
|
|
return { data, recovered: true, source: 'temp', error: errorMessage };
|
|
} catch {
|
|
// This temp file is also corrupted, try next
|
|
continue;
|
|
}
|
|
}
|
|
} catch {
|
|
// Could not read directory, skip temp file check
|
|
}
|
|
|
|
// Try backup files (.bak1, .bak2, .bak3)
|
|
for (let i = 1; i <= maxBackups; i++) {
|
|
const backupPath = `${resolvedPath}.bak${i}`;
|
|
try {
|
|
const content = (await secureFs.readFile(backupPath, 'utf-8')) as string;
|
|
const data = JSON.parse(content) as T;
|
|
|
|
logger.info(`Recovered data from backup: ${backupPath}`);
|
|
|
|
// Optionally restore main file from backup
|
|
if (autoRestore) {
|
|
try {
|
|
await secureFs.copyFile(backupPath, resolvedPath);
|
|
logger.info(`Restored main file from backup: ${backupPath}`);
|
|
} catch (restoreError) {
|
|
logger.warn(`Failed to restore main file from backup: ${restoreError}`);
|
|
}
|
|
}
|
|
|
|
return { data, recovered: true, source: 'backup', error: errorMessage };
|
|
} catch {
|
|
// This backup doesn't exist or is corrupted, try next
|
|
continue;
|
|
}
|
|
}
|
|
|
|
// All recovery attempts failed, return default
|
|
logger.warn(`All recovery attempts failed for ${resolvedPath}, using default value`);
|
|
return { data: defaultValue, recovered: true, source: 'default', error: errorMessage };
|
|
}
|
|
}
|