Files
automaker/libs/utils/tests/atomic-writer.test.ts
gsxdsm 0196911d59 Bug fixes and stability improvements (#815)
* fix(copilot): correct tool.execution_complete event handling

The CopilotProvider was using incorrect event type and data structure
for tool execution completion events from the @github/copilot-sdk,
causing tool call outputs to be empty.

Changes:
- Update event type from 'tool.execution_end' to 'tool.execution_complete'
- Fix data structure to use nested result.content instead of flat result
- Fix error structure to use error.message instead of flat error
- Add success field to match SDK event structure
- Add tests for empty and missing result handling

This aligns with the official @github/copilot-sdk v0.1.16 types
defined in session-events.d.ts.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>

* test(copilot): add edge case test for error with code field

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>

* refactor(copilot): improve error handling and code quality

Code review improvements:
- Extract magic string '[ERROR]' to TOOL_ERROR_PREFIX constant
- Add null-safe error handling with direct error variable assignment
- Include error codes in error messages for better debugging
- Add JSDoc documentation for tool.execution_complete handler
- Update tests to verify error codes are displayed
- Add missing tool_use_id assertion in error test

These changes improve:
- Code maintainability (no magic strings)
- Debugging experience (error codes now visible)
- Type safety (explicit null checks)
- Test coverage (verify error code formatting)

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>

* Changes from fix/bug-fixes-1-0

* test(copilot): add edge case test for error with code field

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>

* Changes from fix/bug-fixes-1-0

* fix: Handle detached HEAD state in worktree discovery and recovery

* fix: Remove unused isDevServerStarting prop and md: breakpoint classes

* fix: Add missing dependency and sanitize persisted cache data

* feat: Ensure NODE_ENV is set to test in vitest configs

* feat: Configure Playwright to run only E2E tests

* fix: Improve PR tracking and dev server lifecycle management

* feat: Add settings-based defaults for planning mode, model config, and custom providers. Fixes #816

* feat: Add worktree and branch selector to graph view

* fix: Add timeout and error handling for worktree HEAD ref resolution

* fix: use absolute icon path and place icon outside asar on Linux

The hicolor icon theme index only lists sizes up to 512x512, so an icon
installed only at 1024x1024 is invisible to GNOME/KDE's theme resolver,
causing both the app launcher and taskbar to show a generic icon.
Additionally, BrowserWindow.icon cannot be read by the window manager
when the file is inside app.asar.

- extraResources: copy logo_larger.png to resources/ (outside asar) so
  it lands at /opt/Automaker/resources/logo_larger.png on install
- linux.desktop.Icon: set to the absolute resources path, bypassing the
  hicolor theme lookup and its size constraints entirely
- icon-manager.ts: on Linux production use process.resourcesPath so
  BrowserWindow receives a real filesystem path the WM can read directly

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: use linux.desktop.entry for custom desktop Icon field

electron-builder v26 rejects arbitrary keys in linux.desktop — the
correct schema wraps custom .desktop overrides inside desktop.entry.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: set desktop name on Linux so taskbar uses the correct app icon

Without app.setDesktopName(), the window manager cannot associate the
running Electron process with automaker.desktop. GNOME/KDE fall back to
_NET_WM_ICON which defaults to Electron's own bundled icon.

Calling app.setDesktopName('automaker.desktop') before any window is
created sets the _GTK_APPLICATION_ID hint and XDG app_id so the WM
picks up the desktop entry's Icon for the taskbar.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* Fix: memory and context views mobile friendly (#818)

* Changes from fix/memory-and-context-mobile-friendly

* fix: Improve file extension detection and add path traversal protection

* refactor: Extract file extension utilities and add path traversal guards

Code review improvements:
- Extract isMarkdownFilename and isImageFilename to shared image-utils.ts
- Remove duplicated code from context-view.tsx and memory-view.tsx
- Add path traversal guard for context fixture utilities (matching memory)
- Add 7 new tests for context fixture path traversal protection
- Total 61 tests pass

Addresses code review feedback from PR #813

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* test: Add e2e tests for profiles crud and board background persistence

* Update apps/ui/playwright.config.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* fix: Add robust test navigation handling and file filtering

* fix: Format NODE_OPTIONS configuration on single line

* test: Update profiles and board background persistence tests

* test: Replace iPhone 13 Pro with Pixel 5 for mobile test consistency

* Update apps/ui/src/components/views/context-view.tsx

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* chore: Remove test project directory

* feat: Filter context files by type and improve mobile menu visibility

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* fix: Improve test reliability and localhost handling

* chore: Use explicit TEST_USE_EXTERNAL_BACKEND env var for server cleanup

* feat: Add E2E/CI mock mode for provider factory and auth verification

* feat: Add remoteBranch parameter to pull and rebase operations

* chore: Enhance E2E testing setup with worker isolation and auth state management

- Updated .gitignore to include worker-specific test fixtures.
- Modified e2e-tests.yml to implement test sharding for improved CI performance.
- Refactored global setup to authenticate once and save session state for reuse across tests.
- Introduced worker-isolated fixture paths to prevent conflicts during parallel test execution.
- Improved test navigation and loading handling for better reliability.
- Updated various test files to utilize new auth state management and fixture paths.

* fix: Update Playwright configuration and improve test reliability

- Increased the number of workers in Playwright configuration for better parallelism in CI environments.
- Enhanced the board background persistence test to ensure dropdown stability by waiting for the list to populate before interaction, improving test reliability.

* chore: Simplify E2E test configuration and enhance mock implementations

- Updated e2e-tests.yml to run tests in a single shard for streamlined CI execution.
- Enhanced unit tests for worktree list handling by introducing a mock for execGitCommand, improving test reliability and coverage.
- Refactored setup functions to better manage command mocks for git operations in tests.
- Improved error handling in mkdirSafe function to account for undefined stats in certain environments.

* refactor: Improve test configurations and enhance error handling

- Updated Playwright configuration to clear VITE_SERVER_URL, ensuring the frontend uses the Vite proxy and preventing cookie domain mismatches.
- Enhanced MergeRebaseDialog logic to normalize selectedBranch for better handling of various ref formats.
- Improved global setup with a more robust backend health check, throwing an error if the backend is not healthy after retries.
- Refactored project creation tests to handle file existence checks more reliably.
- Added error handling for missing E2E source fixtures to guide setup process.
- Enhanced memory navigation to handle sandbox dialog visibility more effectively.

* refactor: Enhance Git command execution and improve test configurations

- Updated Git command execution to merge environment paths correctly, ensuring proper command execution context.
- Refactored the Git initialization process to handle errors more gracefully and ensure user configuration is set before creating the initial commit.
- Improved test configurations by updating Playwright test identifiers for better clarity and consistency across different project states.
- Enhanced cleanup functions in tests to handle directory removal more robustly, preventing errors during test execution.

* fix: Resolve React hooks errors from duplicate instances in dependency tree

* style: Format alias configuration for improved readability

---------

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: DhanushSantosh <dhanushsantoshs05@gmail.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2026-02-27 17:03:29 -08:00

721 lines
27 KiB
TypeScript

import { describe, it, expect, beforeEach, afterEach, vi, type MockInstance } from 'vitest';
import fs from 'fs/promises';
import path from 'path';
import os from 'os';
import { secureFs } from '@automaker/platform';
import {
atomicWriteJson,
readJsonFile,
updateJsonAtomically,
readJsonWithRecovery,
} from '../src/atomic-writer';
// Mock secureFs
vi.mock('@automaker/platform', () => ({
secureFs: {
writeFile: vi.fn(),
readFile: vi.fn(),
rename: vi.fn(),
unlink: vi.fn(),
readdir: vi.fn(),
copyFile: vi.fn(),
access: vi.fn(),
lstat: vi.fn(),
mkdir: vi.fn(),
},
}));
// Mock logger to suppress output during tests
vi.mock('../src/logger.js', () => ({
createLogger: () => ({
info: vi.fn(),
warn: vi.fn(),
error: vi.fn(),
debug: vi.fn(),
}),
}));
describe('atomic-writer.ts', () => {
let tempDir: string;
beforeEach(async () => {
// Create a temporary directory for integration tests
tempDir = await fs.mkdtemp(path.join(os.tmpdir(), 'atomic-writer-test-'));
vi.clearAllMocks();
// Default: parent directory exists (atomicWriteJson always ensures parent dir)
(secureFs.lstat as unknown as MockInstance).mockResolvedValue({
isDirectory: () => true,
isSymbolicLink: () => false,
});
});
afterEach(async () => {
// Clean up temporary directory
try {
await fs.rm(tempDir, { recursive: true, force: true });
} catch {
// Ignore cleanup errors
}
});
describe('atomicWriteJson', () => {
it('should write JSON data atomically', async () => {
const filePath = path.join(tempDir, 'test.json');
const data = { key: 'value', number: 42 };
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(filePath, data);
// Verify writeFile was called with temp file path and JSON content
// Format: .tmp.{timestamp}.{random-hex}
expect(secureFs.writeFile).toHaveBeenCalledTimes(1);
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
expect(writeCall[0]).toMatch(/\.tmp\.\d+\.[a-f0-9]+$/);
expect(writeCall[1]).toBe(JSON.stringify(data, null, 2));
expect(writeCall[2]).toBe('utf-8');
// Verify rename was called with temp -> target
expect(secureFs.rename).toHaveBeenCalledTimes(1);
const renameCall = (secureFs.rename as unknown as MockInstance).mock.calls[0];
expect(renameCall[0]).toMatch(/\.tmp\.\d+\.[a-f0-9]+$/);
expect(renameCall[1]).toBe(path.resolve(filePath));
});
it('should use custom indentation', async () => {
const filePath = path.join(tempDir, 'test.json');
const data = { key: 'value' };
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(filePath, data, { indent: 4 });
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
expect(writeCall[1]).toBe(JSON.stringify(data, null, 4));
});
it('should clean up temp file on write failure', async () => {
const filePath = path.join(tempDir, 'test.json');
const data = { key: 'value' };
const writeError = new Error('Write failed');
(secureFs.writeFile as unknown as MockInstance).mockRejectedValue(writeError);
(secureFs.unlink as unknown as MockInstance).mockResolvedValue(undefined);
await expect(atomicWriteJson(filePath, data)).rejects.toThrow('Write failed');
expect(secureFs.unlink).toHaveBeenCalledTimes(1);
});
it('should clean up temp file on rename failure', async () => {
const filePath = path.join(tempDir, 'test.json');
const data = { key: 'value' };
const renameError = new Error('Rename failed');
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockRejectedValue(renameError);
(secureFs.unlink as unknown as MockInstance).mockResolvedValue(undefined);
await expect(atomicWriteJson(filePath, data)).rejects.toThrow('Rename failed');
expect(secureFs.unlink).toHaveBeenCalledTimes(1);
});
it('should ignore cleanup errors', async () => {
const filePath = path.join(tempDir, 'test.json');
const data = { key: 'value' };
const writeError = new Error('Write failed');
const unlinkError = new Error('Unlink failed');
(secureFs.writeFile as unknown as MockInstance).mockRejectedValue(writeError);
(secureFs.unlink as unknown as MockInstance).mockRejectedValue(unlinkError);
// Should still throw the original error, not the cleanup error
await expect(atomicWriteJson(filePath, data)).rejects.toThrow('Write failed');
});
it('should resolve relative paths', async () => {
const relativePath = 'test.json';
const data = { key: 'value' };
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(relativePath, data);
const renameCall = (secureFs.rename as unknown as MockInstance).mock.calls[0];
expect(renameCall[1]).toBe(path.resolve(relativePath));
});
it('should handle arrays as data', async () => {
const filePath = path.join(tempDir, 'array.json');
const data = [1, 2, 3, { nested: 'value' }];
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(filePath, data);
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
expect(writeCall[1]).toBe(JSON.stringify(data, null, 2));
});
it('should handle null and primitive values', async () => {
const filePath = path.join(tempDir, 'primitive.json');
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(filePath, null);
expect((secureFs.writeFile as unknown as MockInstance).mock.calls[0][1]).toBe('null');
await atomicWriteJson(filePath, 'string');
expect((secureFs.writeFile as unknown as MockInstance).mock.calls[1][1]).toBe('"string"');
await atomicWriteJson(filePath, 123);
expect((secureFs.writeFile as unknown as MockInstance).mock.calls[2][1]).toBe('123');
});
it('should always create parent directories before writing', async () => {
const filePath = path.join(tempDir, 'nested', 'deep', 'test.json');
const data = { key: 'value' };
// Mock lstat to throw ENOENT (directory doesn't exist)
const enoentError = new Error('Not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.lstat as unknown as MockInstance).mockRejectedValue(enoentError);
(secureFs.mkdir as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(filePath, data);
// Should have called mkdir to create parent directories
expect(secureFs.mkdir).toHaveBeenCalledWith(
path.resolve(path.join(tempDir, 'nested', 'deep')),
{ recursive: true }
);
expect(secureFs.writeFile).toHaveBeenCalled();
});
});
describe('readJsonFile', () => {
it('should read and parse JSON file', async () => {
const filePath = path.join(tempDir, 'read.json');
const data = { key: 'value', count: 5 };
(secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(data));
const result = await readJsonFile(filePath, {});
expect(result).toEqual(data);
expect(secureFs.readFile).toHaveBeenCalledWith(path.resolve(filePath), 'utf-8');
});
it('should return default value when file does not exist', async () => {
const filePath = path.join(tempDir, 'nonexistent.json');
const defaultValue = { default: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError);
const result = await readJsonFile(filePath, defaultValue);
expect(result).toEqual(defaultValue);
});
it('should return default value when JSON is invalid', async () => {
const filePath = path.join(tempDir, 'invalid.json');
const defaultValue = { default: true };
(secureFs.readFile as unknown as MockInstance).mockResolvedValue('not valid json');
const result = await readJsonFile(filePath, defaultValue);
expect(result).toEqual(defaultValue);
});
it('should return default value for other read errors', async () => {
const filePath = path.join(tempDir, 'error.json');
const defaultValue = { default: true };
const accessError = new Error('Access denied') as NodeJS.ErrnoException;
accessError.code = 'EACCES';
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(accessError);
const result = await readJsonFile(filePath, defaultValue);
expect(result).toEqual(defaultValue);
});
it('should handle empty object as default', async () => {
const filePath = path.join(tempDir, 'nonexistent.json');
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError);
const result = await readJsonFile<Record<string, unknown>>(filePath, {});
expect(result).toEqual({});
});
it('should handle array as default', async () => {
const filePath = path.join(tempDir, 'nonexistent.json');
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError);
const result = await readJsonFile<string[]>(filePath, []);
expect(result).toEqual([]);
});
it('should parse nested objects correctly', async () => {
const filePath = path.join(tempDir, 'nested.json');
const data = {
level1: {
level2: {
value: 'deep',
array: [1, 2, { nested: true }],
},
},
};
(secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(data));
const result = await readJsonFile(filePath, {});
expect(result).toEqual(data);
});
});
describe('updateJsonAtomically', () => {
it('should read, update, and write file atomically', async () => {
const filePath = path.join(tempDir, 'update.json');
const initialData = { count: 5 };
const defaultValue = { count: 0 };
(secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(initialData));
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await updateJsonAtomically(filePath, defaultValue, (data) => ({
...data,
count: data.count + 1,
}));
// Verify the write was called with updated data
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
const writtenData = JSON.parse(writeCall[1]);
expect(writtenData.count).toBe(6);
});
it('should use default value when file does not exist', async () => {
const filePath = path.join(tempDir, 'new.json');
const defaultValue = { count: 0 };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError);
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await updateJsonAtomically(filePath, defaultValue, (data) => ({
...data,
count: data.count + 1,
}));
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
const writtenData = JSON.parse(writeCall[1]);
expect(writtenData.count).toBe(1);
});
it('should support async updater function', async () => {
const filePath = path.join(tempDir, 'async.json');
const initialData = { value: 'initial' };
(secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(initialData));
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await updateJsonAtomically(filePath, {}, async (data) => {
await new Promise((resolve) => setTimeout(resolve, 10));
return { ...data, value: 'updated' };
});
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
const writtenData = JSON.parse(writeCall[1]);
expect(writtenData.value).toBe('updated');
});
it('should pass through options to atomicWriteJson', async () => {
const filePath = path.join(tempDir, 'options.json');
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError);
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await updateJsonAtomically(filePath, { key: 'value' }, (d) => d, { indent: 4 });
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
expect(writeCall[1]).toBe(JSON.stringify({ key: 'value' }, null, 4));
});
});
describe('readJsonWithRecovery', () => {
it('should return main file data when available', async () => {
const filePath = path.join(tempDir, 'main.json');
const data = { main: true };
(secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(data));
const result = await readJsonWithRecovery(filePath, {});
expect(result.data).toEqual(data);
expect(result.recovered).toBe(false);
expect(result.source).toBe('main');
expect(result.error).toBeUndefined();
});
it('should recover from temp file when main file is missing', async () => {
const filePath = path.join(tempDir, 'data.json');
const tempData = { fromTemp: true };
const fileName = path.basename(filePath);
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance)
.mockRejectedValueOnce(enoentError) // Main file
.mockResolvedValueOnce(JSON.stringify(tempData)); // Temp file
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([
`${fileName}.tmp.1234567890`,
'other-file.json',
]);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
const result = await readJsonWithRecovery(filePath, {});
expect(result.data).toEqual(tempData);
expect(result.recovered).toBe(true);
expect(result.source).toBe('temp');
expect(result.error).toBe('File does not exist');
});
it('should recover from backup file when main and temp are unavailable', async () => {
const filePath = path.join(tempDir, 'data.json');
const backupData = { fromBackup: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance)
.mockRejectedValueOnce(enoentError) // Main file
.mockRejectedValueOnce(enoentError) // backup1
.mockResolvedValueOnce(JSON.stringify(backupData)); // backup2
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([]); // No temp files
(secureFs.copyFile as unknown as MockInstance).mockResolvedValue(undefined);
const result = await readJsonWithRecovery(filePath, {});
expect(result.data).toEqual(backupData);
expect(result.recovered).toBe(true);
expect(result.source).toBe('backup');
});
it('should return default when all recovery attempts fail', async () => {
const filePath = path.join(tempDir, 'data.json');
const defaultValue = { default: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError);
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([]);
const result = await readJsonWithRecovery(filePath, defaultValue);
expect(result.data).toEqual(defaultValue);
expect(result.recovered).toBe(true);
expect(result.source).toBe('default');
expect(result.error).toBe('File does not exist');
});
it('should try multiple temp files in order', async () => {
const filePath = path.join(tempDir, 'data.json');
const fileName = path.basename(filePath);
const validTempData = { valid: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance)
.mockRejectedValueOnce(enoentError) // Main file
.mockResolvedValueOnce('invalid json') // First temp file (invalid)
.mockResolvedValueOnce(JSON.stringify(validTempData)); // Second temp file
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([
`${fileName}.tmp.9999999999`, // Most recent
`${fileName}.tmp.1111111111`, // Older
]);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
const result = await readJsonWithRecovery(filePath, {});
expect(result.data).toEqual(validTempData);
expect(result.source).toBe('temp');
});
it('should try multiple backup files in order', async () => {
const filePath = path.join(tempDir, 'data.json');
const backupData = { backup2: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance)
.mockRejectedValueOnce(enoentError) // Main file
.mockRejectedValueOnce(enoentError) // .bak1
.mockResolvedValueOnce(JSON.stringify(backupData)); // .bak2
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([]);
(secureFs.copyFile as unknown as MockInstance).mockResolvedValue(undefined);
const result = await readJsonWithRecovery(filePath, {});
expect(result.data).toEqual(backupData);
expect(result.source).toBe('backup');
// Verify it tried .bak1 first
expect(secureFs.readFile).toHaveBeenNthCalledWith(
2,
`${path.resolve(filePath)}.bak1`,
'utf-8'
);
});
it('should respect maxBackups option', async () => {
const filePath = path.join(tempDir, 'data.json');
const defaultValue = { default: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError);
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([]);
const result = await readJsonWithRecovery(filePath, defaultValue, { maxBackups: 1 });
expect(result.source).toBe('default');
// Should only have tried main + 1 backup
expect(secureFs.readFile).toHaveBeenCalledTimes(2);
});
it('should not auto-restore when autoRestore is false', async () => {
const filePath = path.join(tempDir, 'data.json');
const fileName = path.basename(filePath);
const tempData = { fromTemp: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance)
.mockRejectedValueOnce(enoentError)
.mockResolvedValueOnce(JSON.stringify(tempData));
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([`${fileName}.tmp.123`]);
const result = await readJsonWithRecovery(filePath, {}, { autoRestore: false });
expect(result.data).toEqual(tempData);
expect(secureFs.rename).not.toHaveBeenCalled();
expect(secureFs.copyFile).not.toHaveBeenCalled();
});
it('should handle directory read errors gracefully', async () => {
const filePath = path.join(tempDir, 'data.json');
const backupData = { backup: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance)
.mockRejectedValueOnce(enoentError) // Main file
.mockResolvedValueOnce(JSON.stringify(backupData)); // backup1
(secureFs.readdir as unknown as MockInstance).mockRejectedValue(new Error('Dir read failed'));
(secureFs.copyFile as unknown as MockInstance).mockResolvedValue(undefined);
const result = await readJsonWithRecovery(filePath, {});
// Should skip temp files and go to backups
expect(result.data).toEqual(backupData);
expect(result.source).toBe('backup');
});
it('should handle corrupted main file with valid error message', async () => {
const filePath = path.join(tempDir, 'corrupted.json');
const defaultValue = { default: true };
const parseError = new SyntaxError('Unexpected token');
(secureFs.readFile as unknown as MockInstance).mockResolvedValueOnce('{{invalid');
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([]);
// Mock to actually throw parse error
(secureFs.readFile as unknown as MockInstance).mockImplementationOnce(() => {
return Promise.resolve('{{invalid json');
});
const result = await readJsonWithRecovery(filePath, defaultValue);
expect(result.recovered).toBe(true);
expect(result.error).toContain('Failed to parse');
});
it('should handle restore failures gracefully', async () => {
const filePath = path.join(tempDir, 'data.json');
const fileName = path.basename(filePath);
const tempData = { fromTemp: true };
const enoentError = new Error('File not found') as NodeJS.ErrnoException;
enoentError.code = 'ENOENT';
(secureFs.readFile as unknown as MockInstance)
.mockRejectedValueOnce(enoentError)
.mockResolvedValueOnce(JSON.stringify(tempData));
(secureFs.readdir as unknown as MockInstance).mockResolvedValue([`${fileName}.tmp.123`]);
(secureFs.rename as unknown as MockInstance).mockRejectedValue(new Error('Restore failed'));
const result = await readJsonWithRecovery(filePath, {});
// Should still return data even if restore failed
expect(result.data).toEqual(tempData);
expect(result.source).toBe('temp');
});
});
describe('Edge cases', () => {
it('should handle empty file path gracefully', async () => {
(secureFs.readFile as unknown as MockInstance).mockRejectedValue(new Error('Invalid path'));
const result = await readJsonFile('', { default: true });
expect(result).toEqual({ default: true });
});
it('should handle special characters in file path', async () => {
const filePath = path.join(tempDir, 'file with spaces & special!.json');
const data = { special: 'chars' };
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(filePath, data);
expect(secureFs.writeFile).toHaveBeenCalled();
});
it('should handle very large objects', async () => {
const filePath = path.join(tempDir, 'large.json');
const largeArray = Array.from({ length: 10000 }, (_, i) => ({
id: i,
data: `item-${i}`,
}));
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(filePath, largeArray);
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
expect(JSON.parse(writeCall[1])).toEqual(largeArray);
});
it('should handle unicode content', async () => {
const filePath = path.join(tempDir, 'unicode.json');
const data = { emoji: '🎉', japanese: 'こんにちは', chinese: '你好' };
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await atomicWriteJson(filePath, data);
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
expect(JSON.parse(writeCall[1])).toEqual(data);
});
it('should handle circular reference error in JSON', async () => {
const filePath = path.join(tempDir, 'circular.json');
const circular: Record<string, unknown> = { key: 'value' };
circular.self = circular;
await expect(atomicWriteJson(filePath, circular)).rejects.toThrow();
});
});
describe('Type safety', () => {
interface TestConfig {
version: number;
settings: {
enabled: boolean;
name: string;
};
}
it('should preserve types in readJsonFile', async () => {
const filePath = path.join(tempDir, 'config.json');
const expected: TestConfig = {
version: 1,
settings: { enabled: true, name: 'test' },
};
(secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(expected));
const result = await readJsonFile<TestConfig>(filePath, {
version: 0,
settings: { enabled: false, name: '' },
});
expect(result.version).toBe(1);
expect(result.settings.enabled).toBe(true);
expect(result.settings.name).toBe('test');
});
it('should preserve types in updateJsonAtomically', async () => {
const filePath = path.join(tempDir, 'counter.json');
interface Counter {
count: number;
}
(secureFs.readFile as unknown as MockInstance).mockResolvedValue(
JSON.stringify({ count: 5 })
);
(secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined);
(secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined);
await updateJsonAtomically<Counter>(filePath, { count: 0 }, (data) => ({
count: data.count + 1,
}));
const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0];
const writtenData: Counter = JSON.parse(writeCall[1]);
expect(writtenData.count).toBe(6);
});
});
});