mirror of
https://github.com/AutoMaker-Org/automaker.git
synced 2026-03-16 21:53:07 +00:00
Bug fixes and stability improvements (#815)
* fix(copilot): correct tool.execution_complete event handling The CopilotProvider was using incorrect event type and data structure for tool execution completion events from the @github/copilot-sdk, causing tool call outputs to be empty. Changes: - Update event type from 'tool.execution_end' to 'tool.execution_complete' - Fix data structure to use nested result.content instead of flat result - Fix error structure to use error.message instead of flat error - Add success field to match SDK event structure - Add tests for empty and missing result handling This aligns with the official @github/copilot-sdk v0.1.16 types defined in session-events.d.ts. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * test(copilot): add edge case test for error with code field Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * refactor(copilot): improve error handling and code quality Code review improvements: - Extract magic string '[ERROR]' to TOOL_ERROR_PREFIX constant - Add null-safe error handling with direct error variable assignment - Include error codes in error messages for better debugging - Add JSDoc documentation for tool.execution_complete handler - Update tests to verify error codes are displayed - Add missing tool_use_id assertion in error test These changes improve: - Code maintainability (no magic strings) - Debugging experience (error codes now visible) - Type safety (explicit null checks) - Test coverage (verify error code formatting) Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * Changes from fix/bug-fixes-1-0 * test(copilot): add edge case test for error with code field Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> * Changes from fix/bug-fixes-1-0 * fix: Handle detached HEAD state in worktree discovery and recovery * fix: Remove unused isDevServerStarting prop and md: breakpoint classes * fix: Add missing dependency and sanitize persisted cache data * feat: Ensure NODE_ENV is set to test in vitest configs * feat: Configure Playwright to run only E2E tests * fix: Improve PR tracking and dev server lifecycle management * feat: Add settings-based defaults for planning mode, model config, and custom providers. Fixes #816 * feat: Add worktree and branch selector to graph view * fix: Add timeout and error handling for worktree HEAD ref resolution * fix: use absolute icon path and place icon outside asar on Linux The hicolor icon theme index only lists sizes up to 512x512, so an icon installed only at 1024x1024 is invisible to GNOME/KDE's theme resolver, causing both the app launcher and taskbar to show a generic icon. Additionally, BrowserWindow.icon cannot be read by the window manager when the file is inside app.asar. - extraResources: copy logo_larger.png to resources/ (outside asar) so it lands at /opt/Automaker/resources/logo_larger.png on install - linux.desktop.Icon: set to the absolute resources path, bypassing the hicolor theme lookup and its size constraints entirely - icon-manager.ts: on Linux production use process.resourcesPath so BrowserWindow receives a real filesystem path the WM can read directly Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: use linux.desktop.entry for custom desktop Icon field electron-builder v26 rejects arbitrary keys in linux.desktop — the correct schema wraps custom .desktop overrides inside desktop.entry. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: set desktop name on Linux so taskbar uses the correct app icon Without app.setDesktopName(), the window manager cannot associate the running Electron process with automaker.desktop. GNOME/KDE fall back to _NET_WM_ICON which defaults to Electron's own bundled icon. Calling app.setDesktopName('automaker.desktop') before any window is created sets the _GTK_APPLICATION_ID hint and XDG app_id so the WM picks up the desktop entry's Icon for the taskbar. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * Fix: memory and context views mobile friendly (#818) * Changes from fix/memory-and-context-mobile-friendly * fix: Improve file extension detection and add path traversal protection * refactor: Extract file extension utilities and add path traversal guards Code review improvements: - Extract isMarkdownFilename and isImageFilename to shared image-utils.ts - Remove duplicated code from context-view.tsx and memory-view.tsx - Add path traversal guard for context fixture utilities (matching memory) - Add 7 new tests for context fixture path traversal protection - Total 61 tests pass Addresses code review feedback from PR #813 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * test: Add e2e tests for profiles crud and board background persistence * Update apps/ui/playwright.config.ts Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> * fix: Add robust test navigation handling and file filtering * fix: Format NODE_OPTIONS configuration on single line * test: Update profiles and board background persistence tests * test: Replace iPhone 13 Pro with Pixel 5 for mobile test consistency * Update apps/ui/src/components/views/context-view.tsx Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> * chore: Remove test project directory * feat: Filter context files by type and improve mobile menu visibility --------- Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> * fix: Improve test reliability and localhost handling * chore: Use explicit TEST_USE_EXTERNAL_BACKEND env var for server cleanup * feat: Add E2E/CI mock mode for provider factory and auth verification * feat: Add remoteBranch parameter to pull and rebase operations * chore: Enhance E2E testing setup with worker isolation and auth state management - Updated .gitignore to include worker-specific test fixtures. - Modified e2e-tests.yml to implement test sharding for improved CI performance. - Refactored global setup to authenticate once and save session state for reuse across tests. - Introduced worker-isolated fixture paths to prevent conflicts during parallel test execution. - Improved test navigation and loading handling for better reliability. - Updated various test files to utilize new auth state management and fixture paths. * fix: Update Playwright configuration and improve test reliability - Increased the number of workers in Playwright configuration for better parallelism in CI environments. - Enhanced the board background persistence test to ensure dropdown stability by waiting for the list to populate before interaction, improving test reliability. * chore: Simplify E2E test configuration and enhance mock implementations - Updated e2e-tests.yml to run tests in a single shard for streamlined CI execution. - Enhanced unit tests for worktree list handling by introducing a mock for execGitCommand, improving test reliability and coverage. - Refactored setup functions to better manage command mocks for git operations in tests. - Improved error handling in mkdirSafe function to account for undefined stats in certain environments. * refactor: Improve test configurations and enhance error handling - Updated Playwright configuration to clear VITE_SERVER_URL, ensuring the frontend uses the Vite proxy and preventing cookie domain mismatches. - Enhanced MergeRebaseDialog logic to normalize selectedBranch for better handling of various ref formats. - Improved global setup with a more robust backend health check, throwing an error if the backend is not healthy after retries. - Refactored project creation tests to handle file existence checks more reliably. - Added error handling for missing E2E source fixtures to guide setup process. - Enhanced memory navigation to handle sandbox dialog visibility more effectively. * refactor: Enhance Git command execution and improve test configurations - Updated Git command execution to merge environment paths correctly, ensuring proper command execution context. - Refactored the Git initialization process to handle errors more gracefully and ensure user configuration is set before creating the initial commit. - Improved test configurations by updating Playwright test identifiers for better clarity and consistency across different project states. - Enhanced cleanup functions in tests to handle directory removal more robustly, preventing errors during test execution. * fix: Resolve React hooks errors from duplicate instances in dependency tree * style: Format alias configuration for improved readability --------- Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> Co-authored-by: DhanushSantosh <dhanushsantoshs05@gmail.com> Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
This commit is contained in:
38
.github/workflows/e2e-tests.yml
vendored
38
.github/workflows/e2e-tests.yml
vendored
@@ -13,6 +13,13 @@ jobs:
|
|||||||
e2e:
|
e2e:
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
timeout-minutes: 15
|
timeout-minutes: 15
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
# shardIndex: [1, 2, 3]
|
||||||
|
# shardTotal: [3]
|
||||||
|
shardIndex: [1]
|
||||||
|
shardTotal: [1]
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout code
|
- name: Checkout code
|
||||||
@@ -91,7 +98,7 @@ jobs:
|
|||||||
curl -s http://localhost:3108/api/health | jq . 2>/dev/null || echo "Health check: $(curl -s http://localhost:3108/api/health 2>/dev/null || echo 'No response')"
|
curl -s http://localhost:3108/api/health | jq . 2>/dev/null || echo "Health check: $(curl -s http://localhost:3108/api/health 2>/dev/null || echo 'No response')"
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# Check if server process is still running
|
# Check if server process is still running
|
||||||
if ! kill -0 $SERVER_PID 2>/dev/null; then
|
if ! kill -0 $SERVER_PID 2>/dev/null; then
|
||||||
echo "ERROR: Server process died during wait!"
|
echo "ERROR: Server process died during wait!"
|
||||||
@@ -99,7 +106,7 @@ jobs:
|
|||||||
cat backend.log
|
cat backend.log
|
||||||
exit 1
|
exit 1
|
||||||
fi
|
fi
|
||||||
|
|
||||||
echo "Waiting... ($i/60)"
|
echo "Waiting... ($i/60)"
|
||||||
sleep 1
|
sleep 1
|
||||||
done
|
done
|
||||||
@@ -127,17 +134,23 @@ jobs:
|
|||||||
|
|
||||||
exit 1
|
exit 1
|
||||||
|
|
||||||
- name: Run E2E tests
|
- name: Run E2E tests (shard ${{ matrix.shardIndex }}/${{ matrix.shardTotal }})
|
||||||
# Playwright automatically starts the Vite frontend via webServer config
|
# Playwright automatically starts the Vite frontend via webServer config
|
||||||
# (see apps/ui/playwright.config.ts) - no need to start it manually
|
# (see apps/ui/playwright.config.ts) - no need to start it manually
|
||||||
run: npm run test --workspace=apps/ui
|
run: npx playwright test --shard=${{ matrix.shardIndex }}/${{ matrix.shardTotal }}
|
||||||
|
working-directory: apps/ui
|
||||||
env:
|
env:
|
||||||
CI: true
|
CI: true
|
||||||
VITE_SERVER_URL: http://localhost:3108
|
|
||||||
SERVER_URL: http://localhost:3108
|
|
||||||
VITE_SKIP_SETUP: 'true'
|
VITE_SKIP_SETUP: 'true'
|
||||||
# Keep UI-side login/defaults consistent
|
# Keep UI-side login/defaults consistent
|
||||||
AUTOMAKER_API_KEY: test-api-key-for-e2e-tests
|
AUTOMAKER_API_KEY: test-api-key-for-e2e-tests
|
||||||
|
# Backend is already started above - Playwright config sets
|
||||||
|
# AUTOMAKER_SERVER_PORT so the Vite proxy forwards /api/* to the backend.
|
||||||
|
# Do NOT set VITE_SERVER_URL here: it bypasses the Vite proxy and causes
|
||||||
|
# a cookie domain mismatch (cookies are bound to 127.0.0.1, but
|
||||||
|
# VITE_SERVER_URL=http://localhost:3108 makes the frontend call localhost).
|
||||||
|
TEST_USE_EXTERNAL_BACKEND: 'true'
|
||||||
|
TEST_SERVER_PORT: 3108
|
||||||
|
|
||||||
- name: Print backend logs on failure
|
- name: Print backend logs on failure
|
||||||
if: failure()
|
if: failure()
|
||||||
@@ -155,7 +168,7 @@ jobs:
|
|||||||
uses: actions/upload-artifact@v4
|
uses: actions/upload-artifact@v4
|
||||||
if: always()
|
if: always()
|
||||||
with:
|
with:
|
||||||
name: playwright-report
|
name: playwright-report-shard-${{ matrix.shardIndex }}-of-${{ matrix.shardTotal }}
|
||||||
path: apps/ui/playwright-report/
|
path: apps/ui/playwright-report/
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
|
|
||||||
@@ -163,12 +176,21 @@ jobs:
|
|||||||
uses: actions/upload-artifact@v4
|
uses: actions/upload-artifact@v4
|
||||||
if: always()
|
if: always()
|
||||||
with:
|
with:
|
||||||
name: test-results
|
name: test-results-shard-${{ matrix.shardIndex }}-of-${{ matrix.shardTotal }}
|
||||||
path: |
|
path: |
|
||||||
apps/ui/test-results/
|
apps/ui/test-results/
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: ignore
|
if-no-files-found: ignore
|
||||||
|
|
||||||
|
- name: Upload blob report for merging
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
if: always()
|
||||||
|
with:
|
||||||
|
name: blob-report-shard-${{ matrix.shardIndex }}-of-${{ matrix.shardTotal }}
|
||||||
|
path: apps/ui/blob-report/
|
||||||
|
retention-days: 1
|
||||||
|
if-no-files-found: ignore
|
||||||
|
|
||||||
- name: Cleanup - Kill backend server
|
- name: Cleanup - Kill backend server
|
||||||
if: always()
|
if: always()
|
||||||
run: |
|
run: |
|
||||||
|
|||||||
5
.gitignore
vendored
5
.gitignore
vendored
@@ -70,6 +70,11 @@ test/opus-thinking-*/
|
|||||||
test/agent-session-test-*/
|
test/agent-session-test-*/
|
||||||
test/feature-backlog-test-*/
|
test/feature-backlog-test-*/
|
||||||
test/running-task-display-test-*/
|
test/running-task-display-test-*/
|
||||||
|
test/agent-output-modal-responsive-*/
|
||||||
|
test/fixtures/.worker-*/
|
||||||
|
test/board-bg-test-*/
|
||||||
|
test/edit-feature-test-*/
|
||||||
|
test/open-project-test-*/
|
||||||
|
|
||||||
# Environment files (keep .example)
|
# Environment files (keep .example)
|
||||||
.env
|
.env
|
||||||
|
|||||||
@@ -349,7 +349,9 @@ const ideationService = new IdeationService(events, settingsService, featureLoad
|
|||||||
|
|
||||||
// Initialize DevServerService with event emitter for real-time log streaming
|
// Initialize DevServerService with event emitter for real-time log streaming
|
||||||
const devServerService = getDevServerService();
|
const devServerService = getDevServerService();
|
||||||
devServerService.setEventEmitter(events);
|
devServerService.initialize(DATA_DIR, events).catch((err) => {
|
||||||
|
logger.error('Failed to initialize DevServerService:', err);
|
||||||
|
});
|
||||||
|
|
||||||
// Initialize Notification Service with event emitter for real-time updates
|
// Initialize Notification Service with event emitter for real-time updates
|
||||||
const notificationService = getNotificationService();
|
const notificationService = getNotificationService();
|
||||||
|
|||||||
@@ -13,6 +13,27 @@ import { createLogger } from '@automaker/utils';
|
|||||||
|
|
||||||
const logger = createLogger('GitLib');
|
const logger = createLogger('GitLib');
|
||||||
|
|
||||||
|
// Extended PATH so git is found when the process does not inherit a full shell PATH
|
||||||
|
// (e.g. Electron, some CI, or IDE-launched processes).
|
||||||
|
const pathSeparator = process.platform === 'win32' ? ';' : ':';
|
||||||
|
const extraPaths: string[] =
|
||||||
|
process.platform === 'win32'
|
||||||
|
? ([
|
||||||
|
process.env.LOCALAPPDATA && `${process.env.LOCALAPPDATA}\\Programs\\Git\\cmd`,
|
||||||
|
process.env.PROGRAMFILES && `${process.env.PROGRAMFILES}\\Git\\cmd`,
|
||||||
|
process.env['ProgramFiles(x86)'] && `${process.env['ProgramFiles(x86)']}\\Git\\cmd`,
|
||||||
|
].filter(Boolean) as string[])
|
||||||
|
: [
|
||||||
|
'/opt/homebrew/bin',
|
||||||
|
'/usr/local/bin',
|
||||||
|
'/usr/bin',
|
||||||
|
'/home/linuxbrew/.linuxbrew/bin',
|
||||||
|
process.env.HOME ? `${process.env.HOME}/.local/bin` : '',
|
||||||
|
].filter(Boolean);
|
||||||
|
|
||||||
|
const extendedPath = [process.env.PATH, ...extraPaths].filter(Boolean).join(pathSeparator);
|
||||||
|
const gitEnv = { ...process.env, PATH: extendedPath };
|
||||||
|
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
// Secure Command Execution
|
// Secure Command Execution
|
||||||
// ============================================================================
|
// ============================================================================
|
||||||
@@ -65,7 +86,14 @@ export async function execGitCommand(
|
|||||||
command: 'git',
|
command: 'git',
|
||||||
args,
|
args,
|
||||||
cwd,
|
cwd,
|
||||||
...(env !== undefined ? { env } : {}),
|
env:
|
||||||
|
env !== undefined
|
||||||
|
? {
|
||||||
|
...gitEnv,
|
||||||
|
...env,
|
||||||
|
PATH: [gitEnv.PATH, env.PATH].filter(Boolean).join(pathSeparator),
|
||||||
|
}
|
||||||
|
: gitEnv,
|
||||||
...(abortController !== undefined ? { abortController } : {}),
|
...(abortController !== undefined ? { abortController } : {}),
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -689,6 +689,145 @@ export interface ProviderByModelIdResult {
|
|||||||
resolvedModel: string | undefined;
|
resolvedModel: string | undefined;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/** Result from resolveProviderContext */
|
||||||
|
export interface ProviderContextResult {
|
||||||
|
/** The provider configuration */
|
||||||
|
provider: ClaudeCompatibleProvider | undefined;
|
||||||
|
/** Credentials for API key resolution */
|
||||||
|
credentials: Credentials | undefined;
|
||||||
|
/** The resolved Claude model ID for SDK configuration */
|
||||||
|
resolvedModel: string | undefined;
|
||||||
|
/** The original model config from the provider if found */
|
||||||
|
modelConfig: import('@automaker/types').ProviderModel | undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Checks if a provider is enabled.
|
||||||
|
* Providers with enabled: undefined are treated as enabled (default state).
|
||||||
|
* Only explicitly set enabled: false means the provider is disabled.
|
||||||
|
*/
|
||||||
|
function isProviderEnabled(provider: ClaudeCompatibleProvider): boolean {
|
||||||
|
return provider.enabled !== false;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Finds a model config in a provider's models array by ID (case-insensitive).
|
||||||
|
*/
|
||||||
|
function findModelInProvider(
|
||||||
|
provider: ClaudeCompatibleProvider,
|
||||||
|
modelId: string
|
||||||
|
): import('@automaker/types').ProviderModel | undefined {
|
||||||
|
return provider.models?.find(
|
||||||
|
(m) => m.id === modelId || m.id.toLowerCase() === modelId.toLowerCase()
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Resolves the provider and Claude-compatible model configuration.
|
||||||
|
*
|
||||||
|
* This is the central logic for resolving provider context, supporting:
|
||||||
|
* 1. Explicit lookup by providerId (most reliable for persistence)
|
||||||
|
* 2. Fallback lookup by modelId across all enabled providers
|
||||||
|
* 3. Resolution of mapsToClaudeModel for SDK configuration
|
||||||
|
*
|
||||||
|
* @param settingsService - Settings service instance
|
||||||
|
* @param modelId - The model ID to resolve
|
||||||
|
* @param providerId - Optional explicit provider ID
|
||||||
|
* @param logPrefix - Prefix for log messages
|
||||||
|
* @returns Promise resolving to the provider context
|
||||||
|
*/
|
||||||
|
export async function resolveProviderContext(
|
||||||
|
settingsService: SettingsService,
|
||||||
|
modelId: string,
|
||||||
|
providerId?: string,
|
||||||
|
logPrefix = '[SettingsHelper]'
|
||||||
|
): Promise<ProviderContextResult> {
|
||||||
|
try {
|
||||||
|
const globalSettings = await settingsService.getGlobalSettings();
|
||||||
|
const credentials = await settingsService.getCredentials();
|
||||||
|
const providers = globalSettings.claudeCompatibleProviders || [];
|
||||||
|
|
||||||
|
logger.debug(
|
||||||
|
`${logPrefix} Resolving provider context: modelId="${modelId}", providerId="${providerId ?? 'none'}", providers count=${providers.length}`
|
||||||
|
);
|
||||||
|
|
||||||
|
let provider: ClaudeCompatibleProvider | undefined;
|
||||||
|
let modelConfig: import('@automaker/types').ProviderModel | undefined;
|
||||||
|
|
||||||
|
// 1. Try resolving by explicit providerId first (most reliable)
|
||||||
|
if (providerId) {
|
||||||
|
provider = providers.find((p) => p.id === providerId);
|
||||||
|
if (provider) {
|
||||||
|
if (!isProviderEnabled(provider)) {
|
||||||
|
logger.warn(
|
||||||
|
`${logPrefix} Explicitly requested provider "${provider.name}" (${providerId}) is disabled (enabled=${provider.enabled})`
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
logger.debug(
|
||||||
|
`${logPrefix} Found provider "${provider.name}" (${providerId}), enabled=${provider.enabled ?? 'undefined (treated as enabled)'}`
|
||||||
|
);
|
||||||
|
// Find the model config within this provider to check for mappings
|
||||||
|
modelConfig = findModelInProvider(provider, modelId);
|
||||||
|
if (!modelConfig && provider.models && provider.models.length > 0) {
|
||||||
|
logger.debug(
|
||||||
|
`${logPrefix} Model "${modelId}" not found in provider "${provider.name}". Available models: ${provider.models.map((m) => m.id).join(', ')}`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
logger.warn(
|
||||||
|
`${logPrefix} Explicitly requested provider "${providerId}" not found. Available providers: ${providers.map((p) => p.id).join(', ')}`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 2. Fallback to model-based lookup across all providers if modelConfig not found
|
||||||
|
// Note: We still search even if provider was found, to get the modelConfig for mapping
|
||||||
|
if (!modelConfig) {
|
||||||
|
for (const p of providers) {
|
||||||
|
if (!isProviderEnabled(p) || p.id === providerId) continue; // Skip disabled or already checked
|
||||||
|
|
||||||
|
const config = findModelInProvider(p, modelId);
|
||||||
|
|
||||||
|
if (config) {
|
||||||
|
// Only override provider if we didn't find one by explicit ID
|
||||||
|
if (!provider) {
|
||||||
|
provider = p;
|
||||||
|
}
|
||||||
|
modelConfig = config;
|
||||||
|
logger.debug(`${logPrefix} Found model "${modelId}" in provider "${p.name}" (fallback)`);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// 3. Resolve the mapped Claude model if specified
|
||||||
|
let resolvedModel: string | undefined;
|
||||||
|
if (modelConfig?.mapsToClaudeModel) {
|
||||||
|
const { resolveModelString } = await import('@automaker/model-resolver');
|
||||||
|
resolvedModel = resolveModelString(modelConfig.mapsToClaudeModel);
|
||||||
|
logger.debug(
|
||||||
|
`${logPrefix} Model "${modelId}" maps to Claude model "${modelConfig.mapsToClaudeModel}" -> "${resolvedModel}"`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Log final result for debugging
|
||||||
|
logger.debug(
|
||||||
|
`${logPrefix} Provider context resolved: provider=${provider?.name ?? 'none'}, modelConfig=${modelConfig ? 'found' : 'not found'}, resolvedModel=${resolvedModel ?? modelId}`
|
||||||
|
);
|
||||||
|
|
||||||
|
return { provider, credentials, resolvedModel, modelConfig };
|
||||||
|
} catch (error) {
|
||||||
|
logger.error(`${logPrefix} Failed to resolve provider context:`, error);
|
||||||
|
return {
|
||||||
|
provider: undefined,
|
||||||
|
credentials: undefined,
|
||||||
|
resolvedModel: undefined,
|
||||||
|
modelConfig: undefined,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Find a ClaudeCompatibleProvider by one of its model IDs.
|
* Find a ClaudeCompatibleProvider by one of its model IDs.
|
||||||
* Searches through all enabled providers to find one that contains the specified model.
|
* Searches through all enabled providers to find one that contains the specified model.
|
||||||
|
|||||||
@@ -188,6 +188,7 @@ export class ClaudeProvider extends BaseProvider {
|
|||||||
async *executeQuery(options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
async *executeQuery(options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
||||||
// Validate that model doesn't have a provider prefix
|
// Validate that model doesn't have a provider prefix
|
||||||
// AgentService should strip prefixes before passing to providers
|
// AgentService should strip prefixes before passing to providers
|
||||||
|
// Claude doesn't use a provider prefix, so we don't need to specify an expected provider
|
||||||
validateBareModelId(options.model, 'ClaudeProvider');
|
validateBareModelId(options.model, 'ClaudeProvider');
|
||||||
|
|
||||||
const {
|
const {
|
||||||
|
|||||||
@@ -739,9 +739,9 @@ export class CodexProvider extends BaseProvider {
|
|||||||
}
|
}
|
||||||
|
|
||||||
async *executeQuery(options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
async *executeQuery(options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
||||||
// Validate that model doesn't have a provider prefix
|
// Validate that model doesn't have a provider prefix (except codex- which should already be stripped)
|
||||||
// AgentService should strip prefixes before passing to providers
|
// AgentService should strip prefixes before passing to providers
|
||||||
validateBareModelId(options.model, 'CodexProvider');
|
validateBareModelId(options.model, 'CodexProvider', 'codex');
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const mcpServers = options.mcpServers ?? {};
|
const mcpServers = options.mcpServers ?? {};
|
||||||
|
|||||||
@@ -76,13 +76,18 @@ interface SdkToolExecutionStartEvent extends SdkEvent {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
interface SdkToolExecutionEndEvent extends SdkEvent {
|
interface SdkToolExecutionCompleteEvent extends SdkEvent {
|
||||||
type: 'tool.execution_end';
|
type: 'tool.execution_complete';
|
||||||
data: {
|
data: {
|
||||||
toolName: string;
|
|
||||||
toolCallId: string;
|
toolCallId: string;
|
||||||
result?: string;
|
success: boolean;
|
||||||
error?: string;
|
result?: {
|
||||||
|
content: string;
|
||||||
|
};
|
||||||
|
error?: {
|
||||||
|
message: string;
|
||||||
|
code?: string;
|
||||||
|
};
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -94,6 +99,16 @@ interface SdkSessionErrorEvent extends SdkEvent {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// Constants
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Prefix for error messages in tool results
|
||||||
|
* Consistent with GeminiProvider's error formatting
|
||||||
|
*/
|
||||||
|
const TOOL_ERROR_PREFIX = '[ERROR]' as const;
|
||||||
|
|
||||||
// =============================================================================
|
// =============================================================================
|
||||||
// Error Codes
|
// Error Codes
|
||||||
// =============================================================================
|
// =============================================================================
|
||||||
@@ -357,12 +372,19 @@ export class CopilotProvider extends CliProvider {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
case 'tool.execution_end': {
|
/**
|
||||||
const toolResultEvent = sdkEvent as SdkToolExecutionEndEvent;
|
* Tool execution completed event
|
||||||
const isError = !!toolResultEvent.data.error;
|
* Handles both successful results and errors from tool executions
|
||||||
const content = isError
|
* Error messages optionally include error codes for better debugging
|
||||||
? `[ERROR] ${toolResultEvent.data.error}`
|
*/
|
||||||
: toolResultEvent.data.result || '';
|
case 'tool.execution_complete': {
|
||||||
|
const toolResultEvent = sdkEvent as SdkToolExecutionCompleteEvent;
|
||||||
|
const error = toolResultEvent.data.error;
|
||||||
|
|
||||||
|
// Format error message with optional code for better debugging
|
||||||
|
const content = error
|
||||||
|
? `${TOOL_ERROR_PREFIX} ${error.message}${error.code ? ` (${error.code})` : ''}`
|
||||||
|
: toolResultEvent.data.result?.content || '';
|
||||||
|
|
||||||
return {
|
return {
|
||||||
type: 'assistant',
|
type: 'assistant',
|
||||||
@@ -628,7 +650,7 @@ export class CopilotProvider extends CliProvider {
|
|||||||
sessionComplete = true;
|
sessionComplete = true;
|
||||||
pushEvent(event);
|
pushEvent(event);
|
||||||
} else {
|
} else {
|
||||||
// Push all other events (tool.execution_start, tool.execution_end, assistant.message, etc.)
|
// Push all other events (tool.execution_start, tool.execution_complete, assistant.message, etc.)
|
||||||
pushEvent(event);
|
pushEvent(event);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -843,9 +843,10 @@ export class CursorProvider extends CliProvider {
|
|||||||
async *executeQuery(options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
async *executeQuery(options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
||||||
this.ensureCliDetected();
|
this.ensureCliDetected();
|
||||||
|
|
||||||
// Validate that model doesn't have a provider prefix
|
// Validate that model doesn't have a provider prefix (except cursor- which should already be stripped)
|
||||||
// AgentService should strip prefixes before passing to providers
|
// AgentService should strip prefixes before passing to providers
|
||||||
validateBareModelId(options.model, 'CursorProvider');
|
// Note: Cursor's Gemini models (e.g., "gemini-3-pro") legitimately start with "gemini-"
|
||||||
|
validateBareModelId(options.model, 'CursorProvider', 'cursor');
|
||||||
|
|
||||||
if (!this.cliPath) {
|
if (!this.cliPath) {
|
||||||
throw this.createError(
|
throw this.createError(
|
||||||
|
|||||||
@@ -546,8 +546,8 @@ export class GeminiProvider extends CliProvider {
|
|||||||
async *executeQuery(options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
async *executeQuery(options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
||||||
this.ensureCliDetected();
|
this.ensureCliDetected();
|
||||||
|
|
||||||
// Validate that model doesn't have a provider prefix
|
// Validate that model doesn't have a provider prefix (except gemini- which should already be stripped)
|
||||||
validateBareModelId(options.model, 'GeminiProvider');
|
validateBareModelId(options.model, 'GeminiProvider', 'gemini');
|
||||||
|
|
||||||
if (!this.cliPath) {
|
if (!this.cliPath) {
|
||||||
throw this.createError(
|
throw this.createError(
|
||||||
|
|||||||
53
apps/server/src/providers/mock-provider.ts
Normal file
53
apps/server/src/providers/mock-provider.ts
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
/**
|
||||||
|
* Mock Provider - No-op AI provider for E2E and CI testing
|
||||||
|
*
|
||||||
|
* When AUTOMAKER_MOCK_AGENT=true, the server uses this provider instead of
|
||||||
|
* real backends (Claude, Codex, etc.) so tests never call external APIs.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import type { ExecuteOptions } from '@automaker/types';
|
||||||
|
import { BaseProvider } from './base-provider.js';
|
||||||
|
import type { ProviderMessage, InstallationStatus, ModelDefinition } from './types.js';
|
||||||
|
|
||||||
|
const MOCK_TEXT = 'Mock agent output for testing.';
|
||||||
|
|
||||||
|
export class MockProvider extends BaseProvider {
|
||||||
|
getName(): string {
|
||||||
|
return 'mock';
|
||||||
|
}
|
||||||
|
|
||||||
|
async *executeQuery(_options: ExecuteOptions): AsyncGenerator<ProviderMessage> {
|
||||||
|
yield {
|
||||||
|
type: 'assistant',
|
||||||
|
message: {
|
||||||
|
role: 'assistant',
|
||||||
|
content: [{ type: 'text', text: MOCK_TEXT }],
|
||||||
|
},
|
||||||
|
};
|
||||||
|
yield {
|
||||||
|
type: 'result',
|
||||||
|
subtype: 'success',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async detectInstallation(): Promise<InstallationStatus> {
|
||||||
|
return {
|
||||||
|
installed: true,
|
||||||
|
method: 'sdk',
|
||||||
|
hasApiKey: true,
|
||||||
|
authenticated: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
getAvailableModels(): ModelDefinition[] {
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
id: 'mock-model',
|
||||||
|
name: 'Mock Model',
|
||||||
|
modelString: 'mock-model',
|
||||||
|
provider: 'mock',
|
||||||
|
description: 'Mock model for testing',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -67,6 +67,16 @@ export function registerProvider(name: string, registration: ProviderRegistratio
|
|||||||
providerRegistry.set(name.toLowerCase(), registration);
|
providerRegistry.set(name.toLowerCase(), registration);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/** Cached mock provider instance when AUTOMAKER_MOCK_AGENT is set (E2E/CI). */
|
||||||
|
let mockProviderInstance: BaseProvider | null = null;
|
||||||
|
|
||||||
|
function getMockProvider(): BaseProvider {
|
||||||
|
if (!mockProviderInstance) {
|
||||||
|
mockProviderInstance = new MockProvider();
|
||||||
|
}
|
||||||
|
return mockProviderInstance;
|
||||||
|
}
|
||||||
|
|
||||||
export class ProviderFactory {
|
export class ProviderFactory {
|
||||||
/**
|
/**
|
||||||
* Determine which provider to use for a given model
|
* Determine which provider to use for a given model
|
||||||
@@ -75,6 +85,9 @@ export class ProviderFactory {
|
|||||||
* @returns Provider name (ModelProvider type)
|
* @returns Provider name (ModelProvider type)
|
||||||
*/
|
*/
|
||||||
static getProviderNameForModel(model: string): ModelProvider {
|
static getProviderNameForModel(model: string): ModelProvider {
|
||||||
|
if (process.env.AUTOMAKER_MOCK_AGENT === 'true') {
|
||||||
|
return 'claude' as ModelProvider; // Name only; getProviderForModel returns MockProvider
|
||||||
|
}
|
||||||
const lowerModel = model.toLowerCase();
|
const lowerModel = model.toLowerCase();
|
||||||
|
|
||||||
// Get all registered providers sorted by priority (descending)
|
// Get all registered providers sorted by priority (descending)
|
||||||
@@ -113,6 +126,9 @@ export class ProviderFactory {
|
|||||||
modelId: string,
|
modelId: string,
|
||||||
options: { throwOnDisconnected?: boolean } = {}
|
options: { throwOnDisconnected?: boolean } = {}
|
||||||
): BaseProvider {
|
): BaseProvider {
|
||||||
|
if (process.env.AUTOMAKER_MOCK_AGENT === 'true') {
|
||||||
|
return getMockProvider();
|
||||||
|
}
|
||||||
const { throwOnDisconnected = true } = options;
|
const { throwOnDisconnected = true } = options;
|
||||||
const providerName = this.getProviderForModelName(modelId);
|
const providerName = this.getProviderForModelName(modelId);
|
||||||
|
|
||||||
@@ -142,6 +158,9 @@ export class ProviderFactory {
|
|||||||
* Get the provider name for a given model ID (without creating provider instance)
|
* Get the provider name for a given model ID (without creating provider instance)
|
||||||
*/
|
*/
|
||||||
static getProviderForModelName(modelId: string): string {
|
static getProviderForModelName(modelId: string): string {
|
||||||
|
if (process.env.AUTOMAKER_MOCK_AGENT === 'true') {
|
||||||
|
return 'claude';
|
||||||
|
}
|
||||||
const lowerModel = modelId.toLowerCase();
|
const lowerModel = modelId.toLowerCase();
|
||||||
|
|
||||||
// Get all registered providers sorted by priority (descending)
|
// Get all registered providers sorted by priority (descending)
|
||||||
@@ -272,6 +291,7 @@ export class ProviderFactory {
|
|||||||
// =============================================================================
|
// =============================================================================
|
||||||
|
|
||||||
// Import providers for registration side-effects
|
// Import providers for registration side-effects
|
||||||
|
import { MockProvider } from './mock-provider.js';
|
||||||
import { ClaudeProvider } from './claude-provider.js';
|
import { ClaudeProvider } from './claude-provider.js';
|
||||||
import { CursorProvider } from './cursor-provider.js';
|
import { CursorProvider } from './cursor-provider.js';
|
||||||
import { CodexProvider } from './codex-provider.js';
|
import { CodexProvider } from './codex-provider.js';
|
||||||
|
|||||||
@@ -70,6 +70,8 @@ export async function parseAndCreateFeatures(
|
|||||||
priority: feature.priority || 2,
|
priority: feature.priority || 2,
|
||||||
complexity: feature.complexity || 'moderate',
|
complexity: feature.complexity || 'moderate',
|
||||||
dependencies: feature.dependencies || [],
|
dependencies: feature.dependencies || [],
|
||||||
|
planningMode: 'skip',
|
||||||
|
requirePlanApproval: false,
|
||||||
createdAt: new Date().toISOString(),
|
createdAt: new Date().toISOString(),
|
||||||
updatedAt: new Date().toISOString(),
|
updatedAt: new Date().toISOString(),
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ export function createBacklogPlanRoutes(
|
|||||||
);
|
);
|
||||||
router.post('/stop', createStopHandler());
|
router.post('/stop', createStopHandler());
|
||||||
router.get('/status', validatePathParams('projectPath'), createStatusHandler());
|
router.get('/status', validatePathParams('projectPath'), createStatusHandler());
|
||||||
router.post('/apply', validatePathParams('projectPath'), createApplyHandler());
|
router.post('/apply', validatePathParams('projectPath'), createApplyHandler(settingsService));
|
||||||
router.post('/clear', validatePathParams('projectPath'), createClearHandler());
|
router.post('/clear', validatePathParams('projectPath'), createClearHandler());
|
||||||
|
|
||||||
return router;
|
return router;
|
||||||
|
|||||||
@@ -3,13 +3,23 @@
|
|||||||
*/
|
*/
|
||||||
|
|
||||||
import type { Request, Response } from 'express';
|
import type { Request, Response } from 'express';
|
||||||
import type { BacklogPlanResult } from '@automaker/types';
|
import { resolvePhaseModel } from '@automaker/model-resolver';
|
||||||
|
import type { BacklogPlanResult, PhaseModelEntry, PlanningMode } from '@automaker/types';
|
||||||
import { FeatureLoader } from '../../../services/feature-loader.js';
|
import { FeatureLoader } from '../../../services/feature-loader.js';
|
||||||
|
import type { SettingsService } from '../../../services/settings-service.js';
|
||||||
import { clearBacklogPlan, getErrorMessage, logError, logger } from '../common.js';
|
import { clearBacklogPlan, getErrorMessage, logError, logger } from '../common.js';
|
||||||
|
|
||||||
const featureLoader = new FeatureLoader();
|
const featureLoader = new FeatureLoader();
|
||||||
|
|
||||||
export function createApplyHandler() {
|
function normalizePhaseModelEntry(
|
||||||
|
entry: PhaseModelEntry | string | undefined | null
|
||||||
|
): PhaseModelEntry | undefined {
|
||||||
|
if (!entry) return undefined;
|
||||||
|
if (typeof entry === 'string') return { model: entry };
|
||||||
|
return entry;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function createApplyHandler(settingsService?: SettingsService) {
|
||||||
return async (req: Request, res: Response): Promise<void> => {
|
return async (req: Request, res: Response): Promise<void> => {
|
||||||
try {
|
try {
|
||||||
const {
|
const {
|
||||||
@@ -38,6 +48,23 @@ export function createApplyHandler() {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
let defaultPlanningMode: PlanningMode = 'skip';
|
||||||
|
let defaultRequirePlanApproval = false;
|
||||||
|
let defaultModelEntry: PhaseModelEntry | undefined;
|
||||||
|
|
||||||
|
if (settingsService) {
|
||||||
|
const globalSettings = await settingsService.getGlobalSettings();
|
||||||
|
const projectSettings = await settingsService.getProjectSettings(projectPath);
|
||||||
|
|
||||||
|
defaultPlanningMode = globalSettings.defaultPlanningMode ?? 'skip';
|
||||||
|
defaultRequirePlanApproval = globalSettings.defaultRequirePlanApproval ?? false;
|
||||||
|
defaultModelEntry = normalizePhaseModelEntry(
|
||||||
|
projectSettings.defaultFeatureModel ?? globalSettings.defaultFeatureModel
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const resolvedDefaultModel = resolvePhaseModel(defaultModelEntry);
|
||||||
|
|
||||||
const appliedChanges: string[] = [];
|
const appliedChanges: string[] = [];
|
||||||
|
|
||||||
// Load current features for dependency validation
|
// Load current features for dependency validation
|
||||||
@@ -88,6 +115,12 @@ export function createApplyHandler() {
|
|||||||
if (!change.feature) continue;
|
if (!change.feature) continue;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
const effectivePlanningMode = change.feature.planningMode ?? defaultPlanningMode;
|
||||||
|
const effectiveRequirePlanApproval =
|
||||||
|
effectivePlanningMode === 'skip' || effectivePlanningMode === 'lite'
|
||||||
|
? false
|
||||||
|
: (change.feature.requirePlanApproval ?? defaultRequirePlanApproval);
|
||||||
|
|
||||||
// Create the new feature - use the AI-generated ID if provided
|
// Create the new feature - use the AI-generated ID if provided
|
||||||
const newFeature = await featureLoader.create(projectPath, {
|
const newFeature = await featureLoader.create(projectPath, {
|
||||||
id: change.feature.id, // Use descriptive ID from AI if provided
|
id: change.feature.id, // Use descriptive ID from AI if provided
|
||||||
@@ -97,6 +130,12 @@ export function createApplyHandler() {
|
|||||||
dependencies: change.feature.dependencies,
|
dependencies: change.feature.dependencies,
|
||||||
priority: change.feature.priority,
|
priority: change.feature.priority,
|
||||||
status: 'backlog',
|
status: 'backlog',
|
||||||
|
model: change.feature.model ?? resolvedDefaultModel.model,
|
||||||
|
thinkingLevel: change.feature.thinkingLevel ?? resolvedDefaultModel.thinkingLevel,
|
||||||
|
reasoningEffort: change.feature.reasoningEffort ?? resolvedDefaultModel.reasoningEffort,
|
||||||
|
providerId: change.feature.providerId ?? resolvedDefaultModel.providerId,
|
||||||
|
planningMode: effectivePlanningMode,
|
||||||
|
requirePlanApproval: effectiveRequirePlanApproval,
|
||||||
branchName,
|
branchName,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -43,7 +43,11 @@ export function createUpdateHandler(featureLoader: FeatureLoader, events?: Event
|
|||||||
|
|
||||||
// Get the current feature to detect status changes
|
// Get the current feature to detect status changes
|
||||||
const currentFeature = await featureLoader.get(projectPath, featureId);
|
const currentFeature = await featureLoader.get(projectPath, featureId);
|
||||||
const previousStatus = currentFeature?.status as FeatureStatus | undefined;
|
if (!currentFeature) {
|
||||||
|
res.status(404).json({ success: false, error: `Feature ${featureId} not found` });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const previousStatus = currentFeature.status as FeatureStatus;
|
||||||
const newStatus = updates.status as FeatureStatus | undefined;
|
const newStatus = updates.status as FeatureStatus | undefined;
|
||||||
|
|
||||||
const updated = await featureLoader.update(
|
const updated = await featureLoader.update(
|
||||||
|
|||||||
@@ -3,16 +3,29 @@
|
|||||||
*/
|
*/
|
||||||
|
|
||||||
import type { Request, Response } from 'express';
|
import type { Request, Response } from 'express';
|
||||||
|
import path from 'path';
|
||||||
import * as secureFs from '../../../lib/secure-fs.js';
|
import * as secureFs from '../../../lib/secure-fs.js';
|
||||||
import { PathNotAllowedError } from '@automaker/platform';
|
import { PathNotAllowedError } from '@automaker/platform';
|
||||||
import { getErrorMessage, logError } from '../common.js';
|
import { getErrorMessage, logError } from '../common.js';
|
||||||
|
|
||||||
// Optional files that are expected to not exist in new projects
|
// Optional files that are expected to not exist in new projects
|
||||||
// Don't log ENOENT errors for these to reduce noise
|
// Don't log ENOENT errors for these to reduce noise
|
||||||
const OPTIONAL_FILES = ['categories.json', 'app_spec.txt'];
|
const OPTIONAL_FILES = ['categories.json', 'app_spec.txt', 'context-metadata.json'];
|
||||||
|
|
||||||
function isOptionalFile(filePath: string): boolean {
|
function isOptionalFile(filePath: string): boolean {
|
||||||
return OPTIONAL_FILES.some((optionalFile) => filePath.endsWith(optionalFile));
|
const basename = path.basename(filePath);
|
||||||
|
if (OPTIONAL_FILES.some((optionalFile) => basename === optionalFile)) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
// Context and memory files may not exist yet during create/delete or test races
|
||||||
|
if (filePath.includes('.automaker/context/') || filePath.includes('.automaker/memory/')) {
|
||||||
|
const name = path.basename(filePath);
|
||||||
|
const lower = name.toLowerCase();
|
||||||
|
if (lower.endsWith('.md') || lower.endsWith('.txt') || lower.endsWith('.markdown')) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
function isENOENT(error: unknown): boolean {
|
function isENOENT(error: unknown): boolean {
|
||||||
@@ -39,12 +52,14 @@ export function createReadHandler() {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Don't log ENOENT errors for optional files (expected to be missing in new projects)
|
const filePath = req.body?.filePath || '';
|
||||||
const shouldLog = !(isENOENT(error) && isOptionalFile(req.body?.filePath || ''));
|
const optionalMissing = isENOENT(error) && isOptionalFile(filePath);
|
||||||
if (shouldLog) {
|
if (!optionalMissing) {
|
||||||
logError(error, 'Read file failed');
|
logError(error, 'Read file failed');
|
||||||
}
|
}
|
||||||
res.status(500).json({ success: false, error: getErrorMessage(error) });
|
// Return 404 for missing optional files so clients can handle "not found"
|
||||||
|
const status = optionalMissing ? 404 : 500;
|
||||||
|
res.status(status).json({ success: false, error: getErrorMessage(error) });
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -35,6 +35,16 @@ export function createStatHandler() {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// File or directory does not exist - return 404 so UI can handle missing paths
|
||||||
|
const code =
|
||||||
|
error && typeof error === 'object' && 'code' in error
|
||||||
|
? (error as { code: string }).code
|
||||||
|
: '';
|
||||||
|
if (code === 'ENOENT') {
|
||||||
|
res.status(404).json({ success: false, error: 'File or directory not found' });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
logError(error, 'Get file stats failed');
|
logError(error, 'Get file stats failed');
|
||||||
res.status(500).json({ success: false, error: getErrorMessage(error) });
|
res.status(500).json({ success: false, error: getErrorMessage(error) });
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -38,7 +38,7 @@ import {
|
|||||||
import {
|
import {
|
||||||
getPromptCustomization,
|
getPromptCustomization,
|
||||||
getAutoLoadClaudeMdSetting,
|
getAutoLoadClaudeMdSetting,
|
||||||
getProviderByModelId,
|
resolveProviderContext,
|
||||||
} from '../../../lib/settings-helpers.js';
|
} from '../../../lib/settings-helpers.js';
|
||||||
import {
|
import {
|
||||||
trySetValidationRunning,
|
trySetValidationRunning,
|
||||||
@@ -64,6 +64,8 @@ interface ValidateIssueRequestBody {
|
|||||||
thinkingLevel?: ThinkingLevel;
|
thinkingLevel?: ThinkingLevel;
|
||||||
/** Reasoning effort for Codex models (ignored for non-Codex models) */
|
/** Reasoning effort for Codex models (ignored for non-Codex models) */
|
||||||
reasoningEffort?: ReasoningEffort;
|
reasoningEffort?: ReasoningEffort;
|
||||||
|
/** Optional Claude-compatible provider ID for custom providers (e.g., GLM, MiniMax) */
|
||||||
|
providerId?: string;
|
||||||
/** Comments to include in validation analysis */
|
/** Comments to include in validation analysis */
|
||||||
comments?: GitHubComment[];
|
comments?: GitHubComment[];
|
||||||
/** Linked pull requests for this issue */
|
/** Linked pull requests for this issue */
|
||||||
@@ -87,6 +89,7 @@ async function runValidation(
|
|||||||
events: EventEmitter,
|
events: EventEmitter,
|
||||||
abortController: AbortController,
|
abortController: AbortController,
|
||||||
settingsService?: SettingsService,
|
settingsService?: SettingsService,
|
||||||
|
providerId?: string,
|
||||||
comments?: ValidationComment[],
|
comments?: ValidationComment[],
|
||||||
linkedPRs?: ValidationLinkedPR[],
|
linkedPRs?: ValidationLinkedPR[],
|
||||||
thinkingLevel?: ThinkingLevel,
|
thinkingLevel?: ThinkingLevel,
|
||||||
@@ -176,7 +179,12 @@ ${basePrompt}`;
|
|||||||
let credentials = await settingsService?.getCredentials();
|
let credentials = await settingsService?.getCredentials();
|
||||||
|
|
||||||
if (settingsService) {
|
if (settingsService) {
|
||||||
const providerResult = await getProviderByModelId(model, settingsService, '[ValidateIssue]');
|
const providerResult = await resolveProviderContext(
|
||||||
|
settingsService,
|
||||||
|
model,
|
||||||
|
providerId,
|
||||||
|
'[ValidateIssue]'
|
||||||
|
);
|
||||||
if (providerResult.provider) {
|
if (providerResult.provider) {
|
||||||
claudeCompatibleProvider = providerResult.provider;
|
claudeCompatibleProvider = providerResult.provider;
|
||||||
providerResolvedModel = providerResult.resolvedModel;
|
providerResolvedModel = providerResult.resolvedModel;
|
||||||
@@ -312,10 +320,16 @@ export function createValidateIssueHandler(
|
|||||||
model = 'opus',
|
model = 'opus',
|
||||||
thinkingLevel,
|
thinkingLevel,
|
||||||
reasoningEffort,
|
reasoningEffort,
|
||||||
|
providerId,
|
||||||
comments: rawComments,
|
comments: rawComments,
|
||||||
linkedPRs: rawLinkedPRs,
|
linkedPRs: rawLinkedPRs,
|
||||||
} = req.body as ValidateIssueRequestBody;
|
} = req.body as ValidateIssueRequestBody;
|
||||||
|
|
||||||
|
const normalizedProviderId =
|
||||||
|
typeof providerId === 'string' && providerId.trim().length > 0
|
||||||
|
? providerId.trim()
|
||||||
|
: undefined;
|
||||||
|
|
||||||
// Transform GitHubComment[] to ValidationComment[] if provided
|
// Transform GitHubComment[] to ValidationComment[] if provided
|
||||||
const validationComments: ValidationComment[] | undefined = rawComments?.map((c) => ({
|
const validationComments: ValidationComment[] | undefined = rawComments?.map((c) => ({
|
||||||
author: c.author?.login || 'ghost',
|
author: c.author?.login || 'ghost',
|
||||||
@@ -364,12 +378,14 @@ export function createValidateIssueHandler(
|
|||||||
isClaudeModel(model) ||
|
isClaudeModel(model) ||
|
||||||
isCursorModel(model) ||
|
isCursorModel(model) ||
|
||||||
isCodexModel(model) ||
|
isCodexModel(model) ||
|
||||||
isOpencodeModel(model);
|
isOpencodeModel(model) ||
|
||||||
|
!!normalizedProviderId;
|
||||||
|
|
||||||
if (!isValidModel) {
|
if (!isValidModel) {
|
||||||
res.status(400).json({
|
res.status(400).json({
|
||||||
success: false,
|
success: false,
|
||||||
error: 'Invalid model. Must be a Claude, Cursor, Codex, or OpenCode model ID (or alias).',
|
error:
|
||||||
|
'Invalid model. Must be a Claude, Cursor, Codex, or OpenCode model ID (or alias), or provide a valid providerId for custom Claude-compatible models.',
|
||||||
});
|
});
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -398,6 +414,7 @@ export function createValidateIssueHandler(
|
|||||||
events,
|
events,
|
||||||
abortController,
|
abortController,
|
||||||
settingsService,
|
settingsService,
|
||||||
|
normalizedProviderId,
|
||||||
validationComments,
|
validationComments,
|
||||||
validationLinkedPRs,
|
validationLinkedPRs,
|
||||||
thinkingLevel,
|
thinkingLevel,
|
||||||
|
|||||||
@@ -80,6 +80,12 @@ function containsAuthError(text: string): boolean {
|
|||||||
export function createVerifyClaudeAuthHandler() {
|
export function createVerifyClaudeAuthHandler() {
|
||||||
return async (req: Request, res: Response): Promise<void> => {
|
return async (req: Request, res: Response): Promise<void> => {
|
||||||
try {
|
try {
|
||||||
|
// In E2E/CI mock mode, skip real API calls
|
||||||
|
if (process.env.AUTOMAKER_MOCK_AGENT === 'true') {
|
||||||
|
res.json({ success: true, authenticated: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
// Get the auth method and optional API key from the request body
|
// Get the auth method and optional API key from the request body
|
||||||
const { authMethod, apiKey } = req.body as {
|
const { authMethod, apiKey } = req.body as {
|
||||||
authMethod?: 'cli' | 'api_key';
|
authMethod?: 'cli' | 'api_key';
|
||||||
|
|||||||
@@ -82,6 +82,12 @@ function isRateLimitError(text: string): boolean {
|
|||||||
|
|
||||||
export function createVerifyCodexAuthHandler() {
|
export function createVerifyCodexAuthHandler() {
|
||||||
return async (req: Request, res: Response): Promise<void> => {
|
return async (req: Request, res: Response): Promise<void> => {
|
||||||
|
// In E2E/CI mock mode, skip real API calls
|
||||||
|
if (process.env.AUTOMAKER_MOCK_AGENT === 'true') {
|
||||||
|
res.json({ success: true, authenticated: true });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
const { authMethod, apiKey } = req.body as {
|
const { authMethod, apiKey } = req.body as {
|
||||||
authMethod?: 'cli' | 'api_key';
|
authMethod?: 'cli' | 'api_key';
|
||||||
apiKey?: string;
|
apiKey?: string;
|
||||||
|
|||||||
@@ -44,13 +44,79 @@ export function createInitGitHandler() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Initialize git with 'main' as the default branch (matching GitHub's standard since 2020)
|
// Initialize git with 'main' as the default branch (matching GitHub's standard since 2020)
|
||||||
// and create an initial empty commit
|
// Run commands sequentially so failures can be handled and partial state cleaned up.
|
||||||
await execAsync(
|
let gitDirCreated = false;
|
||||||
`git init --initial-branch=main && git commit --allow-empty -m "Initial commit"`,
|
try {
|
||||||
{
|
// Step 1: initialize the repository
|
||||||
cwd: projectPath,
|
try {
|
||||||
|
await execAsync(`git init --initial-branch=main`, { cwd: projectPath });
|
||||||
|
} catch (initError: unknown) {
|
||||||
|
const stderr =
|
||||||
|
initError && typeof initError === 'object' && 'stderr' in initError
|
||||||
|
? String((initError as { stderr?: string }).stderr)
|
||||||
|
: '';
|
||||||
|
// Idempotent: if .git was created by a concurrent request or a stale lock exists,
|
||||||
|
// treat as "repo already exists" instead of failing
|
||||||
|
if (
|
||||||
|
/could not lock config file.*File exists|fatal: could not set 'core\.repositoryformatversion'/.test(
|
||||||
|
stderr
|
||||||
|
)
|
||||||
|
) {
|
||||||
|
try {
|
||||||
|
await secureFs.access(gitDirPath);
|
||||||
|
res.json({
|
||||||
|
success: true,
|
||||||
|
result: {
|
||||||
|
initialized: false,
|
||||||
|
message: 'Git repository already exists',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
} catch {
|
||||||
|
// .git still missing, rethrow original error
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw initError;
|
||||||
}
|
}
|
||||||
);
|
gitDirCreated = true;
|
||||||
|
|
||||||
|
// Step 2: ensure user.name and user.email are set so the commit can succeed.
|
||||||
|
// Check the global/system config first; only set locally if missing.
|
||||||
|
let userName = '';
|
||||||
|
let userEmail = '';
|
||||||
|
try {
|
||||||
|
({ stdout: userName } = await execAsync(`git config user.name`, { cwd: projectPath }));
|
||||||
|
} catch {
|
||||||
|
// not set globally – will configure locally below
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
({ stdout: userEmail } = await execAsync(`git config user.email`, {
|
||||||
|
cwd: projectPath,
|
||||||
|
}));
|
||||||
|
} catch {
|
||||||
|
// not set globally – will configure locally below
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!userName.trim()) {
|
||||||
|
await execAsync(`git config user.name "Automaker"`, { cwd: projectPath });
|
||||||
|
}
|
||||||
|
if (!userEmail.trim()) {
|
||||||
|
await execAsync(`git config user.email "automaker@localhost"`, { cwd: projectPath });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 3: create the initial empty commit
|
||||||
|
await execAsync(`git commit --allow-empty -m "Initial commit"`, { cwd: projectPath });
|
||||||
|
} catch (error: unknown) {
|
||||||
|
// Clean up the partial .git directory so subsequent runs behave deterministically
|
||||||
|
if (gitDirCreated) {
|
||||||
|
try {
|
||||||
|
await secureFs.rm(gitDirPath, { recursive: true, force: true });
|
||||||
|
} catch {
|
||||||
|
// best-effort cleanup; ignore errors
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
|
||||||
res.json({
|
res.json({
|
||||||
success: true,
|
success: true,
|
||||||
|
|||||||
@@ -6,12 +6,11 @@
|
|||||||
*/
|
*/
|
||||||
|
|
||||||
import type { Request, Response } from 'express';
|
import type { Request, Response } from 'express';
|
||||||
import { exec, execFile } from 'child_process';
|
import { execFile } from 'child_process';
|
||||||
import { promisify } from 'util';
|
import { promisify } from 'util';
|
||||||
import { getErrorMessage, logWorktreeError } from '../common.js';
|
import { getErrorMessage, logWorktreeError, execGitCommand } from '../common.js';
|
||||||
import { getRemotesWithBranch } from '../../../services/worktree-service.js';
|
import { getRemotesWithBranch } from '../../../services/worktree-service.js';
|
||||||
|
|
||||||
const execAsync = promisify(exec);
|
|
||||||
const execFileAsync = promisify(execFile);
|
const execFileAsync = promisify(execFile);
|
||||||
|
|
||||||
interface BranchInfo {
|
interface BranchInfo {
|
||||||
@@ -36,18 +35,18 @@ export function createListBranchesHandler() {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Get current branch
|
// Get current branch (execGitCommand avoids spawning /bin/sh; works in sandboxed CI)
|
||||||
const { stdout: currentBranchOutput } = await execAsync('git rev-parse --abbrev-ref HEAD', {
|
const currentBranchOutput = await execGitCommand(
|
||||||
cwd: worktreePath,
|
['rev-parse', '--abbrev-ref', 'HEAD'],
|
||||||
});
|
worktreePath
|
||||||
|
);
|
||||||
const currentBranch = currentBranchOutput.trim();
|
const currentBranch = currentBranchOutput.trim();
|
||||||
|
|
||||||
// List all local branches
|
// List all local branches
|
||||||
// Use double quotes around the format string for cross-platform compatibility
|
const branchesOutput = await execGitCommand(
|
||||||
// Single quotes are preserved literally on Windows; double quotes work on both
|
['branch', '--format=%(refname:short)'],
|
||||||
const { stdout: branchesOutput } = await execAsync('git branch --format="%(refname:short)"', {
|
worktreePath
|
||||||
cwd: worktreePath,
|
);
|
||||||
});
|
|
||||||
|
|
||||||
const branches: BranchInfo[] = branchesOutput
|
const branches: BranchInfo[] = branchesOutput
|
||||||
.trim()
|
.trim()
|
||||||
@@ -68,18 +67,15 @@ export function createListBranchesHandler() {
|
|||||||
try {
|
try {
|
||||||
// Fetch latest remote refs (silently, don't fail if offline)
|
// Fetch latest remote refs (silently, don't fail if offline)
|
||||||
try {
|
try {
|
||||||
await execAsync('git fetch --all --quiet', {
|
await execGitCommand(['fetch', '--all', '--quiet'], worktreePath);
|
||||||
cwd: worktreePath,
|
|
||||||
timeout: 10000, // 10 second timeout
|
|
||||||
});
|
|
||||||
} catch {
|
} catch {
|
||||||
// Ignore fetch errors - we'll use cached remote refs
|
// Ignore fetch errors - we'll use cached remote refs
|
||||||
}
|
}
|
||||||
|
|
||||||
// List remote branches
|
// List remote branches
|
||||||
const { stdout: remoteBranchesOutput } = await execAsync(
|
const remoteBranchesOutput = await execGitCommand(
|
||||||
'git branch -r --format="%(refname:short)"',
|
['branch', '-r', '--format=%(refname:short)'],
|
||||||
{ cwd: worktreePath }
|
worktreePath
|
||||||
);
|
);
|
||||||
|
|
||||||
const localBranchNames = new Set(branches.map((b) => b.name));
|
const localBranchNames = new Set(branches.map((b) => b.name));
|
||||||
@@ -118,9 +114,7 @@ export function createListBranchesHandler() {
|
|||||||
// Check if any remotes are configured for this repository
|
// Check if any remotes are configured for this repository
|
||||||
let hasAnyRemotes = false;
|
let hasAnyRemotes = false;
|
||||||
try {
|
try {
|
||||||
const { stdout: remotesOutput } = await execAsync('git remote', {
|
const remotesOutput = await execGitCommand(['remote'], worktreePath);
|
||||||
cwd: worktreePath,
|
|
||||||
});
|
|
||||||
hasAnyRemotes = remotesOutput.trim().length > 0;
|
hasAnyRemotes = remotesOutput.trim().length > 0;
|
||||||
} catch {
|
} catch {
|
||||||
// If git remote fails, assume no remotes
|
// If git remote fails, assume no remotes
|
||||||
|
|||||||
@@ -13,7 +13,14 @@ import { promisify } from 'util';
|
|||||||
import path from 'path';
|
import path from 'path';
|
||||||
import * as secureFs from '../../../lib/secure-fs.js';
|
import * as secureFs from '../../../lib/secure-fs.js';
|
||||||
import { isGitRepo } from '@automaker/git-utils';
|
import { isGitRepo } from '@automaker/git-utils';
|
||||||
import { getErrorMessage, logError, normalizePath, execEnv, isGhCliAvailable } from '../common.js';
|
import {
|
||||||
|
getErrorMessage,
|
||||||
|
logError,
|
||||||
|
normalizePath,
|
||||||
|
execEnv,
|
||||||
|
isGhCliAvailable,
|
||||||
|
execGitCommand,
|
||||||
|
} from '../common.js';
|
||||||
import {
|
import {
|
||||||
readAllWorktreeMetadata,
|
readAllWorktreeMetadata,
|
||||||
updateWorktreePRInfo,
|
updateWorktreePRInfo,
|
||||||
@@ -29,6 +36,22 @@ import {
|
|||||||
const execAsync = promisify(exec);
|
const execAsync = promisify(exec);
|
||||||
const logger = createLogger('Worktree');
|
const logger = createLogger('Worktree');
|
||||||
|
|
||||||
|
/** True when git (or shell) could not be spawned (e.g. ENOENT in sandboxed CI). */
|
||||||
|
function isSpawnENOENT(error: unknown): boolean {
|
||||||
|
if (!error || typeof error !== 'object') return false;
|
||||||
|
const e = error as { code?: string; errno?: number; syscall?: string };
|
||||||
|
// Accept ENOENT with or without syscall so wrapped/reexported errors are handled.
|
||||||
|
// Node may set syscall to 'spawn' or 'spawn git' (or other command name).
|
||||||
|
if (e.code === 'ENOENT' || e.errno === -2) {
|
||||||
|
return (
|
||||||
|
e.syscall === 'spawn' ||
|
||||||
|
(typeof e.syscall === 'string' && e.syscall.startsWith('spawn')) ||
|
||||||
|
e.syscall === undefined
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Cache for GitHub remote status per project path.
|
* Cache for GitHub remote status per project path.
|
||||||
* This prevents repeated "no git remotes found" warnings when polling
|
* This prevents repeated "no git remotes found" warnings when polling
|
||||||
@@ -77,11 +100,8 @@ async function detectConflictState(worktreePath: string): Promise<{
|
|||||||
conflictFiles?: string[];
|
conflictFiles?: string[];
|
||||||
}> {
|
}> {
|
||||||
try {
|
try {
|
||||||
// Find the canonical .git directory for this worktree
|
// Find the canonical .git directory for this worktree (execGitCommand avoids /bin/sh in CI)
|
||||||
const { stdout: gitDirRaw } = await execAsync('git rev-parse --git-dir', {
|
const gitDirRaw = await execGitCommand(['rev-parse', '--git-dir'], worktreePath);
|
||||||
cwd: worktreePath,
|
|
||||||
timeout: 15000,
|
|
||||||
});
|
|
||||||
const gitDir = path.resolve(worktreePath, gitDirRaw.trim());
|
const gitDir = path.resolve(worktreePath, gitDirRaw.trim());
|
||||||
|
|
||||||
// Check for merge, rebase, and cherry-pick state files/directories
|
// Check for merge, rebase, and cherry-pick state files/directories
|
||||||
@@ -121,10 +141,10 @@ async function detectConflictState(worktreePath: string): Promise<{
|
|||||||
// Get list of conflicted files using machine-readable git status
|
// Get list of conflicted files using machine-readable git status
|
||||||
let conflictFiles: string[] = [];
|
let conflictFiles: string[] = [];
|
||||||
try {
|
try {
|
||||||
const { stdout: statusOutput } = await execAsync('git diff --name-only --diff-filter=U', {
|
const statusOutput = await execGitCommand(
|
||||||
cwd: worktreePath,
|
['diff', '--name-only', '--diff-filter=U'],
|
||||||
timeout: 15000,
|
worktreePath
|
||||||
});
|
);
|
||||||
conflictFiles = statusOutput
|
conflictFiles = statusOutput
|
||||||
.trim()
|
.trim()
|
||||||
.split('\n')
|
.split('\n')
|
||||||
@@ -146,13 +166,69 @@ async function detectConflictState(worktreePath: string): Promise<{
|
|||||||
|
|
||||||
async function getCurrentBranch(cwd: string): Promise<string> {
|
async function getCurrentBranch(cwd: string): Promise<string> {
|
||||||
try {
|
try {
|
||||||
const { stdout } = await execAsync('git branch --show-current', { cwd });
|
const stdout = await execGitCommand(['branch', '--show-current'], cwd);
|
||||||
return stdout.trim();
|
return stdout.trim();
|
||||||
} catch {
|
} catch {
|
||||||
return '';
|
return '';
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function normalizeBranchFromHeadRef(headRef: string): string | null {
|
||||||
|
let normalized = headRef.trim();
|
||||||
|
const prefixes = ['refs/heads/', 'refs/remotes/origin/', 'refs/remotes/', 'refs/'];
|
||||||
|
|
||||||
|
for (const prefix of prefixes) {
|
||||||
|
if (normalized.startsWith(prefix)) {
|
||||||
|
normalized = normalized.slice(prefix.length);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Return the full branch name, including any slashes (e.g., "feature/my-branch")
|
||||||
|
return normalized || null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Attempt to recover the branch name for a worktree in detached HEAD state.
|
||||||
|
* This happens during rebase operations where git detaches HEAD from the branch.
|
||||||
|
* We look at git state files (rebase-merge/head-name, rebase-apply/head-name)
|
||||||
|
* to determine which branch the operation is targeting.
|
||||||
|
*
|
||||||
|
* Note: merge conflicts do NOT detach HEAD, so `git worktree list --porcelain`
|
||||||
|
* still includes the `branch` line for merge conflicts. This recovery is
|
||||||
|
* specifically for rebase and cherry-pick operations.
|
||||||
|
*/
|
||||||
|
async function recoverBranchForDetachedWorktree(worktreePath: string): Promise<string | null> {
|
||||||
|
try {
|
||||||
|
const gitDirRaw = await execGitCommand(['rev-parse', '--git-dir'], worktreePath);
|
||||||
|
const gitDir = path.resolve(worktreePath, gitDirRaw.trim());
|
||||||
|
|
||||||
|
// During a rebase, the original branch is stored in rebase-merge/head-name
|
||||||
|
try {
|
||||||
|
const headNamePath = path.join(gitDir, 'rebase-merge', 'head-name');
|
||||||
|
const headName = (await secureFs.readFile(headNamePath, 'utf-8')) as string;
|
||||||
|
const branch = normalizeBranchFromHeadRef(headName);
|
||||||
|
if (branch) return branch;
|
||||||
|
} catch {
|
||||||
|
// Not a rebase-merge
|
||||||
|
}
|
||||||
|
|
||||||
|
// rebase-apply also stores the original branch in head-name
|
||||||
|
try {
|
||||||
|
const headNamePath = path.join(gitDir, 'rebase-apply', 'head-name');
|
||||||
|
const headName = (await secureFs.readFile(headNamePath, 'utf-8')) as string;
|
||||||
|
const branch = normalizeBranchFromHeadRef(headName);
|
||||||
|
if (branch) return branch;
|
||||||
|
} catch {
|
||||||
|
// Not a rebase-apply
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Scan the .worktrees directory to discover worktrees that may exist on disk
|
* Scan the .worktrees directory to discover worktrees that may exist on disk
|
||||||
* but are not registered with git (e.g., created externally or corrupted state).
|
* but are not registered with git (e.g., created externally or corrupted state).
|
||||||
@@ -204,22 +280,36 @@ async function scanWorktreesDirectory(
|
|||||||
});
|
});
|
||||||
} else {
|
} else {
|
||||||
// Try to get branch from HEAD if branch --show-current fails (detached HEAD)
|
// Try to get branch from HEAD if branch --show-current fails (detached HEAD)
|
||||||
|
let headBranch: string | null = null;
|
||||||
try {
|
try {
|
||||||
const { stdout: headRef } = await execAsync('git rev-parse --abbrev-ref HEAD', {
|
const headRef = await execGitCommand(
|
||||||
cwd: worktreePath,
|
['rev-parse', '--abbrev-ref', 'HEAD'],
|
||||||
});
|
worktreePath
|
||||||
const headBranch = headRef.trim();
|
);
|
||||||
if (headBranch && headBranch !== 'HEAD') {
|
const ref = headRef.trim();
|
||||||
logger.info(
|
if (ref && ref !== 'HEAD') {
|
||||||
`Discovered worktree in .worktrees/ not in git worktree list: ${entry.name} (branch: ${headBranch})`
|
headBranch = ref;
|
||||||
);
|
|
||||||
discovered.push({
|
|
||||||
path: normalizedPath,
|
|
||||||
branch: headBranch,
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
} catch {
|
} catch (error) {
|
||||||
// Can't determine branch, skip this directory
|
// Can't determine branch from HEAD ref (including timeout) - fall back to detached HEAD recovery
|
||||||
|
logger.debug(
|
||||||
|
`Failed to resolve HEAD ref for ${worktreePath}: ${getErrorMessage(error)}`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// If HEAD is detached (rebase/merge in progress), try recovery from git state files
|
||||||
|
if (!headBranch) {
|
||||||
|
headBranch = await recoverBranchForDetachedWorktree(worktreePath);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (headBranch) {
|
||||||
|
logger.info(
|
||||||
|
`Discovered worktree in .worktrees/ not in git worktree list: ${entry.name} (branch: ${headBranch})`
|
||||||
|
);
|
||||||
|
discovered.push({
|
||||||
|
path: normalizedPath,
|
||||||
|
branch: headBranch,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -378,15 +468,14 @@ export function createListHandler() {
|
|||||||
// Get current branch in main directory
|
// Get current branch in main directory
|
||||||
const currentBranch = await getCurrentBranch(projectPath);
|
const currentBranch = await getCurrentBranch(projectPath);
|
||||||
|
|
||||||
// Get actual worktrees from git
|
// Get actual worktrees from git (execGitCommand avoids /bin/sh in sandboxed CI)
|
||||||
const { stdout } = await execAsync('git worktree list --porcelain', {
|
const stdout = await execGitCommand(['worktree', 'list', '--porcelain'], projectPath);
|
||||||
cwd: projectPath,
|
|
||||||
});
|
|
||||||
|
|
||||||
const worktrees: WorktreeInfo[] = [];
|
const worktrees: WorktreeInfo[] = [];
|
||||||
const removedWorktrees: Array<{ path: string; branch: string }> = [];
|
const removedWorktrees: Array<{ path: string; branch: string }> = [];
|
||||||
|
let hasMissingWorktree = false;
|
||||||
const lines = stdout.split('\n');
|
const lines = stdout.split('\n');
|
||||||
let current: { path?: string; branch?: string } = {};
|
let current: { path?: string; branch?: string; isDetached?: boolean } = {};
|
||||||
let isFirst = true;
|
let isFirst = true;
|
||||||
|
|
||||||
// First pass: detect removed worktrees
|
// First pass: detect removed worktrees
|
||||||
@@ -395,8 +484,11 @@ export function createListHandler() {
|
|||||||
current.path = normalizePath(line.slice(9));
|
current.path = normalizePath(line.slice(9));
|
||||||
} else if (line.startsWith('branch ')) {
|
} else if (line.startsWith('branch ')) {
|
||||||
current.branch = line.slice(7).replace('refs/heads/', '');
|
current.branch = line.slice(7).replace('refs/heads/', '');
|
||||||
|
} else if (line.startsWith('detached')) {
|
||||||
|
// Worktree is in detached HEAD state (e.g., during rebase)
|
||||||
|
current.isDetached = true;
|
||||||
} else if (line === '') {
|
} else if (line === '') {
|
||||||
if (current.path && current.branch) {
|
if (current.path) {
|
||||||
const isMainWorktree = isFirst;
|
const isMainWorktree = isFirst;
|
||||||
// Check if the worktree directory actually exists
|
// Check if the worktree directory actually exists
|
||||||
// Skip checking/pruning the main worktree (projectPath itself)
|
// Skip checking/pruning the main worktree (projectPath itself)
|
||||||
@@ -407,14 +499,19 @@ export function createListHandler() {
|
|||||||
} catch {
|
} catch {
|
||||||
worktreeExists = false;
|
worktreeExists = false;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!isMainWorktree && !worktreeExists) {
|
if (!isMainWorktree && !worktreeExists) {
|
||||||
|
hasMissingWorktree = true;
|
||||||
// Worktree directory doesn't exist - it was manually deleted
|
// Worktree directory doesn't exist - it was manually deleted
|
||||||
removedWorktrees.push({
|
// Only add to removed list if we know the branch name
|
||||||
path: current.path,
|
if (current.branch) {
|
||||||
branch: current.branch,
|
removedWorktrees.push({
|
||||||
});
|
path: current.path,
|
||||||
} else {
|
branch: current.branch,
|
||||||
// Worktree exists (or is main worktree), add it to the list
|
});
|
||||||
|
}
|
||||||
|
} else if (current.branch) {
|
||||||
|
// Normal case: worktree with a known branch
|
||||||
worktrees.push({
|
worktrees.push({
|
||||||
path: current.path,
|
path: current.path,
|
||||||
branch: current.branch,
|
branch: current.branch,
|
||||||
@@ -423,16 +520,29 @@ export function createListHandler() {
|
|||||||
hasWorktree: true,
|
hasWorktree: true,
|
||||||
});
|
});
|
||||||
isFirst = false;
|
isFirst = false;
|
||||||
|
} else if (current.isDetached && worktreeExists) {
|
||||||
|
// Detached HEAD (e.g., rebase in progress) - try to recover branch name.
|
||||||
|
// This is critical: without this, worktrees undergoing rebase/merge
|
||||||
|
// operations would silently disappear from the UI.
|
||||||
|
const recoveredBranch = await recoverBranchForDetachedWorktree(current.path);
|
||||||
|
worktrees.push({
|
||||||
|
path: current.path,
|
||||||
|
branch: recoveredBranch || `(detached)`,
|
||||||
|
isMain: isMainWorktree,
|
||||||
|
isCurrent: false,
|
||||||
|
hasWorktree: true,
|
||||||
|
});
|
||||||
|
isFirst = false;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
current = {};
|
current = {};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Prune removed worktrees from git (only if any were detected)
|
// Prune removed worktrees from git (only if any missing worktrees were detected)
|
||||||
if (removedWorktrees.length > 0) {
|
if (hasMissingWorktree) {
|
||||||
try {
|
try {
|
||||||
await execAsync('git worktree prune', { cwd: projectPath });
|
await execGitCommand(['worktree', 'prune'], projectPath);
|
||||||
} catch {
|
} catch {
|
||||||
// Prune failed, but we'll still report the removed worktrees
|
// Prune failed, but we'll still report the removed worktrees
|
||||||
}
|
}
|
||||||
@@ -461,9 +571,7 @@ export function createListHandler() {
|
|||||||
if (includeDetails) {
|
if (includeDetails) {
|
||||||
for (const worktree of worktrees) {
|
for (const worktree of worktrees) {
|
||||||
try {
|
try {
|
||||||
const { stdout: statusOutput } = await execAsync('git status --porcelain', {
|
const statusOutput = await execGitCommand(['status', '--porcelain'], worktree.path);
|
||||||
cwd: worktree.path,
|
|
||||||
});
|
|
||||||
const changedFiles = statusOutput
|
const changedFiles = statusOutput
|
||||||
.trim()
|
.trim()
|
||||||
.split('\n')
|
.split('\n')
|
||||||
@@ -492,7 +600,7 @@ export function createListHandler() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Assign PR info to each worktree, preferring fresh GitHub data over cached metadata.
|
// Assign PR info to each worktree.
|
||||||
// Only fetch GitHub PRs if includeDetails is requested (performance optimization).
|
// Only fetch GitHub PRs if includeDetails is requested (performance optimization).
|
||||||
// Uses --state all to detect merged/closed PRs, limited to 1000 recent PRs.
|
// Uses --state all to detect merged/closed PRs, limited to 1000 recent PRs.
|
||||||
const githubPRs = includeDetails
|
const githubPRs = includeDetails
|
||||||
@@ -510,14 +618,27 @@ export function createListHandler() {
|
|||||||
const metadata = allMetadata.get(worktree.branch);
|
const metadata = allMetadata.get(worktree.branch);
|
||||||
const githubPR = githubPRs.get(worktree.branch);
|
const githubPR = githubPRs.get(worktree.branch);
|
||||||
|
|
||||||
if (githubPR) {
|
const metadataPR = metadata?.pr;
|
||||||
// Prefer fresh GitHub data (it has the current state)
|
// Preserve explicit user-selected PR tracking from metadata when it differs
|
||||||
|
// from branch-derived GitHub PR lookup. This allows "Change PR Number" to
|
||||||
|
// persist instead of being overwritten by gh pr list for the branch.
|
||||||
|
const hasManualOverride =
|
||||||
|
!!metadataPR && !!githubPR && metadataPR.number !== githubPR.number;
|
||||||
|
|
||||||
|
if (hasManualOverride) {
|
||||||
|
worktree.pr = metadataPR;
|
||||||
|
} else if (githubPR) {
|
||||||
|
// Use fresh GitHub data when there is no explicit override.
|
||||||
worktree.pr = githubPR;
|
worktree.pr = githubPR;
|
||||||
|
|
||||||
// Sync metadata with GitHub state when:
|
// Sync metadata when missing or stale so fallback data stays current.
|
||||||
// 1. No metadata exists for this PR (PR created externally)
|
const needsSync =
|
||||||
// 2. State has changed (e.g., merged/closed on GitHub)
|
!metadataPR ||
|
||||||
const needsSync = !metadata?.pr || metadata.pr.state !== githubPR.state;
|
metadataPR.number !== githubPR.number ||
|
||||||
|
metadataPR.state !== githubPR.state ||
|
||||||
|
metadataPR.title !== githubPR.title ||
|
||||||
|
metadataPR.url !== githubPR.url ||
|
||||||
|
metadataPR.createdAt !== githubPR.createdAt;
|
||||||
if (needsSync) {
|
if (needsSync) {
|
||||||
// Fire and forget - don't block the response
|
// Fire and forget - don't block the response
|
||||||
updateWorktreePRInfo(projectPath, worktree.branch, githubPR).catch((err) => {
|
updateWorktreePRInfo(projectPath, worktree.branch, githubPR).catch((err) => {
|
||||||
@@ -526,9 +647,9 @@ export function createListHandler() {
|
|||||||
);
|
);
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
} else if (metadata?.pr && metadata.pr.state === 'OPEN') {
|
} else if (metadataPR && metadataPR.state === 'OPEN') {
|
||||||
// Fall back to stored metadata only if the PR is still OPEN
|
// Fall back to stored metadata only if the PR is still OPEN
|
||||||
worktree.pr = metadata.pr;
|
worktree.pr = metadataPR;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -538,6 +659,26 @@ export function createListHandler() {
|
|||||||
removedWorktrees: removedWorktrees.length > 0 ? removedWorktrees : undefined,
|
removedWorktrees: removedWorktrees.length > 0 ? removedWorktrees : undefined,
|
||||||
});
|
});
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
|
// When git is unavailable (e.g. sandboxed E2E, PATH without git), return minimal list so UI still loads
|
||||||
|
if (isSpawnENOENT(error)) {
|
||||||
|
const projectPathFromBody = (req.body as { projectPath?: string })?.projectPath;
|
||||||
|
const mainPath = projectPathFromBody ? normalizePath(projectPathFromBody) : undefined;
|
||||||
|
if (mainPath) {
|
||||||
|
res.json({
|
||||||
|
success: true,
|
||||||
|
worktrees: [
|
||||||
|
{
|
||||||
|
path: mainPath,
|
||||||
|
branch: 'main',
|
||||||
|
isMain: true,
|
||||||
|
isCurrent: true,
|
||||||
|
hasWorktree: true,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
logError(error, 'List worktrees failed');
|
logError(error, 'List worktrees failed');
|
||||||
res.status(500).json({ success: false, error: getErrorMessage(error) });
|
res.status(500).json({ success: false, error: getErrorMessage(error) });
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -23,9 +23,11 @@ import type { PullResult } from '../../../services/pull-service.js';
|
|||||||
export function createPullHandler() {
|
export function createPullHandler() {
|
||||||
return async (req: Request, res: Response): Promise<void> => {
|
return async (req: Request, res: Response): Promise<void> => {
|
||||||
try {
|
try {
|
||||||
const { worktreePath, remote, stashIfNeeded } = req.body as {
|
const { worktreePath, remote, remoteBranch, stashIfNeeded } = req.body as {
|
||||||
worktreePath: string;
|
worktreePath: string;
|
||||||
remote?: string;
|
remote?: string;
|
||||||
|
/** Specific remote branch to pull (e.g. 'main'). When provided, pulls this branch from the remote regardless of tracking config. */
|
||||||
|
remoteBranch?: string;
|
||||||
/** When true, automatically stash local changes before pulling and reapply after */
|
/** When true, automatically stash local changes before pulling and reapply after */
|
||||||
stashIfNeeded?: boolean;
|
stashIfNeeded?: boolean;
|
||||||
};
|
};
|
||||||
@@ -39,7 +41,7 @@ export function createPullHandler() {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Execute the pull via the service
|
// Execute the pull via the service
|
||||||
const result = await performPull(worktreePath, { remote, stashIfNeeded });
|
const result = await performPull(worktreePath, { remote, remoteBranch, stashIfNeeded });
|
||||||
|
|
||||||
// Map service result to HTTP response
|
// Map service result to HTTP response
|
||||||
mapResultToResponse(res, result);
|
mapResultToResponse(res, result);
|
||||||
|
|||||||
@@ -28,7 +28,7 @@ import * as secureFs from '../../lib/secure-fs.js';
|
|||||||
import { validateWorkingDirectory, createAutoModeOptions } from '../../lib/sdk-options.js';
|
import { validateWorkingDirectory, createAutoModeOptions } from '../../lib/sdk-options.js';
|
||||||
import {
|
import {
|
||||||
getPromptCustomization,
|
getPromptCustomization,
|
||||||
getProviderByModelId,
|
resolveProviderContext,
|
||||||
getMCPServersFromSettings,
|
getMCPServersFromSettings,
|
||||||
getDefaultMaxTurnsSetting,
|
getDefaultMaxTurnsSetting,
|
||||||
} from '../../lib/settings-helpers.js';
|
} from '../../lib/settings-helpers.js';
|
||||||
@@ -226,8 +226,7 @@ export class AutoModeServiceFacade {
|
|||||||
/**
|
/**
|
||||||
* Shared agent-run helper used by both PipelineOrchestrator and ExecutionService.
|
* Shared agent-run helper used by both PipelineOrchestrator and ExecutionService.
|
||||||
*
|
*
|
||||||
* Resolves the model string, looks up the custom provider/credentials via
|
* Resolves provider/model context, then delegates to agentExecutor.execute with the
|
||||||
* getProviderByModelId, then delegates to agentExecutor.execute with the
|
|
||||||
* full payload. The opts parameter uses an index-signature union so it
|
* full payload. The opts parameter uses an index-signature union so it
|
||||||
* accepts both the typed ExecutionService opts object and the looser
|
* accepts both the typed ExecutionService opts object and the looser
|
||||||
* Record<string, unknown> used by PipelineOrchestrator without requiring
|
* Record<string, unknown> used by PipelineOrchestrator without requiring
|
||||||
@@ -266,16 +265,19 @@ export class AutoModeServiceFacade {
|
|||||||
| import('@automaker/types').ClaudeCompatibleProvider
|
| import('@automaker/types').ClaudeCompatibleProvider
|
||||||
| undefined;
|
| undefined;
|
||||||
let credentials: import('@automaker/types').Credentials | undefined;
|
let credentials: import('@automaker/types').Credentials | undefined;
|
||||||
|
let providerResolvedModel: string | undefined;
|
||||||
|
|
||||||
if (settingsService) {
|
if (settingsService) {
|
||||||
const providerResult = await getProviderByModelId(
|
const providerId = opts?.providerId as string | undefined;
|
||||||
resolvedModel,
|
const result = await resolveProviderContext(
|
||||||
settingsService,
|
settingsService,
|
||||||
|
resolvedModel,
|
||||||
|
providerId,
|
||||||
'[AutoModeFacade]'
|
'[AutoModeFacade]'
|
||||||
);
|
);
|
||||||
if (providerResult.provider) {
|
claudeCompatibleProvider = result.provider;
|
||||||
claudeCompatibleProvider = providerResult.provider;
|
credentials = result.credentials;
|
||||||
credentials = providerResult.credentials;
|
providerResolvedModel = result.resolvedModel;
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Build sdkOptions with proper maxTurns and allowedTools for auto-mode.
|
// Build sdkOptions with proper maxTurns and allowedTools for auto-mode.
|
||||||
@@ -301,7 +303,7 @@ export class AutoModeServiceFacade {
|
|||||||
|
|
||||||
const sdkOpts = createAutoModeOptions({
|
const sdkOpts = createAutoModeOptions({
|
||||||
cwd: workDir,
|
cwd: workDir,
|
||||||
model: resolvedModel,
|
model: providerResolvedModel || resolvedModel,
|
||||||
systemPrompt: opts?.systemPrompt,
|
systemPrompt: opts?.systemPrompt,
|
||||||
abortController,
|
abortController,
|
||||||
autoLoadClaudeMd,
|
autoLoadClaudeMd,
|
||||||
@@ -313,8 +315,14 @@ export class AutoModeServiceFacade {
|
|||||||
| undefined,
|
| undefined,
|
||||||
});
|
});
|
||||||
|
|
||||||
|
if (!sdkOpts) {
|
||||||
|
logger.error(
|
||||||
|
`[createRunAgentFn] sdkOpts is UNDEFINED! createAutoModeOptions type: ${typeof createAutoModeOptions}`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
logger.info(
|
logger.info(
|
||||||
`[createRunAgentFn] Feature ${featureId}: model=${resolvedModel}, ` +
|
`[createRunAgentFn] Feature ${featureId}: model=${resolvedModel} (resolved=${providerResolvedModel || resolvedModel}), ` +
|
||||||
`maxTurns=${sdkOpts.maxTurns}, allowedTools=${(sdkOpts.allowedTools as string[])?.length ?? 'default'}, ` +
|
`maxTurns=${sdkOpts.maxTurns}, allowedTools=${(sdkOpts.allowedTools as string[])?.length ?? 'default'}, ` +
|
||||||
`provider=${provider.getName()}`
|
`provider=${provider.getName()}`
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -13,6 +13,8 @@ import path from 'path';
|
|||||||
import net from 'net';
|
import net from 'net';
|
||||||
import { createLogger } from '@automaker/utils';
|
import { createLogger } from '@automaker/utils';
|
||||||
import type { EventEmitter } from '../lib/events.js';
|
import type { EventEmitter } from '../lib/events.js';
|
||||||
|
import fs from 'fs/promises';
|
||||||
|
import { constants } from 'fs';
|
||||||
|
|
||||||
const logger = createLogger('DevServerService');
|
const logger = createLogger('DevServerService');
|
||||||
|
|
||||||
@@ -110,6 +112,21 @@ export interface DevServerInfo {
|
|||||||
urlDetected: boolean;
|
urlDetected: boolean;
|
||||||
// Timer for URL detection timeout fallback
|
// Timer for URL detection timeout fallback
|
||||||
urlDetectionTimeout: NodeJS.Timeout | null;
|
urlDetectionTimeout: NodeJS.Timeout | null;
|
||||||
|
// Custom command used to start the server
|
||||||
|
customCommand?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Persistable subset of DevServerInfo for survival across server restarts
|
||||||
|
*/
|
||||||
|
interface PersistedDevServerInfo {
|
||||||
|
worktreePath: string;
|
||||||
|
allocatedPort: number;
|
||||||
|
port: number;
|
||||||
|
url: string;
|
||||||
|
startedAt: string;
|
||||||
|
urlDetected: boolean;
|
||||||
|
customCommand?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Port allocation starts at 3001 to avoid conflicts with common dev ports
|
// Port allocation starts at 3001 to avoid conflicts with common dev ports
|
||||||
@@ -121,8 +138,20 @@ const LIVERELOAD_PORTS = [35729, 35730, 35731] as const;
|
|||||||
|
|
||||||
class DevServerService {
|
class DevServerService {
|
||||||
private runningServers: Map<string, DevServerInfo> = new Map();
|
private runningServers: Map<string, DevServerInfo> = new Map();
|
||||||
|
private startingServers: Set<string> = new Set();
|
||||||
private allocatedPorts: Set<number> = new Set();
|
private allocatedPorts: Set<number> = new Set();
|
||||||
private emitter: EventEmitter | null = null;
|
private emitter: EventEmitter | null = null;
|
||||||
|
private dataDir: string | null = null;
|
||||||
|
private saveQueue: Promise<void> = Promise.resolve();
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initialize the service with data directory for persistence
|
||||||
|
*/
|
||||||
|
async initialize(dataDir: string, emitter: EventEmitter): Promise<void> {
|
||||||
|
this.dataDir = dataDir;
|
||||||
|
this.emitter = emitter;
|
||||||
|
await this.loadState();
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Set the event emitter for streaming log events
|
* Set the event emitter for streaming log events
|
||||||
@@ -132,6 +161,131 @@ class DevServerService {
|
|||||||
this.emitter = emitter;
|
this.emitter = emitter;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Save the current state of running servers to disk
|
||||||
|
*/
|
||||||
|
private async saveState(): Promise<void> {
|
||||||
|
if (!this.dataDir) return;
|
||||||
|
|
||||||
|
// Queue the save operation to prevent concurrent writes
|
||||||
|
this.saveQueue = this.saveQueue
|
||||||
|
.then(async () => {
|
||||||
|
if (!this.dataDir) return;
|
||||||
|
try {
|
||||||
|
const statePath = path.join(this.dataDir, 'dev-servers.json');
|
||||||
|
const persistedInfo: PersistedDevServerInfo[] = Array.from(
|
||||||
|
this.runningServers.values()
|
||||||
|
).map((s) => ({
|
||||||
|
worktreePath: s.worktreePath,
|
||||||
|
allocatedPort: s.allocatedPort,
|
||||||
|
port: s.port,
|
||||||
|
url: s.url,
|
||||||
|
startedAt: s.startedAt.toISOString(),
|
||||||
|
urlDetected: s.urlDetected,
|
||||||
|
customCommand: s.customCommand,
|
||||||
|
}));
|
||||||
|
|
||||||
|
await fs.writeFile(statePath, JSON.stringify(persistedInfo, null, 2));
|
||||||
|
logger.debug(`Saved dev server state to ${statePath}`);
|
||||||
|
} catch (error) {
|
||||||
|
logger.error('Failed to save dev server state:', error);
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.catch((error) => {
|
||||||
|
logger.error('Error in save queue:', error);
|
||||||
|
});
|
||||||
|
|
||||||
|
return this.saveQueue;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Load the state of running servers from disk
|
||||||
|
*/
|
||||||
|
private async loadState(): Promise<void> {
|
||||||
|
if (!this.dataDir) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const statePath = path.join(this.dataDir, 'dev-servers.json');
|
||||||
|
try {
|
||||||
|
await fs.access(statePath, constants.F_OK);
|
||||||
|
} catch {
|
||||||
|
// File doesn't exist, which is fine
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const content = await fs.readFile(statePath, 'utf-8');
|
||||||
|
const rawParsed: unknown = JSON.parse(content);
|
||||||
|
|
||||||
|
if (!Array.isArray(rawParsed)) {
|
||||||
|
logger.warn('Dev server state file is not an array, skipping load');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const persistedInfo: PersistedDevServerInfo[] = rawParsed.filter((entry: unknown) => {
|
||||||
|
if (entry === null || typeof entry !== 'object') {
|
||||||
|
logger.warn('Dropping invalid dev server entry (not an object):', entry);
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
const e = entry as Record<string, unknown>;
|
||||||
|
const valid =
|
||||||
|
typeof e.worktreePath === 'string' &&
|
||||||
|
e.worktreePath.length > 0 &&
|
||||||
|
typeof e.allocatedPort === 'number' &&
|
||||||
|
Number.isInteger(e.allocatedPort) &&
|
||||||
|
e.allocatedPort >= 1 &&
|
||||||
|
e.allocatedPort <= 65535 &&
|
||||||
|
typeof e.port === 'number' &&
|
||||||
|
Number.isInteger(e.port) &&
|
||||||
|
e.port >= 1 &&
|
||||||
|
e.port <= 65535 &&
|
||||||
|
typeof e.url === 'string' &&
|
||||||
|
typeof e.startedAt === 'string' &&
|
||||||
|
typeof e.urlDetected === 'boolean' &&
|
||||||
|
(e.customCommand === undefined || typeof e.customCommand === 'string');
|
||||||
|
if (!valid) {
|
||||||
|
logger.warn('Dropping malformed dev server entry:', e);
|
||||||
|
}
|
||||||
|
return valid;
|
||||||
|
}) as PersistedDevServerInfo[];
|
||||||
|
|
||||||
|
logger.info(`Loading ${persistedInfo.length} dev servers from state`);
|
||||||
|
|
||||||
|
for (const info of persistedInfo) {
|
||||||
|
// Check if the process is still running on the port
|
||||||
|
// Since we can't reliably re-attach to the process for output,
|
||||||
|
// we'll just check if the port is in use.
|
||||||
|
const portInUse = !(await this.isPortAvailable(info.port));
|
||||||
|
|
||||||
|
if (portInUse) {
|
||||||
|
logger.info(`Re-attached to dev server on port ${info.port} for ${info.worktreePath}`);
|
||||||
|
const serverInfo: DevServerInfo = {
|
||||||
|
...info,
|
||||||
|
startedAt: new Date(info.startedAt),
|
||||||
|
process: null, // Process object is lost, but we know it's running
|
||||||
|
scrollbackBuffer: '',
|
||||||
|
outputBuffer: '',
|
||||||
|
flushTimeout: null,
|
||||||
|
stopping: false,
|
||||||
|
urlDetectionTimeout: null,
|
||||||
|
};
|
||||||
|
this.runningServers.set(info.worktreePath, serverInfo);
|
||||||
|
this.allocatedPorts.add(info.allocatedPort);
|
||||||
|
} else {
|
||||||
|
logger.info(
|
||||||
|
`Dev server on port ${info.port} for ${info.worktreePath} is no longer running`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cleanup stale entries from the file if any
|
||||||
|
if (this.runningServers.size !== persistedInfo.length) {
|
||||||
|
await this.saveState();
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
logger.error('Failed to load dev server state:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Prune a stale server entry whose process has exited without cleanup.
|
* Prune a stale server entry whose process has exited without cleanup.
|
||||||
* Clears any pending timers, removes the port from allocatedPorts, deletes
|
* Clears any pending timers, removes the port from allocatedPorts, deletes
|
||||||
@@ -148,6 +302,10 @@ class DevServerService {
|
|||||||
// been mutated by detectUrlFromOutput to reflect the actual detected port.
|
// been mutated by detectUrlFromOutput to reflect the actual detected port.
|
||||||
this.allocatedPorts.delete(server.allocatedPort);
|
this.allocatedPorts.delete(server.allocatedPort);
|
||||||
this.runningServers.delete(worktreePath);
|
this.runningServers.delete(worktreePath);
|
||||||
|
|
||||||
|
// Persist state change
|
||||||
|
this.saveState().catch((err) => logger.error('Failed to save state in pruneStaleServer:', err));
|
||||||
|
|
||||||
if (this.emitter) {
|
if (this.emitter) {
|
||||||
this.emitter.emit('dev-server:stopped', {
|
this.emitter.emit('dev-server:stopped', {
|
||||||
worktreePath,
|
worktreePath,
|
||||||
@@ -249,7 +407,7 @@ class DevServerService {
|
|||||||
* - PHP: "Development Server (http://localhost:8000) started"
|
* - PHP: "Development Server (http://localhost:8000) started"
|
||||||
* - Generic: Any localhost URL with a port
|
* - Generic: Any localhost URL with a port
|
||||||
*/
|
*/
|
||||||
private detectUrlFromOutput(server: DevServerInfo, content: string): void {
|
private async detectUrlFromOutput(server: DevServerInfo, content: string): Promise<void> {
|
||||||
// Skip if URL already detected
|
// Skip if URL already detected
|
||||||
if (server.urlDetected) {
|
if (server.urlDetected) {
|
||||||
return;
|
return;
|
||||||
@@ -304,6 +462,11 @@ class DevServerService {
|
|||||||
|
|
||||||
logger.info(`Detected server URL via ${description}: ${detectedUrl}`);
|
logger.info(`Detected server URL via ${description}: ${detectedUrl}`);
|
||||||
|
|
||||||
|
// Persist state change
|
||||||
|
await this.saveState().catch((err) =>
|
||||||
|
logger.error('Failed to save state in detectUrlFromOutput:', err)
|
||||||
|
);
|
||||||
|
|
||||||
// Emit URL update event
|
// Emit URL update event
|
||||||
if (this.emitter) {
|
if (this.emitter) {
|
||||||
this.emitter.emit('dev-server:url-detected', {
|
this.emitter.emit('dev-server:url-detected', {
|
||||||
@@ -346,6 +509,11 @@ class DevServerService {
|
|||||||
|
|
||||||
logger.info(`Detected server port via ${description}: ${detectedPort} → ${detectedUrl}`);
|
logger.info(`Detected server port via ${description}: ${detectedPort} → ${detectedUrl}`);
|
||||||
|
|
||||||
|
// Persist state change
|
||||||
|
await this.saveState().catch((err) =>
|
||||||
|
logger.error('Failed to save state in detectUrlFromOutput Phase 2:', err)
|
||||||
|
);
|
||||||
|
|
||||||
// Emit URL update event
|
// Emit URL update event
|
||||||
if (this.emitter) {
|
if (this.emitter) {
|
||||||
this.emitter.emit('dev-server:url-detected', {
|
this.emitter.emit('dev-server:url-detected', {
|
||||||
@@ -365,7 +533,7 @@ class DevServerService {
|
|||||||
* Handle incoming stdout/stderr data from dev server process
|
* Handle incoming stdout/stderr data from dev server process
|
||||||
* Buffers data for scrollback replay and schedules throttled emission
|
* Buffers data for scrollback replay and schedules throttled emission
|
||||||
*/
|
*/
|
||||||
private handleProcessOutput(server: DevServerInfo, data: Buffer): void {
|
private async handleProcessOutput(server: DevServerInfo, data: Buffer): Promise<void> {
|
||||||
// Skip output if server is stopping
|
// Skip output if server is stopping
|
||||||
if (server.stopping) {
|
if (server.stopping) {
|
||||||
return;
|
return;
|
||||||
@@ -374,7 +542,7 @@ class DevServerService {
|
|||||||
const content = data.toString();
|
const content = data.toString();
|
||||||
|
|
||||||
// Try to detect actual server URL from output
|
// Try to detect actual server URL from output
|
||||||
this.detectUrlFromOutput(server, content);
|
await this.detectUrlFromOutput(server, content);
|
||||||
|
|
||||||
// Append to scrollback buffer for replay on reconnect
|
// Append to scrollback buffer for replay on reconnect
|
||||||
this.appendToScrollback(server, content);
|
this.appendToScrollback(server, content);
|
||||||
@@ -594,261 +762,305 @@ class DevServerService {
|
|||||||
};
|
};
|
||||||
error?: string;
|
error?: string;
|
||||||
}> {
|
}> {
|
||||||
// Check if already running
|
// Check if already running or starting
|
||||||
if (this.runningServers.has(worktreePath)) {
|
if (this.runningServers.has(worktreePath) || this.startingServers.has(worktreePath)) {
|
||||||
const existing = this.runningServers.get(worktreePath)!;
|
const existing = this.runningServers.get(worktreePath);
|
||||||
return {
|
if (existing) {
|
||||||
success: true,
|
return {
|
||||||
result: {
|
success: true,
|
||||||
worktreePath: existing.worktreePath,
|
result: {
|
||||||
port: existing.port,
|
worktreePath: existing.worktreePath,
|
||||||
url: existing.url,
|
port: existing.port,
|
||||||
message: `Dev server already running on port ${existing.port}`,
|
url: existing.url,
|
||||||
},
|
message: `Dev server already running on port ${existing.port}`,
|
||||||
};
|
},
|
||||||
}
|
};
|
||||||
|
}
|
||||||
// Verify the worktree exists
|
|
||||||
if (!(await this.fileExists(worktreePath))) {
|
|
||||||
return {
|
return {
|
||||||
success: false,
|
success: false,
|
||||||
error: `Worktree path does not exist: ${worktreePath}`,
|
error: 'Dev server is already starting',
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
// Determine the dev command to use
|
this.startingServers.add(worktreePath);
|
||||||
let devCommand: { cmd: string; args: string[] };
|
|
||||||
|
|
||||||
// Normalize custom command: trim whitespace and treat empty strings as undefined
|
|
||||||
const normalizedCustomCommand = customCommand?.trim();
|
|
||||||
|
|
||||||
if (normalizedCustomCommand) {
|
|
||||||
// Use the provided custom command
|
|
||||||
devCommand = this.parseCustomCommand(normalizedCustomCommand);
|
|
||||||
if (!devCommand.cmd) {
|
|
||||||
return {
|
|
||||||
success: false,
|
|
||||||
error: 'Invalid custom command: command cannot be empty',
|
|
||||||
};
|
|
||||||
}
|
|
||||||
logger.debug(`Using custom command: ${normalizedCustomCommand}`);
|
|
||||||
} else {
|
|
||||||
// Check for package.json when auto-detecting
|
|
||||||
const packageJsonPath = path.join(worktreePath, 'package.json');
|
|
||||||
if (!(await this.fileExists(packageJsonPath))) {
|
|
||||||
return {
|
|
||||||
success: false,
|
|
||||||
error: `No package.json found in: ${worktreePath}`,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
// Get dev command from package manager detection
|
|
||||||
const detectedCommand = await this.getDevCommand(worktreePath);
|
|
||||||
if (!detectedCommand) {
|
|
||||||
return {
|
|
||||||
success: false,
|
|
||||||
error: `Could not determine dev command for: ${worktreePath}`,
|
|
||||||
};
|
|
||||||
}
|
|
||||||
devCommand = detectedCommand;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Find available port
|
|
||||||
let port: number;
|
|
||||||
try {
|
try {
|
||||||
port = await this.findAvailablePort();
|
// Verify the worktree exists
|
||||||
} catch (error) {
|
if (!(await this.fileExists(worktreePath))) {
|
||||||
return {
|
return {
|
||||||
success: false,
|
success: false,
|
||||||
error: error instanceof Error ? error.message : 'Port allocation failed',
|
error: `Worktree path does not exist: ${worktreePath}`,
|
||||||
};
|
};
|
||||||
}
|
|
||||||
|
|
||||||
// Reserve the port (port was already force-killed in findAvailablePort)
|
|
||||||
this.allocatedPorts.add(port);
|
|
||||||
|
|
||||||
// Also kill common related ports (livereload, etc.)
|
|
||||||
// Some dev servers use fixed ports for HMR/livereload regardless of main port
|
|
||||||
for (const relatedPort of LIVERELOAD_PORTS) {
|
|
||||||
this.killProcessOnPort(relatedPort);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Small delay to ensure related ports are freed
|
|
||||||
await new Promise((resolve) => setTimeout(resolve, 100));
|
|
||||||
|
|
||||||
logger.info(`Starting dev server on port ${port}`);
|
|
||||||
logger.debug(`Working directory (cwd): ${worktreePath}`);
|
|
||||||
logger.debug(`Command: ${devCommand.cmd} ${devCommand.args.join(' ')} with PORT=${port}`);
|
|
||||||
|
|
||||||
// Spawn the dev process with PORT environment variable
|
|
||||||
// FORCE_COLOR enables colored output even when not running in a TTY
|
|
||||||
const env = {
|
|
||||||
...process.env,
|
|
||||||
PORT: String(port),
|
|
||||||
FORCE_COLOR: '1',
|
|
||||||
// Some tools use these additional env vars for color detection
|
|
||||||
COLORTERM: 'truecolor',
|
|
||||||
TERM: 'xterm-256color',
|
|
||||||
};
|
|
||||||
|
|
||||||
const devProcess = spawn(devCommand.cmd, devCommand.args, {
|
|
||||||
cwd: worktreePath,
|
|
||||||
env,
|
|
||||||
stdio: ['ignore', 'pipe', 'pipe'],
|
|
||||||
detached: false,
|
|
||||||
});
|
|
||||||
|
|
||||||
// Track if process failed early using object to work around TypeScript narrowing
|
|
||||||
const status = { error: null as string | null, exited: false };
|
|
||||||
|
|
||||||
// Create server info early so we can reference it in handlers
|
|
||||||
// We'll add it to runningServers after verifying the process started successfully
|
|
||||||
const hostname = process.env.HOSTNAME || 'localhost';
|
|
||||||
const serverInfo: DevServerInfo = {
|
|
||||||
worktreePath,
|
|
||||||
allocatedPort: port, // Immutable: records which port we reserved; never changed after this point
|
|
||||||
port,
|
|
||||||
url: `http://${hostname}:${port}`, // Initial URL, may be updated by detectUrlFromOutput
|
|
||||||
process: devProcess,
|
|
||||||
startedAt: new Date(),
|
|
||||||
scrollbackBuffer: '',
|
|
||||||
outputBuffer: '',
|
|
||||||
flushTimeout: null,
|
|
||||||
stopping: false,
|
|
||||||
urlDetected: false, // Will be set to true when actual URL is detected from output
|
|
||||||
urlDetectionTimeout: null, // Will be set after server starts successfully
|
|
||||||
};
|
|
||||||
|
|
||||||
// Capture stdout with buffer management and event emission
|
|
||||||
if (devProcess.stdout) {
|
|
||||||
devProcess.stdout.on('data', (data: Buffer) => {
|
|
||||||
this.handleProcessOutput(serverInfo, data);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Capture stderr with buffer management and event emission
|
|
||||||
if (devProcess.stderr) {
|
|
||||||
devProcess.stderr.on('data', (data: Buffer) => {
|
|
||||||
this.handleProcessOutput(serverInfo, data);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Helper to clean up resources and emit stop event
|
|
||||||
const cleanupAndEmitStop = (exitCode: number | null, errorMessage?: string) => {
|
|
||||||
if (serverInfo.flushTimeout) {
|
|
||||||
clearTimeout(serverInfo.flushTimeout);
|
|
||||||
serverInfo.flushTimeout = null;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Clear URL detection timeout to prevent stale fallback emission
|
// Determine the dev command to use
|
||||||
if (serverInfo.urlDetectionTimeout) {
|
let devCommand: { cmd: string; args: string[] };
|
||||||
clearTimeout(serverInfo.urlDetectionTimeout);
|
|
||||||
serverInfo.urlDetectionTimeout = null;
|
// Normalize custom command: trim whitespace and treat empty strings as undefined
|
||||||
|
const normalizedCustomCommand = customCommand?.trim();
|
||||||
|
|
||||||
|
if (normalizedCustomCommand) {
|
||||||
|
// Use the provided custom command
|
||||||
|
devCommand = this.parseCustomCommand(normalizedCustomCommand);
|
||||||
|
if (!devCommand.cmd) {
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: 'Invalid custom command: command cannot be empty',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
logger.debug(`Using custom command: ${normalizedCustomCommand}`);
|
||||||
|
} else {
|
||||||
|
// Check for package.json when auto-detecting
|
||||||
|
const packageJsonPath = path.join(worktreePath, 'package.json');
|
||||||
|
if (!(await this.fileExists(packageJsonPath))) {
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: `No package.json found in: ${worktreePath}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get dev command from package manager detection
|
||||||
|
const detectedCommand = await this.getDevCommand(worktreePath);
|
||||||
|
if (!detectedCommand) {
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: `Could not determine dev command for: ${worktreePath}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
devCommand = detectedCommand;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Emit stopped event (only if not already stopping - prevents duplicate events)
|
// Find available port
|
||||||
if (this.emitter && !serverInfo.stopping) {
|
let port: number;
|
||||||
this.emitter.emit('dev-server:stopped', {
|
try {
|
||||||
|
port = await this.findAvailablePort();
|
||||||
|
} catch (error) {
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: error instanceof Error ? error.message : 'Port allocation failed',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Reserve the port (port was already force-killed in findAvailablePort)
|
||||||
|
this.allocatedPorts.add(port);
|
||||||
|
|
||||||
|
// Also kill common related ports (livereload, etc.)
|
||||||
|
// Some dev servers use fixed ports for HMR/livereload regardless of main port
|
||||||
|
for (const relatedPort of LIVERELOAD_PORTS) {
|
||||||
|
this.killProcessOnPort(relatedPort);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Small delay to ensure related ports are freed
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 100));
|
||||||
|
|
||||||
|
logger.info(`Starting dev server on port ${port}`);
|
||||||
|
logger.debug(`Working directory (cwd): ${worktreePath}`);
|
||||||
|
logger.debug(`Command: ${devCommand.cmd} ${devCommand.args.join(' ')} with PORT=${port}`);
|
||||||
|
|
||||||
|
// Emit starting only after preflight checks pass to avoid dangling starting state.
|
||||||
|
if (this.emitter) {
|
||||||
|
this.emitter.emit('dev-server:starting', {
|
||||||
worktreePath,
|
worktreePath,
|
||||||
port: serverInfo.port, // Use the detected port (may differ from allocated port if detectUrlFromOutput updated it)
|
|
||||||
exitCode,
|
|
||||||
error: errorMessage,
|
|
||||||
timestamp: new Date().toISOString(),
|
timestamp: new Date().toISOString(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
this.allocatedPorts.delete(serverInfo.allocatedPort);
|
// Spawn the dev process with PORT environment variable
|
||||||
this.runningServers.delete(worktreePath);
|
// FORCE_COLOR enables colored output even when not running in a TTY
|
||||||
};
|
const env = {
|
||||||
|
...process.env,
|
||||||
devProcess.on('error', (error) => {
|
PORT: String(port),
|
||||||
logger.error(`Process error:`, error);
|
FORCE_COLOR: '1',
|
||||||
status.error = error.message;
|
// Some tools use these additional env vars for color detection
|
||||||
cleanupAndEmitStop(null, error.message);
|
COLORTERM: 'truecolor',
|
||||||
});
|
TERM: 'xterm-256color',
|
||||||
|
|
||||||
devProcess.on('exit', (code) => {
|
|
||||||
logger.info(`Process for ${worktreePath} exited with code ${code}`);
|
|
||||||
status.exited = true;
|
|
||||||
cleanupAndEmitStop(code);
|
|
||||||
});
|
|
||||||
|
|
||||||
// Wait a moment to see if the process fails immediately
|
|
||||||
await new Promise((resolve) => setTimeout(resolve, 500));
|
|
||||||
|
|
||||||
if (status.error) {
|
|
||||||
return {
|
|
||||||
success: false,
|
|
||||||
error: `Failed to start dev server: ${status.error}`,
|
|
||||||
};
|
};
|
||||||
}
|
|
||||||
|
|
||||||
if (status.exited) {
|
const devProcess = spawn(devCommand.cmd, devCommand.args, {
|
||||||
return {
|
cwd: worktreePath,
|
||||||
success: false,
|
env,
|
||||||
error: `Dev server process exited immediately. Check server logs for details.`,
|
stdio: ['ignore', 'pipe', 'pipe'],
|
||||||
};
|
detached: false,
|
||||||
}
|
|
||||||
|
|
||||||
// Server started successfully - add to running servers map
|
|
||||||
this.runningServers.set(worktreePath, serverInfo);
|
|
||||||
|
|
||||||
// Emit started event for WebSocket subscribers
|
|
||||||
if (this.emitter) {
|
|
||||||
this.emitter.emit('dev-server:started', {
|
|
||||||
worktreePath,
|
|
||||||
port,
|
|
||||||
url: serverInfo.url,
|
|
||||||
timestamp: new Date().toISOString(),
|
|
||||||
});
|
});
|
||||||
}
|
|
||||||
|
|
||||||
// Set up URL detection timeout fallback.
|
// Track if process failed early using object to work around TypeScript narrowing
|
||||||
// If URL detection hasn't succeeded after URL_DETECTION_TIMEOUT_MS, check if
|
const status = { error: null as string | null, exited: false };
|
||||||
// the allocated port is actually in use (server probably started successfully)
|
|
||||||
// and emit a url-detected event with the allocated port as fallback.
|
|
||||||
// Also re-scan the scrollback buffer in case the URL was printed before
|
|
||||||
// our patterns could match (e.g., it was split across multiple data chunks).
|
|
||||||
serverInfo.urlDetectionTimeout = setTimeout(() => {
|
|
||||||
serverInfo.urlDetectionTimeout = null;
|
|
||||||
|
|
||||||
// Only run fallback if server is still running and URL wasn't detected
|
// Create server info early so we can reference it in handlers
|
||||||
if (serverInfo.stopping || serverInfo.urlDetected || !this.runningServers.has(worktreePath)) {
|
// We'll add it to runningServers after verifying the process started successfully
|
||||||
return;
|
const fallbackHost = 'localhost';
|
||||||
|
const serverInfo: DevServerInfo = {
|
||||||
|
worktreePath,
|
||||||
|
allocatedPort: port, // Immutable: records which port we reserved; never changed after this point
|
||||||
|
port,
|
||||||
|
url: `http://${fallbackHost}:${port}`, // Initial URL, may be updated by detectUrlFromOutput
|
||||||
|
process: devProcess,
|
||||||
|
startedAt: new Date(),
|
||||||
|
scrollbackBuffer: '',
|
||||||
|
outputBuffer: '',
|
||||||
|
flushTimeout: null,
|
||||||
|
stopping: false,
|
||||||
|
urlDetected: false, // Will be set to true when actual URL is detected from output
|
||||||
|
urlDetectionTimeout: null, // Will be set after server starts successfully
|
||||||
|
customCommand: normalizedCustomCommand,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Capture stdout with buffer management and event emission
|
||||||
|
if (devProcess.stdout) {
|
||||||
|
devProcess.stdout.on('data', (data: Buffer) => {
|
||||||
|
this.handleProcessOutput(serverInfo, data).catch((error: unknown) => {
|
||||||
|
logger.error('Failed to handle dev server stdout output:', error);
|
||||||
|
});
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// Re-scan the entire scrollback buffer for URL patterns
|
// Capture stderr with buffer management and event emission
|
||||||
// This catches cases where the URL was split across multiple output chunks
|
if (devProcess.stderr) {
|
||||||
logger.info(`URL detection timeout for ${worktreePath}, re-scanning scrollback buffer`);
|
devProcess.stderr.on('data', (data: Buffer) => {
|
||||||
this.detectUrlFromOutput(serverInfo, serverInfo.scrollbackBuffer);
|
this.handleProcessOutput(serverInfo, data).catch((error: unknown) => {
|
||||||
|
logger.error('Failed to handle dev server stderr output:', error);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
// If still not detected after full rescan, use the allocated port as fallback
|
// Helper to clean up resources and emit stop event
|
||||||
if (!serverInfo.urlDetected) {
|
const cleanupAndEmitStop = (exitCode: number | null, errorMessage?: string) => {
|
||||||
logger.info(`URL detection fallback: using allocated port ${port} for ${worktreePath}`);
|
if (serverInfo.flushTimeout) {
|
||||||
const fallbackUrl = `http://${hostname}:${port}`;
|
clearTimeout(serverInfo.flushTimeout);
|
||||||
serverInfo.url = fallbackUrl;
|
serverInfo.flushTimeout = null;
|
||||||
serverInfo.urlDetected = true;
|
}
|
||||||
|
|
||||||
if (this.emitter) {
|
// Clear URL detection timeout to prevent stale fallback emission
|
||||||
this.emitter.emit('dev-server:url-detected', {
|
if (serverInfo.urlDetectionTimeout) {
|
||||||
|
clearTimeout(serverInfo.urlDetectionTimeout);
|
||||||
|
serverInfo.urlDetectionTimeout = null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Emit stopped event (only if not already stopping - prevents duplicate events)
|
||||||
|
if (this.emitter && !serverInfo.stopping) {
|
||||||
|
this.emitter.emit('dev-server:stopped', {
|
||||||
worktreePath,
|
worktreePath,
|
||||||
url: fallbackUrl,
|
port: serverInfo.port, // Use the detected port (may differ from allocated port if detectUrlFromOutput updated it)
|
||||||
port,
|
exitCode,
|
||||||
|
error: errorMessage,
|
||||||
timestamp: new Date().toISOString(),
|
timestamp: new Date().toISOString(),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
}
|
|
||||||
}, URL_DETECTION_TIMEOUT_MS);
|
|
||||||
|
|
||||||
return {
|
this.allocatedPorts.delete(serverInfo.allocatedPort);
|
||||||
success: true,
|
this.runningServers.delete(worktreePath);
|
||||||
result: {
|
|
||||||
worktreePath,
|
// Persist state change
|
||||||
port,
|
this.saveState().catch((err) => logger.error('Failed to save state in cleanup:', err));
|
||||||
url: `http://${hostname}:${port}`,
|
};
|
||||||
message: `Dev server started on port ${port}`,
|
|
||||||
},
|
devProcess.on('error', (error) => {
|
||||||
};
|
logger.error(`Process error:`, error);
|
||||||
|
status.error = error.message;
|
||||||
|
cleanupAndEmitStop(null, error.message);
|
||||||
|
});
|
||||||
|
|
||||||
|
devProcess.on('exit', (code) => {
|
||||||
|
logger.info(`Process for ${worktreePath} exited with code ${code}`);
|
||||||
|
status.exited = true;
|
||||||
|
cleanupAndEmitStop(code);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Wait a moment to see if the process fails immediately
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 500));
|
||||||
|
|
||||||
|
if (status.error) {
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: `Failed to start dev server: ${status.error}`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (status.exited) {
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: `Dev server process exited immediately. Check server logs for details.`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Server started successfully - add to running servers map
|
||||||
|
this.runningServers.set(worktreePath, serverInfo);
|
||||||
|
|
||||||
|
// Persist state change
|
||||||
|
await this.saveState().catch((err) =>
|
||||||
|
logger.error('Failed to save state in startDevServer:', err)
|
||||||
|
);
|
||||||
|
|
||||||
|
// Emit started event for WebSocket subscribers
|
||||||
|
if (this.emitter) {
|
||||||
|
this.emitter.emit('dev-server:started', {
|
||||||
|
worktreePath,
|
||||||
|
port,
|
||||||
|
url: serverInfo.url,
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set up URL detection timeout fallback.
|
||||||
|
// If URL detection hasn't succeeded after URL_DETECTION_TIMEOUT_MS, check if
|
||||||
|
// the allocated port is actually in use (server probably started successfully)
|
||||||
|
// and emit a url-detected event with the allocated port as fallback.
|
||||||
|
// Also re-scan the scrollback buffer in case the URL was printed before
|
||||||
|
// our patterns could match (e.g., it was split across multiple data chunks).
|
||||||
|
serverInfo.urlDetectionTimeout = setTimeout(async () => {
|
||||||
|
serverInfo.urlDetectionTimeout = null;
|
||||||
|
|
||||||
|
// Only run fallback if server is still running and URL wasn't detected
|
||||||
|
if (
|
||||||
|
serverInfo.stopping ||
|
||||||
|
serverInfo.urlDetected ||
|
||||||
|
!this.runningServers.has(worktreePath)
|
||||||
|
) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Re-scan the entire scrollback buffer for URL patterns
|
||||||
|
// This catches cases where the URL was split across multiple output chunks
|
||||||
|
logger.info(`URL detection timeout for ${worktreePath}, re-scanning scrollback buffer`);
|
||||||
|
await this.detectUrlFromOutput(serverInfo, serverInfo.scrollbackBuffer).catch((err) =>
|
||||||
|
logger.error('Failed to re-scan scrollback buffer:', err)
|
||||||
|
);
|
||||||
|
|
||||||
|
// If still not detected after full rescan, use the allocated port as fallback
|
||||||
|
if (!serverInfo.urlDetected) {
|
||||||
|
logger.info(`URL detection fallback: using allocated port ${port} for ${worktreePath}`);
|
||||||
|
const fallbackUrl = `http://${fallbackHost}:${port}`;
|
||||||
|
serverInfo.url = fallbackUrl;
|
||||||
|
serverInfo.urlDetected = true;
|
||||||
|
|
||||||
|
// Persist state change
|
||||||
|
await this.saveState().catch((err) =>
|
||||||
|
logger.error('Failed to save state in URL detection fallback:', err)
|
||||||
|
);
|
||||||
|
|
||||||
|
if (this.emitter) {
|
||||||
|
this.emitter.emit('dev-server:url-detected', {
|
||||||
|
worktreePath: serverInfo.worktreePath,
|
||||||
|
url: fallbackUrl,
|
||||||
|
port,
|
||||||
|
timestamp: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}, URL_DETECTION_TIMEOUT_MS);
|
||||||
|
|
||||||
|
return {
|
||||||
|
success: true,
|
||||||
|
result: {
|
||||||
|
worktreePath: serverInfo.worktreePath,
|
||||||
|
port: serverInfo.port,
|
||||||
|
url: serverInfo.url,
|
||||||
|
message: `Dev server started on port ${port}`,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
} finally {
|
||||||
|
this.startingServers.delete(worktreePath);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -904,9 +1116,11 @@ class DevServerService {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// Kill the process
|
// Kill the process; persisted/re-attached entries may not have a process handle.
|
||||||
if (server.process && !server.process.killed) {
|
if (server.process && !server.process.killed) {
|
||||||
server.process.kill('SIGTERM');
|
server.process.kill('SIGTERM');
|
||||||
|
} else {
|
||||||
|
this.killProcessOnPort(server.port);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Free the originally-reserved port slot (allocatedPort is immutable and always
|
// Free the originally-reserved port slot (allocatedPort is immutable and always
|
||||||
@@ -915,6 +1129,11 @@ class DevServerService {
|
|||||||
this.allocatedPorts.delete(server.allocatedPort);
|
this.allocatedPorts.delete(server.allocatedPort);
|
||||||
this.runningServers.delete(worktreePath);
|
this.runningServers.delete(worktreePath);
|
||||||
|
|
||||||
|
// Persist state change
|
||||||
|
await this.saveState().catch((err) =>
|
||||||
|
logger.error('Failed to save state in stopDevServer:', err)
|
||||||
|
);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
success: true,
|
success: true,
|
||||||
result: {
|
result: {
|
||||||
|
|||||||
@@ -214,7 +214,12 @@ ${feature.spec}
|
|||||||
const branchName = feature.branchName;
|
const branchName = feature.branchName;
|
||||||
if (!worktreePath && useWorktrees && branchName) {
|
if (!worktreePath && useWorktrees && branchName) {
|
||||||
worktreePath = await this.worktreeResolver.findWorktreeForBranch(projectPath, branchName);
|
worktreePath = await this.worktreeResolver.findWorktreeForBranch(projectPath, branchName);
|
||||||
if (worktreePath) logger.info(`Using worktree for branch "${branchName}": ${worktreePath}`);
|
if (!worktreePath) {
|
||||||
|
throw new Error(
|
||||||
|
`Worktree enabled but no worktree found for feature branch "${branchName}".`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
logger.info(`Using worktree for branch "${branchName}": ${worktreePath}`);
|
||||||
}
|
}
|
||||||
const workDir = worktreePath ? path.resolve(worktreePath) : path.resolve(projectPath);
|
const workDir = worktreePath ? path.resolve(worktreePath) : path.resolve(projectPath);
|
||||||
validateWorkingDirectory(workDir);
|
validateWorkingDirectory(workDir);
|
||||||
@@ -304,6 +309,7 @@ ${feature.spec}
|
|||||||
useClaudeCodeSystemPrompt,
|
useClaudeCodeSystemPrompt,
|
||||||
thinkingLevel: feature.thinkingLevel,
|
thinkingLevel: feature.thinkingLevel,
|
||||||
reasoningEffort: feature.reasoningEffort,
|
reasoningEffort: feature.reasoningEffort,
|
||||||
|
providerId: feature.providerId,
|
||||||
branchName: feature.branchName ?? null,
|
branchName: feature.branchName ?? null,
|
||||||
}
|
}
|
||||||
);
|
);
|
||||||
@@ -370,6 +376,7 @@ Please continue from where you left off and complete all remaining tasks. Use th
|
|||||||
useClaudeCodeSystemPrompt,
|
useClaudeCodeSystemPrompt,
|
||||||
thinkingLevel: feature.thinkingLevel,
|
thinkingLevel: feature.thinkingLevel,
|
||||||
reasoningEffort: feature.reasoningEffort,
|
reasoningEffort: feature.reasoningEffort,
|
||||||
|
providerId: feature.providerId,
|
||||||
branchName: feature.branchName ?? null,
|
branchName: feature.branchName ?? null,
|
||||||
}
|
}
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -34,6 +34,7 @@ export type RunAgentFn = (
|
|||||||
useClaudeCodeSystemPrompt?: boolean;
|
useClaudeCodeSystemPrompt?: boolean;
|
||||||
thinkingLevel?: ThinkingLevel;
|
thinkingLevel?: ThinkingLevel;
|
||||||
reasoningEffort?: ReasoningEffort;
|
reasoningEffort?: ReasoningEffort;
|
||||||
|
providerId?: string;
|
||||||
branchName?: string | null;
|
branchName?: string | null;
|
||||||
}
|
}
|
||||||
) => Promise<void>;
|
) => Promise<void>;
|
||||||
|
|||||||
@@ -135,6 +135,7 @@ export class PipelineOrchestrator {
|
|||||||
thinkingLevel: feature.thinkingLevel,
|
thinkingLevel: feature.thinkingLevel,
|
||||||
reasoningEffort: feature.reasoningEffort,
|
reasoningEffort: feature.reasoningEffort,
|
||||||
status: currentStatus,
|
status: currentStatus,
|
||||||
|
providerId: feature.providerId,
|
||||||
}
|
}
|
||||||
);
|
);
|
||||||
try {
|
try {
|
||||||
@@ -503,8 +504,10 @@ export class PipelineOrchestrator {
|
|||||||
requirePlanApproval: false,
|
requirePlanApproval: false,
|
||||||
useClaudeCodeSystemPrompt: context.useClaudeCodeSystemPrompt,
|
useClaudeCodeSystemPrompt: context.useClaudeCodeSystemPrompt,
|
||||||
autoLoadClaudeMd: context.autoLoadClaudeMd,
|
autoLoadClaudeMd: context.autoLoadClaudeMd,
|
||||||
|
thinkingLevel: context.feature.thinkingLevel,
|
||||||
reasoningEffort: context.feature.reasoningEffort,
|
reasoningEffort: context.feature.reasoningEffort,
|
||||||
status: context.feature.status,
|
status: context.feature.status,
|
||||||
|
providerId: context.feature.providerId,
|
||||||
}
|
}
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -28,6 +28,8 @@ const logger = createLogger('PullService');
|
|||||||
export interface PullOptions {
|
export interface PullOptions {
|
||||||
/** Remote name to pull from (defaults to 'origin') */
|
/** Remote name to pull from (defaults to 'origin') */
|
||||||
remote?: string;
|
remote?: string;
|
||||||
|
/** Specific remote branch to pull (e.g. 'main'). When provided, overrides the tracking branch and fetches this branch from the remote. */
|
||||||
|
remoteBranch?: string;
|
||||||
/** When true, automatically stash local changes before pulling and reapply after */
|
/** When true, automatically stash local changes before pulling and reapply after */
|
||||||
stashIfNeeded?: boolean;
|
stashIfNeeded?: boolean;
|
||||||
}
|
}
|
||||||
@@ -243,6 +245,7 @@ export async function performPull(
|
|||||||
): Promise<PullResult> {
|
): Promise<PullResult> {
|
||||||
const targetRemote = options?.remote || 'origin';
|
const targetRemote = options?.remote || 'origin';
|
||||||
const stashIfNeeded = options?.stashIfNeeded ?? false;
|
const stashIfNeeded = options?.stashIfNeeded ?? false;
|
||||||
|
const targetRemoteBranch = options?.remoteBranch;
|
||||||
|
|
||||||
// 1. Get current branch name
|
// 1. Get current branch name
|
||||||
let branchName: string;
|
let branchName: string;
|
||||||
@@ -313,24 +316,34 @@ export async function performPull(
|
|||||||
}
|
}
|
||||||
|
|
||||||
// 7. Verify upstream tracking or remote branch exists
|
// 7. Verify upstream tracking or remote branch exists
|
||||||
const upstreamStatus = await hasUpstreamOrRemoteBranch(worktreePath, branchName, targetRemote);
|
// Skip this check when a specific remote branch is provided - we always use
|
||||||
if (upstreamStatus === 'none') {
|
// explicit 'git pull <remote> <branch>' args in that case.
|
||||||
let stashRecoveryFailed = false;
|
let upstreamStatus: UpstreamStatus = 'tracking';
|
||||||
if (didStash) {
|
if (!targetRemoteBranch) {
|
||||||
const stashPopped = await tryPopStash(worktreePath);
|
upstreamStatus = await hasUpstreamOrRemoteBranch(worktreePath, branchName, targetRemote);
|
||||||
stashRecoveryFailed = !stashPopped;
|
if (upstreamStatus === 'none') {
|
||||||
|
let stashRecoveryFailed = false;
|
||||||
|
if (didStash) {
|
||||||
|
const stashPopped = await tryPopStash(worktreePath);
|
||||||
|
stashRecoveryFailed = !stashPopped;
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
success: false,
|
||||||
|
error: `Branch '${branchName}' has no upstream branch on remote '${targetRemote}'. Push it first or set upstream with: git branch --set-upstream-to=${targetRemote}/${branchName}${stashRecoveryFailed ? ' Local changes remain stashed and need manual recovery (run: git stash pop).' : ''}`,
|
||||||
|
stashRecoveryFailed: stashRecoveryFailed ? stashRecoveryFailed : undefined,
|
||||||
|
};
|
||||||
}
|
}
|
||||||
return {
|
|
||||||
success: false,
|
|
||||||
error: `Branch '${branchName}' has no upstream branch on remote '${targetRemote}'. Push it first or set upstream with: git branch --set-upstream-to=${targetRemote}/${branchName}${stashRecoveryFailed ? ' Local changes remain stashed and need manual recovery (run: git stash pop).' : ''}`,
|
|
||||||
stashRecoveryFailed: stashRecoveryFailed ? stashRecoveryFailed : undefined,
|
|
||||||
};
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// 8. Pull latest changes
|
// 8. Pull latest changes
|
||||||
|
// When a specific remote branch is requested, always use explicit remote + branch args.
|
||||||
// When the branch has a configured upstream tracking ref, let Git use it automatically.
|
// When the branch has a configured upstream tracking ref, let Git use it automatically.
|
||||||
// When only the remote branch exists (no tracking ref), explicitly specify remote and branch.
|
// When only the remote branch exists (no tracking ref), explicitly specify remote and branch.
|
||||||
const pullArgs = upstreamStatus === 'tracking' ? ['pull'] : ['pull', targetRemote, branchName];
|
const pullArgs = targetRemoteBranch
|
||||||
|
? ['pull', targetRemote, targetRemoteBranch]
|
||||||
|
: upstreamStatus === 'tracking'
|
||||||
|
? ['pull']
|
||||||
|
: ['pull', targetRemote, branchName];
|
||||||
let pullConflict = false;
|
let pullConflict = false;
|
||||||
let pullConflictFiles: string[] = [];
|
let pullConflictFiles: string[] = [];
|
||||||
|
|
||||||
|
|||||||
@@ -39,6 +39,18 @@ export interface WorktreeInfo {
|
|||||||
* 3. Listing all worktrees with normalized paths
|
* 3. Listing all worktrees with normalized paths
|
||||||
*/
|
*/
|
||||||
export class WorktreeResolver {
|
export class WorktreeResolver {
|
||||||
|
private normalizeBranchName(branchName: string | null | undefined): string | null {
|
||||||
|
if (!branchName) return null;
|
||||||
|
let normalized = branchName.trim();
|
||||||
|
if (!normalized) return null;
|
||||||
|
|
||||||
|
normalized = normalized.replace(/^refs\/heads\//, '');
|
||||||
|
normalized = normalized.replace(/^refs\/remotes\/[^/]+\//, '');
|
||||||
|
normalized = normalized.replace(/^(origin|upstream)\//, '');
|
||||||
|
|
||||||
|
return normalized || null;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get the current branch name for a git repository
|
* Get the current branch name for a git repository
|
||||||
*
|
*
|
||||||
@@ -64,6 +76,9 @@ export class WorktreeResolver {
|
|||||||
*/
|
*/
|
||||||
async findWorktreeForBranch(projectPath: string, branchName: string): Promise<string | null> {
|
async findWorktreeForBranch(projectPath: string, branchName: string): Promise<string | null> {
|
||||||
try {
|
try {
|
||||||
|
const normalizedTargetBranch = this.normalizeBranchName(branchName);
|
||||||
|
if (!normalizedTargetBranch) return null;
|
||||||
|
|
||||||
const { stdout } = await execAsync('git worktree list --porcelain', {
|
const { stdout } = await execAsync('git worktree list --porcelain', {
|
||||||
cwd: projectPath,
|
cwd: projectPath,
|
||||||
});
|
});
|
||||||
@@ -76,10 +91,10 @@ export class WorktreeResolver {
|
|||||||
if (line.startsWith('worktree ')) {
|
if (line.startsWith('worktree ')) {
|
||||||
currentPath = line.slice(9);
|
currentPath = line.slice(9);
|
||||||
} else if (line.startsWith('branch ')) {
|
} else if (line.startsWith('branch ')) {
|
||||||
currentBranch = line.slice(7).replace('refs/heads/', '');
|
currentBranch = this.normalizeBranchName(line.slice(7));
|
||||||
} else if (line === '' && currentPath && currentBranch) {
|
} else if (line === '' && currentPath && currentBranch) {
|
||||||
// End of a worktree entry
|
// End of a worktree entry
|
||||||
if (currentBranch === branchName) {
|
if (currentBranch === normalizedTargetBranch) {
|
||||||
// Resolve to absolute path - git may return relative paths
|
// Resolve to absolute path - git may return relative paths
|
||||||
// On Windows, this is critical for cwd to work correctly
|
// On Windows, this is critical for cwd to work correctly
|
||||||
// On all platforms, absolute paths ensure consistent behavior
|
// On all platforms, absolute paths ensure consistent behavior
|
||||||
@@ -91,7 +106,7 @@ export class WorktreeResolver {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Check the last entry (if file doesn't end with newline)
|
// Check the last entry (if file doesn't end with newline)
|
||||||
if (currentPath && currentBranch && currentBranch === branchName) {
|
if (currentPath && currentBranch && currentBranch === normalizedTargetBranch) {
|
||||||
return this.resolvePath(projectPath, currentPath);
|
return this.resolvePath(projectPath, currentPath);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -123,7 +138,7 @@ export class WorktreeResolver {
|
|||||||
if (line.startsWith('worktree ')) {
|
if (line.startsWith('worktree ')) {
|
||||||
currentPath = line.slice(9);
|
currentPath = line.slice(9);
|
||||||
} else if (line.startsWith('branch ')) {
|
} else if (line.startsWith('branch ')) {
|
||||||
currentBranch = line.slice(7).replace('refs/heads/', '');
|
currentBranch = this.normalizeBranchName(line.slice(7));
|
||||||
} else if (line.startsWith('detached')) {
|
} else if (line.startsWith('detached')) {
|
||||||
// Detached HEAD - branch is null
|
// Detached HEAD - branch is null
|
||||||
currentBranch = null;
|
currentBranch = null;
|
||||||
|
|||||||
331
apps/server/tests/unit/lib/file-editor-store-logic.test.ts
Normal file
331
apps/server/tests/unit/lib/file-editor-store-logic.test.ts
Normal file
@@ -0,0 +1,331 @@
|
|||||||
|
import { describe, it, expect } from 'vitest';
|
||||||
|
import {
|
||||||
|
computeIsDirty,
|
||||||
|
updateTabWithContent as updateTabContent,
|
||||||
|
markTabAsSaved as markTabSaved,
|
||||||
|
} from '../../../../ui/src/components/views/file-editor-view/file-editor-dirty-utils.ts';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Unit tests for the file editor store logic, focusing on the unsaved indicator fix.
|
||||||
|
*
|
||||||
|
* The bug was: File unsaved indicators weren't working reliably - editing a file
|
||||||
|
* and saving it would sometimes leave the dirty indicator (dot) visible.
|
||||||
|
*
|
||||||
|
* Root causes:
|
||||||
|
* 1. Stale closure in handleSave - captured activeTab could have old content
|
||||||
|
* 2. Editor buffer not synced - CodeMirror might have buffered changes not yet in store
|
||||||
|
*
|
||||||
|
* Fix:
|
||||||
|
* - handleSave now gets fresh state from store using getState()
|
||||||
|
* - handleSave gets current content from editor via getValue()
|
||||||
|
* - Content is synced to store before saving if it differs
|
||||||
|
*
|
||||||
|
* Since we can't easily test the React/zustand store in node environment,
|
||||||
|
* we test the pure logic that the store uses for dirty state tracking.
|
||||||
|
*/
|
||||||
|
|
||||||
|
describe('File editor dirty state logic', () => {
|
||||||
|
describe('updateTabContent', () => {
|
||||||
|
it('should set isDirty to true when content differs from originalContent', () => {
|
||||||
|
const tab = {
|
||||||
|
content: 'original content',
|
||||||
|
originalContent: 'original content',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
const updated = updateTabContent(tab, 'modified content');
|
||||||
|
|
||||||
|
expect(updated.isDirty).toBe(true);
|
||||||
|
expect(updated.content).toBe('modified content');
|
||||||
|
expect(updated.originalContent).toBe('original content');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set isDirty to false when content matches originalContent', () => {
|
||||||
|
const tab = {
|
||||||
|
content: 'original content',
|
||||||
|
originalContent: 'original content',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
// First modify it
|
||||||
|
let updated = updateTabContent(tab, 'modified content');
|
||||||
|
expect(updated.isDirty).toBe(true);
|
||||||
|
|
||||||
|
// Now update back to original
|
||||||
|
updated = updateTabContent(updated, 'original content');
|
||||||
|
expect(updated.isDirty).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle empty content correctly', () => {
|
||||||
|
const tab = {
|
||||||
|
content: '',
|
||||||
|
originalContent: '',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
const updated = updateTabContent(tab, 'new content');
|
||||||
|
|
||||||
|
expect(updated.isDirty).toBe(true);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('markTabSaved', () => {
|
||||||
|
it('should set isDirty to false and update both content and originalContent', () => {
|
||||||
|
const tab = {
|
||||||
|
content: 'original content',
|
||||||
|
originalContent: 'original content',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
// First modify
|
||||||
|
let updated = updateTabContent(tab, 'modified content');
|
||||||
|
expect(updated.isDirty).toBe(true);
|
||||||
|
|
||||||
|
// Then save
|
||||||
|
updated = markTabSaved(updated, 'modified content');
|
||||||
|
|
||||||
|
expect(updated.isDirty).toBe(false);
|
||||||
|
expect(updated.content).toBe('modified content');
|
||||||
|
expect(updated.originalContent).toBe('modified content');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should correctly clear dirty state when save is triggered after edit', () => {
|
||||||
|
// This test simulates the bug scenario:
|
||||||
|
// 1. User edits file -> isDirty = true
|
||||||
|
// 2. User saves -> markTabSaved should set isDirty = false
|
||||||
|
let tab = {
|
||||||
|
content: 'initial',
|
||||||
|
originalContent: 'initial',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Simulate user editing
|
||||||
|
tab = updateTabContent(tab, 'initial\nnew line');
|
||||||
|
|
||||||
|
// Should be dirty
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
|
||||||
|
// Simulate save (with the content that was saved)
|
||||||
|
tab = markTabSaved(tab, 'initial\nnew line');
|
||||||
|
|
||||||
|
// Should NOT be dirty anymore
|
||||||
|
expect(tab.isDirty).toBe(false);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('race condition handling', () => {
|
||||||
|
it('should correctly handle updateTabContent after markTabSaved with same content', () => {
|
||||||
|
// This tests the scenario where:
|
||||||
|
// 1. CodeMirror has a pending onChange with content "B"
|
||||||
|
// 2. User presses save when editor shows "B"
|
||||||
|
// 3. markTabSaved is called with "B"
|
||||||
|
// 4. CodeMirror's pending onChange fires with "B" (same content)
|
||||||
|
// Result: isDirty should remain false
|
||||||
|
let tab = {
|
||||||
|
content: 'A',
|
||||||
|
originalContent: 'A',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
// User edits to "B"
|
||||||
|
tab = updateTabContent(tab, 'B');
|
||||||
|
|
||||||
|
// Save with "B"
|
||||||
|
tab = markTabSaved(tab, 'B');
|
||||||
|
|
||||||
|
// Late onChange with same content "B"
|
||||||
|
tab = updateTabContent(tab, 'B');
|
||||||
|
|
||||||
|
expect(tab.isDirty).toBe(false);
|
||||||
|
expect(tab.content).toBe('B');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should correctly handle updateTabContent after markTabSaved with different content', () => {
|
||||||
|
// This tests the scenario where:
|
||||||
|
// 1. CodeMirror has a pending onChange with content "C"
|
||||||
|
// 2. User presses save when store has "B"
|
||||||
|
// 3. markTabSaved is called with "B"
|
||||||
|
// 4. CodeMirror's pending onChange fires with "C" (different content)
|
||||||
|
// Result: isDirty should be true (file changed after save)
|
||||||
|
let tab = {
|
||||||
|
content: 'A',
|
||||||
|
originalContent: 'A',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
// User edits to "B"
|
||||||
|
tab = updateTabContent(tab, 'B');
|
||||||
|
|
||||||
|
// Save with "B"
|
||||||
|
tab = markTabSaved(tab, 'B');
|
||||||
|
|
||||||
|
// Late onChange with different content "C"
|
||||||
|
tab = updateTabContent(tab, 'C');
|
||||||
|
|
||||||
|
// File changed after save, so it should be dirty
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
expect(tab.content).toBe('C');
|
||||||
|
expect(tab.originalContent).toBe('B');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle rapid edit-save-edit cycle correctly', () => {
|
||||||
|
// Simulate rapid user actions
|
||||||
|
let tab = {
|
||||||
|
content: 'v1',
|
||||||
|
originalContent: 'v1',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Edit 1
|
||||||
|
tab = updateTabContent(tab, 'v2');
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
|
||||||
|
// Save 1
|
||||||
|
tab = markTabSaved(tab, 'v2');
|
||||||
|
expect(tab.isDirty).toBe(false);
|
||||||
|
|
||||||
|
// Edit 2
|
||||||
|
tab = updateTabContent(tab, 'v3');
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
|
||||||
|
// Save 2
|
||||||
|
tab = markTabSaved(tab, 'v3');
|
||||||
|
expect(tab.isDirty).toBe(false);
|
||||||
|
|
||||||
|
// Edit 3 (back to v2)
|
||||||
|
tab = updateTabContent(tab, 'v2');
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
|
||||||
|
// Save 3
|
||||||
|
tab = markTabSaved(tab, 'v2');
|
||||||
|
expect(tab.isDirty).toBe(false);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('handleSave stale closure fix simulation', () => {
|
||||||
|
it('demonstrates the fix: using fresh content instead of closure content', () => {
|
||||||
|
// This test demonstrates why the fix was necessary.
|
||||||
|
// The old handleSave captured activeTab in closure, which could be stale.
|
||||||
|
// The fix gets fresh state from getState() and uses editor.getValue().
|
||||||
|
|
||||||
|
// Simulate store state
|
||||||
|
let storeState = {
|
||||||
|
tabs: [
|
||||||
|
{
|
||||||
|
id: 'tab-1',
|
||||||
|
content: 'A',
|
||||||
|
originalContent: 'A',
|
||||||
|
isDirty: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
activeTabId: 'tab-1',
|
||||||
|
};
|
||||||
|
|
||||||
|
// Simulate a "stale closure" capturing the tab state
|
||||||
|
const staleClosureTab = storeState.tabs[0];
|
||||||
|
|
||||||
|
// User edits - store state updates
|
||||||
|
storeState = {
|
||||||
|
...storeState,
|
||||||
|
tabs: [
|
||||||
|
{
|
||||||
|
id: 'tab-1',
|
||||||
|
content: 'B',
|
||||||
|
originalContent: 'A',
|
||||||
|
isDirty: true,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
// OLD BUG: Using stale closure tab would save "A" (old content)
|
||||||
|
const oldBugSavedContent = staleClosureTab!.content;
|
||||||
|
expect(oldBugSavedContent).toBe('A'); // Wrong! Should be "B"
|
||||||
|
|
||||||
|
// FIX: Using fresh state from getState() gets correct content
|
||||||
|
const freshTab = storeState.tabs[0];
|
||||||
|
const fixedSavedContent = freshTab!.content;
|
||||||
|
expect(fixedSavedContent).toBe('B'); // Correct!
|
||||||
|
});
|
||||||
|
|
||||||
|
it('demonstrates syncing editor content before save', () => {
|
||||||
|
// This test demonstrates why we need to get content from editor directly.
|
||||||
|
// The store might have stale content if onChange hasn't fired yet.
|
||||||
|
|
||||||
|
// Simulate store state (has old content because onChange hasn't fired)
|
||||||
|
let storeContent = 'A';
|
||||||
|
|
||||||
|
// Editor has newer content (not yet synced to store)
|
||||||
|
const editorContent = 'B';
|
||||||
|
|
||||||
|
// FIX: Use editor content if available, fall back to store content
|
||||||
|
const contentToSave = editorContent ?? storeContent;
|
||||||
|
|
||||||
|
expect(contentToSave).toBe('B'); // Correctly saves editor content
|
||||||
|
|
||||||
|
// Simulate syncing to store before save
|
||||||
|
if (editorContent !== null && editorContent !== storeContent) {
|
||||||
|
storeContent = editorContent;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Now store is synced
|
||||||
|
expect(storeContent).toBe('B');
|
||||||
|
|
||||||
|
// After save, markTabSaved would set originalContent = savedContent
|
||||||
|
// and isDirty = false (if no more changes come in)
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('edge cases', () => {
|
||||||
|
it('should handle whitespace-only changes as dirty', () => {
|
||||||
|
let tab = {
|
||||||
|
content: 'hello',
|
||||||
|
originalContent: 'hello',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
tab = updateTabContent(tab, 'hello ');
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle line ending differences as dirty', () => {
|
||||||
|
let tab = {
|
||||||
|
content: 'line1\nline2',
|
||||||
|
originalContent: 'line1\nline2',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
tab = updateTabContent(tab, 'line1\r\nline2');
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle unicode content correctly', () => {
|
||||||
|
let tab = {
|
||||||
|
content: '你好世界',
|
||||||
|
originalContent: '你好世界',
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
tab = updateTabContent(tab, '你好宇宙');
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
|
||||||
|
tab = markTabSaved(tab, '你好宇宙');
|
||||||
|
expect(tab.isDirty).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle very large content efficiently', () => {
|
||||||
|
// Generate a large string (1MB)
|
||||||
|
const largeOriginal = 'x'.repeat(1024 * 1024);
|
||||||
|
const largeModified = largeOriginal + 'y';
|
||||||
|
|
||||||
|
let tab = {
|
||||||
|
content: largeOriginal,
|
||||||
|
originalContent: largeOriginal,
|
||||||
|
isDirty: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
tab = updateTabContent(tab, largeModified);
|
||||||
|
|
||||||
|
expect(tab.isDirty).toBe(true);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -1,5 +1,11 @@
|
|||||||
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||||
import { getMCPServersFromSettings } from '@/lib/settings-helpers.js';
|
import {
|
||||||
|
getMCPServersFromSettings,
|
||||||
|
getProviderById,
|
||||||
|
getProviderByModelId,
|
||||||
|
resolveProviderContext,
|
||||||
|
getAllProviderModels,
|
||||||
|
} from '@/lib/settings-helpers.js';
|
||||||
import type { SettingsService } from '@/services/settings-service.js';
|
import type { SettingsService } from '@/services/settings-service.js';
|
||||||
|
|
||||||
// Mock the logger
|
// Mock the logger
|
||||||
@@ -286,4 +292,691 @@ describe('settings-helpers.ts', () => {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('getProviderById', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return provider when found by ID', async () => {
|
||||||
|
const mockProvider = { id: 'zai-1', name: 'Zai', enabled: true };
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getProviderById('zai-1', mockSettingsService);
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return undefined when provider not found', async () => {
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getProviderById('unknown', mockSettingsService);
|
||||||
|
expect(result.provider).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return provider even if disabled (caller handles enabled state)', async () => {
|
||||||
|
const mockProvider = { id: 'disabled-1', name: 'Disabled', enabled: false };
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getProviderById('disabled-1', mockSettingsService);
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('getProviderByModelId', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return provider and modelConfig when found by model ID', async () => {
|
||||||
|
const mockModel = { id: 'custom-model-1', name: 'Custom Model' };
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [mockModel],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getProviderByModelId('custom-model-1', mockSettingsService);
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
expect(result.modelConfig).toEqual(mockModel);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should resolve mapped Claude model when mapsToClaudeModel is present', async () => {
|
||||||
|
const mockModel = {
|
||||||
|
id: 'custom-model-1',
|
||||||
|
name: 'Custom Model',
|
||||||
|
mapsToClaudeModel: 'sonnet-3-5',
|
||||||
|
};
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [mockModel],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getProviderByModelId('custom-model-1', mockSettingsService);
|
||||||
|
expect(result.resolvedModel).toBeDefined();
|
||||||
|
// resolveModelString('sonnet-3-5') usually returns 'claude-3-5-sonnet-20240620' or similar
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should ignore disabled providers', async () => {
|
||||||
|
const mockModel = { id: 'custom-model-1', name: 'Custom Model' };
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'disabled-1',
|
||||||
|
name: 'Disabled Provider',
|
||||||
|
enabled: false,
|
||||||
|
models: [mockModel],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getProviderByModelId('custom-model-1', mockSettingsService);
|
||||||
|
expect(result.provider).toBeUndefined();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('resolveProviderContext', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should resolve provider by explicit providerId', async () => {
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [{ id: 'custom-model-1', name: 'Custom Model' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({ anthropicApiKey: 'test-key' }),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(
|
||||||
|
mockSettingsService,
|
||||||
|
'custom-model-1',
|
||||||
|
'provider-1'
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
expect(result.credentials).toEqual({ anthropicApiKey: 'test-key' });
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return undefined provider when explicit providerId not found', async () => {
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(
|
||||||
|
mockSettingsService,
|
||||||
|
'some-model',
|
||||||
|
'unknown-provider'
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(result.provider).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should fallback to model-based lookup when providerId not provided', async () => {
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [{ id: 'custom-model-1', name: 'Custom Model' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'custom-model-1');
|
||||||
|
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
expect(result.modelConfig?.id).toBe('custom-model-1');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should resolve mapsToClaudeModel to actual Claude model', async () => {
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [
|
||||||
|
{
|
||||||
|
id: 'custom-model-1',
|
||||||
|
name: 'Custom Model',
|
||||||
|
mapsToClaudeModel: 'sonnet',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'custom-model-1');
|
||||||
|
|
||||||
|
// resolveModelString('sonnet') should return a valid Claude model ID
|
||||||
|
expect(result.resolvedModel).toBeDefined();
|
||||||
|
expect(result.resolvedModel).toContain('claude');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle empty providers list', async () => {
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'some-model');
|
||||||
|
|
||||||
|
expect(result.provider).toBeUndefined();
|
||||||
|
expect(result.resolvedModel).toBeUndefined();
|
||||||
|
expect(result.modelConfig).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle missing claudeCompatibleProviders field', async () => {
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'some-model');
|
||||||
|
|
||||||
|
expect(result.provider).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should skip disabled providers during fallback lookup', async () => {
|
||||||
|
const disabledProvider = {
|
||||||
|
id: 'disabled-1',
|
||||||
|
name: 'Disabled Provider',
|
||||||
|
enabled: false,
|
||||||
|
models: [{ id: 'model-in-disabled', name: 'Model' }],
|
||||||
|
};
|
||||||
|
const enabledProvider = {
|
||||||
|
id: 'enabled-1',
|
||||||
|
name: 'Enabled Provider',
|
||||||
|
enabled: true,
|
||||||
|
models: [{ id: 'model-in-enabled', name: 'Model' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [disabledProvider, enabledProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
// Should skip the disabled provider and find the model in the enabled one
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'model-in-enabled');
|
||||||
|
expect(result.provider?.id).toBe('enabled-1');
|
||||||
|
|
||||||
|
// Should not find model that only exists in disabled provider
|
||||||
|
const result2 = await resolveProviderContext(mockSettingsService, 'model-in-disabled');
|
||||||
|
expect(result2.provider).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should perform case-insensitive model ID matching', async () => {
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [{ id: 'Custom-Model-1', name: 'Custom Model' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'custom-model-1');
|
||||||
|
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
expect(result.modelConfig?.id).toBe('Custom-Model-1');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error result on exception', async () => {
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockRejectedValue(new Error('Settings error')),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'some-model');
|
||||||
|
|
||||||
|
expect(result.provider).toBeUndefined();
|
||||||
|
expect(result.credentials).toBeUndefined();
|
||||||
|
expect(result.resolvedModel).toBeUndefined();
|
||||||
|
expect(result.modelConfig).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should persist and load provider config from server settings', async () => {
|
||||||
|
// This test verifies the main bug fix: providers are loaded from server settings
|
||||||
|
const savedProvider = {
|
||||||
|
id: 'saved-provider-1',
|
||||||
|
name: 'Saved Provider',
|
||||||
|
enabled: true,
|
||||||
|
apiKeySource: 'credentials' as const,
|
||||||
|
models: [
|
||||||
|
{
|
||||||
|
id: 'saved-model-1',
|
||||||
|
name: 'Saved Model',
|
||||||
|
mapsToClaudeModel: 'sonnet',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [savedProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({
|
||||||
|
anthropicApiKey: 'saved-api-key',
|
||||||
|
}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
// Simulate loading saved provider config
|
||||||
|
const result = await resolveProviderContext(
|
||||||
|
mockSettingsService,
|
||||||
|
'saved-model-1',
|
||||||
|
'saved-provider-1'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Verify the provider is loaded from server settings
|
||||||
|
expect(result.provider).toEqual(savedProvider);
|
||||||
|
expect(result.provider?.id).toBe('saved-provider-1');
|
||||||
|
expect(result.provider?.models).toHaveLength(1);
|
||||||
|
expect(result.credentials?.anthropicApiKey).toBe('saved-api-key');
|
||||||
|
// Verify model mapping is resolved
|
||||||
|
expect(result.resolvedModel).toContain('claude');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should accept custom logPrefix parameter', async () => {
|
||||||
|
// Verify that the logPrefix parameter is accepted (used by facade.ts)
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [{ id: 'model-1', name: 'Model' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
// Call with custom logPrefix (as facade.ts does)
|
||||||
|
const result = await resolveProviderContext(
|
||||||
|
mockSettingsService,
|
||||||
|
'model-1',
|
||||||
|
undefined,
|
||||||
|
'[CustomPrefix]'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Function should work the same with custom prefix
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Session restore scenarios - provider.enabled: undefined should be treated as enabled
|
||||||
|
describe('session restore scenarios (enabled: undefined)', () => {
|
||||||
|
it('should treat provider with enabled: undefined as enabled', async () => {
|
||||||
|
// This is the main bug fix: when providers are loaded from settings on session restore,
|
||||||
|
// enabled might be undefined (not explicitly set) and should be treated as enabled
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: undefined, // Not explicitly set - should be treated as enabled
|
||||||
|
models: [{ id: 'model-1', name: 'Model' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'model-1');
|
||||||
|
|
||||||
|
// Provider should be found and used even though enabled is undefined
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
expect(result.modelConfig?.id).toBe('model-1');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use provider by ID when enabled is undefined', async () => {
|
||||||
|
// This tests the explicit providerId lookup with undefined enabled
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: undefined, // Not explicitly set - should be treated as enabled
|
||||||
|
models: [{ id: 'custom-model', name: 'Custom Model', mapsToClaudeModel: 'sonnet' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({ anthropicApiKey: 'test-key' }),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(
|
||||||
|
mockSettingsService,
|
||||||
|
'custom-model',
|
||||||
|
'provider-1'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Provider should be found and used even though enabled is undefined
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
expect(result.credentials?.anthropicApiKey).toBe('test-key');
|
||||||
|
expect(result.resolvedModel).toContain('claude');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should find model via fallback in provider with enabled: undefined', async () => {
|
||||||
|
// Test fallback model lookup when provider has undefined enabled
|
||||||
|
const providerWithUndefinedEnabled = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
// enabled is not set (undefined)
|
||||||
|
models: [{ id: 'model-1', name: 'Model' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [providerWithUndefinedEnabled],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(mockSettingsService, 'model-1');
|
||||||
|
|
||||||
|
expect(result.provider).toEqual(providerWithUndefinedEnabled);
|
||||||
|
expect(result.modelConfig?.id).toBe('model-1');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should still use provider for connection when model not found in its models array', async () => {
|
||||||
|
// This tests the fix: when providerId is explicitly set and provider is found,
|
||||||
|
// but the model isn't in that provider's models array, we still use that provider
|
||||||
|
// for connection settings (baseUrl, credentials)
|
||||||
|
const mockProvider = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
baseUrl: 'https://custom-api.example.com',
|
||||||
|
models: [{ id: 'other-model', name: 'Other Model' }],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [mockProvider],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({ anthropicApiKey: 'test-key' }),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(
|
||||||
|
mockSettingsService,
|
||||||
|
'unknown-model', // Model not in provider's models array
|
||||||
|
'provider-1'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Provider should still be returned for connection settings
|
||||||
|
expect(result.provider).toEqual(mockProvider);
|
||||||
|
// modelConfig should be undefined since the model wasn't found
|
||||||
|
expect(result.modelConfig).toBeUndefined();
|
||||||
|
// resolvedModel should be undefined since no mapping was found
|
||||||
|
expect(result.resolvedModel).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should fallback to find modelConfig in other providers when not in explicit providerId provider', async () => {
|
||||||
|
// When providerId is set and provider is found, but model isn't there,
|
||||||
|
// we should still search for modelConfig in other providers
|
||||||
|
const provider1 = {
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
baseUrl: 'https://provider1.example.com',
|
||||||
|
models: [{ id: 'provider1-model', name: 'Provider 1 Model' }],
|
||||||
|
};
|
||||||
|
const provider2 = {
|
||||||
|
id: 'provider-2',
|
||||||
|
name: 'Provider 2',
|
||||||
|
enabled: true,
|
||||||
|
baseUrl: 'https://provider2.example.com',
|
||||||
|
models: [
|
||||||
|
{
|
||||||
|
id: 'shared-model',
|
||||||
|
name: 'Shared Model',
|
||||||
|
mapsToClaudeModel: 'sonnet',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [provider1, provider2],
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({ anthropicApiKey: 'test-key' }),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await resolveProviderContext(
|
||||||
|
mockSettingsService,
|
||||||
|
'shared-model', // This model is in provider-2, not provider-1
|
||||||
|
'provider-1' // But we explicitly want to use provider-1
|
||||||
|
);
|
||||||
|
|
||||||
|
// Provider should still be provider-1 (for connection settings)
|
||||||
|
expect(result.provider).toEqual(provider1);
|
||||||
|
// But modelConfig should be found from provider-2
|
||||||
|
expect(result.modelConfig?.id).toBe('shared-model');
|
||||||
|
// And the model mapping should be resolved
|
||||||
|
expect(result.resolvedModel).toContain('claude');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle multiple providers with mixed enabled states', async () => {
|
||||||
|
// Test the full session restore scenario with multiple providers
|
||||||
|
const providers = [
|
||||||
|
{
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'First Provider',
|
||||||
|
enabled: undefined, // Undefined after restore
|
||||||
|
models: [{ id: 'model-a', name: 'Model A' }],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'provider-2',
|
||||||
|
name: 'Second Provider',
|
||||||
|
// enabled field missing entirely
|
||||||
|
models: [{ id: 'model-b', name: 'Model B', mapsToClaudeModel: 'opus' }],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'provider-3',
|
||||||
|
name: 'Disabled Provider',
|
||||||
|
enabled: false, // Explicitly disabled
|
||||||
|
models: [{ id: 'model-c', name: 'Model C' }],
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: providers,
|
||||||
|
}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({ anthropicApiKey: 'test-key' }),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
// Provider 1 should work (enabled: undefined)
|
||||||
|
const result1 = await resolveProviderContext(mockSettingsService, 'model-a', 'provider-1');
|
||||||
|
expect(result1.provider?.id).toBe('provider-1');
|
||||||
|
expect(result1.modelConfig?.id).toBe('model-a');
|
||||||
|
|
||||||
|
// Provider 2 should work (enabled field missing)
|
||||||
|
const result2 = await resolveProviderContext(mockSettingsService, 'model-b', 'provider-2');
|
||||||
|
expect(result2.provider?.id).toBe('provider-2');
|
||||||
|
expect(result2.modelConfig?.id).toBe('model-b');
|
||||||
|
expect(result2.resolvedModel).toContain('claude');
|
||||||
|
|
||||||
|
// Provider 3 with explicit providerId IS returned even if disabled
|
||||||
|
// (caller handles enabled state check)
|
||||||
|
const result3 = await resolveProviderContext(mockSettingsService, 'model-c', 'provider-3');
|
||||||
|
// Provider is found but modelConfig won't be found since disabled providers
|
||||||
|
// skip model lookup in their models array
|
||||||
|
expect(result3.provider).toEqual(providers[2]);
|
||||||
|
expect(result3.modelConfig).toBeUndefined();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('getAllProviderModels', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return all models from enabled providers', async () => {
|
||||||
|
const mockProviders = [
|
||||||
|
{
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [
|
||||||
|
{ id: 'model-1', name: 'Model 1' },
|
||||||
|
{ id: 'model-2', name: 'Model 2' },
|
||||||
|
],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'provider-2',
|
||||||
|
name: 'Provider 2',
|
||||||
|
enabled: true,
|
||||||
|
models: [{ id: 'model-3', name: 'Model 3' }],
|
||||||
|
},
|
||||||
|
];
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: mockProviders,
|
||||||
|
}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getAllProviderModels(mockSettingsService);
|
||||||
|
|
||||||
|
expect(result).toHaveLength(3);
|
||||||
|
expect(result[0].providerId).toBe('provider-1');
|
||||||
|
expect(result[0].model.id).toBe('model-1');
|
||||||
|
expect(result[2].providerId).toBe('provider-2');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should filter out disabled providers', async () => {
|
||||||
|
const mockProviders = [
|
||||||
|
{
|
||||||
|
id: 'enabled-1',
|
||||||
|
name: 'Enabled Provider',
|
||||||
|
enabled: true,
|
||||||
|
models: [{ id: 'model-1', name: 'Model 1' }],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'disabled-1',
|
||||||
|
name: 'Disabled Provider',
|
||||||
|
enabled: false,
|
||||||
|
models: [{ id: 'model-2', name: 'Model 2' }],
|
||||||
|
},
|
||||||
|
];
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: mockProviders,
|
||||||
|
}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getAllProviderModels(mockSettingsService);
|
||||||
|
|
||||||
|
expect(result).toHaveLength(1);
|
||||||
|
expect(result[0].providerId).toBe('enabled-1');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return empty array when no providers configured', async () => {
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: [],
|
||||||
|
}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getAllProviderModels(mockSettingsService);
|
||||||
|
|
||||||
|
expect(result).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle missing claudeCompatibleProviders field', async () => {
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getAllProviderModels(mockSettingsService);
|
||||||
|
|
||||||
|
expect(result).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle provider with no models', async () => {
|
||||||
|
const mockProviders = [
|
||||||
|
{
|
||||||
|
id: 'provider-1',
|
||||||
|
name: 'Provider 1',
|
||||||
|
enabled: true,
|
||||||
|
models: [],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'provider-2',
|
||||||
|
name: 'Provider 2',
|
||||||
|
enabled: true,
|
||||||
|
// no models field
|
||||||
|
},
|
||||||
|
];
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
claudeCompatibleProviders: mockProviders,
|
||||||
|
}),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getAllProviderModels(mockSettingsService);
|
||||||
|
|
||||||
|
expect(result).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return empty array on exception', async () => {
|
||||||
|
const mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockRejectedValue(new Error('Settings error')),
|
||||||
|
} as unknown as SettingsService;
|
||||||
|
|
||||||
|
const result = await getAllProviderModels(mockSettingsService);
|
||||||
|
|
||||||
|
expect(result).toEqual([]);
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -15,6 +15,7 @@ import {
|
|||||||
calculateReasoningTimeout,
|
calculateReasoningTimeout,
|
||||||
REASONING_TIMEOUT_MULTIPLIERS,
|
REASONING_TIMEOUT_MULTIPLIERS,
|
||||||
DEFAULT_TIMEOUT_MS,
|
DEFAULT_TIMEOUT_MS,
|
||||||
|
validateBareModelId,
|
||||||
} from '@automaker/types';
|
} from '@automaker/types';
|
||||||
|
|
||||||
const OPENAI_API_KEY_ENV = 'OPENAI_API_KEY';
|
const OPENAI_API_KEY_ENV = 'OPENAI_API_KEY';
|
||||||
@@ -455,4 +456,19 @@ describe('codex-provider.ts', () => {
|
|||||||
expect(calculateReasoningTimeout('xhigh')).toBe(120000);
|
expect(calculateReasoningTimeout('xhigh')).toBe(120000);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('validateBareModelId integration', () => {
|
||||||
|
it('should allow codex- prefixed models for Codex provider with expectedProvider="codex"', () => {
|
||||||
|
expect(() => validateBareModelId('codex-gpt-4', 'CodexProvider', 'codex')).not.toThrow();
|
||||||
|
expect(() =>
|
||||||
|
validateBareModelId('codex-gpt-5.1-codex-max', 'CodexProvider', 'codex')
|
||||||
|
).not.toThrow();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should reject other provider prefixes for Codex provider', () => {
|
||||||
|
expect(() => validateBareModelId('cursor-gpt-4', 'CodexProvider', 'codex')).toThrow();
|
||||||
|
expect(() => validateBareModelId('gemini-2.5-flash', 'CodexProvider', 'codex')).toThrow();
|
||||||
|
expect(() => validateBareModelId('copilot-gpt-4', 'CodexProvider', 'codex')).toThrow();
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -331,13 +331,15 @@ describe('copilot-provider.ts', () => {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should normalize tool.execution_end event', () => {
|
it('should normalize tool.execution_complete event', () => {
|
||||||
const event = {
|
const event = {
|
||||||
type: 'tool.execution_end',
|
type: 'tool.execution_complete',
|
||||||
data: {
|
data: {
|
||||||
toolName: 'read_file',
|
|
||||||
toolCallId: 'call-123',
|
toolCallId: 'call-123',
|
||||||
result: 'file content',
|
success: true,
|
||||||
|
result: {
|
||||||
|
content: 'file content',
|
||||||
|
},
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -357,23 +359,85 @@ describe('copilot-provider.ts', () => {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
it('should handle tool.execution_end with error', () => {
|
it('should handle tool.execution_complete with error', () => {
|
||||||
const event = {
|
const event = {
|
||||||
type: 'tool.execution_end',
|
type: 'tool.execution_complete',
|
||||||
data: {
|
data: {
|
||||||
toolName: 'bash',
|
|
||||||
toolCallId: 'call-456',
|
toolCallId: 'call-456',
|
||||||
error: 'Command failed',
|
success: false,
|
||||||
|
error: {
|
||||||
|
message: 'Command failed',
|
||||||
|
},
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
const result = provider.normalizeEvent(event);
|
const result = provider.normalizeEvent(event);
|
||||||
expect(result?.message?.content?.[0]).toMatchObject({
|
expect(result?.message?.content?.[0]).toMatchObject({
|
||||||
type: 'tool_result',
|
type: 'tool_result',
|
||||||
|
tool_use_id: 'call-456',
|
||||||
content: '[ERROR] Command failed',
|
content: '[ERROR] Command failed',
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('should handle tool.execution_complete with empty result', () => {
|
||||||
|
const event = {
|
||||||
|
type: 'tool.execution_complete',
|
||||||
|
data: {
|
||||||
|
toolCallId: 'call-789',
|
||||||
|
success: true,
|
||||||
|
result: {
|
||||||
|
content: '',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = provider.normalizeEvent(event);
|
||||||
|
expect(result?.message?.content?.[0]).toMatchObject({
|
||||||
|
type: 'tool_result',
|
||||||
|
tool_use_id: 'call-789',
|
||||||
|
content: '',
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle tool.execution_complete with missing result', () => {
|
||||||
|
const event = {
|
||||||
|
type: 'tool.execution_complete',
|
||||||
|
data: {
|
||||||
|
toolCallId: 'call-999',
|
||||||
|
success: true,
|
||||||
|
// No result field
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = provider.normalizeEvent(event);
|
||||||
|
expect(result?.message?.content?.[0]).toMatchObject({
|
||||||
|
type: 'tool_result',
|
||||||
|
tool_use_id: 'call-999',
|
||||||
|
content: '',
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle tool.execution_complete with error code', () => {
|
||||||
|
const event = {
|
||||||
|
type: 'tool.execution_complete',
|
||||||
|
data: {
|
||||||
|
toolCallId: 'call-567',
|
||||||
|
success: false,
|
||||||
|
error: {
|
||||||
|
message: 'Permission denied',
|
||||||
|
code: 'EACCES',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = provider.normalizeEvent(event);
|
||||||
|
expect(result?.message?.content?.[0]).toMatchObject({
|
||||||
|
type: 'tool_result',
|
||||||
|
tool_use_id: 'call-567',
|
||||||
|
content: '[ERROR] Permission denied (EACCES)',
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
it('should normalize session.idle to success result', () => {
|
it('should normalize session.idle to success result', () => {
|
||||||
const event = { type: 'session.idle' };
|
const event = { type: 'session.idle' };
|
||||||
|
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import { describe, it, expect, beforeEach } from 'vitest';
|
import { describe, it, expect, beforeEach, vi } from 'vitest';
|
||||||
import { CursorProvider } from '@/providers/cursor-provider.js';
|
import { CursorProvider } from '@/providers/cursor-provider.js';
|
||||||
|
import { validateBareModelId } from '@automaker/types';
|
||||||
|
|
||||||
describe('cursor-provider.ts', () => {
|
describe('cursor-provider.ts', () => {
|
||||||
describe('buildCliArgs', () => {
|
describe('buildCliArgs', () => {
|
||||||
@@ -154,4 +155,81 @@ describe('cursor-provider.ts', () => {
|
|||||||
expect(msg!.subtype).toBe('success');
|
expect(msg!.subtype).toBe('success');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('Cursor Gemini models support', () => {
|
||||||
|
let provider: CursorProvider;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
provider = Object.create(CursorProvider.prototype) as CursorProvider & {
|
||||||
|
cliPath?: string;
|
||||||
|
};
|
||||||
|
provider.cliPath = '/usr/local/bin/cursor-agent';
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('buildCliArgs with Cursor Gemini models', () => {
|
||||||
|
it('should handle cursor-gemini-3-pro model', () => {
|
||||||
|
const args = provider.buildCliArgs({
|
||||||
|
prompt: 'Write a function',
|
||||||
|
model: 'gemini-3-pro', // Bare model ID after stripping cursor- prefix
|
||||||
|
cwd: '/tmp/project',
|
||||||
|
});
|
||||||
|
|
||||||
|
const modelIndex = args.indexOf('--model');
|
||||||
|
expect(modelIndex).toBeGreaterThan(-1);
|
||||||
|
expect(args[modelIndex + 1]).toBe('gemini-3-pro');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle cursor-gemini-3-flash model', () => {
|
||||||
|
const args = provider.buildCliArgs({
|
||||||
|
prompt: 'Quick task',
|
||||||
|
model: 'gemini-3-flash', // Bare model ID after stripping cursor- prefix
|
||||||
|
cwd: '/tmp/project',
|
||||||
|
});
|
||||||
|
|
||||||
|
const modelIndex = args.indexOf('--model');
|
||||||
|
expect(modelIndex).toBeGreaterThan(-1);
|
||||||
|
expect(args[modelIndex + 1]).toBe('gemini-3-flash');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should include --resume with Cursor Gemini models when sdkSessionId is provided', () => {
|
||||||
|
const args = provider.buildCliArgs({
|
||||||
|
prompt: 'Continue task',
|
||||||
|
model: 'gemini-3-pro',
|
||||||
|
cwd: '/tmp/project',
|
||||||
|
sdkSessionId: 'cursor-gemini-session-123',
|
||||||
|
});
|
||||||
|
|
||||||
|
const resumeIndex = args.indexOf('--resume');
|
||||||
|
expect(resumeIndex).toBeGreaterThan(-1);
|
||||||
|
expect(args[resumeIndex + 1]).toBe('cursor-gemini-session-123');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('validateBareModelId with Cursor Gemini models', () => {
|
||||||
|
it('should allow gemini- prefixed models for Cursor provider with expectedProvider="cursor"', () => {
|
||||||
|
// This is the key fix - Cursor Gemini models have bare IDs like "gemini-3-pro"
|
||||||
|
expect(() => validateBareModelId('gemini-3-pro', 'CursorProvider', 'cursor')).not.toThrow();
|
||||||
|
expect(() =>
|
||||||
|
validateBareModelId('gemini-3-flash', 'CursorProvider', 'cursor')
|
||||||
|
).not.toThrow();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should still reject other provider prefixes for Cursor provider', () => {
|
||||||
|
expect(() => validateBareModelId('codex-gpt-4', 'CursorProvider', 'cursor')).toThrow();
|
||||||
|
expect(() => validateBareModelId('copilot-gpt-4', 'CursorProvider', 'cursor')).toThrow();
|
||||||
|
expect(() => validateBareModelId('opencode-gpt-4', 'CursorProvider', 'cursor')).toThrow();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should accept cursor- prefixed models when expectedProvider is "cursor" (for double-prefix validation)', () => {
|
||||||
|
// Note: When expectedProvider="cursor", we skip the cursor- prefix check
|
||||||
|
// This is intentional because the validation happens AFTER prefix stripping
|
||||||
|
// So if cursor-gemini-3-pro reaches validateBareModelId with expectedProvider="cursor",
|
||||||
|
// it means the prefix was NOT properly stripped, but we skip it anyway
|
||||||
|
// since we're checking if the Cursor provider itself can receive cursor- prefixed models
|
||||||
|
expect(() =>
|
||||||
|
validateBareModelId('cursor-gemini-3-pro', 'CursorProvider', 'cursor')
|
||||||
|
).not.toThrow();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -1,6 +1,7 @@
|
|||||||
import { describe, it, expect, beforeEach } from 'vitest';
|
import { describe, it, expect, beforeEach } from 'vitest';
|
||||||
import { GeminiProvider } from '@/providers/gemini-provider.js';
|
import { GeminiProvider } from '@/providers/gemini-provider.js';
|
||||||
import type { ProviderMessage } from '@automaker/types';
|
import type { ProviderMessage } from '@automaker/types';
|
||||||
|
import { validateBareModelId } from '@automaker/types';
|
||||||
|
|
||||||
describe('gemini-provider.ts', () => {
|
describe('gemini-provider.ts', () => {
|
||||||
let provider: GeminiProvider;
|
let provider: GeminiProvider;
|
||||||
@@ -253,4 +254,19 @@ describe('gemini-provider.ts', () => {
|
|||||||
expect(msg.subtype).toBe('success');
|
expect(msg.subtype).toBe('success');
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('validateBareModelId integration', () => {
|
||||||
|
it('should allow gemini- prefixed models for Gemini provider with expectedProvider="gemini"', () => {
|
||||||
|
expect(() =>
|
||||||
|
validateBareModelId('gemini-2.5-flash', 'GeminiProvider', 'gemini')
|
||||||
|
).not.toThrow();
|
||||||
|
expect(() => validateBareModelId('gemini-2.5-pro', 'GeminiProvider', 'gemini')).not.toThrow();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should reject other provider prefixes for Gemini provider', () => {
|
||||||
|
expect(() => validateBareModelId('cursor-gpt-4', 'GeminiProvider', 'gemini')).toThrow();
|
||||||
|
expect(() => validateBareModelId('codex-gpt-4', 'GeminiProvider', 'gemini')).toThrow();
|
||||||
|
expect(() => validateBareModelId('copilot-gpt-4', 'GeminiProvider', 'gemini')).toThrow();
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -0,0 +1,270 @@
|
|||||||
|
/**
|
||||||
|
* Tests for default fields applied to features created by parseAndCreateFeatures
|
||||||
|
*
|
||||||
|
* Verifies that auto-created features include planningMode: 'skip',
|
||||||
|
* requirePlanApproval: false, and dependencies: [].
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||||
|
import path from 'path';
|
||||||
|
|
||||||
|
// Use vi.hoisted to create mock functions that can be referenced in vi.mock factories
|
||||||
|
const { mockMkdir, mockAtomicWriteJson, mockExtractJsonWithArray, mockCreateNotification } =
|
||||||
|
vi.hoisted(() => ({
|
||||||
|
mockMkdir: vi.fn().mockResolvedValue(undefined),
|
||||||
|
mockAtomicWriteJson: vi.fn().mockResolvedValue(undefined),
|
||||||
|
mockExtractJsonWithArray: vi.fn(),
|
||||||
|
mockCreateNotification: vi.fn().mockResolvedValue(undefined),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/lib/secure-fs.js', () => ({
|
||||||
|
mkdir: mockMkdir,
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@automaker/utils', () => ({
|
||||||
|
createLogger: vi.fn().mockReturnValue({
|
||||||
|
info: vi.fn(),
|
||||||
|
warn: vi.fn(),
|
||||||
|
error: vi.fn(),
|
||||||
|
debug: vi.fn(),
|
||||||
|
}),
|
||||||
|
atomicWriteJson: mockAtomicWriteJson,
|
||||||
|
DEFAULT_BACKUP_COUNT: 3,
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@automaker/platform', () => ({
|
||||||
|
getFeaturesDir: vi.fn((projectPath: string) => path.join(projectPath, '.automaker', 'features')),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/lib/json-extractor.js', () => ({
|
||||||
|
extractJsonWithArray: mockExtractJsonWithArray,
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/services/notification-service.js', () => ({
|
||||||
|
getNotificationService: vi.fn(() => ({
|
||||||
|
createNotification: mockCreateNotification,
|
||||||
|
})),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Import after mocks are set up
|
||||||
|
import { parseAndCreateFeatures } from '../../../../src/routes/app-spec/parse-and-create-features.js';
|
||||||
|
|
||||||
|
describe('parseAndCreateFeatures - default fields', () => {
|
||||||
|
const mockEvents = {
|
||||||
|
emit: vi.fn(),
|
||||||
|
} as any;
|
||||||
|
|
||||||
|
const projectPath = '/test/project';
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set planningMode to "skip" on created features', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Test Feature',
|
||||||
|
description: 'A test feature',
|
||||||
|
priority: 1,
|
||||||
|
complexity: 'simple',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
expect(mockAtomicWriteJson).toHaveBeenCalledTimes(1);
|
||||||
|
const writtenData = mockAtomicWriteJson.mock.calls[0][1];
|
||||||
|
expect(writtenData.planningMode).toBe('skip');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set requirePlanApproval to false on created features', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Test Feature',
|
||||||
|
description: 'A test feature',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
const writtenData = mockAtomicWriteJson.mock.calls[0][1];
|
||||||
|
expect(writtenData.requirePlanApproval).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set dependencies to empty array when not provided', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Test Feature',
|
||||||
|
description: 'A test feature',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
const writtenData = mockAtomicWriteJson.mock.calls[0][1];
|
||||||
|
expect(writtenData.dependencies).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve dependencies when provided by the parser', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Test Feature',
|
||||||
|
description: 'A test feature',
|
||||||
|
dependencies: ['feature-0'],
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
const writtenData = mockAtomicWriteJson.mock.calls[0][1];
|
||||||
|
expect(writtenData.dependencies).toEqual(['feature-0']);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should apply all default fields consistently across multiple features', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Feature 1',
|
||||||
|
description: 'First feature',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'feature-2',
|
||||||
|
title: 'Feature 2',
|
||||||
|
description: 'Second feature',
|
||||||
|
dependencies: ['feature-1'],
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'feature-3',
|
||||||
|
title: 'Feature 3',
|
||||||
|
description: 'Third feature',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
expect(mockAtomicWriteJson).toHaveBeenCalledTimes(3);
|
||||||
|
|
||||||
|
for (let i = 0; i < 3; i++) {
|
||||||
|
const writtenData = mockAtomicWriteJson.mock.calls[i][1];
|
||||||
|
expect(writtenData.planningMode, `feature ${i + 1} planningMode`).toBe('skip');
|
||||||
|
expect(writtenData.requirePlanApproval, `feature ${i + 1} requirePlanApproval`).toBe(false);
|
||||||
|
expect(Array.isArray(writtenData.dependencies), `feature ${i + 1} dependencies`).toBe(true);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Feature 2 should have its explicit dependency preserved
|
||||||
|
expect(mockAtomicWriteJson.mock.calls[1][1].dependencies).toEqual(['feature-1']);
|
||||||
|
// Features 1 and 3 should have empty arrays
|
||||||
|
expect(mockAtomicWriteJson.mock.calls[0][1].dependencies).toEqual([]);
|
||||||
|
expect(mockAtomicWriteJson.mock.calls[2][1].dependencies).toEqual([]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set status to "backlog" on all created features', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Test Feature',
|
||||||
|
description: 'A test feature',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
const writtenData = mockAtomicWriteJson.mock.calls[0][1];
|
||||||
|
expect(writtenData.status).toBe('backlog');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should include createdAt and updatedAt timestamps', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Test Feature',
|
||||||
|
description: 'A test feature',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
const writtenData = mockAtomicWriteJson.mock.calls[0][1];
|
||||||
|
expect(writtenData.createdAt).toBeDefined();
|
||||||
|
expect(writtenData.updatedAt).toBeDefined();
|
||||||
|
// Should be valid ISO date strings
|
||||||
|
expect(new Date(writtenData.createdAt).toISOString()).toBe(writtenData.createdAt);
|
||||||
|
expect(new Date(writtenData.updatedAt).toISOString()).toBe(writtenData.updatedAt);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use default values for optional fields not provided', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-minimal',
|
||||||
|
title: 'Minimal Feature',
|
||||||
|
description: 'Only required fields',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
const writtenData = mockAtomicWriteJson.mock.calls[0][1];
|
||||||
|
expect(writtenData.category).toBe('Uncategorized');
|
||||||
|
expect(writtenData.priority).toBe(2);
|
||||||
|
expect(writtenData.complexity).toBe('moderate');
|
||||||
|
expect(writtenData.dependencies).toEqual([]);
|
||||||
|
expect(writtenData.planningMode).toBe('skip');
|
||||||
|
expect(writtenData.requirePlanApproval).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should emit success event after creating features', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue({
|
||||||
|
features: [
|
||||||
|
{
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Feature 1',
|
||||||
|
description: 'First',
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'content', mockEvents);
|
||||||
|
|
||||||
|
expect(mockEvents.emit).toHaveBeenCalledWith(
|
||||||
|
'spec-regeneration:event',
|
||||||
|
expect.objectContaining({
|
||||||
|
type: 'spec_regeneration_complete',
|
||||||
|
projectPath,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should emit error event when no valid JSON is found', async () => {
|
||||||
|
mockExtractJsonWithArray.mockReturnValue(null);
|
||||||
|
|
||||||
|
await parseAndCreateFeatures(projectPath, 'invalid content', mockEvents);
|
||||||
|
|
||||||
|
expect(mockEvents.emit).toHaveBeenCalledWith(
|
||||||
|
'spec-regeneration:event',
|
||||||
|
expect.objectContaining({
|
||||||
|
type: 'spec_regeneration_error',
|
||||||
|
projectPath,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
149
apps/server/tests/unit/routes/backlog-plan/apply.test.ts
Normal file
149
apps/server/tests/unit/routes/backlog-plan/apply.test.ts
Normal file
@@ -0,0 +1,149 @@
|
|||||||
|
import { beforeEach, describe, expect, it, vi } from 'vitest';
|
||||||
|
|
||||||
|
const { mockGetAll, mockCreate, mockUpdate, mockDelete, mockClearBacklogPlan } = vi.hoisted(() => ({
|
||||||
|
mockGetAll: vi.fn(),
|
||||||
|
mockCreate: vi.fn(),
|
||||||
|
mockUpdate: vi.fn(),
|
||||||
|
mockDelete: vi.fn(),
|
||||||
|
mockClearBacklogPlan: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/services/feature-loader.js', () => ({
|
||||||
|
FeatureLoader: class {
|
||||||
|
getAll = mockGetAll;
|
||||||
|
create = mockCreate;
|
||||||
|
update = mockUpdate;
|
||||||
|
delete = mockDelete;
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/routes/backlog-plan/common.js', () => ({
|
||||||
|
logger: {
|
||||||
|
info: vi.fn(),
|
||||||
|
warn: vi.fn(),
|
||||||
|
error: vi.fn(),
|
||||||
|
},
|
||||||
|
clearBacklogPlan: mockClearBacklogPlan,
|
||||||
|
getErrorMessage: (error: unknown) => (error instanceof Error ? error.message : String(error)),
|
||||||
|
logError: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
import { createApplyHandler } from '@/routes/backlog-plan/routes/apply.js';
|
||||||
|
|
||||||
|
function createMockRes() {
|
||||||
|
const res: {
|
||||||
|
status: ReturnType<typeof vi.fn>;
|
||||||
|
json: ReturnType<typeof vi.fn>;
|
||||||
|
} = {
|
||||||
|
status: vi.fn(),
|
||||||
|
json: vi.fn(),
|
||||||
|
};
|
||||||
|
res.status.mockReturnValue(res);
|
||||||
|
return res;
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('createApplyHandler', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
mockGetAll.mockResolvedValue([]);
|
||||||
|
mockCreate.mockResolvedValue({ id: 'feature-created' });
|
||||||
|
mockUpdate.mockResolvedValue({});
|
||||||
|
mockDelete.mockResolvedValue(true);
|
||||||
|
mockClearBacklogPlan.mockResolvedValue(undefined);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('applies default feature model and planning settings when backlog plan additions omit them', async () => {
|
||||||
|
const settingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
defaultFeatureModel: { model: 'codex-gpt-5.2-codex', reasoningEffort: 'high' },
|
||||||
|
defaultPlanningMode: 'spec',
|
||||||
|
defaultRequirePlanApproval: true,
|
||||||
|
}),
|
||||||
|
getProjectSettings: vi.fn().mockResolvedValue({}),
|
||||||
|
} as any;
|
||||||
|
|
||||||
|
const req = {
|
||||||
|
body: {
|
||||||
|
projectPath: '/tmp/project',
|
||||||
|
plan: {
|
||||||
|
changes: [
|
||||||
|
{
|
||||||
|
type: 'add',
|
||||||
|
feature: {
|
||||||
|
id: 'feature-from-plan',
|
||||||
|
title: 'Created from plan',
|
||||||
|
description: 'desc',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
} as any;
|
||||||
|
const res = createMockRes();
|
||||||
|
|
||||||
|
await createApplyHandler(settingsService)(req, res as any);
|
||||||
|
|
||||||
|
expect(mockCreate).toHaveBeenCalledWith(
|
||||||
|
'/tmp/project',
|
||||||
|
expect.objectContaining({
|
||||||
|
model: 'codex-gpt-5.2-codex',
|
||||||
|
reasoningEffort: 'high',
|
||||||
|
planningMode: 'spec',
|
||||||
|
requirePlanApproval: true,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
expect(res.json).toHaveBeenCalledWith(
|
||||||
|
expect.objectContaining({
|
||||||
|
success: true,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('uses project default feature model override and enforces no approval for skip mode', async () => {
|
||||||
|
const settingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({
|
||||||
|
defaultFeatureModel: { model: 'claude-opus' },
|
||||||
|
defaultPlanningMode: 'skip',
|
||||||
|
defaultRequirePlanApproval: true,
|
||||||
|
}),
|
||||||
|
getProjectSettings: vi.fn().mockResolvedValue({
|
||||||
|
defaultFeatureModel: {
|
||||||
|
model: 'GLM-4.7',
|
||||||
|
providerId: 'provider-glm',
|
||||||
|
thinkingLevel: 'adaptive',
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
} as any;
|
||||||
|
|
||||||
|
const req = {
|
||||||
|
body: {
|
||||||
|
projectPath: '/tmp/project',
|
||||||
|
plan: {
|
||||||
|
changes: [
|
||||||
|
{
|
||||||
|
type: 'add',
|
||||||
|
feature: {
|
||||||
|
id: 'feature-from-plan',
|
||||||
|
title: 'Created from plan',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
} as any;
|
||||||
|
const res = createMockRes();
|
||||||
|
|
||||||
|
await createApplyHandler(settingsService)(req, res as any);
|
||||||
|
|
||||||
|
expect(mockCreate).toHaveBeenCalledWith(
|
||||||
|
'/tmp/project',
|
||||||
|
expect.objectContaining({
|
||||||
|
model: 'GLM-4.7',
|
||||||
|
providerId: 'provider-glm',
|
||||||
|
thinkingLevel: 'adaptive',
|
||||||
|
planningMode: 'skip',
|
||||||
|
requirePlanApproval: false,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -0,0 +1,930 @@
|
|||||||
|
/**
|
||||||
|
* Tests for worktree list endpoint handling of detached HEAD state.
|
||||||
|
*
|
||||||
|
* When a worktree is in detached HEAD state (e.g., during a rebase),
|
||||||
|
* `git worktree list --porcelain` outputs "detached" instead of
|
||||||
|
* "branch refs/heads/...". Previously, these worktrees were silently
|
||||||
|
* dropped from the response because the parser required both path AND branch.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { describe, it, expect, vi, beforeEach, type Mock } from 'vitest';
|
||||||
|
import type { Request, Response } from 'express';
|
||||||
|
import { exec } from 'child_process';
|
||||||
|
import { createMockExpressContext } from '../../../utils/mocks.js';
|
||||||
|
|
||||||
|
// Mock all external dependencies before importing the module under test
|
||||||
|
vi.mock('child_process', () => ({
|
||||||
|
exec: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/lib/git.js', () => ({
|
||||||
|
execGitCommand: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@automaker/git-utils', () => ({
|
||||||
|
isGitRepo: vi.fn(async () => true),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@automaker/utils', () => ({
|
||||||
|
createLogger: () => ({
|
||||||
|
info: vi.fn(),
|
||||||
|
warn: vi.fn(),
|
||||||
|
error: vi.fn(),
|
||||||
|
debug: vi.fn(),
|
||||||
|
}),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@automaker/types', () => ({
|
||||||
|
validatePRState: vi.fn((state: string) => state),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/lib/secure-fs.js', () => ({
|
||||||
|
access: vi.fn().mockResolvedValue(undefined),
|
||||||
|
readFile: vi.fn(),
|
||||||
|
readdir: vi.fn().mockResolvedValue([]),
|
||||||
|
stat: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/lib/worktree-metadata.js', () => ({
|
||||||
|
readAllWorktreeMetadata: vi.fn(async () => new Map()),
|
||||||
|
updateWorktreePRInfo: vi.fn(async () => undefined),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@/routes/worktree/common.js', async (importOriginal) => {
|
||||||
|
const actual = (await importOriginal()) as Record<string, unknown>;
|
||||||
|
return {
|
||||||
|
...actual,
|
||||||
|
getErrorMessage: vi.fn((e: Error) => e?.message || 'Unknown error'),
|
||||||
|
logError: vi.fn(),
|
||||||
|
normalizePath: vi.fn((p: string) => p),
|
||||||
|
execEnv: {},
|
||||||
|
isGhCliAvailable: vi.fn().mockResolvedValue(false),
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
vi.mock('@/routes/github/routes/check-github-remote.js', () => ({
|
||||||
|
checkGitHubRemote: vi.fn().mockResolvedValue({ hasGitHubRemote: false }),
|
||||||
|
}));
|
||||||
|
|
||||||
|
import { createListHandler } from '@/routes/worktree/routes/list.js';
|
||||||
|
import * as secureFs from '@/lib/secure-fs.js';
|
||||||
|
import { execGitCommand } from '@/lib/git.js';
|
||||||
|
import { readAllWorktreeMetadata, updateWorktreePRInfo } from '@/lib/worktree-metadata.js';
|
||||||
|
import { isGitRepo } from '@automaker/git-utils';
|
||||||
|
import { isGhCliAvailable, normalizePath, getErrorMessage } from '@/routes/worktree/common.js';
|
||||||
|
import { checkGitHubRemote } from '@/routes/github/routes/check-github-remote.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set up execGitCommand mock (list handler uses this via lib/git.js, not child_process.exec).
|
||||||
|
*/
|
||||||
|
function setupExecGitCommandMock(options: {
|
||||||
|
porcelainOutput: string;
|
||||||
|
projectBranch?: string;
|
||||||
|
gitDirs?: Record<string, string>;
|
||||||
|
worktreeBranches?: Record<string, string>;
|
||||||
|
}) {
|
||||||
|
const { porcelainOutput, projectBranch = 'main', gitDirs = {}, worktreeBranches = {} } = options;
|
||||||
|
|
||||||
|
vi.mocked(execGitCommand).mockImplementation(async (args: string[], cwd: string) => {
|
||||||
|
if (args[0] === 'worktree' && args[1] === 'list' && args[2] === '--porcelain') {
|
||||||
|
return porcelainOutput;
|
||||||
|
}
|
||||||
|
if (args[0] === 'branch' && args[1] === '--show-current') {
|
||||||
|
if (worktreeBranches[cwd] !== undefined) {
|
||||||
|
return worktreeBranches[cwd] + '\n';
|
||||||
|
}
|
||||||
|
return projectBranch + '\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'rev-parse' && args[1] === '--git-dir') {
|
||||||
|
if (cwd && gitDirs[cwd]) {
|
||||||
|
return gitDirs[cwd] + '\n';
|
||||||
|
}
|
||||||
|
throw new Error('not a git directory');
|
||||||
|
}
|
||||||
|
if (args[0] === 'rev-parse' && args[1] === '--abbrev-ref' && args[2] === 'HEAD') {
|
||||||
|
return 'HEAD\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'worktree' && args[1] === 'prune') {
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
if (args[0] === 'status' && args[1] === '--porcelain') {
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
if (args[0] === 'diff' && args[1] === '--name-only' && args[2] === '--diff-filter=U') {
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('worktree list - detached HEAD handling', () => {
|
||||||
|
let req: Request;
|
||||||
|
let res: Response;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
const context = createMockExpressContext();
|
||||||
|
req = context.req;
|
||||||
|
res = context.res;
|
||||||
|
|
||||||
|
// Re-establish mock implementations cleared by mockReset/clearAllMocks
|
||||||
|
vi.mocked(isGitRepo).mockResolvedValue(true);
|
||||||
|
vi.mocked(readAllWorktreeMetadata).mockResolvedValue(new Map());
|
||||||
|
vi.mocked(isGhCliAvailable).mockResolvedValue(false);
|
||||||
|
vi.mocked(checkGitHubRemote).mockResolvedValue({ hasGitHubRemote: false });
|
||||||
|
vi.mocked(normalizePath).mockImplementation((p: string) => p);
|
||||||
|
vi.mocked(getErrorMessage).mockImplementation(
|
||||||
|
(e: unknown) => (e as Error)?.message || 'Unknown error'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Default: all paths exist
|
||||||
|
vi.mocked(secureFs.access).mockResolvedValue(undefined);
|
||||||
|
// Default: .worktrees directory doesn't exist (no scan via readdir)
|
||||||
|
vi.mocked(secureFs.readdir).mockRejectedValue(new Error('ENOENT'));
|
||||||
|
// Default: readFile fails
|
||||||
|
vi.mocked(secureFs.readFile).mockRejectedValue(new Error('ENOENT'));
|
||||||
|
|
||||||
|
// Default execGitCommand so list handler gets valid porcelain/branch output (vitest clearMocks resets implementations)
|
||||||
|
setupExecGitCommandMock({
|
||||||
|
porcelainOutput: 'worktree /project\nbranch refs/heads/main\n\n',
|
||||||
|
projectBranch: 'main',
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Helper: set up execGitCommand mock for the list handler.
|
||||||
|
* Worktree-specific behavior can be customized via the options parameter.
|
||||||
|
*/
|
||||||
|
function setupStandardExec(options: {
|
||||||
|
porcelainOutput: string;
|
||||||
|
projectBranch?: string;
|
||||||
|
/** Map of worktree path -> git-dir path */
|
||||||
|
gitDirs?: Record<string, string>;
|
||||||
|
/** Map of worktree cwd -> branch for `git branch --show-current` */
|
||||||
|
worktreeBranches?: Record<string, string>;
|
||||||
|
}) {
|
||||||
|
setupExecGitCommandMock(options);
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Suppress .worktrees dir scan by making access throw for the .worktrees dir. */
|
||||||
|
function disableWorktreesScan() {
|
||||||
|
vi.mocked(secureFs.access).mockImplementation(async (p) => {
|
||||||
|
const pathStr = String(p);
|
||||||
|
// Block only the .worktrees dir access check in scanWorktreesDirectory
|
||||||
|
if (pathStr.endsWith('.worktrees') || pathStr.endsWith('.worktrees/')) {
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
}
|
||||||
|
// All other paths exist
|
||||||
|
return undefined;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('porcelain parser', () => {
|
||||||
|
it('should include normal worktrees with branch lines', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/feature-a',
|
||||||
|
'branch refs/heads/feature-a',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
success: boolean;
|
||||||
|
worktrees: Array<{ branch: string; path: string; isMain: boolean; hasWorktree: boolean }>;
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(response.success).toBe(true);
|
||||||
|
expect(response.worktrees).toHaveLength(2);
|
||||||
|
expect(response.worktrees[0]).toEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
path: '/project',
|
||||||
|
branch: 'main',
|
||||||
|
isMain: true,
|
||||||
|
hasWorktree: true,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
expect(response.worktrees[1]).toEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
path: '/project/.worktrees/feature-a',
|
||||||
|
branch: 'feature-a',
|
||||||
|
isMain: false,
|
||||||
|
hasWorktree: true,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should include worktrees with detached HEAD and recover branch from rebase-merge state', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/rebasing-wt',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
gitDirs: {
|
||||||
|
'/project/.worktrees/rebasing-wt': '/project/.worktrees/rebasing-wt/.git',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
// rebase-merge/head-name returns the branch being rebased
|
||||||
|
vi.mocked(secureFs.readFile).mockImplementation(async (filePath) => {
|
||||||
|
const pathStr = String(filePath);
|
||||||
|
if (pathStr.includes('rebase-merge/head-name')) {
|
||||||
|
return 'refs/heads/feature/my-rebasing-branch\n' as any;
|
||||||
|
}
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
});
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string; isCurrent: boolean }>;
|
||||||
|
};
|
||||||
|
expect(response.worktrees).toHaveLength(2);
|
||||||
|
expect(response.worktrees[1]).toEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
path: '/project/.worktrees/rebasing-wt',
|
||||||
|
branch: 'feature/my-rebasing-branch',
|
||||||
|
isMain: false,
|
||||||
|
isCurrent: false,
|
||||||
|
hasWorktree: true,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should include worktrees with detached HEAD and recover branch from rebase-apply state', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/apply-wt',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
gitDirs: {
|
||||||
|
'/project/.worktrees/apply-wt': '/project/.worktrees/apply-wt/.git',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
// rebase-merge doesn't exist, but rebase-apply does
|
||||||
|
vi.mocked(secureFs.readFile).mockImplementation(async (filePath) => {
|
||||||
|
const pathStr = String(filePath);
|
||||||
|
if (pathStr.includes('rebase-apply/head-name')) {
|
||||||
|
return 'refs/heads/feature/apply-branch\n' as any;
|
||||||
|
}
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
});
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string }>;
|
||||||
|
};
|
||||||
|
const detachedWt = response.worktrees.find((w) => w.path === '/project/.worktrees/apply-wt');
|
||||||
|
expect(detachedWt).toBeDefined();
|
||||||
|
expect(detachedWt!.branch).toBe('feature/apply-branch');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should show merge conflict worktrees normally since merge does not detach HEAD', async () => {
|
||||||
|
// During a merge conflict, HEAD stays on the branch, so `git worktree list --porcelain`
|
||||||
|
// still outputs `branch refs/heads/...`. This test verifies merge conflicts don't
|
||||||
|
// trigger the detached HEAD recovery path.
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/merge-wt',
|
||||||
|
'branch refs/heads/feature/merge-branch',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string }>;
|
||||||
|
};
|
||||||
|
const mergeWt = response.worktrees.find((w) => w.path === '/project/.worktrees/merge-wt');
|
||||||
|
expect(mergeWt).toBeDefined();
|
||||||
|
expect(mergeWt!.branch).toBe('feature/merge-branch');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should fall back to (detached) when all branch recovery methods fail', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/unknown-wt',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
worktreeBranches: {
|
||||||
|
'/project/.worktrees/unknown-wt': '', // empty = no branch
|
||||||
|
},
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
// All readFile calls fail (no gitDirs so rev-parse --git-dir will throw)
|
||||||
|
vi.mocked(secureFs.readFile).mockRejectedValue(new Error('ENOENT'));
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string }>;
|
||||||
|
};
|
||||||
|
const detachedWt = response.worktrees.find(
|
||||||
|
(w) => w.path === '/project/.worktrees/unknown-wt'
|
||||||
|
);
|
||||||
|
expect(detachedWt).toBeDefined();
|
||||||
|
expect(detachedWt!.branch).toBe('(detached)');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not include detached worktree when directory does not exist on disk', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/deleted-wt',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
});
|
||||||
|
|
||||||
|
// The deleted worktree doesn't exist on disk
|
||||||
|
vi.mocked(secureFs.access).mockImplementation(async (p) => {
|
||||||
|
const pathStr = String(p);
|
||||||
|
if (pathStr.includes('deleted-wt')) {
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
}
|
||||||
|
if (pathStr.endsWith('.worktrees') || pathStr.endsWith('.worktrees/')) {
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
});
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string }>;
|
||||||
|
};
|
||||||
|
// Only the main worktree should be present
|
||||||
|
expect(response.worktrees).toHaveLength(1);
|
||||||
|
expect(response.worktrees[0].path).toBe('/project');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should set isCurrent to false for detached worktrees even if recovered branch matches current branch', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/rebasing-wt',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
// currentBranch for project is 'feature/my-branch'
|
||||||
|
projectBranch: 'feature/my-branch',
|
||||||
|
gitDirs: {
|
||||||
|
'/project/.worktrees/rebasing-wt': '/project/.worktrees/rebasing-wt/.git',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
// Recovery returns the same branch as currentBranch
|
||||||
|
vi.mocked(secureFs.readFile).mockImplementation(async (filePath) => {
|
||||||
|
const pathStr = String(filePath);
|
||||||
|
if (pathStr.includes('rebase-merge/head-name')) {
|
||||||
|
return 'refs/heads/feature/my-branch\n' as any;
|
||||||
|
}
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
});
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; isCurrent: boolean; path: string }>;
|
||||||
|
};
|
||||||
|
const detachedWt = response.worktrees.find(
|
||||||
|
(w) => w.path === '/project/.worktrees/rebasing-wt'
|
||||||
|
);
|
||||||
|
expect(detachedWt).toBeDefined();
|
||||||
|
// Detached worktrees should always have isCurrent=false
|
||||||
|
expect(detachedWt!.isCurrent).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle mixed normal and detached worktrees', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/normal-wt',
|
||||||
|
'branch refs/heads/feature-normal',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/rebasing-wt',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/another-normal',
|
||||||
|
'branch refs/heads/feature-other',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
gitDirs: {
|
||||||
|
'/project/.worktrees/rebasing-wt': '/project/.worktrees/rebasing-wt/.git',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
vi.mocked(secureFs.readFile).mockImplementation(async (filePath) => {
|
||||||
|
const pathStr = String(filePath);
|
||||||
|
if (pathStr.includes('rebase-merge/head-name')) {
|
||||||
|
return 'refs/heads/feature/rebasing\n' as any;
|
||||||
|
}
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
});
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string; isMain: boolean }>;
|
||||||
|
};
|
||||||
|
expect(response.worktrees).toHaveLength(4);
|
||||||
|
expect(response.worktrees[0]).toEqual(
|
||||||
|
expect.objectContaining({ path: '/project', branch: 'main', isMain: true })
|
||||||
|
);
|
||||||
|
expect(response.worktrees[1]).toEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
path: '/project/.worktrees/normal-wt',
|
||||||
|
branch: 'feature-normal',
|
||||||
|
isMain: false,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
expect(response.worktrees[2]).toEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
path: '/project/.worktrees/rebasing-wt',
|
||||||
|
branch: 'feature/rebasing',
|
||||||
|
isMain: false,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
expect(response.worktrees[3]).toEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
path: '/project/.worktrees/another-normal',
|
||||||
|
branch: 'feature-other',
|
||||||
|
isMain: false,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should correctly advance isFirst flag past detached worktrees', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/detached-wt',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/normal-wt',
|
||||||
|
'branch refs/heads/feature-x',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
vi.mocked(secureFs.readFile).mockRejectedValue(new Error('ENOENT'));
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; isMain: boolean }>;
|
||||||
|
};
|
||||||
|
expect(response.worktrees).toHaveLength(3);
|
||||||
|
expect(response.worktrees[0].isMain).toBe(true); // main
|
||||||
|
expect(response.worktrees[1].isMain).toBe(false); // detached
|
||||||
|
expect(response.worktrees[2].isMain).toBe(false); // normal
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not add removed detached worktrees to removedWorktrees list', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/gone-wt',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
});
|
||||||
|
|
||||||
|
// The detached worktree doesn't exist on disk
|
||||||
|
vi.mocked(secureFs.access).mockImplementation(async (p) => {
|
||||||
|
const pathStr = String(p);
|
||||||
|
if (pathStr.includes('gone-wt')) {
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
}
|
||||||
|
if (pathStr.endsWith('.worktrees') || pathStr.endsWith('.worktrees/')) {
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
});
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string }>;
|
||||||
|
removedWorktrees?: Array<{ path: string; branch: string }>;
|
||||||
|
};
|
||||||
|
// Should not be in removed list since we don't know the branch
|
||||||
|
expect(response.removedWorktrees).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should strip refs/heads/ prefix from recovered branch name', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
setupStandardExec({
|
||||||
|
porcelainOutput: [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/wt1',
|
||||||
|
'detached',
|
||||||
|
'',
|
||||||
|
].join('\n'),
|
||||||
|
gitDirs: {
|
||||||
|
'/project/.worktrees/wt1': '/project/.worktrees/wt1/.git',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
vi.mocked(secureFs.readFile).mockImplementation(async (filePath) => {
|
||||||
|
const pathStr = String(filePath);
|
||||||
|
if (pathStr.includes('rebase-merge/head-name')) {
|
||||||
|
return 'refs/heads/my-branch\n' as any;
|
||||||
|
}
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
});
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string }>;
|
||||||
|
};
|
||||||
|
const wt = response.worktrees.find((w) => w.path === '/project/.worktrees/wt1');
|
||||||
|
expect(wt).toBeDefined();
|
||||||
|
// Should be 'my-branch', not 'refs/heads/my-branch'
|
||||||
|
expect(wt!.branch).toBe('my-branch');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('scanWorktreesDirectory with detached HEAD recovery', () => {
|
||||||
|
it('should recover branch for discovered worktrees with detached HEAD', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
vi.mocked(execGitCommand).mockImplementation(async (args: string[], cwd: string) => {
|
||||||
|
if (args[0] === 'worktree' && args[1] === 'list') {
|
||||||
|
return 'worktree /project\nbranch refs/heads/main\n\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'branch' && args[1] === '--show-current') {
|
||||||
|
return cwd === '/project' ? 'main\n' : '\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'rev-parse' && args[1] === '--abbrev-ref') {
|
||||||
|
return 'HEAD\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'rev-parse' && args[1] === '--git-dir') {
|
||||||
|
return '/project/.worktrees/orphan-wt/.git\n';
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
});
|
||||||
|
|
||||||
|
// .worktrees directory exists and has an orphan worktree
|
||||||
|
vi.mocked(secureFs.access).mockResolvedValue(undefined);
|
||||||
|
vi.mocked(secureFs.readdir).mockResolvedValue([
|
||||||
|
{ name: 'orphan-wt', isDirectory: () => true, isFile: () => false } as any,
|
||||||
|
]);
|
||||||
|
vi.mocked(secureFs.stat).mockResolvedValue({
|
||||||
|
isFile: () => true,
|
||||||
|
isDirectory: () => false,
|
||||||
|
} as any);
|
||||||
|
|
||||||
|
// readFile returns branch from rebase-merge/head-name
|
||||||
|
vi.mocked(secureFs.readFile).mockImplementation(async (filePath) => {
|
||||||
|
const pathStr = String(filePath);
|
||||||
|
if (pathStr.includes('rebase-merge/head-name')) {
|
||||||
|
return 'refs/heads/feature/orphan-branch\n' as any;
|
||||||
|
}
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
});
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string }>;
|
||||||
|
};
|
||||||
|
|
||||||
|
const orphanWt = response.worktrees.find((w) => w.path === '/project/.worktrees/orphan-wt');
|
||||||
|
expect(orphanWt).toBeDefined();
|
||||||
|
expect(orphanWt!.branch).toBe('feature/orphan-branch');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should skip discovered worktrees when all branch detection fails', async () => {
|
||||||
|
req.body = { projectPath: '/project' };
|
||||||
|
|
||||||
|
vi.mocked(execGitCommand).mockImplementation(async (args: string[], cwd: string) => {
|
||||||
|
if (args[0] === 'worktree' && args[1] === 'list') {
|
||||||
|
return 'worktree /project\nbranch refs/heads/main\n\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'branch' && args[1] === '--show-current') {
|
||||||
|
return cwd === '/project' ? 'main\n' : '\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'rev-parse' && args[1] === '--abbrev-ref') {
|
||||||
|
return 'HEAD\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'rev-parse' && args[1] === '--git-dir') {
|
||||||
|
throw new Error('not a git dir');
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
});
|
||||||
|
|
||||||
|
vi.mocked(secureFs.access).mockResolvedValue(undefined);
|
||||||
|
vi.mocked(secureFs.readdir).mockResolvedValue([
|
||||||
|
{ name: 'broken-wt', isDirectory: () => true, isFile: () => false } as any,
|
||||||
|
]);
|
||||||
|
vi.mocked(secureFs.stat).mockResolvedValue({
|
||||||
|
isFile: () => true,
|
||||||
|
isDirectory: () => false,
|
||||||
|
} as any);
|
||||||
|
vi.mocked(secureFs.readFile).mockRejectedValue(new Error('ENOENT'));
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; path: string }>;
|
||||||
|
};
|
||||||
|
|
||||||
|
// Only main worktree should be present
|
||||||
|
expect(response.worktrees).toHaveLength(1);
|
||||||
|
expect(response.worktrees[0].branch).toBe('main');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('PR tracking precedence', () => {
|
||||||
|
it('should keep manually tracked PR from metadata when branch PR differs', async () => {
|
||||||
|
req.body = { projectPath: '/project', includeDetails: true };
|
||||||
|
|
||||||
|
vi.mocked(readAllWorktreeMetadata).mockResolvedValue(
|
||||||
|
new Map([
|
||||||
|
[
|
||||||
|
'feature-a',
|
||||||
|
{
|
||||||
|
branch: 'feature-a',
|
||||||
|
createdAt: '2026-01-01T00:00:00.000Z',
|
||||||
|
pr: {
|
||||||
|
number: 99,
|
||||||
|
url: 'https://github.com/org/repo/pull/99',
|
||||||
|
title: 'Manual override PR',
|
||||||
|
state: 'OPEN',
|
||||||
|
createdAt: '2026-01-01T00:00:00.000Z',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
])
|
||||||
|
);
|
||||||
|
vi.mocked(isGhCliAvailable).mockResolvedValue(true);
|
||||||
|
vi.mocked(checkGitHubRemote).mockResolvedValue({
|
||||||
|
hasGitHubRemote: true,
|
||||||
|
owner: 'org',
|
||||||
|
repo: 'repo',
|
||||||
|
});
|
||||||
|
vi.mocked(secureFs.access).mockImplementation(async (p) => {
|
||||||
|
const pathStr = String(p);
|
||||||
|
if (
|
||||||
|
pathStr.includes('MERGE_HEAD') ||
|
||||||
|
pathStr.includes('rebase-merge') ||
|
||||||
|
pathStr.includes('rebase-apply') ||
|
||||||
|
pathStr.includes('CHERRY_PICK_HEAD')
|
||||||
|
) {
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
});
|
||||||
|
|
||||||
|
vi.mocked(execGitCommand).mockImplementation(async (args: string[], cwd: string) => {
|
||||||
|
if (args[0] === 'rev-parse' && args[1] === '--git-dir') {
|
||||||
|
throw new Error('no git dir');
|
||||||
|
}
|
||||||
|
if (args[0] === 'worktree' && args[1] === 'list') {
|
||||||
|
return [
|
||||||
|
'worktree /project',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project/.worktrees/feature-a',
|
||||||
|
'branch refs/heads/feature-a',
|
||||||
|
'',
|
||||||
|
].join('\n');
|
||||||
|
}
|
||||||
|
if (args[0] === 'branch' && args[1] === '--show-current') {
|
||||||
|
return cwd === '/project' ? 'main\n' : 'feature-a\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'status' && args[1] === '--porcelain') {
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
});
|
||||||
|
(exec as unknown as Mock).mockImplementation(
|
||||||
|
(
|
||||||
|
cmd: string,
|
||||||
|
_opts: unknown,
|
||||||
|
callback?: (err: Error | null, out: { stdout: string; stderr: string }) => void
|
||||||
|
) => {
|
||||||
|
const cb = typeof _opts === 'function' ? _opts : callback!;
|
||||||
|
if (cmd.includes('gh pr list')) {
|
||||||
|
cb(null, {
|
||||||
|
stdout: JSON.stringify([
|
||||||
|
{
|
||||||
|
number: 42,
|
||||||
|
title: 'Branch PR',
|
||||||
|
url: 'https://github.com/org/repo/pull/42',
|
||||||
|
state: 'OPEN',
|
||||||
|
headRefName: 'feature-a',
|
||||||
|
createdAt: '2026-01-02T00:00:00.000Z',
|
||||||
|
},
|
||||||
|
]),
|
||||||
|
stderr: '',
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
cb(null, { stdout: '', stderr: '' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
);
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; pr?: { number: number; title: string } }>;
|
||||||
|
};
|
||||||
|
const featureWorktree = response.worktrees.find((w) => w.branch === 'feature-a');
|
||||||
|
expect(featureWorktree?.pr?.number).toBe(99);
|
||||||
|
expect(featureWorktree?.pr?.title).toBe('Manual override PR');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should prefer GitHub PR when it matches metadata number and sync updated fields', async () => {
|
||||||
|
req.body = { projectPath: '/project-2', includeDetails: true };
|
||||||
|
|
||||||
|
vi.mocked(readAllWorktreeMetadata).mockResolvedValue(
|
||||||
|
new Map([
|
||||||
|
[
|
||||||
|
'feature-a',
|
||||||
|
{
|
||||||
|
branch: 'feature-a',
|
||||||
|
createdAt: '2026-01-01T00:00:00.000Z',
|
||||||
|
pr: {
|
||||||
|
number: 42,
|
||||||
|
url: 'https://github.com/org/repo/pull/42',
|
||||||
|
title: 'Old title',
|
||||||
|
state: 'OPEN',
|
||||||
|
createdAt: '2026-01-01T00:00:00.000Z',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
])
|
||||||
|
);
|
||||||
|
vi.mocked(isGhCliAvailable).mockResolvedValue(true);
|
||||||
|
vi.mocked(checkGitHubRemote).mockResolvedValue({
|
||||||
|
hasGitHubRemote: true,
|
||||||
|
owner: 'org',
|
||||||
|
repo: 'repo',
|
||||||
|
});
|
||||||
|
vi.mocked(secureFs.access).mockImplementation(async (p) => {
|
||||||
|
const pathStr = String(p);
|
||||||
|
if (
|
||||||
|
pathStr.includes('MERGE_HEAD') ||
|
||||||
|
pathStr.includes('rebase-merge') ||
|
||||||
|
pathStr.includes('rebase-apply') ||
|
||||||
|
pathStr.includes('CHERRY_PICK_HEAD')
|
||||||
|
) {
|
||||||
|
throw new Error('ENOENT');
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
});
|
||||||
|
|
||||||
|
vi.mocked(execGitCommand).mockImplementation(async (args: string[], cwd: string) => {
|
||||||
|
if (args[0] === 'rev-parse' && args[1] === '--git-dir') {
|
||||||
|
throw new Error('no git dir');
|
||||||
|
}
|
||||||
|
if (args[0] === 'worktree' && args[1] === 'list') {
|
||||||
|
return [
|
||||||
|
'worktree /project-2',
|
||||||
|
'branch refs/heads/main',
|
||||||
|
'',
|
||||||
|
'worktree /project-2/.worktrees/feature-a',
|
||||||
|
'branch refs/heads/feature-a',
|
||||||
|
'',
|
||||||
|
].join('\n');
|
||||||
|
}
|
||||||
|
if (args[0] === 'branch' && args[1] === '--show-current') {
|
||||||
|
return cwd === '/project-2' ? 'main\n' : 'feature-a\n';
|
||||||
|
}
|
||||||
|
if (args[0] === 'status' && args[1] === '--porcelain') {
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
});
|
||||||
|
(exec as unknown as Mock).mockImplementation(
|
||||||
|
(
|
||||||
|
cmd: string,
|
||||||
|
_opts: unknown,
|
||||||
|
callback?: (err: Error | null, out: { stdout: string; stderr: string }) => void
|
||||||
|
) => {
|
||||||
|
const cb = typeof _opts === 'function' ? _opts : callback!;
|
||||||
|
if (cmd.includes('gh pr list')) {
|
||||||
|
cb(null, {
|
||||||
|
stdout: JSON.stringify([
|
||||||
|
{
|
||||||
|
number: 42,
|
||||||
|
title: 'New title from GitHub',
|
||||||
|
url: 'https://github.com/org/repo/pull/42',
|
||||||
|
state: 'MERGED',
|
||||||
|
headRefName: 'feature-a',
|
||||||
|
createdAt: '2026-01-02T00:00:00.000Z',
|
||||||
|
},
|
||||||
|
]),
|
||||||
|
stderr: '',
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
cb(null, { stdout: '', stderr: '' });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
);
|
||||||
|
disableWorktreesScan();
|
||||||
|
|
||||||
|
const handler = createListHandler();
|
||||||
|
await handler(req, res);
|
||||||
|
|
||||||
|
const response = vi.mocked(res.json).mock.calls[0][0] as {
|
||||||
|
worktrees: Array<{ branch: string; pr?: { number: number; title: string; state: string } }>;
|
||||||
|
};
|
||||||
|
const featureWorktree = response.worktrees.find((w) => w.branch === 'feature-a');
|
||||||
|
expect(featureWorktree?.pr?.number).toBe(42);
|
||||||
|
expect(featureWorktree?.pr?.title).toBe('New title from GitHub');
|
||||||
|
expect(featureWorktree?.pr?.state).toBe('MERGED');
|
||||||
|
expect(vi.mocked(updateWorktreePRInfo)).toHaveBeenCalledWith(
|
||||||
|
'/project-2',
|
||||||
|
'feature-a',
|
||||||
|
expect.objectContaining({
|
||||||
|
number: 42,
|
||||||
|
title: 'New title from GitHub',
|
||||||
|
state: 'MERGED',
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -1181,6 +1181,50 @@ describe('AgentExecutor', () => {
|
|||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('should pass claudeCompatibleProvider to executeQuery options', async () => {
|
||||||
|
const executor = new AgentExecutor(
|
||||||
|
mockEventBus,
|
||||||
|
mockFeatureStateManager,
|
||||||
|
mockPlanApprovalService,
|
||||||
|
mockSettingsService
|
||||||
|
);
|
||||||
|
|
||||||
|
const mockProvider = {
|
||||||
|
getName: () => 'mock',
|
||||||
|
executeQuery: vi.fn().mockImplementation(function* () {
|
||||||
|
yield { type: 'result', subtype: 'success' };
|
||||||
|
}),
|
||||||
|
} as unknown as BaseProvider;
|
||||||
|
|
||||||
|
const mockClaudeProvider = { id: 'zai-1', name: 'Zai' } as any;
|
||||||
|
|
||||||
|
const options: AgentExecutionOptions = {
|
||||||
|
workDir: '/test',
|
||||||
|
featureId: 'test-feature',
|
||||||
|
prompt: 'Test prompt',
|
||||||
|
projectPath: '/project',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
provider: mockProvider,
|
||||||
|
effectiveBareModel: 'claude-sonnet-4-6',
|
||||||
|
claudeCompatibleProvider: mockClaudeProvider,
|
||||||
|
};
|
||||||
|
|
||||||
|
const callbacks = {
|
||||||
|
waitForApproval: vi.fn().mockResolvedValue({ approved: true }),
|
||||||
|
saveFeatureSummary: vi.fn(),
|
||||||
|
updateFeatureSummary: vi.fn(),
|
||||||
|
buildTaskPrompt: vi.fn().mockReturnValue('task prompt'),
|
||||||
|
};
|
||||||
|
|
||||||
|
await executor.execute(options, callbacks);
|
||||||
|
|
||||||
|
expect(mockProvider.executeQuery).toHaveBeenCalledWith(
|
||||||
|
expect.objectContaining({
|
||||||
|
claudeCompatibleProvider: mockClaudeProvider,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
it('should return correct result structure', async () => {
|
it('should return correct result structure', async () => {
|
||||||
const executor = new AgentExecutor(
|
const executor = new AgentExecutor(
|
||||||
mockEventBus,
|
mockEventBus,
|
||||||
|
|||||||
@@ -0,0 +1,207 @@
|
|||||||
|
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||||
|
|
||||||
|
// Mock dependencies (hoisted)
|
||||||
|
vi.mock('../../../../src/services/agent-executor.js');
|
||||||
|
vi.mock('../../../../src/lib/settings-helpers.js');
|
||||||
|
vi.mock('../../../../src/providers/provider-factory.js');
|
||||||
|
vi.mock('../../../../src/lib/sdk-options.js');
|
||||||
|
vi.mock('@automaker/model-resolver', () => ({
|
||||||
|
resolveModelString: vi.fn((model, fallback) => model || fallback),
|
||||||
|
DEFAULT_MODELS: { claude: 'claude-3-5-sonnet' },
|
||||||
|
}));
|
||||||
|
|
||||||
|
import { AutoModeServiceFacade } from '../../../../src/services/auto-mode/facade.js';
|
||||||
|
import { AgentExecutor } from '../../../../src/services/agent-executor.js';
|
||||||
|
import * as settingsHelpers from '../../../../src/lib/settings-helpers.js';
|
||||||
|
import { ProviderFactory } from '../../../../src/providers/provider-factory.js';
|
||||||
|
import * as sdkOptions from '../../../../src/lib/sdk-options.js';
|
||||||
|
|
||||||
|
describe('AutoModeServiceFacade Agent Runner', () => {
|
||||||
|
let mockAgentExecutor: MockAgentExecutor;
|
||||||
|
let mockSettingsService: MockSettingsService;
|
||||||
|
let facade: AutoModeServiceFacade;
|
||||||
|
|
||||||
|
// Type definitions for mocks
|
||||||
|
interface MockAgentExecutor {
|
||||||
|
execute: ReturnType<typeof vi.fn>;
|
||||||
|
}
|
||||||
|
interface MockSettingsService {
|
||||||
|
getGlobalSettings: ReturnType<typeof vi.fn>;
|
||||||
|
getCredentials: ReturnType<typeof vi.fn>;
|
||||||
|
getProjectSettings: ReturnType<typeof vi.fn>;
|
||||||
|
}
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
|
||||||
|
// Set up the mock for createAutoModeOptions
|
||||||
|
// Note: Using 'as any' because Options type from SDK is complex and we only need
|
||||||
|
// the specific fields that are verified in tests (maxTurns, allowedTools, etc.)
|
||||||
|
vi.mocked(sdkOptions.createAutoModeOptions).mockReturnValue({
|
||||||
|
maxTurns: 123,
|
||||||
|
allowedTools: ['tool1'],
|
||||||
|
systemPrompt: 'system-prompt',
|
||||||
|
} as any);
|
||||||
|
|
||||||
|
mockAgentExecutor = {
|
||||||
|
execute: vi.fn().mockResolvedValue(undefined),
|
||||||
|
};
|
||||||
|
(AgentExecutor as any).mockImplementation(function (this: MockAgentExecutor) {
|
||||||
|
return mockAgentExecutor;
|
||||||
|
});
|
||||||
|
|
||||||
|
mockSettingsService = {
|
||||||
|
getGlobalSettings: vi.fn().mockResolvedValue({}),
|
||||||
|
getCredentials: vi.fn().mockResolvedValue({}),
|
||||||
|
getProjectSettings: vi.fn().mockResolvedValue({}),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Helper to access the private createRunAgentFn via factory creation
|
||||||
|
facade = AutoModeServiceFacade.create('/project', {
|
||||||
|
events: { on: vi.fn(), emit: vi.fn() } as any,
|
||||||
|
settingsService: mockSettingsService,
|
||||||
|
sharedServices: {
|
||||||
|
eventBus: { emitAutoModeEvent: vi.fn() } as any,
|
||||||
|
worktreeResolver: { getCurrentBranch: vi.fn().mockResolvedValue('main') } as any,
|
||||||
|
concurrencyManager: {
|
||||||
|
isRunning: vi.fn().mockReturnValue(false),
|
||||||
|
getRunningFeature: vi.fn().mockReturnValue(null),
|
||||||
|
} as any,
|
||||||
|
} as any,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should resolve provider by providerId and pass to AgentExecutor', async () => {
|
||||||
|
// 1. Setup mocks
|
||||||
|
const mockProvider = { getName: () => 'mock-provider' };
|
||||||
|
(ProviderFactory.getProviderForModel as any).mockReturnValue(mockProvider);
|
||||||
|
|
||||||
|
const mockClaudeProvider = { id: 'zai-1', name: 'Zai' };
|
||||||
|
const mockCredentials = { apiKey: 'test-key' };
|
||||||
|
(settingsHelpers.resolveProviderContext as any).mockResolvedValue({
|
||||||
|
provider: mockClaudeProvider,
|
||||||
|
credentials: mockCredentials,
|
||||||
|
resolvedModel: undefined,
|
||||||
|
});
|
||||||
|
|
||||||
|
const runAgentFn = (facade as any).executionService.runAgentFn;
|
||||||
|
|
||||||
|
// 2. Execute
|
||||||
|
await runAgentFn(
|
||||||
|
'/workdir',
|
||||||
|
'feature-1',
|
||||||
|
'prompt',
|
||||||
|
new AbortController(),
|
||||||
|
'/project',
|
||||||
|
[],
|
||||||
|
'model-1',
|
||||||
|
{
|
||||||
|
providerId: 'zai-1',
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// 3. Verify
|
||||||
|
expect(settingsHelpers.resolveProviderContext).toHaveBeenCalledWith(
|
||||||
|
mockSettingsService,
|
||||||
|
'model-1',
|
||||||
|
'zai-1',
|
||||||
|
'[AutoModeFacade]'
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(mockAgentExecutor.execute).toHaveBeenCalledWith(
|
||||||
|
expect.objectContaining({
|
||||||
|
claudeCompatibleProvider: mockClaudeProvider,
|
||||||
|
credentials: mockCredentials,
|
||||||
|
model: 'model-1', // Original model ID
|
||||||
|
}),
|
||||||
|
expect.any(Object)
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should fallback to model-based lookup if providerId is not provided', async () => {
|
||||||
|
const mockProvider = { getName: () => 'mock-provider' };
|
||||||
|
(ProviderFactory.getProviderForModel as any).mockReturnValue(mockProvider);
|
||||||
|
|
||||||
|
const mockClaudeProvider = { id: 'zai-model', name: 'Zai Model' };
|
||||||
|
(settingsHelpers.resolveProviderContext as any).mockResolvedValue({
|
||||||
|
provider: mockClaudeProvider,
|
||||||
|
credentials: { apiKey: 'model-key' },
|
||||||
|
resolvedModel: 'resolved-model-1',
|
||||||
|
});
|
||||||
|
|
||||||
|
const runAgentFn = (facade as any).executionService.runAgentFn;
|
||||||
|
|
||||||
|
await runAgentFn(
|
||||||
|
'/workdir',
|
||||||
|
'feature-1',
|
||||||
|
'prompt',
|
||||||
|
new AbortController(),
|
||||||
|
'/project',
|
||||||
|
[],
|
||||||
|
'model-1',
|
||||||
|
{
|
||||||
|
// no providerId
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(settingsHelpers.resolveProviderContext).toHaveBeenCalledWith(
|
||||||
|
mockSettingsService,
|
||||||
|
'model-1',
|
||||||
|
undefined,
|
||||||
|
'[AutoModeFacade]'
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(mockAgentExecutor.execute).toHaveBeenCalledWith(
|
||||||
|
expect.objectContaining({
|
||||||
|
claudeCompatibleProvider: mockClaudeProvider,
|
||||||
|
}),
|
||||||
|
expect.any(Object)
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use resolvedModel from provider config for createAutoModeOptions if it maps to a Claude model', async () => {
|
||||||
|
const mockProvider = { getName: () => 'mock-provider' };
|
||||||
|
(ProviderFactory.getProviderForModel as any).mockReturnValue(mockProvider);
|
||||||
|
|
||||||
|
const mockClaudeProvider = {
|
||||||
|
id: 'zai-1',
|
||||||
|
name: 'Zai',
|
||||||
|
models: [{ id: 'custom-model-1', mapsToClaudeModel: 'claude-3-opus' }],
|
||||||
|
};
|
||||||
|
(settingsHelpers.resolveProviderContext as any).mockResolvedValue({
|
||||||
|
provider: mockClaudeProvider,
|
||||||
|
credentials: { apiKey: 'test-key' },
|
||||||
|
resolvedModel: 'claude-3-5-opus',
|
||||||
|
});
|
||||||
|
|
||||||
|
const runAgentFn = (facade as any).executionService.runAgentFn;
|
||||||
|
|
||||||
|
await runAgentFn(
|
||||||
|
'/workdir',
|
||||||
|
'feature-1',
|
||||||
|
'prompt',
|
||||||
|
new AbortController(),
|
||||||
|
'/project',
|
||||||
|
[],
|
||||||
|
'custom-model-1',
|
||||||
|
{
|
||||||
|
providerId: 'zai-1',
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
// Verify createAutoModeOptions was called with the mapped model
|
||||||
|
expect(sdkOptions.createAutoModeOptions).toHaveBeenCalledWith(
|
||||||
|
expect.objectContaining({
|
||||||
|
model: 'claude-3-5-opus',
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
|
// Verify AgentExecutor.execute still gets the original custom model ID
|
||||||
|
expect(mockAgentExecutor.execute).toHaveBeenCalledWith(
|
||||||
|
expect.objectContaining({
|
||||||
|
model: 'custom-model-1',
|
||||||
|
}),
|
||||||
|
expect.any(Object)
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
115
apps/server/tests/unit/services/dev-server-event-types.test.ts
Normal file
115
apps/server/tests/unit/services/dev-server-event-types.test.ts
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
||||||
|
import { EventEmitter } from 'events';
|
||||||
|
import path from 'path';
|
||||||
|
import os from 'os';
|
||||||
|
import fs from 'fs/promises';
|
||||||
|
import { spawn } from 'child_process';
|
||||||
|
|
||||||
|
// Mock child_process
|
||||||
|
vi.mock('child_process', () => ({
|
||||||
|
spawn: vi.fn(),
|
||||||
|
execSync: vi.fn(),
|
||||||
|
execFile: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock secure-fs
|
||||||
|
vi.mock('@/lib/secure-fs.js', () => ({
|
||||||
|
access: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock net
|
||||||
|
vi.mock('net', () => ({
|
||||||
|
default: {
|
||||||
|
createServer: vi.fn(),
|
||||||
|
},
|
||||||
|
createServer: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
import * as secureFs from '@/lib/secure-fs.js';
|
||||||
|
import net from 'net';
|
||||||
|
|
||||||
|
describe('DevServerService Event Types', () => {
|
||||||
|
let testDataDir: string;
|
||||||
|
let worktreeDir: string;
|
||||||
|
let mockEmitter: EventEmitter;
|
||||||
|
|
||||||
|
beforeEach(async () => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
vi.resetModules();
|
||||||
|
|
||||||
|
testDataDir = path.join(os.tmpdir(), `dev-server-events-test-${Date.now()}`);
|
||||||
|
worktreeDir = path.join(os.tmpdir(), `dev-server-worktree-events-test-${Date.now()}`);
|
||||||
|
await fs.mkdir(testDataDir, { recursive: true });
|
||||||
|
await fs.mkdir(worktreeDir, { recursive: true });
|
||||||
|
|
||||||
|
mockEmitter = new EventEmitter();
|
||||||
|
|
||||||
|
vi.mocked(secureFs.access).mockResolvedValue(undefined);
|
||||||
|
|
||||||
|
const mockServer = new EventEmitter() as any;
|
||||||
|
mockServer.listen = vi.fn().mockImplementation((port: number, host: string) => {
|
||||||
|
process.nextTick(() => mockServer.emit('listening'));
|
||||||
|
});
|
||||||
|
mockServer.close = vi.fn();
|
||||||
|
vi.mocked(net.createServer).mockReturnValue(mockServer);
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
try {
|
||||||
|
await fs.rm(testDataDir, { recursive: true, force: true });
|
||||||
|
await fs.rm(worktreeDir, { recursive: true, force: true });
|
||||||
|
} catch {
|
||||||
|
// Ignore cleanup errors
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should emit all required event types during dev server lifecycle', async () => {
|
||||||
|
const { getDevServerService } = await import('@/services/dev-server-service.js');
|
||||||
|
const service = getDevServerService();
|
||||||
|
await service.initialize(testDataDir, mockEmitter as any);
|
||||||
|
|
||||||
|
const mockProcess = createMockProcess();
|
||||||
|
vi.mocked(spawn).mockReturnValue(mockProcess as any);
|
||||||
|
|
||||||
|
const emittedEvents: Record<string, any[]> = {
|
||||||
|
'dev-server:starting': [],
|
||||||
|
'dev-server:started': [],
|
||||||
|
'dev-server:url-detected': [],
|
||||||
|
'dev-server:output': [],
|
||||||
|
'dev-server:stopped': [],
|
||||||
|
};
|
||||||
|
|
||||||
|
Object.keys(emittedEvents).forEach((type) => {
|
||||||
|
mockEmitter.on(type, (payload) => emittedEvents[type].push(payload));
|
||||||
|
});
|
||||||
|
|
||||||
|
// 1. Starting & Started
|
||||||
|
await service.startDevServer(worktreeDir, worktreeDir);
|
||||||
|
expect(emittedEvents['dev-server:starting'].length).toBe(1);
|
||||||
|
expect(emittedEvents['dev-server:started'].length).toBe(1);
|
||||||
|
|
||||||
|
// 2. Output & URL Detected
|
||||||
|
mockProcess.stdout.emit('data', Buffer.from('Local: http://localhost:5173/\n'));
|
||||||
|
// Throttled output needs a bit of time
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 100));
|
||||||
|
expect(emittedEvents['dev-server:output'].length).toBeGreaterThanOrEqual(1);
|
||||||
|
expect(emittedEvents['dev-server:url-detected'].length).toBe(1);
|
||||||
|
expect(emittedEvents['dev-server:url-detected'][0].url).toBe('http://localhost:5173/');
|
||||||
|
|
||||||
|
// 3. Stopped
|
||||||
|
await service.stopDevServer(worktreeDir);
|
||||||
|
expect(emittedEvents['dev-server:stopped'].length).toBe(1);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Helper to create a mock child process
|
||||||
|
function createMockProcess() {
|
||||||
|
const mockProcess = new EventEmitter() as any;
|
||||||
|
mockProcess.stdout = new EventEmitter();
|
||||||
|
mockProcess.stderr = new EventEmitter();
|
||||||
|
mockProcess.kill = vi.fn();
|
||||||
|
mockProcess.killed = false;
|
||||||
|
mockProcess.pid = 12345;
|
||||||
|
mockProcess.unref = vi.fn();
|
||||||
|
return mockProcess;
|
||||||
|
}
|
||||||
240
apps/server/tests/unit/services/dev-server-persistence.test.ts
Normal file
240
apps/server/tests/unit/services/dev-server-persistence.test.ts
Normal file
@@ -0,0 +1,240 @@
|
|||||||
|
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest';
|
||||||
|
import { EventEmitter } from 'events';
|
||||||
|
import path from 'path';
|
||||||
|
import os from 'os';
|
||||||
|
import fs from 'fs/promises';
|
||||||
|
import { spawn, execSync } from 'child_process';
|
||||||
|
|
||||||
|
// Mock child_process
|
||||||
|
vi.mock('child_process', () => ({
|
||||||
|
spawn: vi.fn(),
|
||||||
|
execSync: vi.fn(),
|
||||||
|
execFile: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock secure-fs
|
||||||
|
vi.mock('@/lib/secure-fs.js', () => ({
|
||||||
|
access: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock net
|
||||||
|
vi.mock('net', () => ({
|
||||||
|
default: {
|
||||||
|
createServer: vi.fn(),
|
||||||
|
},
|
||||||
|
createServer: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
import * as secureFs from '@/lib/secure-fs.js';
|
||||||
|
import net from 'net';
|
||||||
|
|
||||||
|
describe('DevServerService Persistence & Sync', () => {
|
||||||
|
let testDataDir: string;
|
||||||
|
let worktreeDir: string;
|
||||||
|
let mockEmitter: EventEmitter;
|
||||||
|
|
||||||
|
beforeEach(async () => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
vi.resetModules();
|
||||||
|
|
||||||
|
testDataDir = path.join(os.tmpdir(), `dev-server-persistence-test-${Date.now()}`);
|
||||||
|
worktreeDir = path.join(os.tmpdir(), `dev-server-worktree-test-${Date.now()}`);
|
||||||
|
await fs.mkdir(testDataDir, { recursive: true });
|
||||||
|
await fs.mkdir(worktreeDir, { recursive: true });
|
||||||
|
|
||||||
|
mockEmitter = new EventEmitter();
|
||||||
|
|
||||||
|
// Default mock for secureFs.access - return resolved (file exists)
|
||||||
|
vi.mocked(secureFs.access).mockResolvedValue(undefined);
|
||||||
|
|
||||||
|
// Default mock for net.createServer - port available
|
||||||
|
const mockServer = new EventEmitter() as any;
|
||||||
|
mockServer.listen = vi.fn().mockImplementation((port: number, host: string) => {
|
||||||
|
process.nextTick(() => mockServer.emit('listening'));
|
||||||
|
});
|
||||||
|
mockServer.close = vi.fn();
|
||||||
|
vi.mocked(net.createServer).mockReturnValue(mockServer);
|
||||||
|
|
||||||
|
// Default mock for execSync - no process on port
|
||||||
|
vi.mocked(execSync).mockImplementation(() => {
|
||||||
|
throw new Error('No process found');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
try {
|
||||||
|
await fs.rm(testDataDir, { recursive: true, force: true });
|
||||||
|
await fs.rm(worktreeDir, { recursive: true, force: true });
|
||||||
|
} catch {
|
||||||
|
// Ignore cleanup errors
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should emit dev-server:starting when startDevServer is called', async () => {
|
||||||
|
const { getDevServerService } = await import('@/services/dev-server-service.js');
|
||||||
|
const service = getDevServerService();
|
||||||
|
await service.initialize(testDataDir, mockEmitter as any);
|
||||||
|
|
||||||
|
const mockProcess = createMockProcess();
|
||||||
|
vi.mocked(spawn).mockReturnValue(mockProcess as any);
|
||||||
|
|
||||||
|
const events: any[] = [];
|
||||||
|
mockEmitter.on('dev-server:starting', (payload) => events.push(payload));
|
||||||
|
|
||||||
|
await service.startDevServer(worktreeDir, worktreeDir);
|
||||||
|
|
||||||
|
expect(events.length).toBe(1);
|
||||||
|
expect(events[0].worktreePath).toBe(worktreeDir);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should prevent concurrent starts for the same worktree', async () => {
|
||||||
|
const { getDevServerService } = await import('@/services/dev-server-service.js');
|
||||||
|
const service = getDevServerService();
|
||||||
|
await service.initialize(testDataDir, mockEmitter as any);
|
||||||
|
|
||||||
|
// Delay spawn to simulate long starting time
|
||||||
|
vi.mocked(spawn).mockImplementation(() => {
|
||||||
|
const p = createMockProcess();
|
||||||
|
// Don't return immediately, simulate some work
|
||||||
|
return p as any;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Start first one (don't await yet if we want to test concurrency)
|
||||||
|
const promise1 = service.startDevServer(worktreeDir, worktreeDir);
|
||||||
|
|
||||||
|
// Try to start second one immediately
|
||||||
|
const result2 = await service.startDevServer(worktreeDir, worktreeDir);
|
||||||
|
|
||||||
|
expect(result2.success).toBe(false);
|
||||||
|
expect(result2.error).toContain('already starting');
|
||||||
|
|
||||||
|
await promise1;
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should persist state to dev-servers.json when started', async () => {
|
||||||
|
const { getDevServerService } = await import('@/services/dev-server-service.js');
|
||||||
|
const service = getDevServerService();
|
||||||
|
await service.initialize(testDataDir, mockEmitter as any);
|
||||||
|
|
||||||
|
const mockProcess = createMockProcess();
|
||||||
|
vi.mocked(spawn).mockReturnValue(mockProcess as any);
|
||||||
|
|
||||||
|
await service.startDevServer(worktreeDir, worktreeDir);
|
||||||
|
|
||||||
|
const statePath = path.join(testDataDir, 'dev-servers.json');
|
||||||
|
const exists = await fs
|
||||||
|
.access(statePath)
|
||||||
|
.then(() => true)
|
||||||
|
.catch(() => false);
|
||||||
|
expect(exists).toBe(true);
|
||||||
|
|
||||||
|
const content = await fs.readFile(statePath, 'utf-8');
|
||||||
|
const state = JSON.parse(content);
|
||||||
|
expect(state.length).toBe(1);
|
||||||
|
expect(state[0].worktreePath).toBe(worktreeDir);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should load state from dev-servers.json on initialize', async () => {
|
||||||
|
// 1. Create a fake state file
|
||||||
|
const persistedInfo = [
|
||||||
|
{
|
||||||
|
worktreePath: worktreeDir,
|
||||||
|
allocatedPort: 3005,
|
||||||
|
port: 3005,
|
||||||
|
url: 'http://localhost:3005',
|
||||||
|
startedAt: new Date().toISOString(),
|
||||||
|
urlDetected: true,
|
||||||
|
customCommand: 'npm run dev',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
await fs.writeFile(path.join(testDataDir, 'dev-servers.json'), JSON.stringify(persistedInfo));
|
||||||
|
|
||||||
|
// 2. Mock port as IN USE (so it re-attaches)
|
||||||
|
const mockServer = new EventEmitter() as any;
|
||||||
|
mockServer.listen = vi.fn().mockImplementation((port: number, host: string) => {
|
||||||
|
// Fail to listen = port in use
|
||||||
|
process.nextTick(() => mockServer.emit('error', new Error('EADDRINUSE')));
|
||||||
|
});
|
||||||
|
vi.mocked(net.createServer).mockReturnValue(mockServer);
|
||||||
|
|
||||||
|
const { getDevServerService } = await import('@/services/dev-server-service.js');
|
||||||
|
const service = getDevServerService();
|
||||||
|
await service.initialize(testDataDir, mockEmitter as any);
|
||||||
|
|
||||||
|
expect(service.isRunning(worktreeDir)).toBe(true);
|
||||||
|
const info = service.getServerInfo(worktreeDir);
|
||||||
|
expect(info?.port).toBe(3005);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should prune stale servers from state on initialize if port is available', async () => {
|
||||||
|
// 1. Create a fake state file
|
||||||
|
const persistedInfo = [
|
||||||
|
{
|
||||||
|
worktreePath: worktreeDir,
|
||||||
|
allocatedPort: 3005,
|
||||||
|
port: 3005,
|
||||||
|
url: 'http://localhost:3005',
|
||||||
|
startedAt: new Date().toISOString(),
|
||||||
|
urlDetected: true,
|
||||||
|
},
|
||||||
|
];
|
||||||
|
await fs.writeFile(path.join(testDataDir, 'dev-servers.json'), JSON.stringify(persistedInfo));
|
||||||
|
|
||||||
|
// 2. Mock port as AVAILABLE (so it prunes)
|
||||||
|
const mockServer = new EventEmitter() as any;
|
||||||
|
mockServer.listen = vi.fn().mockImplementation((port: number, host: string) => {
|
||||||
|
process.nextTick(() => mockServer.emit('listening'));
|
||||||
|
});
|
||||||
|
mockServer.close = vi.fn();
|
||||||
|
vi.mocked(net.createServer).mockReturnValue(mockServer);
|
||||||
|
|
||||||
|
const { getDevServerService } = await import('@/services/dev-server-service.js');
|
||||||
|
const service = getDevServerService();
|
||||||
|
await service.initialize(testDataDir, mockEmitter as any);
|
||||||
|
|
||||||
|
expect(service.isRunning(worktreeDir)).toBe(false);
|
||||||
|
|
||||||
|
// Give it a moment to complete the pruning saveState
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 100));
|
||||||
|
|
||||||
|
// Check if file was updated
|
||||||
|
const content = await fs.readFile(path.join(testDataDir, 'dev-servers.json'), 'utf-8');
|
||||||
|
const state = JSON.parse(content);
|
||||||
|
expect(state.length).toBe(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should update persisted state when URL is detected', async () => {
|
||||||
|
const { getDevServerService } = await import('@/services/dev-server-service.js');
|
||||||
|
const service = getDevServerService();
|
||||||
|
await service.initialize(testDataDir, mockEmitter as any);
|
||||||
|
|
||||||
|
const mockProcess = createMockProcess();
|
||||||
|
vi.mocked(spawn).mockReturnValue(mockProcess as any);
|
||||||
|
|
||||||
|
await service.startDevServer(worktreeDir, worktreeDir);
|
||||||
|
|
||||||
|
// Simulate output with URL
|
||||||
|
mockProcess.stdout.emit('data', Buffer.from('Local: http://localhost:5555/\n'));
|
||||||
|
|
||||||
|
// Give it a moment to process and save (needs to wait for saveQueue)
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 300));
|
||||||
|
|
||||||
|
const content = await fs.readFile(path.join(testDataDir, 'dev-servers.json'), 'utf-8');
|
||||||
|
const state = JSON.parse(content);
|
||||||
|
expect(state[0].url).toBe('http://localhost:5555/');
|
||||||
|
expect(state[0].port).toBe(5555);
|
||||||
|
expect(state[0].urlDetected).toBe(true);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Helper to create a mock child process
|
||||||
|
function createMockProcess() {
|
||||||
|
const mockProcess = new EventEmitter() as any;
|
||||||
|
mockProcess.stdout = new EventEmitter();
|
||||||
|
mockProcess.stderr = new EventEmitter();
|
||||||
|
mockProcess.kill = vi.fn();
|
||||||
|
mockProcess.killed = false;
|
||||||
|
mockProcess.pid = 12345;
|
||||||
|
mockProcess.unref = vi.fn();
|
||||||
|
return mockProcess;
|
||||||
|
}
|
||||||
@@ -458,6 +458,21 @@ describe('execution-service.ts', () => {
|
|||||||
expect(callArgs[6]).toBe('claude-sonnet-4');
|
expect(callArgs[6]).toBe('claude-sonnet-4');
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('passes providerId to runAgentFn when present on feature', async () => {
|
||||||
|
const featureWithProvider: Feature = {
|
||||||
|
...testFeature,
|
||||||
|
providerId: 'zai-provider-1',
|
||||||
|
};
|
||||||
|
vi.mocked(mockLoadFeatureFn).mockResolvedValue(featureWithProvider);
|
||||||
|
|
||||||
|
await service.executeFeature('/test/project', 'feature-1');
|
||||||
|
|
||||||
|
expect(mockRunAgentFn).toHaveBeenCalled();
|
||||||
|
const callArgs = mockRunAgentFn.mock.calls[0];
|
||||||
|
const options = callArgs[7];
|
||||||
|
expect(options.providerId).toBe('zai-provider-1');
|
||||||
|
});
|
||||||
|
|
||||||
it('executes pipeline after agent completes', async () => {
|
it('executes pipeline after agent completes', async () => {
|
||||||
const pipelineSteps = [{ id: 'step-1', name: 'Step 1', order: 1, instructions: 'Do step 1' }];
|
const pipelineSteps = [{ id: 'step-1', name: 'Step 1', order: 1, instructions: 'Do step 1' }];
|
||||||
vi.mocked(pipelineService.getPipelineConfig).mockResolvedValue({
|
vi.mocked(pipelineService.getPipelineConfig).mockResolvedValue({
|
||||||
@@ -1316,16 +1331,19 @@ describe('execution-service.ts', () => {
|
|||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('falls back to project path when worktree not found', async () => {
|
it('emits error and does not execute agent when worktree is not found in worktree mode', async () => {
|
||||||
vi.mocked(mockWorktreeResolver.findWorktreeForBranch).mockResolvedValue(null);
|
vi.mocked(mockWorktreeResolver.findWorktreeForBranch).mockResolvedValue(null);
|
||||||
|
|
||||||
await service.executeFeature('/test/project', 'feature-1', true);
|
await service.executeFeature('/test/project', 'feature-1', true);
|
||||||
|
|
||||||
// Should still run agent, just with project path
|
expect(mockRunAgentFn).not.toHaveBeenCalled();
|
||||||
expect(mockRunAgentFn).toHaveBeenCalled();
|
expect(mockEventBus.emitAutoModeEvent).toHaveBeenCalledWith(
|
||||||
const callArgs = mockRunAgentFn.mock.calls[0];
|
'auto_mode_error',
|
||||||
// First argument is workDir - should be normalized path to /test/project
|
expect.objectContaining({
|
||||||
expect(callArgs[0]).toBe(normalizePath('/test/project'));
|
featureId: 'feature-1',
|
||||||
|
error: 'Worktree enabled but no worktree found for feature branch "feature/test-1".',
|
||||||
|
})
|
||||||
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it('skips worktree resolution when useWorktrees is false', async () => {
|
it('skips worktree resolution when useWorktrees is false', async () => {
|
||||||
|
|||||||
@@ -0,0 +1,356 @@
|
|||||||
|
/**
|
||||||
|
* Tests for providerId passthrough in PipelineOrchestrator
|
||||||
|
* Verifies that feature.providerId is forwarded to runAgentFn in both
|
||||||
|
* executePipeline (step execution) and executeTestStep (test fix) contexts.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||||
|
import type { Feature, PipelineStep } from '@automaker/types';
|
||||||
|
import {
|
||||||
|
PipelineOrchestrator,
|
||||||
|
type PipelineContext,
|
||||||
|
type UpdateFeatureStatusFn,
|
||||||
|
type BuildFeaturePromptFn,
|
||||||
|
type ExecuteFeatureFn,
|
||||||
|
type RunAgentFn,
|
||||||
|
} from '../../../src/services/pipeline-orchestrator.js';
|
||||||
|
import type { TypedEventBus } from '../../../src/services/typed-event-bus.js';
|
||||||
|
import type { FeatureStateManager } from '../../../src/services/feature-state-manager.js';
|
||||||
|
import type { AgentExecutor } from '../../../src/services/agent-executor.js';
|
||||||
|
import type { WorktreeResolver } from '../../../src/services/worktree-resolver.js';
|
||||||
|
import type { SettingsService } from '../../../src/services/settings-service.js';
|
||||||
|
import type { ConcurrencyManager } from '../../../src/services/concurrency-manager.js';
|
||||||
|
import type { TestRunnerService } from '../../../src/services/test-runner-service.js';
|
||||||
|
import * as secureFs from '../../../src/lib/secure-fs.js';
|
||||||
|
import { getFeatureDir } from '@automaker/platform';
|
||||||
|
import {
|
||||||
|
getPromptCustomization,
|
||||||
|
getAutoLoadClaudeMdSetting,
|
||||||
|
filterClaudeMdFromContext,
|
||||||
|
} from '../../../src/lib/settings-helpers.js';
|
||||||
|
|
||||||
|
// Mock pipelineService
|
||||||
|
vi.mock('../../../src/services/pipeline-service.js', () => ({
|
||||||
|
pipelineService: {
|
||||||
|
isPipelineStatus: vi.fn(),
|
||||||
|
getStepIdFromStatus: vi.fn(),
|
||||||
|
getPipelineConfig: vi.fn(),
|
||||||
|
getNextStatus: vi.fn(),
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock merge-service
|
||||||
|
vi.mock('../../../src/services/merge-service.js', () => ({
|
||||||
|
performMerge: vi.fn().mockResolvedValue({ success: true }),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock secureFs
|
||||||
|
vi.mock('../../../src/lib/secure-fs.js', () => ({
|
||||||
|
readFile: vi.fn(),
|
||||||
|
access: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock settings helpers
|
||||||
|
vi.mock('../../../src/lib/settings-helpers.js', () => ({
|
||||||
|
getPromptCustomization: vi.fn().mockResolvedValue({
|
||||||
|
taskExecution: {
|
||||||
|
implementationInstructions: 'test instructions',
|
||||||
|
playwrightVerificationInstructions: 'test playwright',
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
getAutoLoadClaudeMdSetting: vi.fn().mockResolvedValue(true),
|
||||||
|
getUseClaudeCodeSystemPromptSetting: vi.fn().mockResolvedValue(true),
|
||||||
|
filterClaudeMdFromContext: vi.fn().mockReturnValue('context prompt'),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock validateWorkingDirectory
|
||||||
|
vi.mock('../../../src/lib/sdk-options.js', () => ({
|
||||||
|
validateWorkingDirectory: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock platform
|
||||||
|
vi.mock('@automaker/platform', () => ({
|
||||||
|
getFeatureDir: vi
|
||||||
|
.fn()
|
||||||
|
.mockImplementation(
|
||||||
|
(projectPath: string, featureId: string) => `${projectPath}/.automaker/features/${featureId}`
|
||||||
|
),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock model-resolver
|
||||||
|
vi.mock('@automaker/model-resolver', () => ({
|
||||||
|
resolveModelString: vi.fn().mockReturnValue('claude-sonnet-4'),
|
||||||
|
DEFAULT_MODELS: { claude: 'claude-sonnet-4' },
|
||||||
|
}));
|
||||||
|
|
||||||
|
describe('PipelineOrchestrator - providerId passthrough', () => {
|
||||||
|
let mockEventBus: TypedEventBus;
|
||||||
|
let mockFeatureStateManager: FeatureStateManager;
|
||||||
|
let mockAgentExecutor: AgentExecutor;
|
||||||
|
let mockTestRunnerService: TestRunnerService;
|
||||||
|
let mockWorktreeResolver: WorktreeResolver;
|
||||||
|
let mockConcurrencyManager: ConcurrencyManager;
|
||||||
|
let mockUpdateFeatureStatusFn: UpdateFeatureStatusFn;
|
||||||
|
let mockLoadContextFilesFn: vi.Mock;
|
||||||
|
let mockBuildFeaturePromptFn: BuildFeaturePromptFn;
|
||||||
|
let mockExecuteFeatureFn: ExecuteFeatureFn;
|
||||||
|
let mockRunAgentFn: RunAgentFn;
|
||||||
|
let orchestrator: PipelineOrchestrator;
|
||||||
|
|
||||||
|
const testSteps: PipelineStep[] = [
|
||||||
|
{
|
||||||
|
id: 'step-1',
|
||||||
|
name: 'Step 1',
|
||||||
|
order: 1,
|
||||||
|
instructions: 'Do step 1',
|
||||||
|
colorClass: 'blue',
|
||||||
|
createdAt: '',
|
||||||
|
updatedAt: '',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
const createFeatureWithProvider = (providerId?: string): Feature => ({
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Test Feature',
|
||||||
|
category: 'test',
|
||||||
|
description: 'Test description',
|
||||||
|
status: 'pipeline_step-1',
|
||||||
|
branchName: 'feature/test-1',
|
||||||
|
providerId,
|
||||||
|
});
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
|
||||||
|
mockEventBus = {
|
||||||
|
emitAutoModeEvent: vi.fn(),
|
||||||
|
getUnderlyingEmitter: vi.fn().mockReturnValue({}),
|
||||||
|
} as unknown as TypedEventBus;
|
||||||
|
|
||||||
|
mockFeatureStateManager = {
|
||||||
|
updateFeatureStatus: vi.fn().mockResolvedValue(undefined),
|
||||||
|
loadFeature: vi.fn().mockResolvedValue(createFeatureWithProvider()),
|
||||||
|
} as unknown as FeatureStateManager;
|
||||||
|
|
||||||
|
mockAgentExecutor = {
|
||||||
|
execute: vi.fn().mockResolvedValue({ success: true }),
|
||||||
|
} as unknown as AgentExecutor;
|
||||||
|
|
||||||
|
mockTestRunnerService = {
|
||||||
|
startTests: vi
|
||||||
|
.fn()
|
||||||
|
.mockResolvedValue({ success: true, result: { sessionId: 'test-session-1' } }),
|
||||||
|
getSession: vi.fn().mockReturnValue({
|
||||||
|
status: 'passed',
|
||||||
|
exitCode: 0,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
}),
|
||||||
|
getSessionOutput: vi
|
||||||
|
.fn()
|
||||||
|
.mockReturnValue({ success: true, result: { output: 'All tests passed' } }),
|
||||||
|
} as unknown as TestRunnerService;
|
||||||
|
|
||||||
|
mockWorktreeResolver = {
|
||||||
|
findWorktreeForBranch: vi.fn().mockResolvedValue('/test/worktree'),
|
||||||
|
getCurrentBranch: vi.fn().mockResolvedValue('main'),
|
||||||
|
} as unknown as WorktreeResolver;
|
||||||
|
|
||||||
|
mockConcurrencyManager = {
|
||||||
|
acquire: vi.fn().mockImplementation(({ featureId, isAutoMode }) => ({
|
||||||
|
featureId,
|
||||||
|
projectPath: '/test/project',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
branchName: null,
|
||||||
|
worktreePath: null,
|
||||||
|
isAutoMode: isAutoMode ?? false,
|
||||||
|
})),
|
||||||
|
release: vi.fn(),
|
||||||
|
getRunningFeature: vi.fn().mockReturnValue(undefined),
|
||||||
|
} as unknown as ConcurrencyManager;
|
||||||
|
|
||||||
|
mockUpdateFeatureStatusFn = vi.fn().mockResolvedValue(undefined);
|
||||||
|
mockLoadContextFilesFn = vi.fn().mockResolvedValue({ contextPrompt: 'test context' });
|
||||||
|
mockBuildFeaturePromptFn = vi.fn().mockReturnValue('Feature prompt content');
|
||||||
|
mockExecuteFeatureFn = vi.fn().mockResolvedValue(undefined);
|
||||||
|
mockRunAgentFn = vi.fn().mockResolvedValue(undefined);
|
||||||
|
|
||||||
|
vi.mocked(secureFs.readFile).mockResolvedValue('Previous context');
|
||||||
|
vi.mocked(secureFs.access).mockResolvedValue(undefined);
|
||||||
|
vi.mocked(getFeatureDir).mockImplementation(
|
||||||
|
(projectPath: string, featureId: string) => `${projectPath}/.automaker/features/${featureId}`
|
||||||
|
);
|
||||||
|
vi.mocked(getPromptCustomization).mockResolvedValue({
|
||||||
|
taskExecution: {
|
||||||
|
implementationInstructions: 'test instructions',
|
||||||
|
playwrightVerificationInstructions: 'test playwright',
|
||||||
|
},
|
||||||
|
} as any);
|
||||||
|
vi.mocked(getAutoLoadClaudeMdSetting).mockResolvedValue(true);
|
||||||
|
vi.mocked(filterClaudeMdFromContext).mockReturnValue('context prompt');
|
||||||
|
|
||||||
|
orchestrator = new PipelineOrchestrator(
|
||||||
|
mockEventBus,
|
||||||
|
mockFeatureStateManager,
|
||||||
|
mockAgentExecutor,
|
||||||
|
mockTestRunnerService,
|
||||||
|
mockWorktreeResolver,
|
||||||
|
mockConcurrencyManager,
|
||||||
|
null,
|
||||||
|
mockUpdateFeatureStatusFn,
|
||||||
|
mockLoadContextFilesFn,
|
||||||
|
mockBuildFeaturePromptFn,
|
||||||
|
mockExecuteFeatureFn,
|
||||||
|
mockRunAgentFn
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('executePipeline', () => {
|
||||||
|
it('should pass providerId to runAgentFn options when feature has providerId', async () => {
|
||||||
|
const feature = createFeatureWithProvider('moonshot-ai');
|
||||||
|
const context: PipelineContext = {
|
||||||
|
projectPath: '/test/project',
|
||||||
|
featureId: 'feature-1',
|
||||||
|
feature,
|
||||||
|
steps: testSteps,
|
||||||
|
workDir: '/test/project',
|
||||||
|
worktreePath: null,
|
||||||
|
branchName: 'feature/test-1',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
autoLoadClaudeMd: true,
|
||||||
|
testAttempts: 0,
|
||||||
|
maxTestAttempts: 5,
|
||||||
|
};
|
||||||
|
|
||||||
|
await orchestrator.executePipeline(context);
|
||||||
|
|
||||||
|
expect(mockRunAgentFn).toHaveBeenCalledTimes(1);
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('providerId', 'moonshot-ai');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should pass undefined providerId when feature has no providerId', async () => {
|
||||||
|
const feature = createFeatureWithProvider(undefined);
|
||||||
|
const context: PipelineContext = {
|
||||||
|
projectPath: '/test/project',
|
||||||
|
featureId: 'feature-1',
|
||||||
|
feature,
|
||||||
|
steps: testSteps,
|
||||||
|
workDir: '/test/project',
|
||||||
|
worktreePath: null,
|
||||||
|
branchName: 'feature/test-1',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
autoLoadClaudeMd: true,
|
||||||
|
testAttempts: 0,
|
||||||
|
maxTestAttempts: 5,
|
||||||
|
};
|
||||||
|
|
||||||
|
await orchestrator.executePipeline(context);
|
||||||
|
|
||||||
|
expect(mockRunAgentFn).toHaveBeenCalledTimes(1);
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('providerId', undefined);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should pass status alongside providerId in options', async () => {
|
||||||
|
const feature = createFeatureWithProvider('zhipu');
|
||||||
|
const context: PipelineContext = {
|
||||||
|
projectPath: '/test/project',
|
||||||
|
featureId: 'feature-1',
|
||||||
|
feature,
|
||||||
|
steps: testSteps,
|
||||||
|
workDir: '/test/project',
|
||||||
|
worktreePath: null,
|
||||||
|
branchName: 'feature/test-1',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
autoLoadClaudeMd: true,
|
||||||
|
testAttempts: 0,
|
||||||
|
maxTestAttempts: 5,
|
||||||
|
};
|
||||||
|
|
||||||
|
await orchestrator.executePipeline(context);
|
||||||
|
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('providerId', 'zhipu');
|
||||||
|
expect(options).toHaveProperty('status');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('executeTestStep', () => {
|
||||||
|
it('should pass providerId in test fix agent options when tests fail', async () => {
|
||||||
|
vi.mocked(mockTestRunnerService.getSession)
|
||||||
|
.mockReturnValueOnce({
|
||||||
|
status: 'failed',
|
||||||
|
exitCode: 1,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
} as never)
|
||||||
|
.mockReturnValueOnce({
|
||||||
|
status: 'passed',
|
||||||
|
exitCode: 0,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
} as never);
|
||||||
|
|
||||||
|
const feature = createFeatureWithProvider('custom-provider');
|
||||||
|
const context: PipelineContext = {
|
||||||
|
projectPath: '/test/project',
|
||||||
|
featureId: 'feature-1',
|
||||||
|
feature,
|
||||||
|
steps: testSteps,
|
||||||
|
workDir: '/test/project',
|
||||||
|
worktreePath: null,
|
||||||
|
branchName: 'feature/test-1',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
autoLoadClaudeMd: true,
|
||||||
|
testAttempts: 0,
|
||||||
|
maxTestAttempts: 5,
|
||||||
|
};
|
||||||
|
|
||||||
|
await orchestrator.executeTestStep(context, 'npm test');
|
||||||
|
|
||||||
|
// The fix agent should receive providerId
|
||||||
|
expect(mockRunAgentFn).toHaveBeenCalledTimes(1);
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('providerId', 'custom-provider');
|
||||||
|
}, 15000);
|
||||||
|
|
||||||
|
it('should pass thinkingLevel in test fix agent options', async () => {
|
||||||
|
vi.mocked(mockTestRunnerService.getSession)
|
||||||
|
.mockReturnValueOnce({
|
||||||
|
status: 'failed',
|
||||||
|
exitCode: 1,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
} as never)
|
||||||
|
.mockReturnValueOnce({
|
||||||
|
status: 'passed',
|
||||||
|
exitCode: 0,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
} as never);
|
||||||
|
|
||||||
|
const feature = createFeatureWithProvider('moonshot-ai');
|
||||||
|
feature.thinkingLevel = 'high';
|
||||||
|
const context: PipelineContext = {
|
||||||
|
projectPath: '/test/project',
|
||||||
|
featureId: 'feature-1',
|
||||||
|
feature,
|
||||||
|
steps: testSteps,
|
||||||
|
workDir: '/test/project',
|
||||||
|
worktreePath: null,
|
||||||
|
branchName: 'feature/test-1',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
autoLoadClaudeMd: true,
|
||||||
|
testAttempts: 0,
|
||||||
|
maxTestAttempts: 5,
|
||||||
|
};
|
||||||
|
|
||||||
|
await orchestrator.executeTestStep(context, 'npm test');
|
||||||
|
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('thinkingLevel', 'high');
|
||||||
|
expect(options).toHaveProperty('providerId', 'moonshot-ai');
|
||||||
|
}, 15000);
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -0,0 +1,302 @@
|
|||||||
|
/**
|
||||||
|
* Tests for status + providerId coexistence in PipelineOrchestrator options.
|
||||||
|
*
|
||||||
|
* During rebase onto upstream/v1.0.0rc, a merge conflict arose where
|
||||||
|
* upstream added `status: currentStatus` and the incoming branch added
|
||||||
|
* `providerId: feature.providerId`. The conflict resolution kept BOTH fields.
|
||||||
|
*
|
||||||
|
* This test validates that both fields coexist correctly in the options
|
||||||
|
* object passed to runAgentFn in both executePipeline and executeTestStep.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { describe, it, expect, vi, beforeEach } from 'vitest';
|
||||||
|
import type { Feature, PipelineStep } from '@automaker/types';
|
||||||
|
import {
|
||||||
|
PipelineOrchestrator,
|
||||||
|
type PipelineContext,
|
||||||
|
type UpdateFeatureStatusFn,
|
||||||
|
type BuildFeaturePromptFn,
|
||||||
|
type ExecuteFeatureFn,
|
||||||
|
type RunAgentFn,
|
||||||
|
} from '../../../src/services/pipeline-orchestrator.js';
|
||||||
|
import type { TypedEventBus } from '../../../src/services/typed-event-bus.js';
|
||||||
|
import type { FeatureStateManager } from '../../../src/services/feature-state-manager.js';
|
||||||
|
import type { AgentExecutor } from '../../../src/services/agent-executor.js';
|
||||||
|
import type { WorktreeResolver } from '../../../src/services/worktree-resolver.js';
|
||||||
|
import type { ConcurrencyManager } from '../../../src/services/concurrency-manager.js';
|
||||||
|
import type { TestRunnerService } from '../../../src/services/test-runner-service.js';
|
||||||
|
import * as secureFs from '../../../src/lib/secure-fs.js';
|
||||||
|
import { getFeatureDir } from '@automaker/platform';
|
||||||
|
import {
|
||||||
|
getPromptCustomization,
|
||||||
|
getAutoLoadClaudeMdSetting,
|
||||||
|
filterClaudeMdFromContext,
|
||||||
|
} from '../../../src/lib/settings-helpers.js';
|
||||||
|
|
||||||
|
vi.mock('../../../src/services/pipeline-service.js', () => ({
|
||||||
|
pipelineService: {
|
||||||
|
isPipelineStatus: vi.fn(),
|
||||||
|
getStepIdFromStatus: vi.fn(),
|
||||||
|
getPipelineConfig: vi.fn(),
|
||||||
|
getNextStatus: vi.fn(),
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('../../../src/services/merge-service.js', () => ({
|
||||||
|
performMerge: vi.fn().mockResolvedValue({ success: true }),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('../../../src/lib/secure-fs.js', () => ({
|
||||||
|
readFile: vi.fn(),
|
||||||
|
access: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('../../../src/lib/settings-helpers.js', () => ({
|
||||||
|
getPromptCustomization: vi.fn().mockResolvedValue({
|
||||||
|
taskExecution: {
|
||||||
|
implementationInstructions: 'test instructions',
|
||||||
|
playwrightVerificationInstructions: 'test playwright',
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
getAutoLoadClaudeMdSetting: vi.fn().mockResolvedValue(true),
|
||||||
|
getUseClaudeCodeSystemPromptSetting: vi.fn().mockResolvedValue(true),
|
||||||
|
filterClaudeMdFromContext: vi.fn().mockReturnValue('context prompt'),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('../../../src/lib/sdk-options.js', () => ({
|
||||||
|
validateWorkingDirectory: vi.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@automaker/platform', () => ({
|
||||||
|
getFeatureDir: vi
|
||||||
|
.fn()
|
||||||
|
.mockImplementation(
|
||||||
|
(projectPath: string, featureId: string) => `${projectPath}/.automaker/features/${featureId}`
|
||||||
|
),
|
||||||
|
}));
|
||||||
|
|
||||||
|
vi.mock('@automaker/model-resolver', () => ({
|
||||||
|
resolveModelString: vi.fn().mockReturnValue('claude-sonnet-4'),
|
||||||
|
DEFAULT_MODELS: { claude: 'claude-sonnet-4' },
|
||||||
|
}));
|
||||||
|
|
||||||
|
describe('PipelineOrchestrator - status and providerId coexistence', () => {
|
||||||
|
let mockRunAgentFn: RunAgentFn;
|
||||||
|
let orchestrator: PipelineOrchestrator;
|
||||||
|
|
||||||
|
const testSteps: PipelineStep[] = [
|
||||||
|
{
|
||||||
|
id: 'implement',
|
||||||
|
name: 'Implement Feature',
|
||||||
|
order: 1,
|
||||||
|
instructions: 'Implement the feature',
|
||||||
|
colorClass: 'blue',
|
||||||
|
createdAt: '',
|
||||||
|
updatedAt: '',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
const createFeature = (overrides: Partial<Feature> = {}): Feature => ({
|
||||||
|
id: 'feature-1',
|
||||||
|
title: 'Test Feature',
|
||||||
|
category: 'test',
|
||||||
|
description: 'Test description',
|
||||||
|
status: 'pipeline_implement',
|
||||||
|
branchName: 'feature/test-1',
|
||||||
|
providerId: 'moonshot-ai',
|
||||||
|
thinkingLevel: 'medium',
|
||||||
|
reasoningEffort: 'high',
|
||||||
|
...overrides,
|
||||||
|
});
|
||||||
|
|
||||||
|
const createContext = (feature: Feature): PipelineContext => ({
|
||||||
|
projectPath: '/test/project',
|
||||||
|
featureId: feature.id,
|
||||||
|
feature,
|
||||||
|
steps: testSteps,
|
||||||
|
workDir: '/test/project',
|
||||||
|
worktreePath: null,
|
||||||
|
branchName: feature.branchName ?? 'main',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
autoLoadClaudeMd: true,
|
||||||
|
testAttempts: 0,
|
||||||
|
maxTestAttempts: 5,
|
||||||
|
});
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
mockRunAgentFn = vi.fn().mockResolvedValue(undefined);
|
||||||
|
|
||||||
|
vi.mocked(secureFs.readFile).mockResolvedValue('Previous context');
|
||||||
|
vi.mocked(secureFs.access).mockResolvedValue(undefined);
|
||||||
|
vi.mocked(getFeatureDir).mockImplementation(
|
||||||
|
(projectPath: string, featureId: string) => `${projectPath}/.automaker/features/${featureId}`
|
||||||
|
);
|
||||||
|
vi.mocked(getPromptCustomization).mockResolvedValue({
|
||||||
|
taskExecution: {
|
||||||
|
implementationInstructions: 'test instructions',
|
||||||
|
playwrightVerificationInstructions: 'test playwright',
|
||||||
|
},
|
||||||
|
} as any);
|
||||||
|
vi.mocked(getAutoLoadClaudeMdSetting).mockResolvedValue(true);
|
||||||
|
vi.mocked(filterClaudeMdFromContext).mockReturnValue('context prompt');
|
||||||
|
|
||||||
|
const mockEventBus = {
|
||||||
|
emitAutoModeEvent: vi.fn(),
|
||||||
|
getUnderlyingEmitter: vi.fn().mockReturnValue({}),
|
||||||
|
} as unknown as TypedEventBus;
|
||||||
|
|
||||||
|
const mockFeatureStateManager = {
|
||||||
|
updateFeatureStatus: vi.fn().mockResolvedValue(undefined),
|
||||||
|
loadFeature: vi.fn().mockResolvedValue(createFeature()),
|
||||||
|
} as unknown as FeatureStateManager;
|
||||||
|
|
||||||
|
const mockTestRunnerService = {
|
||||||
|
startTests: vi
|
||||||
|
.fn()
|
||||||
|
.mockResolvedValue({ success: true, result: { sessionId: 'test-session-1' } }),
|
||||||
|
getSession: vi.fn().mockReturnValue({
|
||||||
|
status: 'passed',
|
||||||
|
exitCode: 0,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
}),
|
||||||
|
getSessionOutput: vi
|
||||||
|
.fn()
|
||||||
|
.mockReturnValue({ success: true, result: { output: 'All tests passed' } }),
|
||||||
|
} as unknown as TestRunnerService;
|
||||||
|
|
||||||
|
orchestrator = new PipelineOrchestrator(
|
||||||
|
mockEventBus,
|
||||||
|
mockFeatureStateManager,
|
||||||
|
{} as AgentExecutor,
|
||||||
|
mockTestRunnerService,
|
||||||
|
{
|
||||||
|
findWorktreeForBranch: vi.fn().mockResolvedValue('/test/worktree'),
|
||||||
|
getCurrentBranch: vi.fn().mockResolvedValue('main'),
|
||||||
|
} as unknown as WorktreeResolver,
|
||||||
|
{
|
||||||
|
acquire: vi.fn().mockImplementation(({ featureId }) => ({
|
||||||
|
featureId,
|
||||||
|
projectPath: '/test/project',
|
||||||
|
abortController: new AbortController(),
|
||||||
|
branchName: null,
|
||||||
|
worktreePath: null,
|
||||||
|
isAutoMode: false,
|
||||||
|
})),
|
||||||
|
release: vi.fn(),
|
||||||
|
getRunningFeature: vi.fn().mockReturnValue(undefined),
|
||||||
|
} as unknown as ConcurrencyManager,
|
||||||
|
null,
|
||||||
|
vi.fn().mockResolvedValue(undefined),
|
||||||
|
vi.fn().mockResolvedValue({ contextPrompt: 'test context' }),
|
||||||
|
vi.fn().mockReturnValue('Feature prompt content'),
|
||||||
|
vi.fn().mockResolvedValue(undefined),
|
||||||
|
mockRunAgentFn
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('executePipeline - options object', () => {
|
||||||
|
it('should pass both status and providerId in options', async () => {
|
||||||
|
const feature = createFeature({ providerId: 'moonshot-ai' });
|
||||||
|
const context = createContext(feature);
|
||||||
|
|
||||||
|
await orchestrator.executePipeline(context);
|
||||||
|
|
||||||
|
expect(mockRunAgentFn).toHaveBeenCalledTimes(1);
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('status', 'pipeline_implement');
|
||||||
|
expect(options).toHaveProperty('providerId', 'moonshot-ai');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should pass status even when providerId is undefined', async () => {
|
||||||
|
const feature = createFeature({ providerId: undefined });
|
||||||
|
const context = createContext(feature);
|
||||||
|
|
||||||
|
await orchestrator.executePipeline(context);
|
||||||
|
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('status', 'pipeline_implement');
|
||||||
|
expect(options).toHaveProperty('providerId', undefined);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should pass thinkingLevel and reasoningEffort alongside status and providerId', async () => {
|
||||||
|
const feature = createFeature({
|
||||||
|
providerId: 'zhipu',
|
||||||
|
thinkingLevel: 'high',
|
||||||
|
reasoningEffort: 'medium',
|
||||||
|
});
|
||||||
|
const context = createContext(feature);
|
||||||
|
|
||||||
|
await orchestrator.executePipeline(context);
|
||||||
|
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('status', 'pipeline_implement');
|
||||||
|
expect(options).toHaveProperty('providerId', 'zhipu');
|
||||||
|
expect(options).toHaveProperty('thinkingLevel', 'high');
|
||||||
|
expect(options).toHaveProperty('reasoningEffort', 'medium');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('executeTestStep - options object', () => {
|
||||||
|
it('should pass both status and providerId in test fix agent options', async () => {
|
||||||
|
const feature = createFeature({
|
||||||
|
status: 'running',
|
||||||
|
providerId: 'custom-provider',
|
||||||
|
});
|
||||||
|
const context = createContext(feature);
|
||||||
|
|
||||||
|
const mockTestRunner = orchestrator['testRunnerService'] as any;
|
||||||
|
vi.mocked(mockTestRunner.getSession)
|
||||||
|
.mockReturnValueOnce({
|
||||||
|
status: 'failed',
|
||||||
|
exitCode: 1,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
})
|
||||||
|
.mockReturnValueOnce({
|
||||||
|
status: 'passed',
|
||||||
|
exitCode: 0,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
});
|
||||||
|
|
||||||
|
await orchestrator.executeTestStep(context, 'npm test');
|
||||||
|
|
||||||
|
expect(mockRunAgentFn).toHaveBeenCalledTimes(1);
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
expect(options).toHaveProperty('status', 'running');
|
||||||
|
expect(options).toHaveProperty('providerId', 'custom-provider');
|
||||||
|
}, 15000);
|
||||||
|
|
||||||
|
it('should pass feature.status (not currentStatus) in test fix context', async () => {
|
||||||
|
const feature = createFeature({
|
||||||
|
status: 'pipeline_test',
|
||||||
|
providerId: 'moonshot-ai',
|
||||||
|
});
|
||||||
|
const context = createContext(feature);
|
||||||
|
|
||||||
|
const mockTestRunner = orchestrator['testRunnerService'] as any;
|
||||||
|
vi.mocked(mockTestRunner.getSession)
|
||||||
|
.mockReturnValueOnce({
|
||||||
|
status: 'failed',
|
||||||
|
exitCode: 1,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
})
|
||||||
|
.mockReturnValueOnce({
|
||||||
|
status: 'passed',
|
||||||
|
exitCode: 0,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
});
|
||||||
|
|
||||||
|
await orchestrator.executeTestStep(context, 'npm test');
|
||||||
|
|
||||||
|
const options = mockRunAgentFn.mock.calls[0][7];
|
||||||
|
// In test fix context, status should come from context.feature.status
|
||||||
|
expect(options).toHaveProperty('status', 'pipeline_test');
|
||||||
|
expect(options).toHaveProperty('providerId', 'moonshot-ai');
|
||||||
|
}, 15000);
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -107,6 +107,25 @@ branch refs/heads/feature-y
|
|||||||
expect(result).toBe(normalizePath('/Users/dev/project/.worktrees/feature-x'));
|
expect(result).toBe(normalizePath('/Users/dev/project/.worktrees/feature-x'));
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('should normalize refs/heads and trim when resolving target branch', async () => {
|
||||||
|
mockExecAsync(async () => ({ stdout: porcelainOutput, stderr: '' }));
|
||||||
|
|
||||||
|
const result = await resolver.findWorktreeForBranch(
|
||||||
|
'/Users/dev/project',
|
||||||
|
' refs/heads/feature-x '
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(result).toBe(normalizePath('/Users/dev/project/.worktrees/feature-x'));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should normalize remote-style target branch names', async () => {
|
||||||
|
mockExecAsync(async () => ({ stdout: porcelainOutput, stderr: '' }));
|
||||||
|
|
||||||
|
const result = await resolver.findWorktreeForBranch('/Users/dev/project', 'origin/feature-x');
|
||||||
|
|
||||||
|
expect(result).toBe(normalizePath('/Users/dev/project/.worktrees/feature-x'));
|
||||||
|
});
|
||||||
|
|
||||||
it('should return null when branch not found', async () => {
|
it('should return null when branch not found', async () => {
|
||||||
mockExecAsync(async () => ({ stdout: porcelainOutput, stderr: '' }));
|
mockExecAsync(async () => ({ stdout: porcelainOutput, stderr: '' }));
|
||||||
|
|
||||||
|
|||||||
1
apps/ui/.gitignore
vendored
1
apps/ui/.gitignore
vendored
@@ -38,6 +38,7 @@ yarn-error.log*
|
|||||||
/playwright-report/
|
/playwright-report/
|
||||||
/blob-report/
|
/blob-report/
|
||||||
/playwright/.cache/
|
/playwright/.cache/
|
||||||
|
/tests/.auth/
|
||||||
|
|
||||||
# Electron
|
# Electron
|
||||||
/release/
|
/release/
|
||||||
|
|||||||
@@ -144,6 +144,9 @@
|
|||||||
"@playwright/test": "1.57.0",
|
"@playwright/test": "1.57.0",
|
||||||
"@tailwindcss/vite": "4.1.18",
|
"@tailwindcss/vite": "4.1.18",
|
||||||
"@tanstack/router-plugin": "1.141.7",
|
"@tanstack/router-plugin": "1.141.7",
|
||||||
|
"@testing-library/jest-dom": "^6.9.1",
|
||||||
|
"@testing-library/react": "^16.3.2",
|
||||||
|
"@testing-library/user-event": "^14.6.1",
|
||||||
"@types/dagre": "0.7.53",
|
"@types/dagre": "0.7.53",
|
||||||
"@types/node": "22.19.3",
|
"@types/node": "22.19.3",
|
||||||
"@types/react": "19.2.7",
|
"@types/react": "19.2.7",
|
||||||
@@ -156,6 +159,7 @@
|
|||||||
"electron-builder": "26.0.12",
|
"electron-builder": "26.0.12",
|
||||||
"eslint": "9.39.2",
|
"eslint": "9.39.2",
|
||||||
"eslint-plugin-react-hooks": "^7.0.1",
|
"eslint-plugin-react-hooks": "^7.0.1",
|
||||||
|
"jsdom": "^28.1.0",
|
||||||
"tailwindcss": "4.1.18",
|
"tailwindcss": "4.1.18",
|
||||||
"tw-animate-css": "1.4.0",
|
"tw-animate-css": "1.4.0",
|
||||||
"typescript": "5.9.3",
|
"typescript": "5.9.3",
|
||||||
|
|||||||
@@ -1,28 +1,60 @@
|
|||||||
import { defineConfig, devices } from '@playwright/test';
|
import { defineConfig, devices } from '@playwright/test';
|
||||||
|
import path from 'path';
|
||||||
|
|
||||||
const port = process.env.TEST_PORT || 3107;
|
const port = process.env.TEST_PORT || 3107;
|
||||||
|
|
||||||
|
// PATH that includes common git locations so the E2E server can run git (worktree list, etc.)
|
||||||
|
const pathSeparator = process.platform === 'win32' ? ';' : ':';
|
||||||
|
const extraPath =
|
||||||
|
process.platform === 'win32'
|
||||||
|
? [
|
||||||
|
process.env.LOCALAPPDATA && `${process.env.LOCALAPPDATA}\\Programs\\Git\\cmd`,
|
||||||
|
process.env.PROGRAMFILES && `${process.env.PROGRAMFILES}\\Git\\cmd`,
|
||||||
|
].filter(Boolean)
|
||||||
|
: [
|
||||||
|
'/opt/homebrew/bin',
|
||||||
|
'/usr/local/bin',
|
||||||
|
'/usr/bin',
|
||||||
|
'/home/linuxbrew/.linuxbrew/bin',
|
||||||
|
process.env.HOME && `${process.env.HOME}/.local/bin`,
|
||||||
|
].filter(Boolean);
|
||||||
|
const e2eServerPath = [process.env.PATH, ...extraPath].filter(Boolean).join(pathSeparator);
|
||||||
const serverPort = process.env.TEST_SERVER_PORT || 3108;
|
const serverPort = process.env.TEST_SERVER_PORT || 3108;
|
||||||
|
// When true, no webServer is started; you must run UI (port 3107) and server (3108) yourself.
|
||||||
const reuseServer = process.env.TEST_REUSE_SERVER === 'true';
|
const reuseServer = process.env.TEST_REUSE_SERVER === 'true';
|
||||||
const useExternalBackend = !!process.env.VITE_SERVER_URL;
|
// Only skip backend startup when explicitly requested for E2E runs.
|
||||||
|
// VITE_SERVER_URL may be set in user shells for local dev and should not affect tests.
|
||||||
|
const useExternalBackend = process.env.TEST_USE_EXTERNAL_BACKEND === 'true';
|
||||||
// Always use mock agent for tests (disables rate limiting, uses mock Claude responses)
|
// Always use mock agent for tests (disables rate limiting, uses mock Claude responses)
|
||||||
const mockAgent = true;
|
const mockAgent = true;
|
||||||
|
|
||||||
|
// Auth state file written by global setup, reused by all tests to skip per-test login
|
||||||
|
const AUTH_STATE_PATH = path.join(__dirname, 'tests/.auth/storage-state.json');
|
||||||
|
|
||||||
export default defineConfig({
|
export default defineConfig({
|
||||||
testDir: './tests',
|
testDir: './tests',
|
||||||
|
// Keep Playwright scoped to E2E specs so Vitest unit files are not executed here.
|
||||||
|
testMatch: '**/*.spec.ts',
|
||||||
|
testIgnore: ['**/unit/**'],
|
||||||
fullyParallel: true,
|
fullyParallel: true,
|
||||||
forbidOnly: !!process.env.CI,
|
forbidOnly: !!process.env.CI,
|
||||||
retries: 0,
|
retries: process.env.CI ? 2 : 0,
|
||||||
workers: 1, // Run sequentially to avoid auth conflicts with shared server
|
// Use multiple workers for parallelism. CI gets 2 workers (constrained resources),
|
||||||
reporter: 'html',
|
// local runs use 8 workers for faster test execution.
|
||||||
|
workers: process.env.CI ? 2 : 8,
|
||||||
|
reporter: process.env.CI ? 'github' : 'html',
|
||||||
timeout: 30000,
|
timeout: 30000,
|
||||||
use: {
|
use: {
|
||||||
baseURL: `http://localhost:${port}`,
|
baseURL: `http://127.0.0.1:${port}`,
|
||||||
trace: 'on-failure',
|
trace: 'on-failure',
|
||||||
screenshot: 'only-on-failure',
|
screenshot: 'only-on-failure',
|
||||||
serviceWorkers: 'block',
|
serviceWorkers: 'block',
|
||||||
|
// Reuse auth state from global setup - avoids per-test login overhead
|
||||||
|
storageState: AUTH_STATE_PATH,
|
||||||
},
|
},
|
||||||
// Global setup - authenticate before each test
|
// Global setup - authenticate once and save state for all workers
|
||||||
globalSetup: require.resolve('./tests/global-setup.ts'),
|
globalSetup: require.resolve('./tests/global-setup.ts'),
|
||||||
|
globalTeardown: require.resolve('./tests/global-teardown.ts'),
|
||||||
projects: [
|
projects: [
|
||||||
{
|
{
|
||||||
name: 'chromium',
|
name: 'chromium',
|
||||||
@@ -40,13 +72,15 @@ export default defineConfig({
|
|||||||
: [
|
: [
|
||||||
{
|
{
|
||||||
command: `cd ../server && npm run dev:test`,
|
command: `cd ../server && npm run dev:test`,
|
||||||
url: `http://localhost:${serverPort}/api/health`,
|
url: `http://127.0.0.1:${serverPort}/api/health`,
|
||||||
// Don't reuse existing server to ensure we use the test API key
|
// Don't reuse existing server to ensure we use the test API key
|
||||||
reuseExistingServer: false,
|
reuseExistingServer: false,
|
||||||
timeout: 60000,
|
timeout: 60000,
|
||||||
env: {
|
env: {
|
||||||
...process.env,
|
...process.env,
|
||||||
PORT: String(serverPort),
|
PORT: String(serverPort),
|
||||||
|
// Ensure server can find git in CI/minimal env (worktree list, etc.)
|
||||||
|
PATH: e2eServerPath,
|
||||||
// Enable mock agent in CI to avoid real API calls
|
// Enable mock agent in CI to avoid real API calls
|
||||||
AUTOMAKER_MOCK_AGENT: mockAgent ? 'true' : 'false',
|
AUTOMAKER_MOCK_AGENT: mockAgent ? 'true' : 'false',
|
||||||
// Set a test API key for web mode authentication
|
// Set a test API key for web mode authentication
|
||||||
@@ -69,7 +103,7 @@ export default defineConfig({
|
|||||||
// Frontend Vite dev server
|
// Frontend Vite dev server
|
||||||
{
|
{
|
||||||
command: `npm run dev`,
|
command: `npm run dev`,
|
||||||
url: `http://localhost:${port}`,
|
url: `http://127.0.0.1:${port}`,
|
||||||
reuseExistingServer: false,
|
reuseExistingServer: false,
|
||||||
timeout: 120000,
|
timeout: 120000,
|
||||||
env: {
|
env: {
|
||||||
@@ -81,6 +115,11 @@ export default defineConfig({
|
|||||||
VITE_SKIP_SETUP: 'true',
|
VITE_SKIP_SETUP: 'true',
|
||||||
// Always skip electron plugin during tests - prevents duplicate server spawning
|
// Always skip electron plugin during tests - prevents duplicate server spawning
|
||||||
VITE_SKIP_ELECTRON: 'true',
|
VITE_SKIP_ELECTRON: 'true',
|
||||||
|
// Clear VITE_SERVER_URL to force the frontend to use the Vite proxy (/api)
|
||||||
|
// instead of calling the backend directly. Direct calls bypass the proxy and
|
||||||
|
// cause cookie domain mismatches (cookies are bound to 127.0.0.1 but
|
||||||
|
// VITE_SERVER_URL typically uses localhost).
|
||||||
|
VITE_SERVER_URL: '',
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
],
|
],
|
||||||
|
|||||||
@@ -10,7 +10,9 @@ const execAsync = promisify(exec);
|
|||||||
|
|
||||||
const SERVER_PORT = process.env.TEST_SERVER_PORT || 3108;
|
const SERVER_PORT = process.env.TEST_SERVER_PORT || 3108;
|
||||||
const UI_PORT = process.env.TEST_PORT || 3107;
|
const UI_PORT = process.env.TEST_PORT || 3107;
|
||||||
const USE_EXTERNAL_SERVER = !!process.env.VITE_SERVER_URL;
|
// Match Playwright config semantics: only explicit opt-in should skip backend startup/cleanup.
|
||||||
|
// VITE_SERVER_URL may exist in local shells and should not implicitly affect test behavior.
|
||||||
|
const USE_EXTERNAL_SERVER = process.env.TEST_USE_EXTERNAL_BACKEND === 'true';
|
||||||
console.log(`[KillTestServers] SERVER_PORT ${SERVER_PORT}`);
|
console.log(`[KillTestServers] SERVER_PORT ${SERVER_PORT}`);
|
||||||
console.log(`[KillTestServers] UI_PORT ${UI_PORT}`);
|
console.log(`[KillTestServers] UI_PORT ${UI_PORT}`);
|
||||||
async function killProcessOnPort(port) {
|
async function killProcessOnPort(port) {
|
||||||
|
|||||||
@@ -18,6 +18,8 @@ const __dirname = path.dirname(__filename);
|
|||||||
const WORKSPACE_ROOT = path.resolve(__dirname, '../../..');
|
const WORKSPACE_ROOT = path.resolve(__dirname, '../../..');
|
||||||
const FIXTURE_PATH = path.join(WORKSPACE_ROOT, 'test/fixtures/projectA');
|
const FIXTURE_PATH = path.join(WORKSPACE_ROOT, 'test/fixtures/projectA');
|
||||||
const SPEC_FILE_PATH = path.join(FIXTURE_PATH, '.automaker/app_spec.txt');
|
const SPEC_FILE_PATH = path.join(FIXTURE_PATH, '.automaker/app_spec.txt');
|
||||||
|
const CONTEXT_DIR = path.join(FIXTURE_PATH, '.automaker/context');
|
||||||
|
const CONTEXT_METADATA_PATH = path.join(CONTEXT_DIR, 'context-metadata.json');
|
||||||
const SERVER_SETTINGS_PATH = path.join(WORKSPACE_ROOT, 'apps/server/data/settings.json');
|
const SERVER_SETTINGS_PATH = path.join(WORKSPACE_ROOT, 'apps/server/data/settings.json');
|
||||||
// Create a shared test workspace directory that will be used as default for project creation
|
// Create a shared test workspace directory that will be used as default for project creation
|
||||||
const TEST_WORKSPACE_DIR = path.join(os.tmpdir(), 'automaker-e2e-workspace');
|
const TEST_WORKSPACE_DIR = path.join(os.tmpdir(), 'automaker-e2e-workspace');
|
||||||
@@ -145,6 +147,14 @@ function setupFixtures() {
|
|||||||
fs.writeFileSync(SPEC_FILE_PATH, SPEC_CONTENT);
|
fs.writeFileSync(SPEC_FILE_PATH, SPEC_CONTENT);
|
||||||
console.log(`Created fixture file: ${SPEC_FILE_PATH}`);
|
console.log(`Created fixture file: ${SPEC_FILE_PATH}`);
|
||||||
|
|
||||||
|
// Create .automaker/context and context-metadata.json (expected by context view / FS read)
|
||||||
|
if (!fs.existsSync(CONTEXT_DIR)) {
|
||||||
|
fs.mkdirSync(CONTEXT_DIR, { recursive: true });
|
||||||
|
console.log(`Created directory: ${CONTEXT_DIR}`);
|
||||||
|
}
|
||||||
|
fs.writeFileSync(CONTEXT_METADATA_PATH, JSON.stringify({ files: {} }, null, 2));
|
||||||
|
console.log(`Created fixture file: ${CONTEXT_METADATA_PATH}`);
|
||||||
|
|
||||||
// Reset server settings.json to a clean state for E2E tests
|
// Reset server settings.json to a clean state for E2E tests
|
||||||
const settingsDir = path.dirname(SERVER_SETTINGS_PATH);
|
const settingsDir = path.dirname(SERVER_SETTINGS_PATH);
|
||||||
if (!fs.existsSync(settingsDir)) {
|
if (!fs.existsSync(settingsDir)) {
|
||||||
|
|||||||
@@ -177,7 +177,7 @@ export function FileBrowserDialog({
|
|||||||
onSelect(currentPath);
|
onSelect(currentPath);
|
||||||
onOpenChange(false);
|
onOpenChange(false);
|
||||||
}
|
}
|
||||||
}, [currentPath, onSelect, onOpenChange]);
|
}, [currentPath, onSelect, onOpenChange, addRecentFolder]);
|
||||||
|
|
||||||
// Handle Command/Ctrl+Enter keyboard shortcut to select current folder
|
// Handle Command/Ctrl+Enter keyboard shortcut to select current folder
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
|
|||||||
@@ -37,7 +37,7 @@ import { Switch } from '@/components/ui/switch';
|
|||||||
import { Label } from '@/components/ui/label';
|
import { Label } from '@/components/ui/label';
|
||||||
import { Spinner } from '@/components/ui/spinner';
|
import { Spinner } from '@/components/ui/spinner';
|
||||||
import { Markdown } from '@/components/ui/markdown';
|
import { Markdown } from '@/components/ui/markdown';
|
||||||
import { cn, modelSupportsThinking, generateUUID } from '@/lib/utils';
|
import { cn, generateUUID, normalizeModelEntry } from '@/lib/utils';
|
||||||
import { useAppStore } from '@/store/app-store';
|
import { useAppStore } from '@/store/app-store';
|
||||||
import { useGitHubPRReviewComments } from '@/hooks/queries';
|
import { useGitHubPRReviewComments } from '@/hooks/queries';
|
||||||
import { useCreateFeature, useResolveReviewThread } from '@/hooks/mutations';
|
import { useCreateFeature, useResolveReviewThread } from '@/hooks/mutations';
|
||||||
@@ -45,7 +45,7 @@ import { toast } from 'sonner';
|
|||||||
import type { PRReviewComment } from '@/lib/electron';
|
import type { PRReviewComment } from '@/lib/electron';
|
||||||
import type { Feature } from '@/store/app-store';
|
import type { Feature } from '@/store/app-store';
|
||||||
import type { PhaseModelEntry } from '@automaker/types';
|
import type { PhaseModelEntry } from '@automaker/types';
|
||||||
import { supportsReasoningEffort, normalizeThinkingLevelForModel } from '@automaker/types';
|
import { normalizeThinkingLevelForModel } from '@automaker/types';
|
||||||
import { resolveModelString } from '@automaker/model-resolver';
|
import { resolveModelString } from '@automaker/model-resolver';
|
||||||
import { PhaseModelSelector } from '@/components/views/settings-view/model-defaults';
|
import { PhaseModelSelector } from '@/components/views/settings-view/model-defaults';
|
||||||
|
|
||||||
@@ -62,6 +62,8 @@ export interface PRCommentResolutionPRInfo {
|
|||||||
title: string;
|
title: string;
|
||||||
/** The branch name (headRefName) associated with this PR, used to assign features to the correct worktree */
|
/** The branch name (headRefName) associated with this PR, used to assign features to the correct worktree */
|
||||||
headRefName?: string;
|
headRefName?: string;
|
||||||
|
/** The URL of the PR, used to set prUrl on created features */
|
||||||
|
url?: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface PRCommentResolutionDialogProps {
|
interface PRCommentResolutionDialogProps {
|
||||||
@@ -730,14 +732,9 @@ export function PRCommentResolutionDialog({
|
|||||||
|
|
||||||
const selectedComments = comments.filter((c) => selectedIds.has(c.id));
|
const selectedComments = comments.filter((c) => selectedIds.has(c.id));
|
||||||
|
|
||||||
// Resolve model settings from the current model entry
|
// Resolve and normalize model settings
|
||||||
const selectedModel = resolveModelString(modelEntry.model);
|
const normalizedEntry = normalizeModelEntry(modelEntry);
|
||||||
const normalizedThinking = modelSupportsThinking(selectedModel)
|
const selectedModel = resolveModelString(normalizedEntry.model);
|
||||||
? modelEntry.thinkingLevel || 'none'
|
|
||||||
: 'none';
|
|
||||||
const normalizedReasoning = supportsReasoningEffort(selectedModel)
|
|
||||||
? modelEntry.reasoningEffort || 'none'
|
|
||||||
: 'none';
|
|
||||||
|
|
||||||
setIsCreating(true);
|
setIsCreating(true);
|
||||||
setCreationErrors([]);
|
setCreationErrors([]);
|
||||||
@@ -753,8 +750,13 @@ export function PRCommentResolutionDialog({
|
|||||||
steps: [],
|
steps: [],
|
||||||
status: 'backlog',
|
status: 'backlog',
|
||||||
model: selectedModel,
|
model: selectedModel,
|
||||||
thinkingLevel: normalizedThinking,
|
thinkingLevel: normalizedEntry.thinkingLevel,
|
||||||
reasoningEffort: normalizedReasoning,
|
reasoningEffort: normalizedEntry.reasoningEffort,
|
||||||
|
providerId: normalizedEntry.providerId,
|
||||||
|
planningMode: 'skip',
|
||||||
|
requirePlanApproval: false,
|
||||||
|
dependencies: [],
|
||||||
|
...(pr.url ? { prUrl: pr.url } : {}),
|
||||||
// Associate feature with the PR's branch so it appears on the correct worktree
|
// Associate feature with the PR's branch so it appears on the correct worktree
|
||||||
...(pr.headRefName ? { branchName: pr.headRefName } : {}),
|
...(pr.headRefName ? { branchName: pr.headRefName } : {}),
|
||||||
};
|
};
|
||||||
@@ -779,8 +781,13 @@ export function PRCommentResolutionDialog({
|
|||||||
steps: [],
|
steps: [],
|
||||||
status: 'backlog',
|
status: 'backlog',
|
||||||
model: selectedModel,
|
model: selectedModel,
|
||||||
thinkingLevel: normalizedThinking,
|
thinkingLevel: normalizedEntry.thinkingLevel,
|
||||||
reasoningEffort: normalizedReasoning,
|
reasoningEffort: normalizedEntry.reasoningEffort,
|
||||||
|
providerId: normalizedEntry.providerId,
|
||||||
|
planningMode: 'skip',
|
||||||
|
requirePlanApproval: false,
|
||||||
|
dependencies: [],
|
||||||
|
...(pr.url ? { prUrl: pr.url } : {}),
|
||||||
// Associate feature with the PR's branch so it appears on the correct worktree
|
// Associate feature with the PR's branch so it appears on the correct worktree
|
||||||
...(pr.headRefName ? { branchName: pr.headRefName } : {}),
|
...(pr.headRefName ? { branchName: pr.headRefName } : {}),
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -67,7 +67,7 @@ export function useNavigation({
|
|||||||
hideContext,
|
hideContext,
|
||||||
hideTerminal,
|
hideTerminal,
|
||||||
currentProject,
|
currentProject,
|
||||||
projects,
|
projects: _projects,
|
||||||
projectHistory,
|
projectHistory,
|
||||||
navigate,
|
navigate,
|
||||||
toggleSidebar,
|
toggleSidebar,
|
||||||
@@ -325,7 +325,6 @@ export function useNavigation({
|
|||||||
currentProject,
|
currentProject,
|
||||||
navigate,
|
navigate,
|
||||||
toggleSidebar,
|
toggleSidebar,
|
||||||
projects.length,
|
|
||||||
handleOpenFolder,
|
handleOpenFolder,
|
||||||
projectHistory.length,
|
projectHistory.length,
|
||||||
cyclePrevProject,
|
cyclePrevProject,
|
||||||
|
|||||||
@@ -48,12 +48,13 @@ export function ModelOverrideTrigger({
|
|||||||
const globalDefault = phaseModels[phase];
|
const globalDefault = phaseModels[phase];
|
||||||
const normalizedGlobal = normalizeEntry(globalDefault);
|
const normalizedGlobal = normalizeEntry(globalDefault);
|
||||||
|
|
||||||
// Compare models (and thinking levels if both have them)
|
// Compare models, thinking levels, and provider IDs
|
||||||
const modelsMatch = entry.model === normalizedGlobal.model;
|
const modelsMatch = entry.model === normalizedGlobal.model;
|
||||||
const thinkingMatch =
|
const thinkingMatch =
|
||||||
(entry.thinkingLevel || 'none') === (normalizedGlobal.thinkingLevel || 'none');
|
(entry.thinkingLevel || 'none') === (normalizedGlobal.thinkingLevel || 'none');
|
||||||
|
const providerMatch = entry.providerId === normalizedGlobal.providerId;
|
||||||
|
|
||||||
if (modelsMatch && thinkingMatch) {
|
if (modelsMatch && thinkingMatch && providerMatch) {
|
||||||
onModelChange(null); // Clear override
|
onModelChange(null); // Clear override
|
||||||
} else {
|
} else {
|
||||||
onModelChange(entry); // Set override
|
onModelChange(entry); // Set override
|
||||||
|
|||||||
@@ -244,6 +244,7 @@ export function DescriptionImageDropZone({
|
|||||||
onTextFilesChange,
|
onTextFilesChange,
|
||||||
previewImages,
|
previewImages,
|
||||||
saveImageToTemp,
|
saveImageToTemp,
|
||||||
|
setPreviewImages,
|
||||||
]
|
]
|
||||||
);
|
);
|
||||||
|
|
||||||
@@ -309,7 +310,7 @@ export function DescriptionImageDropZone({
|
|||||||
return newMap;
|
return newMap;
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
[images, onImagesChange]
|
[images, onImagesChange, setPreviewImages]
|
||||||
);
|
);
|
||||||
|
|
||||||
const removeTextFile = useCallback(
|
const removeTextFile = useCallback(
|
||||||
|
|||||||
@@ -384,7 +384,8 @@ export function GitDiffPanel({
|
|||||||
const queryError = useWorktrees ? worktreeError : gitError;
|
const queryError = useWorktrees ? worktreeError : gitError;
|
||||||
|
|
||||||
// Extract files, diff content, and merge state from the data
|
// Extract files, diff content, and merge state from the data
|
||||||
const files: FileStatus[] = diffsData?.files ?? [];
|
// Use useMemo to stabilize the files array reference to prevent unnecessary re-renders
|
||||||
|
const files = useMemo(() => diffsData?.files ?? [], [diffsData?.files]);
|
||||||
const diffContent = diffsData?.diff ?? '';
|
const diffContent = diffsData?.diff ?? '';
|
||||||
const mergeState: MergeStateInfo | undefined = diffsData?.mergeState;
|
const mergeState: MergeStateInfo | undefined = diffsData?.mergeState;
|
||||||
const error = queryError
|
const error = queryError
|
||||||
@@ -584,7 +585,7 @@ export function GitDiffPanel({
|
|||||||
() => setStagingInProgress(new Set(allPaths)),
|
() => setStagingInProgress(new Set(allPaths)),
|
||||||
() => setStagingInProgress(new Set())
|
() => setStagingInProgress(new Set())
|
||||||
);
|
);
|
||||||
}, [worktreePath, projectPath, useWorktrees, enableStaging, files, executeStagingAction]);
|
}, [worktreePath, useWorktrees, enableStaging, files, executeStagingAction]);
|
||||||
|
|
||||||
const handleUnstageAll = useCallback(async () => {
|
const handleUnstageAll = useCallback(async () => {
|
||||||
const stagedFiles = files.filter((f) => {
|
const stagedFiles = files.filter((f) => {
|
||||||
@@ -607,7 +608,7 @@ export function GitDiffPanel({
|
|||||||
() => setStagingInProgress(new Set(allPaths)),
|
() => setStagingInProgress(new Set(allPaths)),
|
||||||
() => setStagingInProgress(new Set())
|
() => setStagingInProgress(new Set())
|
||||||
);
|
);
|
||||||
}, [worktreePath, projectPath, useWorktrees, enableStaging, files, executeStagingAction]);
|
}, [worktreePath, useWorktrees, enableStaging, files, executeStagingAction]);
|
||||||
|
|
||||||
// Compute merge summary
|
// Compute merge summary
|
||||||
const mergeSummary = useMemo(() => {
|
const mergeSummary = useMemo(() => {
|
||||||
|
|||||||
@@ -37,6 +37,7 @@ export function Spinner({ size = 'md', variant = 'primary', className }: Spinner
|
|||||||
<Loader2
|
<Loader2
|
||||||
className={cn(sizeClasses[size], 'animate-spin', variantClasses[variant], className)}
|
className={cn(sizeClasses[size], 'animate-spin', variantClasses[variant], className)}
|
||||||
aria-hidden="true"
|
aria-hidden="true"
|
||||||
|
data-testid="spinner"
|
||||||
/>
|
/>
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -178,7 +178,9 @@ export const XtermLogViewer = forwardRef<XtermLogViewerRef, XtermLogViewerProps>
|
|||||||
fitAddonRef.current = null;
|
fitAddonRef.current = null;
|
||||||
setIsReady(false);
|
setIsReady(false);
|
||||||
};
|
};
|
||||||
}, []); // Only run once on mount
|
// Only run once on mount - intentionally excluding deps to prevent re-initialization
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
|
}, []);
|
||||||
|
|
||||||
// Update theme when it changes
|
// Update theme when it changes
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
|
|||||||
@@ -34,6 +34,7 @@ import type {
|
|||||||
BacklogPlanResult,
|
BacklogPlanResult,
|
||||||
FeatureStatusWithPipeline,
|
FeatureStatusWithPipeline,
|
||||||
FeatureTemplate,
|
FeatureTemplate,
|
||||||
|
ReasoningEffort,
|
||||||
} from '@automaker/types';
|
} from '@automaker/types';
|
||||||
import { pathsEqual } from '@/lib/utils';
|
import { pathsEqual } from '@/lib/utils';
|
||||||
import { toast } from 'sonner';
|
import { toast } from 'sonner';
|
||||||
@@ -977,23 +978,27 @@ export function BoardView() {
|
|||||||
|
|
||||||
// Helper that creates a feature and immediately starts it (used by conflict handlers and the Make button)
|
// Helper that creates a feature and immediately starts it (used by conflict handlers and the Make button)
|
||||||
const handleAddAndStartFeature = useCallback(
|
const handleAddAndStartFeature = useCallback(
|
||||||
async (featureData: Parameters<typeof handleAddFeature>[0]) => {
|
async (featureData: Parameters<typeof handleAddFeature>[0]): Promise<string | null> => {
|
||||||
// Capture existing feature IDs before adding
|
let createdFeatureId: string | null = null;
|
||||||
const featuresBeforeIds = new Set(useAppStore.getState().features.map((f) => f.id));
|
|
||||||
try {
|
try {
|
||||||
// Create feature directly with in_progress status to avoid brief backlog flash
|
// Create feature directly with in_progress status to avoid brief backlog flash
|
||||||
await handleAddFeature({ ...featureData, initialStatus: 'in_progress' });
|
const createdFeature = await handleAddFeature({
|
||||||
|
...featureData,
|
||||||
|
initialStatus: 'in_progress',
|
||||||
|
});
|
||||||
|
createdFeatureId = createdFeature?.id ?? null;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
logger.error('Failed to create feature:', error);
|
logger.error('Failed to create feature:', error);
|
||||||
toast.error('Failed to create feature', {
|
toast.error('Failed to create feature', {
|
||||||
description: error instanceof Error ? error.message : 'An error occurred',
|
description: error instanceof Error ? error.message : 'An error occurred',
|
||||||
});
|
});
|
||||||
return;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Find the newly created feature by looking for an ID that wasn't in the original set
|
|
||||||
const latestFeatures = useAppStore.getState().features;
|
const latestFeatures = useAppStore.getState().features;
|
||||||
const newFeature = latestFeatures.find((f) => !featuresBeforeIds.has(f.id));
|
const newFeature = createdFeatureId
|
||||||
|
? latestFeatures.find((f) => f.id === createdFeatureId)
|
||||||
|
: undefined;
|
||||||
|
|
||||||
if (newFeature) {
|
if (newFeature) {
|
||||||
try {
|
try {
|
||||||
@@ -1010,6 +1015,8 @@ export function BoardView() {
|
|||||||
description: 'The feature was created but could not be started automatically.',
|
description: 'The feature was created but could not be started automatically.',
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
return createdFeatureId;
|
||||||
},
|
},
|
||||||
[handleAddFeature, handleStartImplementation]
|
[handleAddFeature, handleStartImplementation]
|
||||||
);
|
);
|
||||||
@@ -1018,7 +1025,12 @@ export function BoardView() {
|
|||||||
const handleQuickAdd = useCallback(
|
const handleQuickAdd = useCallback(
|
||||||
async (
|
async (
|
||||||
description: string,
|
description: string,
|
||||||
modelEntry: { model: string; thinkingLevel?: string; reasoningEffort?: string }
|
modelEntry: {
|
||||||
|
model: string;
|
||||||
|
thinkingLevel?: string;
|
||||||
|
reasoningEffort?: string;
|
||||||
|
providerId?: string;
|
||||||
|
}
|
||||||
) => {
|
) => {
|
||||||
// Generate a title from the first line of the description
|
// Generate a title from the first line of the description
|
||||||
const title = description.split('\n')[0].substring(0, 100);
|
const title = description.split('\n')[0].substring(0, 100);
|
||||||
@@ -1032,7 +1044,8 @@ export function BoardView() {
|
|||||||
skipTests: defaultSkipTests,
|
skipTests: defaultSkipTests,
|
||||||
model: resolveModelString(modelEntry.model) as ModelAlias,
|
model: resolveModelString(modelEntry.model) as ModelAlias,
|
||||||
thinkingLevel: (modelEntry.thinkingLevel as ThinkingLevel) || 'none',
|
thinkingLevel: (modelEntry.thinkingLevel as ThinkingLevel) || 'none',
|
||||||
reasoningEffort: modelEntry.reasoningEffort,
|
reasoningEffort: modelEntry.reasoningEffort as ReasoningEffort,
|
||||||
|
providerId: modelEntry.providerId,
|
||||||
branchName: addFeatureUseSelectedWorktreeBranch ? selectedWorktreeBranch : undefined,
|
branchName: addFeatureUseSelectedWorktreeBranch ? selectedWorktreeBranch : undefined,
|
||||||
priority: 2,
|
priority: 2,
|
||||||
planningMode: useAppStore.getState().defaultPlanningMode ?? 'skip',
|
planningMode: useAppStore.getState().defaultPlanningMode ?? 'skip',
|
||||||
@@ -1053,7 +1066,12 @@ export function BoardView() {
|
|||||||
const handleQuickAddAndStart = useCallback(
|
const handleQuickAddAndStart = useCallback(
|
||||||
async (
|
async (
|
||||||
description: string,
|
description: string,
|
||||||
modelEntry: { model: string; thinkingLevel?: string; reasoningEffort?: string }
|
modelEntry: {
|
||||||
|
model: string;
|
||||||
|
thinkingLevel?: string;
|
||||||
|
reasoningEffort?: string;
|
||||||
|
providerId?: string;
|
||||||
|
}
|
||||||
) => {
|
) => {
|
||||||
// Generate a title from the first line of the description
|
// Generate a title from the first line of the description
|
||||||
const title = description.split('\n')[0].substring(0, 100);
|
const title = description.split('\n')[0].substring(0, 100);
|
||||||
@@ -1067,7 +1085,8 @@ export function BoardView() {
|
|||||||
skipTests: defaultSkipTests,
|
skipTests: defaultSkipTests,
|
||||||
model: resolveModelString(modelEntry.model) as ModelAlias,
|
model: resolveModelString(modelEntry.model) as ModelAlias,
|
||||||
thinkingLevel: (modelEntry.thinkingLevel as ThinkingLevel) || 'none',
|
thinkingLevel: (modelEntry.thinkingLevel as ThinkingLevel) || 'none',
|
||||||
reasoningEffort: modelEntry.reasoningEffort,
|
reasoningEffort: modelEntry.reasoningEffort as ReasoningEffort,
|
||||||
|
providerId: modelEntry.providerId,
|
||||||
branchName: addFeatureUseSelectedWorktreeBranch ? selectedWorktreeBranch : undefined,
|
branchName: addFeatureUseSelectedWorktreeBranch ? selectedWorktreeBranch : undefined,
|
||||||
priority: 2,
|
priority: 2,
|
||||||
planningMode: useAppStore.getState().defaultPlanningMode ?? 'skip',
|
planningMode: useAppStore.getState().defaultPlanningMode ?? 'skip',
|
||||||
@@ -1104,6 +1123,8 @@ export function BoardView() {
|
|||||||
title: prInfo.title,
|
title: prInfo.title,
|
||||||
// Pass the worktree's branch so features are created on the correct worktree
|
// Pass the worktree's branch so features are created on the correct worktree
|
||||||
headRefName: worktree.branch,
|
headRefName: worktree.branch,
|
||||||
|
// Pass the PR URL so features are created with prUrl set
|
||||||
|
url: prInfo.url,
|
||||||
});
|
});
|
||||||
setShowPRCommentDialog(true);
|
setShowPRCommentDialog(true);
|
||||||
}, []);
|
}, []);
|
||||||
@@ -1132,11 +1153,22 @@ export function BoardView() {
|
|||||||
priority: 1,
|
priority: 1,
|
||||||
planningMode: 'skip' as const,
|
planningMode: 'skip' as const,
|
||||||
requirePlanApproval: false,
|
requirePlanApproval: false,
|
||||||
|
dependencies: [],
|
||||||
};
|
};
|
||||||
|
|
||||||
await handleAddAndStartFeature(featureData);
|
const createdFeatureId = await handleAddAndStartFeature(featureData);
|
||||||
|
|
||||||
|
// Set prUrl on the created feature if the PR has a URL
|
||||||
|
if (prInfo.url && createdFeatureId) {
|
||||||
|
updateFeature(createdFeatureId, { prUrl: prInfo.url });
|
||||||
|
try {
|
||||||
|
await persistFeatureUpdate(createdFeatureId, { prUrl: prInfo.url });
|
||||||
|
} catch (error) {
|
||||||
|
logger.error('Failed to persist PR URL on created feature:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
},
|
},
|
||||||
[handleAddAndStartFeature, defaultSkipTests]
|
[handleAddAndStartFeature, defaultSkipTests, updateFeature, persistFeatureUpdate]
|
||||||
);
|
);
|
||||||
|
|
||||||
// Handler for resolving conflicts - opens dialog to select remote branch, then creates a feature
|
// Handler for resolving conflicts - opens dialog to select remote branch, then creates a feature
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ import { getProviderIconForModel } from '@/components/ui/provider-icon';
|
|||||||
import { useFeature, useAgentOutput } from '@/hooks/queries';
|
import { useFeature, useAgentOutput } from '@/hooks/queries';
|
||||||
import { queryKeys } from '@/lib/query-keys';
|
import { queryKeys } from '@/lib/query-keys';
|
||||||
import { getFirstNonEmptySummary } from '@/lib/summary-selection';
|
import { getFirstNonEmptySummary } from '@/lib/summary-selection';
|
||||||
|
import { useAppStore } from '@/store/app-store';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Formats thinking level for compact display
|
* Formats thinking level for compact display
|
||||||
@@ -64,6 +65,21 @@ export const AgentInfoPanel = memo(function AgentInfoPanel({
|
|||||||
const queryClient = useQueryClient();
|
const queryClient = useQueryClient();
|
||||||
const [isSummaryDialogOpen, setIsSummaryDialogOpen] = useState(false);
|
const [isSummaryDialogOpen, setIsSummaryDialogOpen] = useState(false);
|
||||||
const [isTodosExpanded, setIsTodosExpanded] = useState(false);
|
const [isTodosExpanded, setIsTodosExpanded] = useState(false);
|
||||||
|
|
||||||
|
// Get providers from store for provider-aware model name display
|
||||||
|
// This allows formatModelName to show provider-specific model names (e.g., "GLM 4.7" instead of "Sonnet 4.5")
|
||||||
|
// when a feature was executed using a Claude-compatible provider
|
||||||
|
const claudeCompatibleProviders = useAppStore((state) => state.claudeCompatibleProviders);
|
||||||
|
|
||||||
|
// Memoize the format options to avoid recreating the object on every render
|
||||||
|
const modelFormatOptions = useMemo(
|
||||||
|
() => ({
|
||||||
|
providerId: feature.providerId,
|
||||||
|
claudeCompatibleProviders,
|
||||||
|
}),
|
||||||
|
[feature.providerId, claudeCompatibleProviders]
|
||||||
|
);
|
||||||
|
|
||||||
// Track real-time task status updates from WebSocket events
|
// Track real-time task status updates from WebSocket events
|
||||||
const [taskStatusMap, setTaskStatusMap] = useState<
|
const [taskStatusMap, setTaskStatusMap] = useState<
|
||||||
Map<string, 'pending' | 'in_progress' | 'completed'>
|
Map<string, 'pending' | 'in_progress' | 'completed'>
|
||||||
@@ -74,7 +90,7 @@ export const AgentInfoPanel = memo(function AgentInfoPanel({
|
|||||||
const [lastWsEventTimestamp, setLastWsEventTimestamp] = useState<number | null>(null);
|
const [lastWsEventTimestamp, setLastWsEventTimestamp] = useState<number | null>(null);
|
||||||
|
|
||||||
// Determine if we should poll for updates
|
// Determine if we should poll for updates
|
||||||
const shouldFetchData = feature.status !== 'backlog';
|
const shouldFetchData = feature.status !== 'backlog' && feature.status !== 'merge_conflict';
|
||||||
|
|
||||||
// Track whether we're receiving WebSocket events (within threshold)
|
// Track whether we're receiving WebSocket events (within threshold)
|
||||||
// Use a state to trigger re-renders when the WebSocket connection becomes stale
|
// Use a state to trigger re-renders when the WebSocket connection becomes stale
|
||||||
@@ -242,9 +258,7 @@ export const AgentInfoPanel = memo(function AgentInfoPanel({
|
|||||||
return agentInfo?.todos || [];
|
return agentInfo?.todos || [];
|
||||||
}, [
|
}, [
|
||||||
freshPlanSpec,
|
freshPlanSpec,
|
||||||
feature.planSpec?.tasks,
|
feature.planSpec,
|
||||||
feature.planSpec?.tasksCompleted,
|
|
||||||
feature.planSpec?.currentTaskId,
|
|
||||||
agentInfo?.todos,
|
agentInfo?.todos,
|
||||||
taskStatusMap,
|
taskStatusMap,
|
||||||
taskSummaryMap,
|
taskSummaryMap,
|
||||||
@@ -314,7 +328,7 @@ export const AgentInfoPanel = memo(function AgentInfoPanel({
|
|||||||
}, [feature.id, shouldListenToEvents]);
|
}, [feature.id, shouldListenToEvents]);
|
||||||
|
|
||||||
// Model/Preset Info for Backlog Cards
|
// Model/Preset Info for Backlog Cards
|
||||||
if (feature.status === 'backlog') {
|
if (feature.status === 'backlog' || feature.status === 'merge_conflict') {
|
||||||
const provider = getProviderFromModel(feature.model);
|
const provider = getProviderFromModel(feature.model);
|
||||||
const isCodex = provider === 'codex';
|
const isCodex = provider === 'codex';
|
||||||
const isClaude = provider === 'claude';
|
const isClaude = provider === 'claude';
|
||||||
@@ -327,7 +341,9 @@ export const AgentInfoPanel = memo(function AgentInfoPanel({
|
|||||||
const ProviderIcon = getProviderIconForModel(feature.model);
|
const ProviderIcon = getProviderIconForModel(feature.model);
|
||||||
return <ProviderIcon className="w-3 h-3" />;
|
return <ProviderIcon className="w-3 h-3" />;
|
||||||
})()}
|
})()}
|
||||||
<span className="font-medium">{formatModelName(feature.model ?? DEFAULT_MODEL)}</span>
|
<span className="font-medium">
|
||||||
|
{formatModelName(feature.model ?? DEFAULT_MODEL, modelFormatOptions)}
|
||||||
|
</span>
|
||||||
</div>
|
</div>
|
||||||
{isClaude && feature.thinkingLevel && feature.thinkingLevel !== 'none' ? (
|
{isClaude && feature.thinkingLevel && feature.thinkingLevel !== 'none' ? (
|
||||||
<div className="flex items-center gap-1 text-purple-400">
|
<div className="flex items-center gap-1 text-purple-400">
|
||||||
@@ -373,7 +389,9 @@ export const AgentInfoPanel = memo(function AgentInfoPanel({
|
|||||||
const ProviderIcon = getProviderIconForModel(feature.model);
|
const ProviderIcon = getProviderIconForModel(feature.model);
|
||||||
return <ProviderIcon className="w-3 h-3" />;
|
return <ProviderIcon className="w-3 h-3" />;
|
||||||
})()}
|
})()}
|
||||||
<span className="font-medium">{formatModelName(feature.model ?? DEFAULT_MODEL)}</span>
|
<span className="font-medium">
|
||||||
|
{formatModelName(feature.model ?? DEFAULT_MODEL, modelFormatOptions)}
|
||||||
|
</span>
|
||||||
</div>
|
</div>
|
||||||
{agentInfo?.currentPhase && (
|
{agentInfo?.currentPhase && (
|
||||||
<div
|
<div
|
||||||
|
|||||||
@@ -39,7 +39,7 @@ export const CardActions = memo(function CardActions({
|
|||||||
feature,
|
feature,
|
||||||
isCurrentAutoTask,
|
isCurrentAutoTask,
|
||||||
isRunningTask = false,
|
isRunningTask = false,
|
||||||
hasContext: _hasContext,
|
hasContext = false,
|
||||||
shortcutKey,
|
shortcutKey,
|
||||||
isSelectionMode = false,
|
isSelectionMode = false,
|
||||||
onEdit,
|
onEdit,
|
||||||
@@ -54,6 +54,8 @@ export const CardActions = memo(function CardActions({
|
|||||||
onViewPlan,
|
onViewPlan,
|
||||||
onApprovePlan,
|
onApprovePlan,
|
||||||
}: CardActionsProps) {
|
}: CardActionsProps) {
|
||||||
|
const showBacklogLogsButton = hasContext && !!onViewOutput;
|
||||||
|
|
||||||
// Hide all actions when in selection mode
|
// Hide all actions when in selection mode
|
||||||
if (isSelectionMode) {
|
if (isSelectionMode) {
|
||||||
return null;
|
return null;
|
||||||
@@ -243,7 +245,7 @@ export const CardActions = memo(function CardActions({
|
|||||||
onViewOutput();
|
onViewOutput();
|
||||||
}}
|
}}
|
||||||
onPointerDown={(e) => e.stopPropagation()}
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
data-testid={`view-output-inprogress-${feature.id}`}
|
data-testid={`view-output-${feature.id}`}
|
||||||
>
|
>
|
||||||
<FileText className="w-3 h-3" />
|
<FileText className="w-3 h-3" />
|
||||||
</Button>
|
</Button>
|
||||||
@@ -348,6 +350,7 @@ export const CardActions = memo(function CardActions({
|
|||||||
{!isCurrentAutoTask &&
|
{!isCurrentAutoTask &&
|
||||||
isRunningTask &&
|
isRunningTask &&
|
||||||
(feature.status === 'backlog' ||
|
(feature.status === 'backlog' ||
|
||||||
|
feature.status === 'merge_conflict' ||
|
||||||
feature.status === 'interrupted' ||
|
feature.status === 'interrupted' ||
|
||||||
feature.status === 'ready') && (
|
feature.status === 'ready') && (
|
||||||
<>
|
<>
|
||||||
@@ -395,6 +398,7 @@ export const CardActions = memo(function CardActions({
|
|||||||
{!isCurrentAutoTask &&
|
{!isCurrentAutoTask &&
|
||||||
!isRunningTask &&
|
!isRunningTask &&
|
||||||
(feature.status === 'backlog' ||
|
(feature.status === 'backlog' ||
|
||||||
|
feature.status === 'merge_conflict' ||
|
||||||
feature.status === 'interrupted' ||
|
feature.status === 'interrupted' ||
|
||||||
feature.status === 'ready') && (
|
feature.status === 'ready') && (
|
||||||
<>
|
<>
|
||||||
@@ -412,6 +416,22 @@ export const CardActions = memo(function CardActions({
|
|||||||
<Edit className="w-3 h-3 mr-1" />
|
<Edit className="w-3 h-3 mr-1" />
|
||||||
Edit
|
Edit
|
||||||
</Button>
|
</Button>
|
||||||
|
{showBacklogLogsButton && (
|
||||||
|
<Button
|
||||||
|
variant="outline"
|
||||||
|
size="sm"
|
||||||
|
className="h-7 text-xs px-2"
|
||||||
|
onClick={(e) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
onViewOutput();
|
||||||
|
}}
|
||||||
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
data-testid={`view-output-backlog-${feature.id}`}
|
||||||
|
title="View Logs"
|
||||||
|
>
|
||||||
|
<FileText className="w-3 h-3" />
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
{feature.planSpec?.content && onViewPlan && (
|
{feature.planSpec?.content && onViewPlan && (
|
||||||
<Button
|
<Button
|
||||||
variant="outline"
|
variant="outline"
|
||||||
@@ -440,8 +460,12 @@ export const CardActions = memo(function CardActions({
|
|||||||
onPointerDown={(e) => e.stopPropagation()}
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
data-testid={`make-${feature.id}`}
|
data-testid={`make-${feature.id}`}
|
||||||
>
|
>
|
||||||
<PlayCircle className="w-3 h-3 mr-1" />
|
{feature.status === 'merge_conflict' ? (
|
||||||
Make
|
<RotateCcw className="w-3 h-3 mr-1" />
|
||||||
|
) : (
|
||||||
|
<PlayCircle className="w-3 h-3 mr-1" />
|
||||||
|
)}
|
||||||
|
{feature.status === 'merge_conflict' ? 'Restart' : 'Make'}
|
||||||
</Button>
|
</Button>
|
||||||
)}
|
)}
|
||||||
</>
|
</>
|
||||||
|
|||||||
@@ -3,7 +3,7 @@ import { memo, useEffect, useMemo, useState } from 'react';
|
|||||||
import { Feature, useAppStore } from '@/store/app-store';
|
import { Feature, useAppStore } from '@/store/app-store';
|
||||||
import { cn } from '@/lib/utils';
|
import { cn } from '@/lib/utils';
|
||||||
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip';
|
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip';
|
||||||
import { AlertCircle, Lock, Hand, Sparkles, SkipForward } from 'lucide-react';
|
import { AlertCircle, AlertTriangle, Lock, Hand, Sparkles, SkipForward } from 'lucide-react';
|
||||||
import { getBlockingDependencies } from '@automaker/dependency-resolver';
|
import { getBlockingDependencies } from '@automaker/dependency-resolver';
|
||||||
import { useShallow } from 'zustand/react/shallow';
|
import { useShallow } from 'zustand/react/shallow';
|
||||||
import { usePipelineConfig } from '@/hooks/queries/use-pipeline';
|
import { usePipelineConfig } from '@/hooks/queries/use-pipeline';
|
||||||
@@ -18,33 +18,61 @@ interface CardBadgesProps {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* CardBadges - Shows error badges below the card header
|
* CardBadges - Shows error badges below the card header
|
||||||
* Note: Blocked/Lock badges are now shown in PriorityBadges for visual consistency
|
* Note: Merge conflict badge is aligned with the top badge row for visual consistency
|
||||||
*/
|
*/
|
||||||
export const CardBadges = memo(function CardBadges({ feature }: CardBadgesProps) {
|
export const CardBadges = memo(function CardBadges({ feature }: CardBadgesProps) {
|
||||||
if (!feature.error) {
|
const showMergeConflict = feature.status === 'merge_conflict';
|
||||||
|
const mergeConflictOffsetClass = feature.priority ? 'left-9' : 'left-2';
|
||||||
|
if (!feature.error && !showMergeConflict) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="flex flex-wrap items-center gap-1.5 px-3 pt-1.5 min-h-[24px]">
|
<>
|
||||||
|
{/* Merge conflict badge */}
|
||||||
|
{showMergeConflict && (
|
||||||
|
<div className={cn('absolute top-2 z-10', mergeConflictOffsetClass)}>
|
||||||
|
<Tooltip>
|
||||||
|
<TooltipTrigger asChild>
|
||||||
|
<div
|
||||||
|
className={cn(
|
||||||
|
uniformBadgeClass,
|
||||||
|
'bg-[var(--status-warning-bg)] border-[var(--status-warning)]/40 text-[var(--status-warning)]'
|
||||||
|
)}
|
||||||
|
data-testid={`merge-conflict-badge-${feature.id}`}
|
||||||
|
>
|
||||||
|
<AlertTriangle className="w-3.5 h-3.5" />
|
||||||
|
</div>
|
||||||
|
</TooltipTrigger>
|
||||||
|
<TooltipContent side="bottom" className="text-xs max-w-[250px]">
|
||||||
|
<p>Merge Conflict: automatic merge failed and requires manual resolution</p>
|
||||||
|
</TooltipContent>
|
||||||
|
</Tooltip>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
{/* Error badge */}
|
{/* Error badge */}
|
||||||
<Tooltip>
|
{feature.error && (
|
||||||
<TooltipTrigger asChild>
|
<div className="flex flex-wrap items-center gap-1.5 px-3 pt-1.5 min-h-[24px]">
|
||||||
<div
|
<Tooltip>
|
||||||
className={cn(
|
<TooltipTrigger asChild>
|
||||||
uniformBadgeClass,
|
<div
|
||||||
'bg-[var(--status-error-bg)] border-[var(--status-error)]/40 text-[var(--status-error)]'
|
className={cn(
|
||||||
)}
|
uniformBadgeClass,
|
||||||
data-testid={`error-badge-${feature.id}`}
|
'bg-[var(--status-error-bg)] border-[var(--status-error)]/40 text-[var(--status-error)]'
|
||||||
>
|
)}
|
||||||
<AlertCircle className="w-3.5 h-3.5" />
|
data-testid={`error-badge-${feature.id}`}
|
||||||
</div>
|
>
|
||||||
</TooltipTrigger>
|
<AlertCircle className="w-3.5 h-3.5" />
|
||||||
<TooltipContent side="bottom" className="text-xs max-w-[250px]">
|
</div>
|
||||||
<p>{feature.error}</p>
|
</TooltipTrigger>
|
||||||
</TooltipContent>
|
<TooltipContent side="bottom" className="text-xs max-w-[250px]">
|
||||||
</Tooltip>
|
<p>{feature.error}</p>
|
||||||
</div>
|
</TooltipContent>
|
||||||
|
</Tooltip>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</>
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import { memo, useState } from 'react';
|
import { memo, useState, useMemo } from 'react';
|
||||||
import type { DraggableAttributes, DraggableSyntheticListeners } from '@dnd-kit/core';
|
import type { DraggableAttributes, DraggableSyntheticListeners } from '@dnd-kit/core';
|
||||||
import { Feature } from '@/store/app-store';
|
import { Feature } from '@/store/app-store';
|
||||||
import { cn } from '@/lib/utils';
|
import { cn } from '@/lib/utils';
|
||||||
@@ -30,6 +30,7 @@ import { CountUpTimer } from '@/components/ui/count-up-timer';
|
|||||||
import { formatModelName, DEFAULT_MODEL } from '@/lib/agent-context-parser';
|
import { formatModelName, DEFAULT_MODEL } from '@/lib/agent-context-parser';
|
||||||
import { DeleteConfirmDialog } from '@/components/ui/delete-confirm-dialog';
|
import { DeleteConfirmDialog } from '@/components/ui/delete-confirm-dialog';
|
||||||
import { getProviderIconForModel } from '@/components/ui/provider-icon';
|
import { getProviderIconForModel } from '@/components/ui/provider-icon';
|
||||||
|
import { useAppStore } from '@/store/app-store';
|
||||||
|
|
||||||
function DuplicateMenuItems({
|
function DuplicateMenuItems({
|
||||||
onDuplicate,
|
onDuplicate,
|
||||||
@@ -107,6 +108,7 @@ interface CardHeaderProps {
|
|||||||
isDraggable: boolean;
|
isDraggable: boolean;
|
||||||
isCurrentAutoTask: boolean;
|
isCurrentAutoTask: boolean;
|
||||||
isSelectionMode?: boolean;
|
isSelectionMode?: boolean;
|
||||||
|
hasContext?: boolean;
|
||||||
onEdit: () => void;
|
onEdit: () => void;
|
||||||
onDelete: () => void;
|
onDelete: () => void;
|
||||||
onViewOutput?: () => void;
|
onViewOutput?: () => void;
|
||||||
@@ -123,6 +125,7 @@ export const CardHeaderSection = memo(function CardHeaderSection({
|
|||||||
isDraggable,
|
isDraggable,
|
||||||
isCurrentAutoTask,
|
isCurrentAutoTask,
|
||||||
isSelectionMode = false,
|
isSelectionMode = false,
|
||||||
|
hasContext = false,
|
||||||
onEdit,
|
onEdit,
|
||||||
onDelete,
|
onDelete,
|
||||||
onViewOutput,
|
onViewOutput,
|
||||||
@@ -135,6 +138,21 @@ export const CardHeaderSection = memo(function CardHeaderSection({
|
|||||||
}: CardHeaderProps) {
|
}: CardHeaderProps) {
|
||||||
const [isDescriptionExpanded, setIsDescriptionExpanded] = useState(false);
|
const [isDescriptionExpanded, setIsDescriptionExpanded] = useState(false);
|
||||||
const [isDeleteDialogOpen, setIsDeleteDialogOpen] = useState(false);
|
const [isDeleteDialogOpen, setIsDeleteDialogOpen] = useState(false);
|
||||||
|
const showBacklogLogsButton = hasContext && !!onViewOutput;
|
||||||
|
|
||||||
|
// Get providers from store for provider-aware model name display
|
||||||
|
// This allows formatModelName to show provider-specific model names (e.g., "GLM 4.7" instead of "Sonnet 4.5")
|
||||||
|
// when a feature was executed using a Claude-compatible provider
|
||||||
|
const claudeCompatibleProviders = useAppStore((state) => state.claudeCompatibleProviders);
|
||||||
|
|
||||||
|
// Memoize the format options to avoid recreating the object on every render
|
||||||
|
const modelFormatOptions = useMemo(
|
||||||
|
() => ({
|
||||||
|
providerId: feature.providerId,
|
||||||
|
claudeCompatibleProviders,
|
||||||
|
}),
|
||||||
|
[feature.providerId, claudeCompatibleProviders]
|
||||||
|
);
|
||||||
|
|
||||||
const handleDeleteClick = (e: React.MouseEvent) => {
|
const handleDeleteClick = (e: React.MouseEvent) => {
|
||||||
e.stopPropagation();
|
e.stopPropagation();
|
||||||
@@ -207,7 +225,9 @@ export const CardHeaderSection = memo(function CardHeaderSection({
|
|||||||
<div className="px-2 py-1.5 text-[10px] text-muted-foreground border-t mt-1 pt-1.5">
|
<div className="px-2 py-1.5 text-[10px] text-muted-foreground border-t mt-1 pt-1.5">
|
||||||
<div className="flex items-center gap-1">
|
<div className="flex items-center gap-1">
|
||||||
<ProviderIcon className="w-3 h-3" />
|
<ProviderIcon className="w-3 h-3" />
|
||||||
<span>{formatModelName(feature.model ?? DEFAULT_MODEL)}</span>
|
<span>
|
||||||
|
{formatModelName(feature.model ?? DEFAULT_MODEL, modelFormatOptions)}
|
||||||
|
</span>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
@@ -221,6 +241,7 @@ export const CardHeaderSection = memo(function CardHeaderSection({
|
|||||||
{!isCurrentAutoTask &&
|
{!isCurrentAutoTask &&
|
||||||
!isSelectionMode &&
|
!isSelectionMode &&
|
||||||
(feature.status === 'backlog' ||
|
(feature.status === 'backlog' ||
|
||||||
|
feature.status === 'merge_conflict' ||
|
||||||
feature.status === 'interrupted' ||
|
feature.status === 'interrupted' ||
|
||||||
feature.status === 'ready') && (
|
feature.status === 'ready') && (
|
||||||
<div className="absolute top-2 right-2 flex items-center gap-1">
|
<div className="absolute top-2 right-2 flex items-center gap-1">
|
||||||
@@ -238,6 +259,22 @@ export const CardHeaderSection = memo(function CardHeaderSection({
|
|||||||
>
|
>
|
||||||
<GitFork className="w-4 h-4" />
|
<GitFork className="w-4 h-4" />
|
||||||
</Button>
|
</Button>
|
||||||
|
{showBacklogLogsButton && (
|
||||||
|
<Button
|
||||||
|
variant="ghost"
|
||||||
|
size="sm"
|
||||||
|
className="h-6 w-6 p-0 hover:bg-white/10 text-muted-foreground hover:text-foreground"
|
||||||
|
onClick={(e) => {
|
||||||
|
e.stopPropagation();
|
||||||
|
onViewOutput?.();
|
||||||
|
}}
|
||||||
|
onPointerDown={(e) => e.stopPropagation()}
|
||||||
|
data-testid={`logs-backlog-${feature.id}`}
|
||||||
|
title="Logs"
|
||||||
|
>
|
||||||
|
<FileText className="w-4 h-4" />
|
||||||
|
</Button>
|
||||||
|
)}
|
||||||
<Button
|
<Button
|
||||||
variant="ghost"
|
variant="ghost"
|
||||||
size="sm"
|
size="sm"
|
||||||
@@ -444,7 +481,9 @@ export const CardHeaderSection = memo(function CardHeaderSection({
|
|||||||
<div className="px-2 py-1.5 text-[10px] text-muted-foreground border-t mt-1 pt-1.5">
|
<div className="px-2 py-1.5 text-[10px] text-muted-foreground border-t mt-1 pt-1.5">
|
||||||
<div className="flex items-center gap-1">
|
<div className="flex items-center gap-1">
|
||||||
<ProviderIcon className="w-3 h-3" />
|
<ProviderIcon className="w-3 h-3" />
|
||||||
<span>{formatModelName(feature.model ?? DEFAULT_MODEL)}</span>
|
<span>
|
||||||
|
{formatModelName(feature.model ?? DEFAULT_MODEL, modelFormatOptions)}
|
||||||
|
</span>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -131,6 +131,7 @@ export const KanbanCard = memo(function KanbanCard({
|
|||||||
!!isCurrentAutoTask &&
|
!!isCurrentAutoTask &&
|
||||||
!isInExecutionState &&
|
!isInExecutionState &&
|
||||||
(feature.status === 'backlog' ||
|
(feature.status === 'backlog' ||
|
||||||
|
feature.status === 'merge_conflict' ||
|
||||||
feature.status === 'ready' ||
|
feature.status === 'ready' ||
|
||||||
feature.status === 'interrupted');
|
feature.status === 'interrupted');
|
||||||
// Show running visual treatment for both fully confirmed and stale-status running tasks
|
// Show running visual treatment for both fully confirmed and stale-status running tasks
|
||||||
@@ -149,6 +150,7 @@ export const KanbanCard = memo(function KanbanCard({
|
|||||||
!isSelectionMode &&
|
!isSelectionMode &&
|
||||||
!isRunningWithStaleStatus &&
|
!isRunningWithStaleStatus &&
|
||||||
(feature.status === 'backlog' ||
|
(feature.status === 'backlog' ||
|
||||||
|
feature.status === 'merge_conflict' ||
|
||||||
feature.status === 'interrupted' ||
|
feature.status === 'interrupted' ||
|
||||||
feature.status === 'ready' ||
|
feature.status === 'ready' ||
|
||||||
feature.status === 'waiting_approval' ||
|
feature.status === 'waiting_approval' ||
|
||||||
@@ -194,7 +196,10 @@ export const KanbanCard = memo(function KanbanCard({
|
|||||||
const cardStyle = getCardBorderStyle(cardBorderEnabled, cardBorderOpacity);
|
const cardStyle = getCardBorderStyle(cardBorderEnabled, cardBorderOpacity);
|
||||||
|
|
||||||
// Only allow selection for features matching the selection target
|
// Only allow selection for features matching the selection target
|
||||||
const isSelectable = isSelectionMode && feature.status === selectionTarget;
|
const isSelectable =
|
||||||
|
isSelectionMode &&
|
||||||
|
(feature.status === selectionTarget ||
|
||||||
|
(selectionTarget === 'backlog' && feature.status === 'merge_conflict'));
|
||||||
|
|
||||||
const wrapperClasses = cn(
|
const wrapperClasses = cn(
|
||||||
'relative select-none outline-none transition-transform duration-200 ease-out',
|
'relative select-none outline-none transition-transform duration-200 ease-out',
|
||||||
@@ -275,6 +280,7 @@ export const KanbanCard = memo(function KanbanCard({
|
|||||||
isDraggable={isDraggable}
|
isDraggable={isDraggable}
|
||||||
isCurrentAutoTask={isActivelyRunning}
|
isCurrentAutoTask={isActivelyRunning}
|
||||||
isSelectionMode={isSelectionMode}
|
isSelectionMode={isSelectionMode}
|
||||||
|
hasContext={hasContext}
|
||||||
onEdit={onEdit}
|
onEdit={onEdit}
|
||||||
onDelete={onDelete}
|
onDelete={onDelete}
|
||||||
onViewOutput={onViewOutput}
|
onViewOutput={onViewOutput}
|
||||||
|
|||||||
@@ -144,6 +144,7 @@ function getPrimaryAction(
|
|||||||
if (
|
if (
|
||||||
isRunningTask &&
|
isRunningTask &&
|
||||||
(feature.status === 'backlog' ||
|
(feature.status === 'backlog' ||
|
||||||
|
feature.status === 'merge_conflict' ||
|
||||||
feature.status === 'ready' ||
|
feature.status === 'ready' ||
|
||||||
feature.status === 'interrupted') &&
|
feature.status === 'interrupted') &&
|
||||||
handlers.onForceStop
|
handlers.onForceStop
|
||||||
@@ -156,11 +157,14 @@ function getPrimaryAction(
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
// Backlog - implement is primary
|
// Backlog-like statuses - implement/restart is primary
|
||||||
if (feature.status === 'backlog' && handlers.onImplement) {
|
if (
|
||||||
|
(feature.status === 'backlog' || feature.status === 'merge_conflict') &&
|
||||||
|
handlers.onImplement
|
||||||
|
) {
|
||||||
return {
|
return {
|
||||||
icon: PlayCircle,
|
icon: feature.status === 'merge_conflict' ? RotateCcw : PlayCircle,
|
||||||
label: 'Make',
|
label: feature.status === 'merge_conflict' ? 'Restart' : 'Make',
|
||||||
onClick: handlers.onImplement,
|
onClick: handlers.onImplement,
|
||||||
variant: 'primary',
|
variant: 'primary',
|
||||||
};
|
};
|
||||||
@@ -293,13 +297,16 @@ export const RowActions = memo(function RowActions({
|
|||||||
|
|
||||||
// Use controlled or uncontrolled state
|
// Use controlled or uncontrolled state
|
||||||
const open = isOpen ?? internalOpen;
|
const open = isOpen ?? internalOpen;
|
||||||
const setOpen = (value: boolean) => {
|
const setOpen = useCallback(
|
||||||
if (onOpenChange) {
|
(value: boolean) => {
|
||||||
onOpenChange(value);
|
if (onOpenChange) {
|
||||||
} else {
|
onOpenChange(value);
|
||||||
setInternalOpen(value);
|
} else {
|
||||||
}
|
setInternalOpen(value);
|
||||||
};
|
}
|
||||||
|
},
|
||||||
|
[onOpenChange]
|
||||||
|
);
|
||||||
|
|
||||||
const handleOpenChange = useCallback(
|
const handleOpenChange = useCallback(
|
||||||
(newOpen: boolean) => {
|
(newOpen: boolean) => {
|
||||||
@@ -425,68 +432,77 @@ export const RowActions = memo(function RowActions({
|
|||||||
)}
|
)}
|
||||||
|
|
||||||
{/* Backlog actions */}
|
{/* Backlog actions */}
|
||||||
{!isCurrentAutoTask && !isRunningTask && feature.status === 'backlog' && (
|
{!isCurrentAutoTask &&
|
||||||
<>
|
!isRunningTask &&
|
||||||
<MenuItem icon={Edit} label="Edit" onClick={withClose(handlers.onEdit)} />
|
(feature.status === 'backlog' || feature.status === 'merge_conflict') && (
|
||||||
{feature.planSpec?.content && handlers.onViewPlan && (
|
<>
|
||||||
<MenuItem icon={Eye} label="View Plan" onClick={withClose(handlers.onViewPlan)} />
|
<MenuItem icon={Edit} label="Edit" onClick={withClose(handlers.onEdit)} />
|
||||||
)}
|
{handlers.onViewOutput && (
|
||||||
{handlers.onImplement && (
|
<MenuItem
|
||||||
<MenuItem
|
icon={FileText}
|
||||||
icon={PlayCircle}
|
label="View Logs"
|
||||||
label="Make"
|
onClick={withClose(handlers.onViewOutput)}
|
||||||
onClick={withClose(handlers.onImplement)}
|
/>
|
||||||
variant="primary"
|
)}
|
||||||
/>
|
{feature.planSpec?.content && handlers.onViewPlan && (
|
||||||
)}
|
<MenuItem icon={Eye} label="View Plan" onClick={withClose(handlers.onViewPlan)} />
|
||||||
{handlers.onSpawnTask && (
|
)}
|
||||||
<MenuItem
|
{handlers.onImplement && (
|
||||||
icon={GitFork}
|
<MenuItem
|
||||||
label="Spawn Sub-Task"
|
icon={feature.status === 'merge_conflict' ? RotateCcw : PlayCircle}
|
||||||
onClick={withClose(handlers.onSpawnTask)}
|
label={feature.status === 'merge_conflict' ? 'Restart' : 'Make'}
|
||||||
/>
|
onClick={withClose(handlers.onImplement)}
|
||||||
)}
|
variant="primary"
|
||||||
{handlers.onDuplicate && (
|
/>
|
||||||
<DropdownMenuSub>
|
)}
|
||||||
<div className="flex items-center">
|
{handlers.onSpawnTask && (
|
||||||
<DropdownMenuItem
|
<MenuItem
|
||||||
onClick={withClose(handlers.onDuplicate)}
|
icon={GitFork}
|
||||||
className="flex-1 pr-0 rounded-r-none"
|
label="Spawn Sub-Task"
|
||||||
>
|
onClick={withClose(handlers.onSpawnTask)}
|
||||||
<Copy className="w-4 h-4 mr-2" />
|
/>
|
||||||
Duplicate
|
)}
|
||||||
</DropdownMenuItem>
|
{handlers.onDuplicate && (
|
||||||
{handlers.onDuplicateAsChild && (
|
<DropdownMenuSub>
|
||||||
<DropdownMenuSubTrigger className="px-1 rounded-l-none border-l border-border/30 h-8" />
|
<div className="flex items-center">
|
||||||
)}
|
<DropdownMenuItem
|
||||||
</div>
|
onClick={withClose(handlers.onDuplicate)}
|
||||||
{handlers.onDuplicateAsChild && (
|
className="flex-1 pr-0 rounded-r-none"
|
||||||
<DropdownMenuSubContent>
|
>
|
||||||
<MenuItem
|
<Copy className="w-4 h-4 mr-2" />
|
||||||
icon={GitFork}
|
Duplicate
|
||||||
label="Duplicate as Child"
|
</DropdownMenuItem>
|
||||||
onClick={withClose(handlers.onDuplicateAsChild)}
|
{handlers.onDuplicateAsChild && (
|
||||||
/>
|
<DropdownMenuSubTrigger className="px-1 rounded-l-none border-l border-border/30 h-8" />
|
||||||
{handlers.onDuplicateAsChildMultiple && (
|
|
||||||
<MenuItem
|
|
||||||
icon={Repeat}
|
|
||||||
label="Duplicate as Child ×N"
|
|
||||||
onClick={withClose(handlers.onDuplicateAsChildMultiple)}
|
|
||||||
/>
|
|
||||||
)}
|
)}
|
||||||
</DropdownMenuSubContent>
|
</div>
|
||||||
)}
|
{handlers.onDuplicateAsChild && (
|
||||||
</DropdownMenuSub>
|
<DropdownMenuSubContent>
|
||||||
)}
|
<MenuItem
|
||||||
<DropdownMenuSeparator />
|
icon={GitFork}
|
||||||
<MenuItem
|
label="Duplicate as Child"
|
||||||
icon={Trash2}
|
onClick={withClose(handlers.onDuplicateAsChild)}
|
||||||
label="Delete"
|
/>
|
||||||
onClick={withClose(handlers.onDelete)}
|
{handlers.onDuplicateAsChildMultiple && (
|
||||||
variant="destructive"
|
<MenuItem
|
||||||
/>
|
icon={Repeat}
|
||||||
</>
|
label="Duplicate as Child ×N"
|
||||||
)}
|
onClick={withClose(handlers.onDuplicateAsChildMultiple)}
|
||||||
|
/>
|
||||||
|
)}
|
||||||
|
</DropdownMenuSubContent>
|
||||||
|
)}
|
||||||
|
</DropdownMenuSub>
|
||||||
|
)}
|
||||||
|
<DropdownMenuSeparator />
|
||||||
|
<MenuItem
|
||||||
|
icon={Trash2}
|
||||||
|
label="Delete"
|
||||||
|
onClick={withClose(handlers.onDelete)}
|
||||||
|
variant="destructive"
|
||||||
|
/>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
|
||||||
{/* In Progress actions - starting/running (no error, force stop available) - mirrors running task actions */}
|
{/* In Progress actions - starting/running (no error, force stop available) - mirrors running task actions */}
|
||||||
{!isCurrentAutoTask &&
|
{!isCurrentAutoTask &&
|
||||||
|
|||||||
@@ -23,6 +23,12 @@ const BASE_STATUS_DISPLAY: Record<string, StatusDisplay> = {
|
|||||||
bgClass: 'bg-[var(--status-backlog)]/15',
|
bgClass: 'bg-[var(--status-backlog)]/15',
|
||||||
borderClass: 'border-[var(--status-backlog)]/30',
|
borderClass: 'border-[var(--status-backlog)]/30',
|
||||||
},
|
},
|
||||||
|
merge_conflict: {
|
||||||
|
label: 'Merge Conflict',
|
||||||
|
colorClass: 'text-[var(--status-warning)]',
|
||||||
|
bgClass: 'bg-[var(--status-warning)]/15',
|
||||||
|
borderClass: 'border-[var(--status-warning)]/30',
|
||||||
|
},
|
||||||
in_progress: {
|
in_progress: {
|
||||||
label: 'In Progress',
|
label: 'In Progress',
|
||||||
colorClass: 'text-[var(--status-in-progress)]',
|
colorClass: 'text-[var(--status-in-progress)]',
|
||||||
@@ -204,6 +210,7 @@ export function getStatusLabel(
|
|||||||
export function getStatusOrder(status: FeatureStatusWithPipeline): number {
|
export function getStatusOrder(status: FeatureStatusWithPipeline): number {
|
||||||
const baseOrder: Record<string, number> = {
|
const baseOrder: Record<string, number> = {
|
||||||
backlog: 0,
|
backlog: 0,
|
||||||
|
merge_conflict: 0,
|
||||||
in_progress: 1,
|
in_progress: 1,
|
||||||
waiting_approval: 2,
|
waiting_approval: 2,
|
||||||
verified: 3,
|
verified: 3,
|
||||||
|
|||||||
@@ -24,16 +24,11 @@ import {
|
|||||||
import { Play, Cpu, FolderKanban, Settings2 } from 'lucide-react';
|
import { Play, Cpu, FolderKanban, Settings2 } from 'lucide-react';
|
||||||
import { useNavigate } from '@tanstack/react-router';
|
import { useNavigate } from '@tanstack/react-router';
|
||||||
import { toast } from 'sonner';
|
import { toast } from 'sonner';
|
||||||
import { cn } from '@/lib/utils';
|
import { cn, normalizeModelEntry } from '@/lib/utils';
|
||||||
import { modelSupportsThinking } from '@/lib/utils';
|
|
||||||
import { useAppStore } from '@/store/app-store';
|
import { useAppStore } from '@/store/app-store';
|
||||||
import type { ThinkingLevel, PlanningMode, Feature, FeatureImage } from '@/store/types';
|
import type { ThinkingLevel, PlanningMode, Feature, FeatureImage } from '@/store/types';
|
||||||
import type { ReasoningEffort, PhaseModelEntry, AgentModel } from '@automaker/types';
|
import type { ReasoningEffort, PhaseModelEntry, AgentModel } from '@automaker/types';
|
||||||
import {
|
import { normalizeThinkingLevelForModel, getThinkingLevelsForModel } from '@automaker/types';
|
||||||
supportsReasoningEffort,
|
|
||||||
normalizeThinkingLevelForModel,
|
|
||||||
getThinkingLevelsForModel,
|
|
||||||
} from '@automaker/types';
|
|
||||||
import {
|
import {
|
||||||
PrioritySelector,
|
PrioritySelector,
|
||||||
WorkModeSelector,
|
WorkModeSelector,
|
||||||
@@ -90,6 +85,7 @@ type FeatureData = {
|
|||||||
model: AgentModel;
|
model: AgentModel;
|
||||||
thinkingLevel: ThinkingLevel;
|
thinkingLevel: ThinkingLevel;
|
||||||
reasoningEffort: ReasoningEffort;
|
reasoningEffort: ReasoningEffort;
|
||||||
|
providerId?: string;
|
||||||
branchName: string;
|
branchName: string;
|
||||||
priority: number;
|
priority: number;
|
||||||
planningMode: PlanningMode;
|
planningMode: PlanningMode;
|
||||||
@@ -327,13 +323,7 @@ export function AddFeatureDialog({
|
|||||||
}
|
}
|
||||||
|
|
||||||
const finalCategory = category || 'Uncategorized';
|
const finalCategory = category || 'Uncategorized';
|
||||||
const selectedModel = modelEntry.model;
|
const normalizedEntry = normalizeModelEntry(modelEntry);
|
||||||
const normalizedThinking = modelSupportsThinking(selectedModel)
|
|
||||||
? modelEntry.thinkingLevel || 'none'
|
|
||||||
: 'none';
|
|
||||||
const normalizedReasoning = supportsReasoningEffort(selectedModel)
|
|
||||||
? modelEntry.reasoningEffort || 'none'
|
|
||||||
: 'none';
|
|
||||||
|
|
||||||
// For 'current' mode, use empty string (work on current branch)
|
// For 'current' mode, use empty string (work on current branch)
|
||||||
// For 'auto' mode, use empty string (will be auto-generated in use-board-actions)
|
// For 'auto' mode, use empty string (will be auto-generated in use-board-actions)
|
||||||
@@ -381,9 +371,10 @@ export function AddFeatureDialog({
|
|||||||
imagePaths,
|
imagePaths,
|
||||||
textFilePaths,
|
textFilePaths,
|
||||||
skipTests,
|
skipTests,
|
||||||
model: selectedModel,
|
model: normalizedEntry.model,
|
||||||
thinkingLevel: normalizedThinking,
|
thinkingLevel: normalizedEntry.thinkingLevel,
|
||||||
reasoningEffort: normalizedReasoning,
|
reasoningEffort: normalizedEntry.reasoningEffort,
|
||||||
|
providerId: normalizedEntry.providerId,
|
||||||
branchName: finalBranchName,
|
branchName: finalBranchName,
|
||||||
priority,
|
priority,
|
||||||
planningMode,
|
planningMode,
|
||||||
|
|||||||
@@ -0,0 +1,41 @@
|
|||||||
|
/**
|
||||||
|
* Constants for AgentOutputModal component
|
||||||
|
* Centralizes magic numbers, timeouts, and configuration values
|
||||||
|
*/
|
||||||
|
|
||||||
|
export const MODAL_CONSTANTS = {
|
||||||
|
// Auto-scroll threshold for detecting when user is at bottom
|
||||||
|
AUTOSCROLL_THRESHOLD: 50,
|
||||||
|
|
||||||
|
// Delay for closing modal after successful completion
|
||||||
|
MODAL_CLOSE_DELAY_MS: 1500,
|
||||||
|
|
||||||
|
// Modal height constraints for different viewports
|
||||||
|
HEIGHT_CONSTRAINTS: {
|
||||||
|
MOBILE_MAX_DVH: '85dvh',
|
||||||
|
SMALL_MAX_VH: '80vh',
|
||||||
|
TABLET_MAX_VH: '85vh',
|
||||||
|
},
|
||||||
|
|
||||||
|
// Modal width constraints for different viewports
|
||||||
|
WIDTH_CONSTRAINTS: {
|
||||||
|
MOBILE_MAX_CALC: 'calc(100% - 2rem)',
|
||||||
|
SMALL_MAX_VW: '60vw',
|
||||||
|
TABLET_MAX_VW: '90vw',
|
||||||
|
TABLET_MAX_WIDTH: '1200px',
|
||||||
|
},
|
||||||
|
|
||||||
|
// View modes
|
||||||
|
VIEW_MODES: {
|
||||||
|
SUMMARY: 'summary',
|
||||||
|
PARSED: 'parsed',
|
||||||
|
RAW: 'raw',
|
||||||
|
CHANGES: 'changes',
|
||||||
|
} as const,
|
||||||
|
|
||||||
|
// Component heights (complete Tailwind class fragments for template interpolation)
|
||||||
|
COMPONENT_HEIGHTS: {
|
||||||
|
SMALL_MIN: 'sm:min-h-[200px]',
|
||||||
|
SMALL_MAX: 'sm:max-h-[60vh]',
|
||||||
|
},
|
||||||
|
} as const;
|
||||||
@@ -26,6 +26,7 @@ import { useAgentOutput, useFeature } from '@/hooks/queries';
|
|||||||
import { cn } from '@/lib/utils';
|
import { cn } from '@/lib/utils';
|
||||||
import type { AutoModeEvent } from '@/types/electron';
|
import type { AutoModeEvent } from '@/types/electron';
|
||||||
import type { BacklogPlanEvent } from '@automaker/types';
|
import type { BacklogPlanEvent } from '@automaker/types';
|
||||||
|
import { MODAL_CONSTANTS } from './agent-output-modal.constants';
|
||||||
|
|
||||||
interface AgentOutputModalProps {
|
interface AgentOutputModalProps {
|
||||||
open: boolean;
|
open: boolean;
|
||||||
@@ -257,7 +258,8 @@ export function AgentOutputModal({
|
|||||||
if (!summaryScrollRef.current) return;
|
if (!summaryScrollRef.current) return;
|
||||||
|
|
||||||
const { scrollTop, scrollHeight, clientHeight } = summaryScrollRef.current;
|
const { scrollTop, scrollHeight, clientHeight } = summaryScrollRef.current;
|
||||||
const isAtBottom = scrollHeight - scrollTop - clientHeight < 50;
|
const isAtBottom =
|
||||||
|
scrollHeight - scrollTop - clientHeight < MODAL_CONSTANTS.AUTOSCROLL_THRESHOLD;
|
||||||
setSummaryAutoScroll(isAtBottom);
|
setSummaryAutoScroll(isAtBottom);
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -440,7 +442,7 @@ export function AgentOutputModal({
|
|||||||
return () => {
|
return () => {
|
||||||
unsubscribe();
|
unsubscribe();
|
||||||
};
|
};
|
||||||
}, [open, featureId, isBacklogPlan]);
|
}, [open, featureId, isBacklogPlan, onClose]);
|
||||||
|
|
||||||
// Listen to backlog plan events and update output
|
// Listen to backlog plan events and update output
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
|
|||||||
@@ -140,6 +140,7 @@ export function BacklogPlanDialog({
|
|||||||
setPrompt('');
|
setPrompt('');
|
||||||
onClose();
|
onClose();
|
||||||
}, [
|
}, [
|
||||||
|
logger,
|
||||||
projectPath,
|
projectPath,
|
||||||
prompt,
|
prompt,
|
||||||
modelOverride,
|
modelOverride,
|
||||||
|
|||||||
@@ -24,10 +24,9 @@ import {
|
|||||||
import { GitBranch, Cpu, FolderKanban, Settings2 } from 'lucide-react';
|
import { GitBranch, Cpu, FolderKanban, Settings2 } from 'lucide-react';
|
||||||
import { useNavigate } from '@tanstack/react-router';
|
import { useNavigate } from '@tanstack/react-router';
|
||||||
import { toast } from 'sonner';
|
import { toast } from 'sonner';
|
||||||
import { cn, modelSupportsThinking } from '@/lib/utils';
|
import { cn, migrateModelId, normalizeModelEntry } from '@/lib/utils';
|
||||||
import { Feature, ModelAlias, ThinkingLevel, PlanningMode } from '@/store/app-store';
|
import { Feature, ModelAlias, ThinkingLevel, PlanningMode } from '@/store/app-store';
|
||||||
import type { ReasoningEffort, PhaseModelEntry, DescriptionHistoryEntry } from '@automaker/types';
|
import type { ReasoningEffort, PhaseModelEntry, DescriptionHistoryEntry } from '@automaker/types';
|
||||||
import { migrateModelId } from '@automaker/types';
|
|
||||||
import {
|
import {
|
||||||
PrioritySelector,
|
PrioritySelector,
|
||||||
WorkModeSelector,
|
WorkModeSelector,
|
||||||
@@ -41,7 +40,6 @@ import type { WorkMode } from '../shared';
|
|||||||
import { PhaseModelSelector } from '@/components/views/settings-view/model-defaults/phase-model-selector';
|
import { PhaseModelSelector } from '@/components/views/settings-view/model-defaults/phase-model-selector';
|
||||||
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip';
|
import { Tooltip, TooltipContent, TooltipTrigger } from '@/components/ui/tooltip';
|
||||||
import { DependencyTreeDialog } from './dependency-tree-dialog';
|
import { DependencyTreeDialog } from './dependency-tree-dialog';
|
||||||
import { supportsReasoningEffort } from '@automaker/types';
|
|
||||||
|
|
||||||
interface EditFeatureDialogProps {
|
interface EditFeatureDialogProps {
|
||||||
feature: Feature | null;
|
feature: Feature | null;
|
||||||
@@ -56,6 +54,7 @@ interface EditFeatureDialogProps {
|
|||||||
model: ModelAlias;
|
model: ModelAlias;
|
||||||
thinkingLevel: ThinkingLevel;
|
thinkingLevel: ThinkingLevel;
|
||||||
reasoningEffort: ReasoningEffort;
|
reasoningEffort: ReasoningEffort;
|
||||||
|
providerId?: string;
|
||||||
imagePaths: DescriptionImagePath[];
|
imagePaths: DescriptionImagePath[];
|
||||||
textFilePaths: DescriptionTextFilePath[];
|
textFilePaths: DescriptionTextFilePath[];
|
||||||
branchName: string; // Can be empty string to use current branch
|
branchName: string; // Can be empty string to use current branch
|
||||||
@@ -109,11 +108,14 @@ export function EditFeatureDialog({
|
|||||||
);
|
);
|
||||||
|
|
||||||
// Model selection state - migrate legacy model IDs to canonical format
|
// Model selection state - migrate legacy model IDs to canonical format
|
||||||
const [modelEntry, setModelEntry] = useState<PhaseModelEntry>(() => ({
|
const [modelEntry, setModelEntry] = useState<PhaseModelEntry>(() =>
|
||||||
model: migrateModelId(feature?.model) || 'claude-opus',
|
normalizeModelEntry({
|
||||||
thinkingLevel: feature?.thinkingLevel || 'none',
|
model: migrateModelId(feature?.model) || 'claude-opus',
|
||||||
reasoningEffort: feature?.reasoningEffort || 'none',
|
thinkingLevel: feature?.thinkingLevel || 'none',
|
||||||
}));
|
reasoningEffort: feature?.reasoningEffort || 'none',
|
||||||
|
providerId: feature?.providerId,
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
// Track the source of description changes for history
|
// Track the source of description changes for history
|
||||||
const [descriptionChangeSource, setDescriptionChangeSource] = useState<
|
const [descriptionChangeSource, setDescriptionChangeSource] = useState<
|
||||||
@@ -161,11 +163,14 @@ export function EditFeatureDialog({
|
|||||||
setPreEnhancementDescription(null);
|
setPreEnhancementDescription(null);
|
||||||
setLocalHistory(feature.descriptionHistory ?? []);
|
setLocalHistory(feature.descriptionHistory ?? []);
|
||||||
// Reset model entry - migrate legacy model IDs
|
// Reset model entry - migrate legacy model IDs
|
||||||
setModelEntry({
|
setModelEntry(
|
||||||
model: migrateModelId(feature.model) || 'claude-opus',
|
normalizeModelEntry({
|
||||||
thinkingLevel: feature.thinkingLevel || 'none',
|
model: migrateModelId(feature.model) || 'claude-opus',
|
||||||
reasoningEffort: feature.reasoningEffort || 'none',
|
thinkingLevel: feature.thinkingLevel || 'none',
|
||||||
});
|
reasoningEffort: feature.reasoningEffort || 'none',
|
||||||
|
providerId: feature.providerId,
|
||||||
|
})
|
||||||
|
);
|
||||||
// Reset dependency state
|
// Reset dependency state
|
||||||
setParentDependencies(feature.dependencies ?? []);
|
setParentDependencies(feature.dependencies ?? []);
|
||||||
const childDeps = allFeatures
|
const childDeps = allFeatures
|
||||||
@@ -202,19 +207,14 @@ export function EditFeatureDialog({
|
|||||||
if (!editingFeature) return;
|
if (!editingFeature) return;
|
||||||
|
|
||||||
// Validate branch selection for custom mode
|
// Validate branch selection for custom mode
|
||||||
const isBranchSelectorEnabled = editingFeature.status === 'backlog';
|
const isBranchSelectorEnabled =
|
||||||
|
editingFeature.status === 'backlog' || editingFeature.status === 'merge_conflict';
|
||||||
if (isBranchSelectorEnabled && workMode === 'custom' && !editingFeature.branchName?.trim()) {
|
if (isBranchSelectorEnabled && workMode === 'custom' && !editingFeature.branchName?.trim()) {
|
||||||
toast.error('Please select a branch name');
|
toast.error('Please select a branch name');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
const selectedModel = modelEntry.model;
|
const normalizedEntry = normalizeModelEntry(modelEntry);
|
||||||
const normalizedThinking: ThinkingLevel = modelSupportsThinking(selectedModel)
|
|
||||||
? (modelEntry.thinkingLevel ?? 'none')
|
|
||||||
: 'none';
|
|
||||||
const normalizedReasoning: ReasoningEffort = supportsReasoningEffort(selectedModel)
|
|
||||||
? (modelEntry.reasoningEffort ?? 'none')
|
|
||||||
: 'none';
|
|
||||||
|
|
||||||
// For 'current' mode, use empty string (work on current branch)
|
// For 'current' mode, use empty string (work on current branch)
|
||||||
// For 'auto' mode, use empty string (will be auto-generated in use-board-actions)
|
// For 'auto' mode, use empty string (will be auto-generated in use-board-actions)
|
||||||
@@ -232,9 +232,10 @@ export function EditFeatureDialog({
|
|||||||
category: editingFeature.category,
|
category: editingFeature.category,
|
||||||
description: editingFeature.description,
|
description: editingFeature.description,
|
||||||
skipTests: editingFeature.skipTests ?? false,
|
skipTests: editingFeature.skipTests ?? false,
|
||||||
model: selectedModel,
|
model: normalizedEntry.model,
|
||||||
thinkingLevel: normalizedThinking,
|
thinkingLevel: normalizedEntry.thinkingLevel,
|
||||||
reasoningEffort: normalizedReasoning,
|
reasoningEffort: normalizedEntry.reasoningEffort,
|
||||||
|
providerId: normalizedEntry.providerId,
|
||||||
imagePaths: editingFeature.imagePaths ?? [],
|
imagePaths: editingFeature.imagePaths ?? [],
|
||||||
textFilePaths: editingFeature.textFilePaths ?? [],
|
textFilePaths: editingFeature.textFilePaths ?? [],
|
||||||
branchName: finalBranchName,
|
branchName: finalBranchName,
|
||||||
@@ -557,7 +558,9 @@ export function EditFeatureDialog({
|
|||||||
branchSuggestions={branchSuggestions}
|
branchSuggestions={branchSuggestions}
|
||||||
branchCardCounts={branchCardCounts}
|
branchCardCounts={branchCardCounts}
|
||||||
currentBranch={currentBranch}
|
currentBranch={currentBranch}
|
||||||
disabled={editingFeature.status !== 'backlog'}
|
disabled={
|
||||||
|
editingFeature.status !== 'backlog' && editingFeature.status !== 'merge_conflict'
|
||||||
|
}
|
||||||
testIdPrefix="edit-feature-work-mode"
|
testIdPrefix="edit-feature-work-mode"
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
@@ -627,7 +630,8 @@ export function EditFeatureDialog({
|
|||||||
hotkeyActive={!!editingFeature}
|
hotkeyActive={!!editingFeature}
|
||||||
data-testid="confirm-edit-feature"
|
data-testid="confirm-edit-feature"
|
||||||
disabled={
|
disabled={
|
||||||
editingFeature.status === 'backlog' &&
|
(editingFeature.status === 'backlog' ||
|
||||||
|
editingFeature.status === 'merge_conflict') &&
|
||||||
workMode === 'custom' &&
|
workMode === 'custom' &&
|
||||||
!editingFeature.branchName?.trim()
|
!editingFeature.branchName?.trim()
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -0,0 +1,138 @@
|
|||||||
|
/**
|
||||||
|
* Event content formatting utilities for AgentOutputModal
|
||||||
|
* Extracts the complex switch statement logic from the main component
|
||||||
|
*/
|
||||||
|
|
||||||
|
import type { AutoModeEvent } from '@/types/electron';
|
||||||
|
import type { BacklogPlanEvent } from '@automaker/types';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format auto mode event content for display
|
||||||
|
*/
|
||||||
|
export function formatAutoModeEventContent(event: AutoModeEvent): string {
|
||||||
|
switch (event.type) {
|
||||||
|
case 'auto_mode_progress':
|
||||||
|
return event.content || '';
|
||||||
|
|
||||||
|
case 'auto_mode_tool': {
|
||||||
|
const toolName = event.tool || 'Unknown Tool';
|
||||||
|
const toolInput = event.input ? JSON.stringify(event.input, null, 2) : '';
|
||||||
|
return `\n🔧 Tool: ${toolName}\n${toolInput ? `Input: ${toolInput}\n` : ''}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'auto_mode_phase': {
|
||||||
|
const phaseEmoji = event.phase === 'planning' ? '📋' : event.phase === 'action' ? '⚡' : '✅';
|
||||||
|
return `\n${phaseEmoji} ${event.message}\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'auto_mode_error':
|
||||||
|
return `\n❌ Error: ${event.error}\n`;
|
||||||
|
|
||||||
|
case 'auto_mode_ultrathink_preparation':
|
||||||
|
return formatUltrathinkPreparation(event);
|
||||||
|
|
||||||
|
case 'planning_started': {
|
||||||
|
if ('mode' in event && 'message' in event) {
|
||||||
|
const modeLabel = event.mode === 'lite' ? 'Lite' : event.mode === 'spec' ? 'Spec' : 'Full';
|
||||||
|
return `\n📋 Planning Mode: ${modeLabel}\n${event.message}\n`;
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'plan_approval_required':
|
||||||
|
return '\n⏸️ Plan generated - waiting for your approval...\n';
|
||||||
|
|
||||||
|
case 'plan_approved':
|
||||||
|
return event.hasEdits
|
||||||
|
? '\n✅ Plan approved (with edits) - continuing to implementation...\n'
|
||||||
|
: '\n✅ Plan approved - continuing to implementation...\n';
|
||||||
|
|
||||||
|
case 'plan_auto_approved':
|
||||||
|
return '\n✅ Plan auto-approved - continuing to implementation...\n';
|
||||||
|
|
||||||
|
case 'plan_revision_requested': {
|
||||||
|
const revisionEvent = event as Extract<AutoModeEvent, { type: 'plan_revision_requested' }>;
|
||||||
|
return `\n🔄 Revising plan based on your feedback (v${revisionEvent.planVersion})...\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'auto_mode_task_started': {
|
||||||
|
const taskEvent = event as Extract<AutoModeEvent, { type: 'auto_mode_task_started' }>;
|
||||||
|
return `\n▶ Starting ${taskEvent.taskId}: ${taskEvent.taskDescription}\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'auto_mode_task_complete': {
|
||||||
|
const taskEvent = event as Extract<AutoModeEvent, { type: 'auto_mode_task_complete' }>;
|
||||||
|
return `\n✓ ${taskEvent.taskId} completed (${taskEvent.tasksCompleted}/${taskEvent.tasksTotal})\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'auto_mode_phase_complete': {
|
||||||
|
const phaseEvent = event as Extract<AutoModeEvent, { type: 'auto_mode_phase_complete' }>;
|
||||||
|
return `\n🏁 Phase ${phaseEvent.phaseNumber} complete\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
case 'auto_mode_feature_complete': {
|
||||||
|
const emoji = event.passes ? '✅' : '⚠️';
|
||||||
|
return `\n${emoji} Task completed: ${event.message}\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
default:
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format backlog plan event content for display
|
||||||
|
*/
|
||||||
|
export function formatBacklogPlanEventContent(event: BacklogPlanEvent): string {
|
||||||
|
switch (event.type) {
|
||||||
|
case 'backlog_plan_progress':
|
||||||
|
return `\n🧭 ${event.content || 'Backlog plan progress update'}\n`;
|
||||||
|
|
||||||
|
case 'backlog_plan_error':
|
||||||
|
return `\n❌ Backlog plan error: ${event.error || 'Unknown error'}\n`;
|
||||||
|
|
||||||
|
case 'backlog_plan_complete':
|
||||||
|
return '\n✅ Backlog plan completed\n';
|
||||||
|
|
||||||
|
default:
|
||||||
|
return `\nℹ️ ${event.type}\n`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Format ultrathink preparation details
|
||||||
|
*/
|
||||||
|
function formatUltrathinkPreparation(
|
||||||
|
event: AutoModeEvent & {
|
||||||
|
warnings?: string[];
|
||||||
|
recommendations?: string[];
|
||||||
|
estimatedCost?: number;
|
||||||
|
estimatedTime?: string;
|
||||||
|
}
|
||||||
|
): string {
|
||||||
|
let prepContent = '\n🧠 Ultrathink Preparation\n';
|
||||||
|
|
||||||
|
if (event.warnings && event.warnings.length > 0) {
|
||||||
|
prepContent += '\n⚠️ Warnings:\n';
|
||||||
|
event.warnings.forEach((warning: string) => {
|
||||||
|
prepContent += ` • ${warning}\n`;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (event.recommendations && event.recommendations.length > 0) {
|
||||||
|
prepContent += '\n💡 Recommendations:\n';
|
||||||
|
event.recommendations.forEach((rec: string) => {
|
||||||
|
prepContent += ` • ${rec}\n`;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
if (event.estimatedCost !== undefined) {
|
||||||
|
prepContent += `\n💰 Estimated Cost: ~$${event.estimatedCost.toFixed(2)} per execution\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (event.estimatedTime) {
|
||||||
|
prepContent += `\n⏱️ Estimated Time: ${event.estimatedTime}\n`;
|
||||||
|
}
|
||||||
|
|
||||||
|
return prepContent;
|
||||||
|
}
|
||||||
@@ -22,7 +22,7 @@ import {
|
|||||||
import type { WorkMode } from '../shared';
|
import type { WorkMode } from '../shared';
|
||||||
import { PhaseModelSelector } from '@/components/views/settings-view/model-defaults/phase-model-selector';
|
import { PhaseModelSelector } from '@/components/views/settings-view/model-defaults/phase-model-selector';
|
||||||
import type { PhaseModelEntry } from '@automaker/types';
|
import type { PhaseModelEntry } from '@automaker/types';
|
||||||
import { cn } from '@/lib/utils';
|
import { cn, normalizeModelEntry } from '@/lib/utils';
|
||||||
|
|
||||||
interface MassEditDialogProps {
|
interface MassEditDialogProps {
|
||||||
open: boolean;
|
open: boolean;
|
||||||
@@ -181,7 +181,9 @@ export function MassEditDialog({
|
|||||||
});
|
});
|
||||||
setModel(getInitialValue(selectedFeatures, 'model', 'claude-sonnet') as ModelAlias);
|
setModel(getInitialValue(selectedFeatures, 'model', 'claude-sonnet') as ModelAlias);
|
||||||
setThinkingLevel(getInitialValue(selectedFeatures, 'thinkingLevel', 'none') as ThinkingLevel);
|
setThinkingLevel(getInitialValue(selectedFeatures, 'thinkingLevel', 'none') as ThinkingLevel);
|
||||||
setProviderId(undefined); // Features don't store providerId, but we track it after selection
|
setProviderId(
|
||||||
|
getInitialValue(selectedFeatures, 'providerId', undefined) as string | undefined
|
||||||
|
);
|
||||||
setPlanningMode(getInitialValue(selectedFeatures, 'planningMode', 'skip') as PlanningMode);
|
setPlanningMode(getInitialValue(selectedFeatures, 'planningMode', 'skip') as PlanningMode);
|
||||||
setRequirePlanApproval(getInitialValue(selectedFeatures, 'requirePlanApproval', false));
|
setRequirePlanApproval(getInitialValue(selectedFeatures, 'requirePlanApproval', false));
|
||||||
setPriority(getInitialValue(selectedFeatures, 'priority', 2));
|
setPriority(getInitialValue(selectedFeatures, 'priority', 2));
|
||||||
@@ -207,8 +209,23 @@ export function MassEditDialog({
|
|||||||
const handleApply = async () => {
|
const handleApply = async () => {
|
||||||
const updates: Partial<Feature> = {};
|
const updates: Partial<Feature> = {};
|
||||||
|
|
||||||
if (applyState.model) updates.model = model;
|
if (applyState.model || applyState.thinkingLevel) {
|
||||||
if (applyState.thinkingLevel) updates.thinkingLevel = thinkingLevel;
|
const normalizedEntry = normalizeModelEntry({
|
||||||
|
model,
|
||||||
|
thinkingLevel,
|
||||||
|
providerId,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (applyState.model) {
|
||||||
|
updates.model = normalizedEntry.model;
|
||||||
|
updates.providerId = normalizedEntry.providerId;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (applyState.thinkingLevel) {
|
||||||
|
updates.thinkingLevel = normalizedEntry.thinkingLevel;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if (applyState.planningMode) updates.planningMode = planningMode;
|
if (applyState.planningMode) updates.planningMode = planningMode;
|
||||||
if (applyState.requirePlanApproval) updates.requirePlanApproval = requirePlanApproval;
|
if (applyState.requirePlanApproval) updates.requirePlanApproval = requirePlanApproval;
|
||||||
if (applyState.priority) updates.priority = priority;
|
if (applyState.priority) updates.priority = priority;
|
||||||
|
|||||||
@@ -192,15 +192,9 @@ export function MergeRebaseDialog({
|
|||||||
const api = getHttpApiClient();
|
const api = getHttpApiClient();
|
||||||
|
|
||||||
if (selectedStrategy === 'rebase') {
|
if (selectedStrategy === 'rebase') {
|
||||||
// First fetch the remote to ensure we have latest refs
|
// Attempt the rebase operation - the rebase service fetches from the remote
|
||||||
try {
|
// before rebasing to ensure we have up-to-date refs
|
||||||
await api.worktree.pull(worktree.path, selectedRemote);
|
const result = await api.worktree.rebase(worktree.path, selectedBranch, selectedRemote);
|
||||||
} catch {
|
|
||||||
// Fetch may fail if no upstream - that's okay, we'll try rebase anyway
|
|
||||||
}
|
|
||||||
|
|
||||||
// Attempt the rebase operation
|
|
||||||
const result = await api.worktree.rebase(worktree.path, selectedBranch);
|
|
||||||
|
|
||||||
if (result.success) {
|
if (result.success) {
|
||||||
toast.success(`Rebased onto ${selectedBranch}`, {
|
toast.success(`Rebased onto ${selectedBranch}`, {
|
||||||
@@ -223,9 +217,26 @@ export function MergeRebaseDialog({
|
|||||||
setStep('select');
|
setStep('select');
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
// Merge strategy - attempt to merge the remote branch
|
// Merge strategy - merge the selected remote branch into the current branch.
|
||||||
// Use the pull endpoint for merging remote branches
|
// selectedBranch may be a full ref (e.g. refs/remotes/origin/main); normalize to short name
|
||||||
const result = await api.worktree.pull(worktree.path, selectedRemote, true);
|
// for 'git pull <remote> <branch>'.
|
||||||
|
let remoteBranchShortName = selectedBranch;
|
||||||
|
const remotePrefix = `refs/remotes/${selectedRemote}/`;
|
||||||
|
if (selectedBranch.startsWith(remotePrefix)) {
|
||||||
|
remoteBranchShortName = selectedBranch.slice(remotePrefix.length);
|
||||||
|
} else if (selectedBranch.startsWith(`${selectedRemote}/`)) {
|
||||||
|
remoteBranchShortName = selectedBranch.slice(selectedRemote.length + 1);
|
||||||
|
} else if (selectedBranch.startsWith('refs/heads/')) {
|
||||||
|
remoteBranchShortName = selectedBranch.slice('refs/heads/'.length);
|
||||||
|
} else if (selectedBranch.startsWith('refs/')) {
|
||||||
|
remoteBranchShortName = selectedBranch.slice('refs/'.length);
|
||||||
|
}
|
||||||
|
const result = await api.worktree.pull(
|
||||||
|
worktree.path,
|
||||||
|
selectedRemote,
|
||||||
|
true,
|
||||||
|
remoteBranchShortName
|
||||||
|
);
|
||||||
|
|
||||||
if (result.success && result.result) {
|
if (result.success && result.result) {
|
||||||
if (result.result.hasConflicts) {
|
if (result.result.hasConflicts) {
|
||||||
|
|||||||
@@ -510,9 +510,12 @@ export function StashChangesDialog({
|
|||||||
A descriptive message helps identify this stash later. Press{' '}
|
A descriptive message helps identify this stash later. Press{' '}
|
||||||
<kbd className="px-1 py-0.5 text-[10px] bg-muted rounded border">
|
<kbd className="px-1 py-0.5 text-[10px] bg-muted rounded border">
|
||||||
{typeof navigator !== 'undefined' &&
|
{typeof navigator !== 'undefined' &&
|
||||||
((navigator as any).userAgentData?.platform || navigator.platform || '').includes(
|
(
|
||||||
'Mac'
|
(navigator as Navigator & { userAgentData?: { platform?: string } }).userAgentData
|
||||||
)
|
?.platform ||
|
||||||
|
navigator.platform ||
|
||||||
|
''
|
||||||
|
).includes('Mac')
|
||||||
? '⌘'
|
? '⌘'
|
||||||
: 'Ctrl'}
|
: 'Ctrl'}
|
||||||
+Enter
|
+Enter
|
||||||
|
|||||||
@@ -25,6 +25,18 @@ const logger = createLogger('BoardActions');
|
|||||||
|
|
||||||
const MAX_DUPLICATES = 50;
|
const MAX_DUPLICATES = 50;
|
||||||
|
|
||||||
|
function normalizeFeatureBranchName(branchName?: string | null): string | undefined {
|
||||||
|
if (!branchName) return undefined;
|
||||||
|
let normalized = branchName.trim();
|
||||||
|
if (!normalized) return undefined;
|
||||||
|
|
||||||
|
normalized = normalized.replace(/^refs\/heads\//, '');
|
||||||
|
normalized = normalized.replace(/^refs\/remotes\/[^/]+\//, '');
|
||||||
|
normalized = normalized.replace(/^(origin|upstream)\//, '');
|
||||||
|
|
||||||
|
return normalized || undefined;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Removes a running task from all worktrees for a given project.
|
* Removes a running task from all worktrees for a given project.
|
||||||
* Used when stopping features to ensure the task is removed from all worktree contexts,
|
* Used when stopping features to ensure the task is removed from all worktree contexts,
|
||||||
@@ -137,6 +149,8 @@ export function useBoardActions({
|
|||||||
skipTests: boolean;
|
skipTests: boolean;
|
||||||
model: ModelAlias;
|
model: ModelAlias;
|
||||||
thinkingLevel: ThinkingLevel;
|
thinkingLevel: ThinkingLevel;
|
||||||
|
reasoningEffort?: ReasoningEffort;
|
||||||
|
providerId?: string;
|
||||||
branchName: string;
|
branchName: string;
|
||||||
priority: number;
|
priority: number;
|
||||||
planningMode: PlanningMode;
|
planningMode: PlanningMode;
|
||||||
@@ -182,7 +196,7 @@ export function useBoardActions({
|
|||||||
if (workMode === 'current') {
|
if (workMode === 'current') {
|
||||||
// Work directly on current branch - use the current worktree's branch if not on main
|
// Work directly on current branch - use the current worktree's branch if not on main
|
||||||
// This ensures features created on a non-main worktree are associated with that worktree
|
// This ensures features created on a non-main worktree are associated with that worktree
|
||||||
finalBranchName = currentWorktreeBranch || undefined;
|
finalBranchName = normalizeFeatureBranchName(currentWorktreeBranch);
|
||||||
} else if (workMode === 'auto') {
|
} else if (workMode === 'auto') {
|
||||||
// Auto-generate a branch name based on feature title and timestamp
|
// Auto-generate a branch name based on feature title and timestamp
|
||||||
// Create a slug from the title: lowercase, replace non-alphanumeric with hyphens
|
// Create a slug from the title: lowercase, replace non-alphanumeric with hyphens
|
||||||
@@ -196,7 +210,7 @@ export function useBoardActions({
|
|||||||
finalBranchName = `feature/${titleSlug}-${randomSuffix}`;
|
finalBranchName = `feature/${titleSlug}-${randomSuffix}`;
|
||||||
} else {
|
} else {
|
||||||
// Custom mode - use provided branch name
|
// Custom mode - use provided branch name
|
||||||
finalBranchName = featureData.branchName || undefined;
|
finalBranchName = normalizeFeatureBranchName(featureData.branchName);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Create worktree for 'auto' or 'custom' modes when we have a branch name
|
// Create worktree for 'auto' or 'custom' modes when we have a branch name
|
||||||
@@ -388,11 +402,11 @@ export function useBoardActions({
|
|||||||
if (workMode === 'current') {
|
if (workMode === 'current') {
|
||||||
// Work directly on current branch - use the current worktree's branch if not on main
|
// Work directly on current branch - use the current worktree's branch if not on main
|
||||||
// This ensures features updated on a non-main worktree are associated with that worktree
|
// This ensures features updated on a non-main worktree are associated with that worktree
|
||||||
finalBranchName = currentWorktreeBranch || undefined;
|
finalBranchName = normalizeFeatureBranchName(currentWorktreeBranch);
|
||||||
} else if (workMode === 'auto') {
|
} else if (workMode === 'auto') {
|
||||||
// Preserve existing branch name if one exists (avoid orphaning worktrees on edit)
|
// Preserve existing branch name if one exists (avoid orphaning worktrees on edit)
|
||||||
if (updates.branchName?.trim()) {
|
if (updates.branchName?.trim()) {
|
||||||
finalBranchName = updates.branchName;
|
finalBranchName = normalizeFeatureBranchName(updates.branchName);
|
||||||
} else {
|
} else {
|
||||||
// Auto-generate a branch name based on feature title
|
// Auto-generate a branch name based on feature title
|
||||||
// Create a slug from the title: lowercase, replace non-alphanumeric with hyphens
|
// Create a slug from the title: lowercase, replace non-alphanumeric with hyphens
|
||||||
@@ -406,7 +420,7 @@ export function useBoardActions({
|
|||||||
finalBranchName = `feature/${titleSlug}-${randomSuffix}`;
|
finalBranchName = `feature/${titleSlug}-${randomSuffix}`;
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
finalBranchName = updates.branchName || undefined;
|
finalBranchName = normalizeFeatureBranchName(updates.branchName);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Create worktree for 'auto' or 'custom' modes when we have a branch name
|
// Create worktree for 'auto' or 'custom' modes when we have a branch name
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
// @ts-nocheck - column filtering logic with dependency resolution and status mapping
|
// @ts-nocheck - column filtering logic with dependency resolution and status mapping
|
||||||
import { useMemo, useCallback } from 'react';
|
import { useMemo, useCallback, useEffect, useRef } from 'react';
|
||||||
import { Feature, useAppStore } from '@/store/app-store';
|
import { Feature, useAppStore } from '@/store/app-store';
|
||||||
import {
|
import {
|
||||||
createFeatureMap,
|
createFeatureMap,
|
||||||
@@ -28,6 +28,45 @@ export function useBoardColumnFeatures({
|
|||||||
currentWorktreeBranch,
|
currentWorktreeBranch,
|
||||||
projectPath,
|
projectPath,
|
||||||
}: UseBoardColumnFeaturesProps) {
|
}: UseBoardColumnFeaturesProps) {
|
||||||
|
// Get recently completed features from store for race condition protection
|
||||||
|
const recentlyCompletedFeatures = useAppStore((state) => state.recentlyCompletedFeatures);
|
||||||
|
const clearRecentlyCompletedFeatures = useAppStore(
|
||||||
|
(state) => state.clearRecentlyCompletedFeatures
|
||||||
|
);
|
||||||
|
|
||||||
|
// Track previous feature IDs to detect when features list has been refreshed
|
||||||
|
const prevFeatureIdsRef = useRef<Set<string>>(new Set());
|
||||||
|
|
||||||
|
// Clear recently completed features when the cache refreshes with updated statuses.
|
||||||
|
//
|
||||||
|
// RACE CONDITION SCENARIO THIS PREVENTS:
|
||||||
|
// 1. Feature completes on server -> status becomes 'verified'/'completed' on disk
|
||||||
|
// 2. Server emits auto_mode_feature_complete event
|
||||||
|
// 3. Frontend receives event -> removes feature from runningTasks, adds to recentlyCompletedFeatures
|
||||||
|
// 4. React Query invalidates features query, triggers async refetch
|
||||||
|
// 5. RACE: Before refetch completes, component may re-render with stale cache data
|
||||||
|
// where status='backlog' and feature is no longer in runningTasks
|
||||||
|
// 6. This hook prevents the feature from appearing in backlog during that window
|
||||||
|
//
|
||||||
|
// When the refetch completes with fresh data (status='verified'/'completed'),
|
||||||
|
// this effect clears the recentlyCompletedFeatures set since it's no longer needed.
|
||||||
|
useEffect(() => {
|
||||||
|
const currentIds = new Set(features.map((f) => f.id));
|
||||||
|
|
||||||
|
// Check if any recently completed features now have terminal statuses in the new data
|
||||||
|
// If so, we can clear the tracking since the cache is now fresh
|
||||||
|
const hasUpdatedStatus = Array.from(recentlyCompletedFeatures).some((featureId) => {
|
||||||
|
const feature = features.find((f) => f.id === featureId);
|
||||||
|
return feature && (feature.status === 'verified' || feature.status === 'completed');
|
||||||
|
});
|
||||||
|
|
||||||
|
if (hasUpdatedStatus) {
|
||||||
|
clearRecentlyCompletedFeatures();
|
||||||
|
}
|
||||||
|
|
||||||
|
prevFeatureIdsRef.current = currentIds;
|
||||||
|
}, [features, recentlyCompletedFeatures, clearRecentlyCompletedFeatures]);
|
||||||
|
|
||||||
// Memoize column features to prevent unnecessary re-renders
|
// Memoize column features to prevent unnecessary re-renders
|
||||||
const columnFeaturesMap = useMemo(() => {
|
const columnFeaturesMap = useMemo(() => {
|
||||||
// Use a more flexible type to support dynamic pipeline statuses
|
// Use a more flexible type to support dynamic pipeline statuses
|
||||||
@@ -44,6 +83,9 @@ export function useBoardColumnFeatures({
|
|||||||
// briefly appearing in backlog during the timing gap between when the server
|
// briefly appearing in backlog during the timing gap between when the server
|
||||||
// starts executing a feature and when the UI receives the event/status update.
|
// starts executing a feature and when the UI receives the event/status update.
|
||||||
const allRunningTaskIds = new Set(runningAutoTasksAllWorktrees);
|
const allRunningTaskIds = new Set(runningAutoTasksAllWorktrees);
|
||||||
|
// Get recently completed features for additional race condition protection
|
||||||
|
// These features should not appear in backlog even if cache has stale status
|
||||||
|
const recentlyCompleted = recentlyCompletedFeatures;
|
||||||
|
|
||||||
// Filter features by search query (case-insensitive)
|
// Filter features by search query (case-insensitive)
|
||||||
const normalizedQuery = searchQuery.toLowerCase().trim();
|
const normalizedQuery = searchQuery.toLowerCase().trim();
|
||||||
@@ -148,12 +190,19 @@ export function useBoardColumnFeatures({
|
|||||||
// Filter all items by worktree, including backlog
|
// Filter all items by worktree, including backlog
|
||||||
// This ensures backlog items with a branch assigned only show in that branch
|
// This ensures backlog items with a branch assigned only show in that branch
|
||||||
//
|
//
|
||||||
// 'ready' and 'interrupted' are transitional statuses that don't have dedicated columns:
|
// 'merge_conflict', 'ready', and 'interrupted' are backlog-lane statuses that don't
|
||||||
|
// have dedicated columns:
|
||||||
|
// - 'merge_conflict': Automatic merge failed; user must resolve conflicts before restart
|
||||||
// - 'ready': Feature has an approved plan, waiting to be picked up for execution
|
// - 'ready': Feature has an approved plan, waiting to be picked up for execution
|
||||||
// - 'interrupted': Feature execution was aborted (e.g., user stopped it, server restart)
|
// - 'interrupted': Feature execution was aborted (e.g., user stopped it, server restart)
|
||||||
// Both display in the backlog column and need the same allRunningTaskIds race-condition
|
// Both display in the backlog column and need the same allRunningTaskIds race-condition
|
||||||
// protection as 'backlog' to prevent briefly flashing in backlog when already executing.
|
// protection as 'backlog' to prevent briefly flashing in backlog when already executing.
|
||||||
if (status === 'backlog' || status === 'ready' || status === 'interrupted') {
|
if (
|
||||||
|
status === 'backlog' ||
|
||||||
|
status === 'merge_conflict' ||
|
||||||
|
status === 'ready' ||
|
||||||
|
status === 'interrupted'
|
||||||
|
) {
|
||||||
// IMPORTANT: Check if this feature is running on ANY worktree before placing in backlog.
|
// IMPORTANT: Check if this feature is running on ANY worktree before placing in backlog.
|
||||||
// This prevents a race condition where the feature has started executing on the server
|
// This prevents a race condition where the feature has started executing on the server
|
||||||
// (and is tracked in a different worktree's running list) but the disk status hasn't
|
// (and is tracked in a different worktree's running list) but the disk status hasn't
|
||||||
@@ -165,6 +214,14 @@ export function useBoardColumnFeatures({
|
|||||||
if (matchesWorktree) {
|
if (matchesWorktree) {
|
||||||
map.in_progress.push(f);
|
map.in_progress.push(f);
|
||||||
}
|
}
|
||||||
|
} else if (recentlyCompleted.has(f.id)) {
|
||||||
|
// Feature recently completed - skip placing in backlog to prevent race condition
|
||||||
|
// where stale cache has status='backlog' but feature actually completed.
|
||||||
|
// The feature will be placed correctly once the cache refreshes.
|
||||||
|
// Log for debugging (can remove after verification)
|
||||||
|
console.debug(
|
||||||
|
`Feature ${f.id} recently completed - skipping backlog placement during cache refresh`
|
||||||
|
);
|
||||||
} else if (matchesWorktree) {
|
} else if (matchesWorktree) {
|
||||||
map.backlog.push(f);
|
map.backlog.push(f);
|
||||||
}
|
}
|
||||||
@@ -231,6 +288,7 @@ export function useBoardColumnFeatures({
|
|||||||
currentWorktreePath,
|
currentWorktreePath,
|
||||||
currentWorktreeBranch,
|
currentWorktreeBranch,
|
||||||
projectPath,
|
projectPath,
|
||||||
|
recentlyCompletedFeatures,
|
||||||
]);
|
]);
|
||||||
|
|
||||||
const getColumnFeatures = useCallback(
|
const getColumnFeatures = useCallback(
|
||||||
|
|||||||
@@ -196,7 +196,7 @@ export function useBoardDragDrop({
|
|||||||
|
|
||||||
// Handle different drag scenarios
|
// Handle different drag scenarios
|
||||||
// Note: Worktrees are created server-side at execution time based on feature.branchName
|
// Note: Worktrees are created server-side at execution time based on feature.branchName
|
||||||
if (draggedFeature.status === 'backlog') {
|
if (draggedFeature.status === 'backlog' || draggedFeature.status === 'merge_conflict') {
|
||||||
// From backlog
|
// From backlog
|
||||||
if (targetStatus === 'in_progress') {
|
if (targetStatus === 'in_progress') {
|
||||||
// Use helper function to handle concurrency check and start implementation
|
// Use helper function to handle concurrency check and start implementation
|
||||||
|
|||||||
@@ -73,6 +73,10 @@ export function useBoardEffects({
|
|||||||
const checkAllContexts = async () => {
|
const checkAllContexts = async () => {
|
||||||
const featuresWithPotentialContext = features.filter(
|
const featuresWithPotentialContext = features.filter(
|
||||||
(f) =>
|
(f) =>
|
||||||
|
f.status === 'backlog' ||
|
||||||
|
f.status === 'merge_conflict' ||
|
||||||
|
f.status === 'ready' ||
|
||||||
|
f.status === 'interrupted' ||
|
||||||
f.status === 'in_progress' ||
|
f.status === 'in_progress' ||
|
||||||
f.status === 'waiting_approval' ||
|
f.status === 'waiting_approval' ||
|
||||||
f.status === 'verified' ||
|
f.status === 'verified' ||
|
||||||
|
|||||||
@@ -61,7 +61,7 @@ export function useBoardFeatures({ currentProject }: UseBoardFeaturesProps) {
|
|||||||
} catch {
|
} catch {
|
||||||
setPersistedCategories([]);
|
setPersistedCategories([]);
|
||||||
}
|
}
|
||||||
}, [currentProject, loadFeatures]);
|
}, [currentProject]);
|
||||||
|
|
||||||
// Save a new category to the persisted categories file
|
// Save a new category to the persisted categories file
|
||||||
const saveCategory = useCallback(
|
const saveCategory = useCallback(
|
||||||
@@ -161,6 +161,7 @@ export function useBoardFeatures({ currentProject }: UseBoardFeaturesProps) {
|
|||||||
});
|
});
|
||||||
|
|
||||||
return unsubscribe;
|
return unsubscribe;
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps -- loadFeatures is a stable ref from React Query
|
||||||
}, [currentProject]);
|
}, [currentProject]);
|
||||||
|
|
||||||
// Check for interrupted features on mount
|
// Check for interrupted features on mount
|
||||||
|
|||||||
@@ -32,6 +32,14 @@ export function useBoardPersistence({ currentProject }: UseBoardPersistenceProps
|
|||||||
) => {
|
) => {
|
||||||
if (!currentProject) return;
|
if (!currentProject) return;
|
||||||
|
|
||||||
|
// Cancel any in-flight refetches to prevent them from overwriting our optimistic update.
|
||||||
|
// Without this, a slow background refetch (e.g., from a prior create/invalidate) can
|
||||||
|
// resolve after setQueryData and overwrite the cache with stale data.
|
||||||
|
await queryClient.cancelQueries({
|
||||||
|
queryKey: queryKeys.features.all(currentProject.path),
|
||||||
|
exact: true,
|
||||||
|
});
|
||||||
|
|
||||||
// Capture previous cache snapshot for rollback on error
|
// Capture previous cache snapshot for rollback on error
|
||||||
const previousFeatures = queryClient.getQueryData<Feature[]>(
|
const previousFeatures = queryClient.getQueryData<Feature[]>(
|
||||||
queryKeys.features.all(currentProject.path)
|
queryKeys.features.all(currentProject.path)
|
||||||
@@ -123,6 +131,12 @@ export function useBoardPersistence({ currentProject }: UseBoardPersistenceProps
|
|||||||
throw new Error('Features API not available');
|
throw new Error('Features API not available');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Cancel any in-flight refetches to prevent them from overwriting our optimistic update
|
||||||
|
await queryClient.cancelQueries({
|
||||||
|
queryKey: queryKeys.features.all(currentProject.path),
|
||||||
|
exact: true,
|
||||||
|
});
|
||||||
|
|
||||||
// Capture previous cache snapshot for synchronous rollback on error
|
// Capture previous cache snapshot for synchronous rollback on error
|
||||||
const previousFeatures = queryClient.getQueryData<Feature[]>(
|
const previousFeatures = queryClient.getQueryData<Feature[]>(
|
||||||
queryKeys.features.all(currentProject.path)
|
queryKeys.features.all(currentProject.path)
|
||||||
@@ -175,6 +189,12 @@ export function useBoardPersistence({ currentProject }: UseBoardPersistenceProps
|
|||||||
async (featureId: string) => {
|
async (featureId: string) => {
|
||||||
if (!currentProject) return;
|
if (!currentProject) return;
|
||||||
|
|
||||||
|
// Cancel any in-flight refetches to prevent them from overwriting our optimistic update
|
||||||
|
await queryClient.cancelQueries({
|
||||||
|
queryKey: queryKeys.features.all(currentProject.path),
|
||||||
|
exact: true,
|
||||||
|
});
|
||||||
|
|
||||||
// Optimistically remove from React Query cache for immediate board refresh
|
// Optimistically remove from React Query cache for immediate board refresh
|
||||||
const previousFeatures = queryClient.getQueryData<Feature[]>(
|
const previousFeatures = queryClient.getQueryData<Feature[]>(
|
||||||
queryKeys.features.all(currentProject.path)
|
queryKeys.features.all(currentProject.path)
|
||||||
|
|||||||
@@ -161,6 +161,7 @@ function VirtualizedList<Item extends VirtualListItem>({
|
|||||||
const resolvedHeight = measured ?? estimatedItemHeight;
|
const resolvedHeight = measured ?? estimatedItemHeight;
|
||||||
return resolvedHeight + itemGap;
|
return resolvedHeight + itemGap;
|
||||||
});
|
});
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||||
}, [items, estimatedItemHeight, itemGap, measureVersion]);
|
}, [items, estimatedItemHeight, itemGap, measureVersion]);
|
||||||
|
|
||||||
const itemStarts = useMemo(() => {
|
const itemStarts = useMemo(() => {
|
||||||
|
|||||||
@@ -70,7 +70,8 @@ interface WorktreeActionsDropdownProps {
|
|||||||
hasRemoteBranch: boolean;
|
hasRemoteBranch: boolean;
|
||||||
isPulling: boolean;
|
isPulling: boolean;
|
||||||
isPushing: boolean;
|
isPushing: boolean;
|
||||||
isStartingDevServer: boolean;
|
isStartingAnyDevServer: boolean;
|
||||||
|
isDevServerStarting: boolean;
|
||||||
isDevServerRunning: boolean;
|
isDevServerRunning: boolean;
|
||||||
devServerInfo?: DevServerInfo;
|
devServerInfo?: DevServerInfo;
|
||||||
gitRepoStatus: GitRepoStatus;
|
gitRepoStatus: GitRepoStatus;
|
||||||
@@ -244,7 +245,8 @@ export function WorktreeActionsDropdown({
|
|||||||
hasRemoteBranch,
|
hasRemoteBranch,
|
||||||
isPulling,
|
isPulling,
|
||||||
isPushing,
|
isPushing,
|
||||||
isStartingDevServer,
|
isStartingAnyDevServer,
|
||||||
|
isDevServerStarting,
|
||||||
isDevServerRunning,
|
isDevServerRunning,
|
||||||
devServerInfo,
|
devServerInfo,
|
||||||
gitRepoStatus,
|
gitRepoStatus,
|
||||||
@@ -550,20 +552,26 @@ export function WorktreeActionsDropdown({
|
|||||||
<div className="flex items-center">
|
<div className="flex items-center">
|
||||||
<DropdownMenuItem
|
<DropdownMenuItem
|
||||||
onClick={() => onStartDevServer(worktree)}
|
onClick={() => onStartDevServer(worktree)}
|
||||||
disabled={isStartingDevServer}
|
disabled={isStartingAnyDevServer || isDevServerStarting}
|
||||||
className="text-xs flex-1 pr-0 rounded-r-none"
|
className="text-xs flex-1 pr-0 rounded-r-none"
|
||||||
>
|
>
|
||||||
<Play
|
<Play
|
||||||
className={cn('w-3.5 h-3.5 mr-2', isStartingDevServer && 'animate-pulse')}
|
className={cn(
|
||||||
|
'w-3.5 h-3.5 mr-2',
|
||||||
|
(isStartingAnyDevServer || isDevServerStarting) && 'animate-pulse'
|
||||||
|
)}
|
||||||
/>
|
/>
|
||||||
{isStartingDevServer ? 'Starting...' : 'Start Dev Server'}
|
{isStartingAnyDevServer || isDevServerStarting
|
||||||
|
? 'Starting...'
|
||||||
|
: 'Start Dev Server'}
|
||||||
</DropdownMenuItem>
|
</DropdownMenuItem>
|
||||||
<DropdownMenuSubTrigger
|
<DropdownMenuSubTrigger
|
||||||
className={cn(
|
className={cn(
|
||||||
'text-xs px-1 rounded-l-none border-l border-border/30 h-8',
|
'text-xs px-1 rounded-l-none border-l border-border/30 h-8',
|
||||||
isStartingDevServer && 'opacity-50 cursor-not-allowed'
|
(isStartingAnyDevServer || isDevServerStarting) &&
|
||||||
|
'opacity-50 cursor-not-allowed'
|
||||||
)}
|
)}
|
||||||
disabled={isStartingDevServer}
|
disabled={isStartingAnyDevServer || isDevServerStarting}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
<DropdownMenuSubContent>{viewDevServerLogsItem}</DropdownMenuSubContent>
|
<DropdownMenuSubContent>{viewDevServerLogsItem}</DropdownMenuSubContent>
|
||||||
|
|||||||
@@ -31,6 +31,8 @@ export interface WorktreeDropdownItemProps {
|
|||||||
cardCount?: number;
|
cardCount?: number;
|
||||||
/** Whether the dev server is running for this worktree */
|
/** Whether the dev server is running for this worktree */
|
||||||
devServerRunning?: boolean;
|
devServerRunning?: boolean;
|
||||||
|
/** Whether the dev server is starting for this worktree */
|
||||||
|
devServerStarting?: boolean;
|
||||||
/** Dev server information if running */
|
/** Dev server information if running */
|
||||||
devServerInfo?: DevServerInfo;
|
devServerInfo?: DevServerInfo;
|
||||||
/** Whether auto-mode is running for this worktree */
|
/** Whether auto-mode is running for this worktree */
|
||||||
@@ -64,6 +66,7 @@ export function WorktreeDropdownItem({
|
|||||||
isRunning,
|
isRunning,
|
||||||
cardCount,
|
cardCount,
|
||||||
devServerRunning,
|
devServerRunning,
|
||||||
|
devServerStarting,
|
||||||
devServerInfo,
|
devServerInfo,
|
||||||
isAutoModeRunning = false,
|
isAutoModeRunning = false,
|
||||||
isTestRunning = false,
|
isTestRunning = false,
|
||||||
@@ -154,6 +157,16 @@ export function WorktreeDropdownItem({
|
|||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
|
{/* Dev server starting indicator */}
|
||||||
|
{devServerStarting && (
|
||||||
|
<span
|
||||||
|
className="inline-flex items-center justify-center h-4 w-4 text-amber-500"
|
||||||
|
title="Dev server starting..."
|
||||||
|
>
|
||||||
|
<Spinner size="xs" variant="current" />
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
|
||||||
{/* Test running indicator */}
|
{/* Test running indicator */}
|
||||||
{isTestRunning && (
|
{isTestRunning && (
|
||||||
<span
|
<span
|
||||||
|
|||||||
@@ -53,6 +53,8 @@ export interface WorktreeDropdownProps {
|
|||||||
branchCardCounts?: Record<string, number>;
|
branchCardCounts?: Record<string, number>;
|
||||||
/** Function to check if dev server is running for a worktree */
|
/** Function to check if dev server is running for a worktree */
|
||||||
isDevServerRunning: (worktree: WorktreeInfo) => boolean;
|
isDevServerRunning: (worktree: WorktreeInfo) => boolean;
|
||||||
|
/** Function to check if dev server is starting for a worktree */
|
||||||
|
isDevServerStarting: (worktree: WorktreeInfo) => boolean;
|
||||||
/** Function to get dev server info for a worktree */
|
/** Function to get dev server info for a worktree */
|
||||||
getDevServerInfo: (worktree: WorktreeInfo) => DevServerInfo | undefined;
|
getDevServerInfo: (worktree: WorktreeInfo) => DevServerInfo | undefined;
|
||||||
/** Function to check if auto-mode is running for a worktree */
|
/** Function to check if auto-mode is running for a worktree */
|
||||||
@@ -78,7 +80,7 @@ export interface WorktreeDropdownProps {
|
|||||||
// Action dropdown props
|
// Action dropdown props
|
||||||
isPulling: boolean;
|
isPulling: boolean;
|
||||||
isPushing: boolean;
|
isPushing: boolean;
|
||||||
isStartingDevServer: boolean;
|
isStartingAnyDevServer: boolean;
|
||||||
aheadCount: number;
|
aheadCount: number;
|
||||||
behindCount: number;
|
behindCount: number;
|
||||||
hasRemoteBranch: boolean;
|
hasRemoteBranch: boolean;
|
||||||
@@ -178,6 +180,7 @@ export function WorktreeDropdown({
|
|||||||
isActivating,
|
isActivating,
|
||||||
branchCardCounts,
|
branchCardCounts,
|
||||||
isDevServerRunning,
|
isDevServerRunning,
|
||||||
|
isDevServerStarting,
|
||||||
getDevServerInfo,
|
getDevServerInfo,
|
||||||
isAutoModeRunningForWorktree,
|
isAutoModeRunningForWorktree,
|
||||||
isTestRunningForWorktree,
|
isTestRunningForWorktree,
|
||||||
@@ -196,7 +199,7 @@ export function WorktreeDropdown({
|
|||||||
// Action dropdown props
|
// Action dropdown props
|
||||||
isPulling,
|
isPulling,
|
||||||
isPushing,
|
isPushing,
|
||||||
isStartingDevServer,
|
isStartingAnyDevServer,
|
||||||
aheadCount,
|
aheadCount,
|
||||||
behindCount,
|
behindCount,
|
||||||
hasRemoteBranch,
|
hasRemoteBranch,
|
||||||
@@ -270,6 +273,7 @@ export function WorktreeDropdown({
|
|||||||
if (!selectedWorktree) {
|
if (!selectedWorktree) {
|
||||||
return {
|
return {
|
||||||
devServerRunning: false,
|
devServerRunning: false,
|
||||||
|
devServerStarting: false,
|
||||||
devServerInfo: undefined,
|
devServerInfo: undefined,
|
||||||
autoModeRunning: false,
|
autoModeRunning: false,
|
||||||
isRunning: false,
|
isRunning: false,
|
||||||
@@ -279,6 +283,7 @@ export function WorktreeDropdown({
|
|||||||
}
|
}
|
||||||
return {
|
return {
|
||||||
devServerRunning: isDevServerRunning(selectedWorktree),
|
devServerRunning: isDevServerRunning(selectedWorktree),
|
||||||
|
devServerStarting: isDevServerStarting(selectedWorktree),
|
||||||
devServerInfo: getDevServerInfo(selectedWorktree),
|
devServerInfo: getDevServerInfo(selectedWorktree),
|
||||||
autoModeRunning: isAutoModeRunningForWorktree(selectedWorktree),
|
autoModeRunning: isAutoModeRunningForWorktree(selectedWorktree),
|
||||||
isRunning: hasRunningFeatures(selectedWorktree),
|
isRunning: hasRunningFeatures(selectedWorktree),
|
||||||
@@ -288,6 +293,7 @@ export function WorktreeDropdown({
|
|||||||
}, [
|
}, [
|
||||||
selectedWorktree,
|
selectedWorktree,
|
||||||
isDevServerRunning,
|
isDevServerRunning,
|
||||||
|
isDevServerStarting,
|
||||||
getDevServerInfo,
|
getDevServerInfo,
|
||||||
isAutoModeRunningForWorktree,
|
isAutoModeRunningForWorktree,
|
||||||
hasRunningFeatures,
|
hasRunningFeatures,
|
||||||
@@ -360,6 +366,16 @@ export function WorktreeDropdown({
|
|||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
|
|
||||||
|
{/* Dev server starting indicator */}
|
||||||
|
{selectedStatus.devServerStarting && (
|
||||||
|
<span
|
||||||
|
className="inline-flex items-center justify-center h-4 w-4 text-amber-500 shrink-0"
|
||||||
|
title="Dev server starting..."
|
||||||
|
>
|
||||||
|
<Spinner size="xs" variant="current" />
|
||||||
|
</span>
|
||||||
|
)}
|
||||||
|
|
||||||
{/* Test running indicator */}
|
{/* Test running indicator */}
|
||||||
{selectedStatus.testRunning && (
|
{selectedStatus.testRunning && (
|
||||||
<span
|
<span
|
||||||
@@ -468,6 +484,7 @@ export function WorktreeDropdown({
|
|||||||
isRunning={hasRunningFeatures(mainWorktree)}
|
isRunning={hasRunningFeatures(mainWorktree)}
|
||||||
cardCount={branchCardCounts?.[mainWorktree.branch]}
|
cardCount={branchCardCounts?.[mainWorktree.branch]}
|
||||||
devServerRunning={isDevServerRunning(mainWorktree)}
|
devServerRunning={isDevServerRunning(mainWorktree)}
|
||||||
|
devServerStarting={isDevServerStarting(mainWorktree)}
|
||||||
devServerInfo={getDevServerInfo(mainWorktree)}
|
devServerInfo={getDevServerInfo(mainWorktree)}
|
||||||
isAutoModeRunning={isAutoModeRunningForWorktree(mainWorktree)}
|
isAutoModeRunning={isAutoModeRunningForWorktree(mainWorktree)}
|
||||||
isTestRunning={isTestRunningForWorktree(mainWorktree)}
|
isTestRunning={isTestRunningForWorktree(mainWorktree)}
|
||||||
@@ -493,6 +510,7 @@ export function WorktreeDropdown({
|
|||||||
isRunning={hasRunningFeatures(worktree)}
|
isRunning={hasRunningFeatures(worktree)}
|
||||||
cardCount={branchCardCounts?.[worktree.branch]}
|
cardCount={branchCardCounts?.[worktree.branch]}
|
||||||
devServerRunning={isDevServerRunning(worktree)}
|
devServerRunning={isDevServerRunning(worktree)}
|
||||||
|
devServerStarting={isDevServerStarting(worktree)}
|
||||||
devServerInfo={getDevServerInfo(worktree)}
|
devServerInfo={getDevServerInfo(worktree)}
|
||||||
isAutoModeRunning={isAutoModeRunningForWorktree(worktree)}
|
isAutoModeRunning={isAutoModeRunningForWorktree(worktree)}
|
||||||
isTestRunning={isTestRunningForWorktree(worktree)}
|
isTestRunning={isTestRunningForWorktree(worktree)}
|
||||||
@@ -543,7 +561,8 @@ export function WorktreeDropdown({
|
|||||||
}
|
}
|
||||||
isPulling={isPulling}
|
isPulling={isPulling}
|
||||||
isPushing={isPushing}
|
isPushing={isPushing}
|
||||||
isStartingDevServer={isStartingDevServer}
|
isStartingDevServer={isStartingAnyDevServer}
|
||||||
|
isDevServerStarting={isDevServerStarting(selectedWorktree)}
|
||||||
isDevServerRunning={isDevServerRunning(selectedWorktree)}
|
isDevServerRunning={isDevServerRunning(selectedWorktree)}
|
||||||
devServerInfo={getDevServerInfo(selectedWorktree)}
|
devServerInfo={getDevServerInfo(selectedWorktree)}
|
||||||
gitRepoStatus={gitRepoStatus}
|
gitRepoStatus={gitRepoStatus}
|
||||||
|
|||||||
@@ -7,15 +7,18 @@ import {
|
|||||||
DropdownMenuSeparator,
|
DropdownMenuSeparator,
|
||||||
DropdownMenuTrigger,
|
DropdownMenuTrigger,
|
||||||
} from '@/components/ui/dropdown-menu';
|
} from '@/components/ui/dropdown-menu';
|
||||||
import { GitBranch, ChevronDown, CircleDot, Check } from 'lucide-react';
|
import { GitBranch, ChevronDown, CircleDot, Check, Globe } from 'lucide-react';
|
||||||
import { Spinner } from '@/components/ui/spinner';
|
import { Spinner } from '@/components/ui/spinner';
|
||||||
import { cn } from '@/lib/utils';
|
import { cn } from '@/lib/utils';
|
||||||
import type { WorktreeInfo } from '../types';
|
import type { WorktreeInfo, DevServerInfo } from '../types';
|
||||||
|
|
||||||
interface WorktreeMobileDropdownProps {
|
interface WorktreeMobileDropdownProps {
|
||||||
worktrees: WorktreeInfo[];
|
worktrees: WorktreeInfo[];
|
||||||
isWorktreeSelected: (worktree: WorktreeInfo) => boolean;
|
isWorktreeSelected: (worktree: WorktreeInfo) => boolean;
|
||||||
hasRunningFeatures: (worktree: WorktreeInfo) => boolean;
|
hasRunningFeatures: (worktree: WorktreeInfo) => boolean;
|
||||||
|
isDevServerRunning: (worktree: WorktreeInfo) => boolean;
|
||||||
|
isDevServerStarting: (worktree: WorktreeInfo) => boolean;
|
||||||
|
getDevServerInfo: (worktree: WorktreeInfo) => DevServerInfo | undefined;
|
||||||
isActivating: boolean;
|
isActivating: boolean;
|
||||||
branchCardCounts?: Record<string, number>;
|
branchCardCounts?: Record<string, number>;
|
||||||
onSelectWorktree: (worktree: WorktreeInfo) => void;
|
onSelectWorktree: (worktree: WorktreeInfo) => void;
|
||||||
@@ -25,6 +28,9 @@ export function WorktreeMobileDropdown({
|
|||||||
worktrees,
|
worktrees,
|
||||||
isWorktreeSelected,
|
isWorktreeSelected,
|
||||||
hasRunningFeatures,
|
hasRunningFeatures,
|
||||||
|
isDevServerRunning,
|
||||||
|
isDevServerStarting,
|
||||||
|
getDevServerInfo,
|
||||||
isActivating,
|
isActivating,
|
||||||
branchCardCounts,
|
branchCardCounts,
|
||||||
onSelectWorktree,
|
onSelectWorktree,
|
||||||
@@ -59,6 +65,9 @@ export function WorktreeMobileDropdown({
|
|||||||
{worktrees.map((worktree) => {
|
{worktrees.map((worktree) => {
|
||||||
const isSelected = isWorktreeSelected(worktree);
|
const isSelected = isWorktreeSelected(worktree);
|
||||||
const isRunning = hasRunningFeatures(worktree);
|
const isRunning = hasRunningFeatures(worktree);
|
||||||
|
const devServerRunning = isDevServerRunning(worktree);
|
||||||
|
const devServerStarting = isDevServerStarting(worktree);
|
||||||
|
const devServerInfo = getDevServerInfo(worktree);
|
||||||
const cardCount = branchCardCounts?.[worktree.branch];
|
const cardCount = branchCardCounts?.[worktree.branch];
|
||||||
const hasChanges = worktree.hasChanges;
|
const hasChanges = worktree.hasChanges;
|
||||||
const changedFilesCount = worktree.changedFilesCount;
|
const changedFilesCount = worktree.changedFilesCount;
|
||||||
@@ -103,6 +112,10 @@ export function WorktreeMobileDropdown({
|
|||||||
{changedFilesCount ?? '!'}
|
{changedFilesCount ?? '!'}
|
||||||
</span>
|
</span>
|
||||||
)}
|
)}
|
||||||
|
{devServerRunning && devServerInfo?.urlDetected === true && (
|
||||||
|
<Globe className="w-3 h-3 text-green-500" />
|
||||||
|
)}
|
||||||
|
{devServerStarting && <Spinner size="xs" variant="muted" />}
|
||||||
</div>
|
</div>
|
||||||
</DropdownMenuItem>
|
</DropdownMenuItem>
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -34,7 +34,8 @@ interface WorktreeTabProps {
|
|||||||
isSwitching: boolean;
|
isSwitching: boolean;
|
||||||
isPulling: boolean;
|
isPulling: boolean;
|
||||||
isPushing: boolean;
|
isPushing: boolean;
|
||||||
isStartingDevServer: boolean;
|
isStartingAnyDevServer: boolean;
|
||||||
|
isDevServerStarting: boolean;
|
||||||
aheadCount: number;
|
aheadCount: number;
|
||||||
behindCount: number;
|
behindCount: number;
|
||||||
hasRemoteBranch: boolean;
|
hasRemoteBranch: boolean;
|
||||||
@@ -146,7 +147,8 @@ export function WorktreeTab({
|
|||||||
isSwitching,
|
isSwitching,
|
||||||
isPulling,
|
isPulling,
|
||||||
isPushing,
|
isPushing,
|
||||||
isStartingDevServer,
|
isStartingAnyDevServer,
|
||||||
|
isDevServerStarting,
|
||||||
aheadCount,
|
aheadCount,
|
||||||
behindCount,
|
behindCount,
|
||||||
hasRemoteBranch,
|
hasRemoteBranch,
|
||||||
@@ -531,7 +533,8 @@ export function WorktreeTab({
|
|||||||
trackingRemote={trackingRemote}
|
trackingRemote={trackingRemote}
|
||||||
isPulling={isPulling}
|
isPulling={isPulling}
|
||||||
isPushing={isPushing}
|
isPushing={isPushing}
|
||||||
isStartingDevServer={isStartingDevServer}
|
isStartingDevServer={isStartingAnyDevServer}
|
||||||
|
isDevServerStarting={isDevServerStarting}
|
||||||
isDevServerRunning={isDevServerRunning}
|
isDevServerRunning={isDevServerRunning}
|
||||||
devServerInfo={devServerInfo}
|
devServerInfo={devServerInfo}
|
||||||
gitRepoStatus={gitRepoStatus}
|
gitRepoStatus={gitRepoStatus}
|
||||||
|
|||||||
@@ -56,7 +56,8 @@ function showUrlDetectedToast(url: string, port: number): void {
|
|||||||
}
|
}
|
||||||
|
|
||||||
export function useDevServers({ projectPath }: UseDevServersOptions) {
|
export function useDevServers({ projectPath }: UseDevServersOptions) {
|
||||||
const [isStartingDevServer, setIsStartingDevServer] = useState(false);
|
const [isStartingAnyDevServer, setIsStartingAnyDevServer] = useState(false);
|
||||||
|
const [startingServers, setStartingServers] = useState<Set<string>>(new Set());
|
||||||
const [runningDevServers, setRunningDevServers] = useState<Map<string, DevServerInfo>>(new Map());
|
const [runningDevServers, setRunningDevServers] = useState<Map<string, DevServerInfo>>(new Map());
|
||||||
|
|
||||||
// Track which worktrees have had their url-detected toast shown to prevent re-triggering
|
// Track which worktrees have had their url-detected toast shown to prevent re-triggering
|
||||||
@@ -327,7 +328,16 @@ export function useDevServers({ projectPath }: UseDevServersOptions) {
|
|||||||
if (!api?.worktree?.onDevServerLogEvent) return;
|
if (!api?.worktree?.onDevServerLogEvent) return;
|
||||||
|
|
||||||
const unsubscribe = api.worktree.onDevServerLogEvent((event) => {
|
const unsubscribe = api.worktree.onDevServerLogEvent((event) => {
|
||||||
if (event.type === 'dev-server:url-detected') {
|
if (event.type === 'dev-server:starting') {
|
||||||
|
const { worktreePath } = event.payload;
|
||||||
|
const key = normalizePath(worktreePath);
|
||||||
|
setStartingServers((prev) => {
|
||||||
|
const next = new Set(prev);
|
||||||
|
next.add(key);
|
||||||
|
return next;
|
||||||
|
});
|
||||||
|
logger.info(`Dev server starting for ${worktreePath} (reactive update)`);
|
||||||
|
} else if (event.type === 'dev-server:url-detected') {
|
||||||
const { worktreePath, url, port } = event.payload;
|
const { worktreePath, url, port } = event.payload;
|
||||||
const key = normalizePath(worktreePath);
|
const key = normalizePath(worktreePath);
|
||||||
// Clear the port detection timeout since URL was successfully detected
|
// Clear the port detection timeout since URL was successfully detected
|
||||||
@@ -387,6 +397,15 @@ export function useDevServers({ projectPath }: UseDevServersOptions) {
|
|||||||
// Reactively add/update the server when it starts
|
// Reactively add/update the server when it starts
|
||||||
const { worktreePath, port, url } = event.payload;
|
const { worktreePath, port, url } = event.payload;
|
||||||
const key = normalizePath(worktreePath);
|
const key = normalizePath(worktreePath);
|
||||||
|
|
||||||
|
// Remove from starting set
|
||||||
|
setStartingServers((prev) => {
|
||||||
|
if (!prev.has(key)) return prev;
|
||||||
|
const next = new Set(prev);
|
||||||
|
next.delete(key);
|
||||||
|
return next;
|
||||||
|
});
|
||||||
|
|
||||||
// Clear previous toast tracking for this key so a new detection triggers a fresh toast
|
// Clear previous toast tracking for this key so a new detection triggers a fresh toast
|
||||||
toastShownForRef.current.delete(key);
|
toastShownForRef.current.delete(key);
|
||||||
setRunningDevServers((prev) => {
|
setRunningDevServers((prev) => {
|
||||||
@@ -409,11 +428,12 @@ export function useDevServers({ projectPath }: UseDevServersOptions) {
|
|||||||
|
|
||||||
// Cleanup all port detection timers on unmount
|
// Cleanup all port detection timers on unmount
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
|
const timers = portDetectionTimers.current;
|
||||||
return () => {
|
return () => {
|
||||||
for (const timer of portDetectionTimers.current.values()) {
|
for (const timer of timers.values()) {
|
||||||
clearTimeout(timer);
|
clearTimeout(timer);
|
||||||
}
|
}
|
||||||
portDetectionTimers.current.clear();
|
timers.clear();
|
||||||
};
|
};
|
||||||
}, []);
|
}, []);
|
||||||
|
|
||||||
@@ -427,8 +447,8 @@ export function useDevServers({ projectPath }: UseDevServersOptions) {
|
|||||||
|
|
||||||
const handleStartDevServer = useCallback(
|
const handleStartDevServer = useCallback(
|
||||||
async (worktree: WorktreeInfo) => {
|
async (worktree: WorktreeInfo) => {
|
||||||
if (isStartingDevServer) return;
|
if (isStartingAnyDevServer) return;
|
||||||
setIsStartingDevServer(true);
|
setIsStartingAnyDevServer(true);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const api = getElectronAPI();
|
const api = getElectronAPI();
|
||||||
@@ -470,10 +490,10 @@ export function useDevServers({ projectPath }: UseDevServersOptions) {
|
|||||||
description: error instanceof Error ? error.message : undefined,
|
description: error instanceof Error ? error.message : undefined,
|
||||||
});
|
});
|
||||||
} finally {
|
} finally {
|
||||||
setIsStartingDevServer(false);
|
setIsStartingAnyDevServer(false);
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
[isStartingDevServer, projectPath, startPortDetectionTimer]
|
[isStartingAnyDevServer, projectPath, startPortDetectionTimer]
|
||||||
);
|
);
|
||||||
|
|
||||||
const handleStopDevServer = useCallback(
|
const handleStopDevServer = useCallback(
|
||||||
@@ -543,6 +563,13 @@ export function useDevServers({ projectPath }: UseDevServersOptions) {
|
|||||||
[runningDevServers, getWorktreeKey]
|
[runningDevServers, getWorktreeKey]
|
||||||
);
|
);
|
||||||
|
|
||||||
|
const isDevServerStarting = useCallback(
|
||||||
|
(worktree: WorktreeInfo) => {
|
||||||
|
return startingServers.has(getWorktreeKey(worktree));
|
||||||
|
},
|
||||||
|
[startingServers, getWorktreeKey]
|
||||||
|
);
|
||||||
|
|
||||||
const getDevServerInfo = useCallback(
|
const getDevServerInfo = useCallback(
|
||||||
(worktree: WorktreeInfo) => {
|
(worktree: WorktreeInfo) => {
|
||||||
return runningDevServers.get(getWorktreeKey(worktree));
|
return runningDevServers.get(getWorktreeKey(worktree));
|
||||||
@@ -551,10 +578,11 @@ export function useDevServers({ projectPath }: UseDevServersOptions) {
|
|||||||
);
|
);
|
||||||
|
|
||||||
return {
|
return {
|
||||||
isStartingDevServer,
|
isStartingAnyDevServer,
|
||||||
runningDevServers,
|
runningDevServers,
|
||||||
getWorktreeKey,
|
getWorktreeKey,
|
||||||
isDevServerRunning,
|
isDevServerRunning,
|
||||||
|
isDevServerStarting,
|
||||||
getDevServerInfo,
|
getDevServerInfo,
|
||||||
handleStartDevServer,
|
handleStartDevServer,
|
||||||
handleStopDevServer,
|
handleStopDevServer,
|
||||||
|
|||||||
@@ -1,4 +1,4 @@
|
|||||||
import { useEffect, useCallback, useRef, startTransition } from 'react';
|
import { useEffect, useCallback, useRef, startTransition, useMemo } from 'react';
|
||||||
import { useQueryClient } from '@tanstack/react-query';
|
import { useQueryClient } from '@tanstack/react-query';
|
||||||
import { useAppStore } from '@/store/app-store';
|
import { useAppStore } from '@/store/app-store';
|
||||||
import { useWorktrees as useWorktreesQuery } from '@/hooks/queries';
|
import { useWorktrees as useWorktreesQuery } from '@/hooks/queries';
|
||||||
@@ -26,7 +26,7 @@ export function useWorktrees({
|
|||||||
|
|
||||||
// Use the React Query hook
|
// Use the React Query hook
|
||||||
const { data, isLoading, refetch } = useWorktreesQuery(projectPath);
|
const { data, isLoading, refetch } = useWorktreesQuery(projectPath);
|
||||||
const worktrees = (data?.worktrees ?? []) as WorktreeInfo[];
|
const worktrees = useMemo(() => (data?.worktrees ?? []) as WorktreeInfo[], [data?.worktrees]);
|
||||||
|
|
||||||
// Sync worktrees to Zustand store when they change.
|
// Sync worktrees to Zustand store when they change.
|
||||||
// Use a ref to track the previous worktrees and skip the store update when the
|
// Use a ref to track the previous worktrees and skip the store update when the
|
||||||
|
|||||||
@@ -87,8 +87,9 @@ export function WorktreePanel({
|
|||||||
} = useWorktrees({ projectPath, refreshTrigger, onRemovedWorktrees });
|
} = useWorktrees({ projectPath, refreshTrigger, onRemovedWorktrees });
|
||||||
|
|
||||||
const {
|
const {
|
||||||
isStartingDevServer,
|
isStartingAnyDevServer,
|
||||||
isDevServerRunning,
|
isDevServerRunning,
|
||||||
|
isDevServerStarting,
|
||||||
getDevServerInfo,
|
getDevServerInfo,
|
||||||
handleStartDevServer,
|
handleStartDevServer,
|
||||||
handleStopDevServer,
|
handleStopDevServer,
|
||||||
@@ -211,7 +212,6 @@ export function WorktreePanel({
|
|||||||
// Read store state directly inside the effect to avoid a dependency cycle
|
// Read store state directly inside the effect to avoid a dependency cycle
|
||||||
// (the effect writes to the same state it would otherwise depend on)
|
// (the effect writes to the same state it would otherwise depend on)
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
const mainWt = worktrees.find((w) => w.isMain);
|
|
||||||
const otherWts = worktrees.filter((w) => !w.isMain);
|
const otherWts = worktrees.filter((w) => !w.isMain);
|
||||||
const otherSlotCount = Math.max(0, pinnedWorktreesCount);
|
const otherSlotCount = Math.max(0, pinnedWorktreesCount);
|
||||||
|
|
||||||
@@ -487,7 +487,7 @@ export function WorktreePanel({
|
|||||||
description: `Stopped tests in ${worktree.branch}`,
|
description: `Stopped tests in ${worktree.branch}`,
|
||||||
});
|
});
|
||||||
} else {
|
} else {
|
||||||
toast.error('Failed to stop tests', {
|
toast.error(result.error || 'Failed to stop tests', {
|
||||||
description: result.error || 'Unknown error',
|
description: result.error || 'Unknown error',
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@@ -982,6 +982,9 @@ export function WorktreePanel({
|
|||||||
worktrees={worktrees}
|
worktrees={worktrees}
|
||||||
isWorktreeSelected={isWorktreeSelected}
|
isWorktreeSelected={isWorktreeSelected}
|
||||||
hasRunningFeatures={hasRunningFeatures}
|
hasRunningFeatures={hasRunningFeatures}
|
||||||
|
isDevServerRunning={isDevServerRunning}
|
||||||
|
isDevServerStarting={isDevServerStarting}
|
||||||
|
getDevServerInfo={getDevServerInfo}
|
||||||
isActivating={isActivating}
|
isActivating={isActivating}
|
||||||
branchCardCounts={branchCardCounts}
|
branchCardCounts={branchCardCounts}
|
||||||
onSelectWorktree={handleSelectWorktree}
|
onSelectWorktree={handleSelectWorktree}
|
||||||
@@ -1017,7 +1020,8 @@ export function WorktreePanel({
|
|||||||
trackingRemote={getTrackingRemote(selectedWorktree.path)}
|
trackingRemote={getTrackingRemote(selectedWorktree.path)}
|
||||||
isPulling={isPulling}
|
isPulling={isPulling}
|
||||||
isPushing={isPushing}
|
isPushing={isPushing}
|
||||||
isStartingDevServer={isStartingDevServer}
|
isStartingAnyDevServer={isStartingAnyDevServer}
|
||||||
|
isDevServerStarting={isDevServerStarting(selectedWorktree)}
|
||||||
isDevServerRunning={isDevServerRunning(selectedWorktree)}
|
isDevServerRunning={isDevServerRunning(selectedWorktree)}
|
||||||
devServerInfo={getDevServerInfo(selectedWorktree)}
|
devServerInfo={getDevServerInfo(selectedWorktree)}
|
||||||
gitRepoStatus={gitRepoStatus}
|
gitRepoStatus={gitRepoStatus}
|
||||||
@@ -1245,6 +1249,7 @@ export function WorktreePanel({
|
|||||||
isActivating={isActivating}
|
isActivating={isActivating}
|
||||||
branchCardCounts={branchCardCounts}
|
branchCardCounts={branchCardCounts}
|
||||||
isDevServerRunning={isDevServerRunning}
|
isDevServerRunning={isDevServerRunning}
|
||||||
|
isDevServerStarting={isDevServerStarting}
|
||||||
getDevServerInfo={getDevServerInfo}
|
getDevServerInfo={getDevServerInfo}
|
||||||
isAutoModeRunningForWorktree={isAutoModeRunningForWorktree}
|
isAutoModeRunningForWorktree={isAutoModeRunningForWorktree}
|
||||||
isTestRunningForWorktree={isTestRunningForWorktree}
|
isTestRunningForWorktree={isTestRunningForWorktree}
|
||||||
@@ -1261,7 +1266,7 @@ export function WorktreePanel({
|
|||||||
onCreateBranch={onCreateBranch}
|
onCreateBranch={onCreateBranch}
|
||||||
isPulling={isPulling}
|
isPulling={isPulling}
|
||||||
isPushing={isPushing}
|
isPushing={isPushing}
|
||||||
isStartingDevServer={isStartingDevServer}
|
isStartingAnyDevServer={isStartingAnyDevServer}
|
||||||
aheadCount={aheadCount}
|
aheadCount={aheadCount}
|
||||||
behindCount={behindCount}
|
behindCount={behindCount}
|
||||||
hasRemoteBranch={hasRemoteBranch}
|
hasRemoteBranch={hasRemoteBranch}
|
||||||
@@ -1322,6 +1327,7 @@ export function WorktreePanel({
|
|||||||
isRunning={hasRunningFeatures(mainWorktree)}
|
isRunning={hasRunningFeatures(mainWorktree)}
|
||||||
isActivating={isActivating}
|
isActivating={isActivating}
|
||||||
isDevServerRunning={isDevServerRunning(mainWorktree)}
|
isDevServerRunning={isDevServerRunning(mainWorktree)}
|
||||||
|
isDevServerStarting={isDevServerStarting(mainWorktree)}
|
||||||
devServerInfo={getDevServerInfo(mainWorktree)}
|
devServerInfo={getDevServerInfo(mainWorktree)}
|
||||||
branches={branches}
|
branches={branches}
|
||||||
filteredBranches={filteredBranches}
|
filteredBranches={filteredBranches}
|
||||||
@@ -1330,7 +1336,7 @@ export function WorktreePanel({
|
|||||||
isSwitching={isSwitching}
|
isSwitching={isSwitching}
|
||||||
isPulling={isPulling}
|
isPulling={isPulling}
|
||||||
isPushing={isPushing}
|
isPushing={isPushing}
|
||||||
isStartingDevServer={isStartingDevServer}
|
isStartingAnyDevServer={isStartingAnyDevServer}
|
||||||
aheadCount={aheadCount}
|
aheadCount={aheadCount}
|
||||||
behindCount={behindCount}
|
behindCount={behindCount}
|
||||||
hasRemoteBranch={hasRemoteBranch}
|
hasRemoteBranch={hasRemoteBranch}
|
||||||
@@ -1413,6 +1419,7 @@ export function WorktreePanel({
|
|||||||
isRunning={hasRunningFeatures(worktree)}
|
isRunning={hasRunningFeatures(worktree)}
|
||||||
isActivating={isActivating}
|
isActivating={isActivating}
|
||||||
isDevServerRunning={isDevServerRunning(worktree)}
|
isDevServerRunning={isDevServerRunning(worktree)}
|
||||||
|
isDevServerStarting={isDevServerStarting(worktree)}
|
||||||
devServerInfo={getDevServerInfo(worktree)}
|
devServerInfo={getDevServerInfo(worktree)}
|
||||||
branches={branches}
|
branches={branches}
|
||||||
filteredBranches={filteredBranches}
|
filteredBranches={filteredBranches}
|
||||||
@@ -1421,7 +1428,7 @@ export function WorktreePanel({
|
|||||||
isSwitching={isSwitching}
|
isSwitching={isSwitching}
|
||||||
isPulling={isPulling}
|
isPulling={isPulling}
|
||||||
isPushing={isPushing}
|
isPushing={isPushing}
|
||||||
isStartingDevServer={isStartingDevServer}
|
isStartingAnyDevServer={isStartingAnyDevServer}
|
||||||
aheadCount={aheadCount}
|
aheadCount={aheadCount}
|
||||||
behindCount={behindCount}
|
behindCount={behindCount}
|
||||||
hasRemoteBranch={hasRemoteBranch}
|
hasRemoteBranch={hasRemoteBranch}
|
||||||
|
|||||||
@@ -242,8 +242,13 @@ export function ContextView() {
|
|||||||
const handleSelectFile = (file: ContextFile) => {
|
const handleSelectFile = (file: ContextFile) => {
|
||||||
// Note: Unsaved changes warning could be added here in the future
|
// Note: Unsaved changes warning could be added here in the future
|
||||||
// For now, silently proceed to avoid disrupting mobile UX flow
|
// For now, silently proceed to avoid disrupting mobile UX flow
|
||||||
loadFileContent(file);
|
// Set selected file immediately for responsive UI feedback,
|
||||||
|
// then load content asynchronously
|
||||||
|
setSelectedFile(file);
|
||||||
|
setEditedContent(file.content || '');
|
||||||
|
setHasChanges(false);
|
||||||
setIsPreviewMode(isMarkdownFilename(file.name));
|
setIsPreviewMode(isMarkdownFilename(file.name));
|
||||||
|
loadFileContent(file);
|
||||||
};
|
};
|
||||||
|
|
||||||
// Save current file
|
// Save current file
|
||||||
@@ -527,11 +532,13 @@ export function ContextView() {
|
|||||||
delete metadata.files[selectedFile.name];
|
delete metadata.files[selectedFile.name];
|
||||||
await saveMetadata(metadata);
|
await saveMetadata(metadata);
|
||||||
|
|
||||||
|
// Refresh file list before closing dialog so UI is updated when dialog dismisses
|
||||||
|
await loadContextFiles();
|
||||||
|
|
||||||
setIsDeleteDialogOpen(false);
|
setIsDeleteDialogOpen(false);
|
||||||
setSelectedFile(null);
|
setSelectedFile(null);
|
||||||
setEditedContent('');
|
setEditedContent('');
|
||||||
setHasChanges(false);
|
setHasChanges(false);
|
||||||
await loadContextFiles();
|
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
logger.error('Failed to delete file:', error);
|
logger.error('Failed to delete file:', error);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -591,12 +591,16 @@ export function DashboardView() {
|
|||||||
<Button
|
<Button
|
||||||
size="icon"
|
size="icon"
|
||||||
className="bg-linear-to-r from-brand-500 to-brand-600 hover:from-brand-600 hover:to-brand-700 text-white"
|
className="bg-linear-to-r from-brand-500 to-brand-600 hover:from-brand-600 hover:to-brand-700 text-white"
|
||||||
|
data-testid="create-new-project-mobile"
|
||||||
>
|
>
|
||||||
<Plus className="w-4 h-4" />
|
<Plus className="w-4 h-4" />
|
||||||
</Button>
|
</Button>
|
||||||
</DropdownMenuTrigger>
|
</DropdownMenuTrigger>
|
||||||
<DropdownMenuContent align="end" className="w-56">
|
<DropdownMenuContent align="end" className="w-56">
|
||||||
<DropdownMenuItem onClick={handleNewProject}>
|
<DropdownMenuItem
|
||||||
|
onClick={handleNewProject}
|
||||||
|
data-testid="quick-setup-option-mobile"
|
||||||
|
>
|
||||||
<Plus className="w-4 h-4 mr-2" />
|
<Plus className="w-4 h-4 mr-2" />
|
||||||
Quick Setup
|
Quick Setup
|
||||||
</DropdownMenuItem>
|
</DropdownMenuItem>
|
||||||
@@ -662,7 +666,7 @@ export function DashboardView() {
|
|||||||
<DropdownMenuContent align="end" className="w-56">
|
<DropdownMenuContent align="end" className="w-56">
|
||||||
<DropdownMenuItem
|
<DropdownMenuItem
|
||||||
onClick={handleNewProject}
|
onClick={handleNewProject}
|
||||||
data-testid="quick-setup-option"
|
data-testid="quick-setup-option-no-projects"
|
||||||
>
|
>
|
||||||
<Plus className="w-4 h-4 mr-2" />
|
<Plus className="w-4 h-4 mr-2" />
|
||||||
Quick Setup
|
Quick Setup
|
||||||
@@ -749,14 +753,20 @@ export function DashboardView() {
|
|||||||
</Button>
|
</Button>
|
||||||
<DropdownMenu>
|
<DropdownMenu>
|
||||||
<DropdownMenuTrigger asChild>
|
<DropdownMenuTrigger asChild>
|
||||||
<Button className="hidden sm:flex bg-linear-to-r from-brand-500 to-brand-600 hover:from-brand-600 hover:to-brand-700 text-white">
|
<Button
|
||||||
|
className="hidden sm:flex bg-linear-to-r from-brand-500 to-brand-600 hover:from-brand-600 hover:to-brand-700 text-white"
|
||||||
|
data-testid="create-new-project-header"
|
||||||
|
>
|
||||||
<Plus className="w-4 h-4 mr-2" />
|
<Plus className="w-4 h-4 mr-2" />
|
||||||
New Project
|
New Project
|
||||||
<ChevronDown className="w-4 h-4 ml-2" />
|
<ChevronDown className="w-4 h-4 ml-2" />
|
||||||
</Button>
|
</Button>
|
||||||
</DropdownMenuTrigger>
|
</DropdownMenuTrigger>
|
||||||
<DropdownMenuContent align="end" className="w-56">
|
<DropdownMenuContent align="end" className="w-56">
|
||||||
<DropdownMenuItem onClick={handleNewProject}>
|
<DropdownMenuItem
|
||||||
|
onClick={handleNewProject}
|
||||||
|
data-testid="quick-setup-option-has-projects"
|
||||||
|
>
|
||||||
<Plus className="w-4 h-4 mr-2" />
|
<Plus className="w-4 h-4 mr-2" />
|
||||||
Quick Setup
|
Quick Setup
|
||||||
</DropdownMenuItem>
|
</DropdownMenuItem>
|
||||||
|
|||||||
@@ -36,6 +36,8 @@ export interface CodeEditorHandle {
|
|||||||
redo: () => void;
|
redo: () => void;
|
||||||
/** Returns the current text selection with line range, or null if nothing is selected */
|
/** Returns the current text selection with line range, or null if nothing is selected */
|
||||||
getSelection: () => { text: string; fromLine: number; toLine: number } | null;
|
getSelection: () => { text: string; fromLine: number; toLine: number } | null;
|
||||||
|
/** Returns the current editor content (may differ from store if onChange hasn't fired yet) */
|
||||||
|
getValue: () => string | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
interface CodeEditorProps {
|
interface CodeEditorProps {
|
||||||
@@ -465,6 +467,11 @@ export const CodeEditor = forwardRef<CodeEditorHandle, CodeEditorProps>(function
|
|||||||
const toLine = view.state.doc.lineAt(to).number;
|
const toLine = view.state.doc.lineAt(to).number;
|
||||||
return { text, fromLine, toLine };
|
return { text, fromLine, toLine };
|
||||||
},
|
},
|
||||||
|
getValue: () => {
|
||||||
|
const view = editorRef.current?.view;
|
||||||
|
if (!view) return null;
|
||||||
|
return view.state.doc.toString();
|
||||||
|
},
|
||||||
}),
|
}),
|
||||||
[]
|
[]
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -0,0 +1,15 @@
|
|||||||
|
export function computeIsDirty(content: string, originalContent: string): boolean {
|
||||||
|
return content !== originalContent;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function updateTabWithContent<
|
||||||
|
T extends { originalContent: string; content: string; isDirty: boolean },
|
||||||
|
>(tab: T, content: string): T {
|
||||||
|
return { ...tab, content, isDirty: computeIsDirty(content, tab.originalContent) };
|
||||||
|
}
|
||||||
|
|
||||||
|
export function markTabAsSaved<
|
||||||
|
T extends { originalContent: string; content: string; isDirty: boolean },
|
||||||
|
>(tab: T, content: string): T {
|
||||||
|
return { ...tab, content, originalContent: content, isDirty: false };
|
||||||
|
}
|
||||||
@@ -489,18 +489,38 @@ export function FileEditorView({ initialPath }: FileEditorViewProps) {
|
|||||||
|
|
||||||
// ─── Handle Save ─────────────────────────────────────────────
|
// ─── Handle Save ─────────────────────────────────────────────
|
||||||
const handleSave = useCallback(async () => {
|
const handleSave = useCallback(async () => {
|
||||||
if (!activeTab || !activeTab.isDirty) return;
|
// Get fresh state from the store to avoid stale closure issues
|
||||||
|
const {
|
||||||
|
tabs: currentTabs,
|
||||||
|
activeTabId: currentActiveTabId,
|
||||||
|
updateTabContent,
|
||||||
|
} = useFileEditorStore.getState();
|
||||||
|
|
||||||
|
if (!currentActiveTabId) return;
|
||||||
|
|
||||||
|
const tab = currentTabs.find((t) => t.id === currentActiveTabId);
|
||||||
|
if (!tab || !tab.isDirty) return;
|
||||||
|
|
||||||
|
// Get the current editor content directly from CodeMirror to ensure
|
||||||
|
// we save the latest content even if onChange hasn't fired yet
|
||||||
|
const editorContent = editorRef.current?.getValue();
|
||||||
|
const contentToSave = editorContent ?? tab.content;
|
||||||
|
|
||||||
|
// Sync the editor content to the store before saving
|
||||||
|
if (editorContent != null && editorContent !== tab.content) {
|
||||||
|
updateTabContent(tab.id, editorContent);
|
||||||
|
}
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const api = getElectronAPI();
|
const api = getElectronAPI();
|
||||||
const result = await api.writeFile(activeTab.filePath, activeTab.content);
|
const result = await api.writeFile(tab.filePath, contentToSave);
|
||||||
|
|
||||||
if (result.success) {
|
if (result.success) {
|
||||||
markTabSaved(activeTab.id, activeTab.content);
|
markTabSaved(tab.id, contentToSave);
|
||||||
// Refresh git status and inline diff after save
|
// Refresh git status and inline diff after save
|
||||||
loadGitStatus();
|
loadGitStatus();
|
||||||
if (showInlineDiff) {
|
if (showInlineDiff) {
|
||||||
loadFileDiff(activeTab.filePath);
|
loadFileDiff(tab.filePath);
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
logger.error('Failed to save file:', result.error);
|
logger.error('Failed to save file:', result.error);
|
||||||
@@ -508,7 +528,7 @@ export function FileEditorView({ initialPath }: FileEditorViewProps) {
|
|||||||
} catch (error) {
|
} catch (error) {
|
||||||
logger.error('Failed to save file:', error);
|
logger.error('Failed to save file:', error);
|
||||||
}
|
}
|
||||||
}, [activeTab, markTabSaved, loadGitStatus, showInlineDiff, loadFileDiff]);
|
}, [markTabSaved, loadGitStatus, showInlineDiff, loadFileDiff]);
|
||||||
|
|
||||||
// ─── Auto Save: save a specific tab by ID ───────────────────
|
// ─── Auto Save: save a specific tab by ID ───────────────────
|
||||||
const saveTabById = useCallback(
|
const saveTabById = useCallback(
|
||||||
@@ -584,6 +604,7 @@ export function FileEditorView({ initialPath }: FileEditorViewProps) {
|
|||||||
autoSaveTimerRef.current = null;
|
autoSaveTimerRef.current = null;
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps -- activeTab is accessed for isDirty/content only
|
||||||
}, [editorAutoSave, editorAutoSaveDelay, activeTab?.isDirty, activeTab?.content, handleSave]);
|
}, [editorAutoSave, editorAutoSaveDelay, activeTab?.isDirty, activeTab?.content, handleSave]);
|
||||||
|
|
||||||
// ─── Handle Search ──────────────────────────────────────────
|
// ─── Handle Search ──────────────────────────────────────────
|
||||||
@@ -710,6 +731,7 @@ export function FileEditorView({ initialPath }: FileEditorViewProps) {
|
|||||||
model: resolveModelString(featureData.model),
|
model: resolveModelString(featureData.model),
|
||||||
thinkingLevel: featureData.thinkingLevel,
|
thinkingLevel: featureData.thinkingLevel,
|
||||||
reasoningEffort: featureData.reasoningEffort,
|
reasoningEffort: featureData.reasoningEffort,
|
||||||
|
providerId: featureData.providerId,
|
||||||
skipTests: featureData.skipTests,
|
skipTests: featureData.skipTests,
|
||||||
branchName: featureData.workMode === 'current' ? currentBranch : featureData.branchName,
|
branchName: featureData.workMode === 'current' ? currentBranch : featureData.branchName,
|
||||||
planningMode: featureData.planningMode,
|
planningMode: featureData.planningMode,
|
||||||
@@ -1069,6 +1091,7 @@ export function FileEditorView({ initialPath }: FileEditorViewProps) {
|
|||||||
} else {
|
} else {
|
||||||
useFileEditorStore.getState().setActiveFileGitDetails(null);
|
useFileEditorStore.getState().setActiveFileGitDetails(null);
|
||||||
}
|
}
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps -- activeTab accessed for specific properties only
|
||||||
}, [activeTab?.filePath, activeTab?.isBinary, loadFileGitDetails]);
|
}, [activeTab?.filePath, activeTab?.isBinary, loadFileGitDetails]);
|
||||||
|
|
||||||
// Load file diff when inline diff is enabled and active tab changes
|
// Load file diff when inline diff is enabled and active tab changes
|
||||||
@@ -1078,6 +1101,7 @@ export function FileEditorView({ initialPath }: FileEditorViewProps) {
|
|||||||
} else {
|
} else {
|
||||||
useFileEditorStore.getState().setActiveFileDiff(null);
|
useFileEditorStore.getState().setActiveFileDiff(null);
|
||||||
}
|
}
|
||||||
|
// eslint-disable-next-line react-hooks/exhaustive-deps -- activeTab accessed for specific properties only
|
||||||
}, [
|
}, [
|
||||||
showInlineDiff,
|
showInlineDiff,
|
||||||
activeTab?.filePath,
|
activeTab?.filePath,
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import { create } from 'zustand';
|
import { create } from 'zustand';
|
||||||
import { persist, type StorageValue } from 'zustand/middleware';
|
import { persist, type StorageValue } from 'zustand/middleware';
|
||||||
|
import { updateTabWithContent, markTabAsSaved } from './file-editor-dirty-utils';
|
||||||
|
|
||||||
export interface FileTreeNode {
|
export interface FileTreeNode {
|
||||||
name: string;
|
name: string;
|
||||||
@@ -262,17 +263,13 @@ export const useFileEditorStore = create<FileEditorState>()(
|
|||||||
|
|
||||||
updateTabContent: (tabId, content) => {
|
updateTabContent: (tabId, content) => {
|
||||||
set({
|
set({
|
||||||
tabs: get().tabs.map((t) =>
|
tabs: get().tabs.map((t) => (t.id === tabId ? updateTabWithContent(t, content) : t)),
|
||||||
t.id === tabId ? { ...t, content, isDirty: content !== t.originalContent } : t
|
|
||||||
),
|
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
|
|
||||||
markTabSaved: (tabId, content) => {
|
markTabSaved: (tabId, content) => {
|
||||||
set({
|
set({
|
||||||
tabs: get().tabs.map((t) =>
|
tabs: get().tabs.map((t) => (t.id === tabId ? markTabAsSaved(t, content) : t)),
|
||||||
t.id === tabId ? { ...t, content, originalContent: content, isDirty: false } : t
|
|
||||||
),
|
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
|
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user