diff --git a/CLAUDE.md b/CLAUDE.md index 40664601..d46d1284 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -166,7 +166,10 @@ Use `resolveModelString()` from `@automaker/model-resolver` to convert model ali ## Environment Variables - `ANTHROPIC_API_KEY` - Anthropic API key (or use Claude Code CLI auth) +- `HOST` - Host to bind server to (default: 0.0.0.0) +- `HOSTNAME` - Hostname for user-facing URLs (default: localhost) - `PORT` - Server port (default: 3008) - `DATA_DIR` - Data storage directory (default: ./data) - `ALLOWED_ROOT_DIRECTORY` - Restrict file operations to specific directory - `AUTOMAKER_MOCK_AGENT=true` - Enable mock agent mode for CI testing +- `VITE_HOSTNAME` - Hostname for frontend API URLs (default: localhost) diff --git a/README.md b/README.md index ef8c7bc4..60dba6d4 100644 --- a/README.md +++ b/README.md @@ -28,6 +28,7 @@ - [Quick Start](#quick-start) - [How to Run](#how-to-run) - [Development Mode](#development-mode) + - [Interactive TUI Launcher](#interactive-tui-launcher-recommended-for-new-users) - [Building for Production](#building-for-production) - [Testing](#testing) - [Linting](#linting) @@ -101,11 +102,9 @@ In the Discord, you can: ### Prerequisites -- **Node.js 18+** (tested with Node.js 22) +- **Node.js 22+** (required: >=22.0.0 <23.0.0) - **npm** (comes with Node.js) -- **Authentication** (choose one): - - **[Claude Code CLI](https://code.claude.com/docs/en/overview)** (recommended) - Install and authenticate, credentials used automatically - - **Anthropic API Key** - Direct API key for Claude Agent SDK ([get one here](https://console.anthropic.com/)) +- **[Claude Code CLI](https://code.claude.com/docs/en/overview)** - Install and authenticate with your Anthropic subscription. Automaker integrates with your authenticated Claude Code CLI to access Claude models. ### Quick Start @@ -117,30 +116,14 @@ cd automaker # 2. Install dependencies npm install -# 3. Build shared packages (can be skipped - npm run dev does it automatically) -npm run build:packages - -# 4. Start Automaker +# 3. Start Automaker npm run dev # Choose between: # 1. Web Application (browser at localhost:3007) # 2. Desktop Application (Electron - recommended) ``` -**Authentication Setup:** On first run, Automaker will automatically show a setup wizard where you can configure authentication. You can choose to: - -- Use **Claude Code CLI** (recommended) - Automaker will detect your CLI credentials automatically -- Enter an **API key** directly in the wizard - -If you prefer to set up authentication before running (e.g., for headless deployments or CI/CD), you can set it manually: - -```bash -# Option A: Environment variable -export ANTHROPIC_API_KEY="sk-ant-..." - -# Option B: Create .env file in project root -echo "ANTHROPIC_API_KEY=sk-ant-..." > .env -``` +**Authentication:** Automaker integrates with your authenticated Claude Code CLI. Make sure you have [installed and authenticated](https://code.claude.com/docs/en/quickstart) the Claude Code CLI before running Automaker. Your CLI credentials will be detected automatically. **For Development:** `npm run dev` starts the development server with Vite live reload and hot module replacement for fast refresh and instant updates as you make changes. @@ -179,6 +162,40 @@ npm run dev:electron:wsl:gpu npm run dev:web ``` +### Interactive TUI Launcher (Recommended for New Users) + +For a user-friendly interactive menu, use the built-in TUI launcher script: + +```bash +# Show interactive menu with all launch options +./start-automaker.sh + +# Or launch directly without menu +./start-automaker.sh web # Web browser +./start-automaker.sh electron # Desktop app +./start-automaker.sh electron-debug # Desktop + DevTools + +# Additional options +./start-automaker.sh --help # Show all available options +./start-automaker.sh --version # Show version information +./start-automaker.sh --check-deps # Verify project dependencies +./start-automaker.sh --no-colors # Disable colored output +./start-automaker.sh --no-history # Don't remember last choice +``` + +**Features:** + +- 🎨 Beautiful terminal UI with gradient colors and ASCII art +- ⌨️ Interactive menu (press 1-3 to select, Q to exit) +- 💾 Remembers your last choice +- ✅ Pre-flight checks (validates Node.js, npm, dependencies) +- 📏 Responsive layout (adapts to terminal size) +- ⏱️ 30-second timeout for hands-free selection +- 🌐 Cross-shell compatible (bash/zsh) + +**History File:** +Your last selected mode is saved in `~/.automaker_launcher_history` for quick re-runs. + ### Building for Production #### Web Application @@ -239,16 +256,9 @@ docker-compose logs -f docker-compose down ``` -##### Configuration +##### Authentication -Create a `.env` file in the project root if using API key authentication: - -```bash -# Optional: Anthropic API key (not needed if using Claude CLI authentication) -ANTHROPIC_API_KEY=sk-ant-... -``` - -**Note:** Most users authenticate via Claude CLI instead of API keys. See [Claude CLI Authentication](#claude-cli-authentication-optional) below. +Automaker integrates with your authenticated Claude Code CLI. To use CLI authentication in Docker, mount your Claude CLI config directory (see [Claude CLI Authentication](#claude-cli-authentication) below). ##### Working with Projects (Host Directory Access) @@ -262,9 +272,9 @@ services: - /path/to/your/project:/projects/your-project ``` -##### Claude CLI Authentication (Optional) +##### Claude CLI Authentication -To use Claude Code CLI authentication instead of an API key, mount your Claude CLI config directory: +Mount your Claude CLI config directory to use your authenticated CLI credentials: ```yaml services: @@ -362,10 +372,6 @@ npm run lint ### Environment Configuration -#### Authentication (if not using Claude Code CLI) - -- `ANTHROPIC_API_KEY` - Your Anthropic API key for Claude Agent SDK (not needed if using Claude Code CLI) - #### Optional - Server - `PORT` - Server port (default: 3008) @@ -376,7 +382,7 @@ npm run lint - `AUTOMAKER_API_KEY` - Optional API authentication for the server - `ALLOWED_ROOT_DIRECTORY` - Restrict file operations to specific directory -- `CORS_ORIGIN` - CORS policy (default: \*) +- `CORS_ORIGIN` - CORS allowed origins (comma-separated list; defaults to localhost only) #### Optional - Development @@ -385,40 +391,12 @@ npm run lint ### Authentication Setup -#### Option 1: Claude Code CLI (Recommended) +Automaker integrates with your authenticated Claude Code CLI and uses your Anthropic subscription. Install and authenticate the Claude Code CLI following the [official quickstart guide](https://code.claude.com/docs/en/quickstart). Once authenticated, Automaker will automatically detect and use your CLI credentials. No additional configuration needed! -#### Option 2: Direct API Key - -If you prefer not to use the CLI, you can provide an Anthropic API key directly using one of these methods: - -##### 2a. Shell Configuration - -Add to your `~/.bashrc` or `~/.zshrc`: - -```bash -export ANTHROPIC_API_KEY="sk-ant-..." -``` - -Then restart your terminal or run `source ~/.bashrc` (or `source ~/.zshrc`). - -##### 2b. .env File - -Create a `.env` file in the project root (gitignored): - -```bash -ANTHROPIC_API_KEY=sk-ant-... -PORT=3008 -DATA_DIR=./data -``` - -##### 2c. In-App Storage - -The application can store your API key securely in the settings UI. The key is persisted in the `DATA_DIR` directory. - ## Features ### Core Workflow @@ -527,20 +505,24 @@ Automaker provides several specialized views accessible via the sidebar or keybo | **Agent** | `A` | Interactive chat sessions with AI agents for exploratory work and questions | | **Spec** | `D` | Project specification editor with AI-powered generation and feature suggestions | | **Context** | `C` | Manage context files (markdown, images) that AI agents automatically reference | -| **Profiles** | `M` | Create and manage AI agent profiles with custom prompts and configurations | | **Settings** | `S` | Configure themes, shortcuts, defaults, authentication, and more | | **Terminal** | `T` | Integrated terminal with tabs, splits, and persistent sessions | -| **GitHub Issues** | - | Import and validate GitHub issues, convert to tasks | +| **Graph** | `H` | Visualize feature dependencies with interactive graph visualization | +| **Ideation** | `I` | Brainstorm and generate ideas with AI assistance | +| **Memory** | `Y` | View and manage agent memory and conversation history | +| **GitHub Issues** | `G` | Import and validate GitHub issues, convert to tasks | +| **GitHub PRs** | `R` | View and manage GitHub pull requests | | **Running Agents** | - | View all active agents across projects with status and progress | ### Keyboard Navigation All shortcuts are customizable in Settings. Default shortcuts: -- **Navigation:** `K` (Board), `A` (Agent), `D` (Spec), `C` (Context), `S` (Settings), `M` (Profiles), `T` (Terminal) +- **Navigation:** `K` (Board), `A` (Agent), `D` (Spec), `C` (Context), `S` (Settings), `T` (Terminal), `H` (Graph), `I` (Ideation), `Y` (Memory), `G` (GitHub Issues), `R` (GitHub PRs) - **UI:** `` ` `` (Toggle sidebar) -- **Actions:** `N` (New item in current view), `G` (Start next features), `O` (Open project), `P` (Project picker) +- **Actions:** `N` (New item in current view), `O` (Open project), `P` (Project picker) - **Projects:** `Q`/`E` (Cycle previous/next project) +- **Terminal:** `Alt+D` (Split right), `Alt+S` (Split down), `Alt+W` (Close), `Alt+T` (New tab) ## Architecture @@ -605,10 +587,16 @@ Stored in `{projectPath}/.automaker/`: │ ├── agent-output.md # AI agent output log │ └── images/ # Attached images ├── context/ # Context files for AI agents +├── worktrees/ # Git worktree metadata +├── validations/ # GitHub issue validation results +├── ideation/ # Brainstorming and analysis data +│ └── analysis.json # Project structure analysis +├── board/ # Board-related data +├── images/ # Project-level images ├── settings.json # Project-specific settings -├── spec.md # Project specification -├── analysis.json # Project structure analysis -└── feature-suggestions.json # AI-generated suggestions +├── app_spec.txt # Project specification (XML format) +├── active-branches.json # Active git branches tracking +└── execution-state.json # Auto-mode execution state ``` #### Global Data @@ -646,7 +634,6 @@ data/ - [Contributing Guide](./CONTRIBUTING.md) - How to contribute to Automaker - [Project Documentation](./docs/) - Architecture guides, patterns, and developer docs -- [Docker Isolation Guide](./docs/docker-isolation.md) - Security-focused Docker deployment - [Shared Packages Guide](./docs/llm-shared-packages.md) - Using monorepo packages ### Community diff --git a/apps/server/.env.example b/apps/server/.env.example index 68b28395..6ac27145 100644 --- a/apps/server/.env.example +++ b/apps/server/.env.example @@ -44,6 +44,11 @@ CORS_ORIGIN=http://localhost:3007 # OPTIONAL - Server # ============================================ +# Host to bind the server to (default: 0.0.0.0) +# Use 0.0.0.0 to listen on all interfaces (recommended for Docker/remote access) +# Use 127.0.0.1 or localhost to restrict to local connections only +HOST=0.0.0.0 + # Port to run the server on PORT=3008 diff --git a/apps/server/package.json b/apps/server/package.json index 3cc4fe18..e214eb02 100644 --- a/apps/server/package.json +++ b/apps/server/package.json @@ -1,6 +1,6 @@ { "name": "@automaker/server", - "version": "0.11.0", + "version": "0.12.0", "description": "Backend server for Automaker - provides API for both web and Electron modes", "author": "AutoMaker Team", "license": "SEE LICENSE IN LICENSE", diff --git a/apps/server/src/index.ts b/apps/server/src/index.ts index 3a59d4d3..d90c7a36 100644 --- a/apps/server/src/index.ts +++ b/apps/server/src/index.ts @@ -79,11 +79,17 @@ import { createIdeationRoutes } from './routes/ideation/index.js'; import { IdeationService } from './services/ideation-service.js'; import { getDevServerService } from './services/dev-server-service.js'; import { eventHookService } from './services/event-hook-service.js'; +import { createNotificationsRoutes } from './routes/notifications/index.js'; +import { getNotificationService } from './services/notification-service.js'; +import { createEventHistoryRoutes } from './routes/event-history/index.js'; +import { getEventHistoryService } from './services/event-history-service.js'; // Load environment variables dotenv.config(); const PORT = parseInt(process.env.PORT || '3008', 10); +const HOST = process.env.HOST || '0.0.0.0'; +const HOSTNAME = process.env.HOSTNAME || 'localhost'; const DATA_DIR = process.env.DATA_DIR || './data'; const ENABLE_REQUEST_LOGGING_DEFAULT = process.env.ENABLE_REQUEST_LOGGING !== 'false'; // Default to true @@ -208,8 +214,15 @@ const ideationService = new IdeationService(events, settingsService, featureLoad const devServerService = getDevServerService(); devServerService.setEventEmitter(events); -// Initialize Event Hook Service for custom event triggers -eventHookService.initialize(events, settingsService); +// Initialize Notification Service with event emitter for real-time updates +const notificationService = getNotificationService(); +notificationService.setEventEmitter(events); + +// Initialize Event History Service +const eventHistoryService = getEventHistoryService(); + +// Initialize Event Hook Service for custom event triggers (with history storage) +eventHookService.initialize(events, settingsService, eventHistoryService); // Initialize services (async () => { @@ -264,7 +277,7 @@ app.get('/api/health/detailed', createDetailedHandler()); app.use('/api/fs', createFsRoutes(events)); app.use('/api/agent', createAgentRoutes(agentService, events)); app.use('/api/sessions', createSessionsRoutes(agentService)); -app.use('/api/features', createFeaturesRoutes(featureLoader, settingsService)); +app.use('/api/features', createFeaturesRoutes(featureLoader, settingsService, events)); app.use('/api/auto-mode', createAutoModeRoutes(autoModeService)); app.use('/api/enhance-prompt', createEnhancePromptRoutes(settingsService)); app.use('/api/worktree', createWorktreeRoutes(events, settingsService)); @@ -285,6 +298,8 @@ app.use('/api/backlog-plan', createBacklogPlanRoutes(events, settingsService)); app.use('/api/mcp', createMCPRoutes(mcpTestService)); app.use('/api/pipeline', createPipelineRoutes(pipelineService)); app.use('/api/ideation', createIdeationRoutes(events, ideationService, featureLoader)); +app.use('/api/notifications', createNotificationsRoutes(notificationService)); +app.use('/api/event-history', createEventHistoryRoutes(eventHistoryService, settingsService)); // Create HTTP server const server = createServer(app); @@ -596,8 +611,8 @@ terminalWss.on('connection', (ws: WebSocket, req: import('http').IncomingMessage }); // Start server with error handling for port conflicts -const startServer = (port: number) => { - server.listen(port, () => { +const startServer = (port: number, host: string) => { + server.listen(port, host, () => { const terminalStatus = isTerminalEnabled() ? isTerminalPasswordRequired() ? 'enabled (password protected)' @@ -608,10 +623,11 @@ const startServer = (port: number) => { ╔═══════════════════════════════════════════════════════╗ ║ Automaker Backend Server ║ ╠═══════════════════════════════════════════════════════╣ -║ HTTP API: http://localhost:${portStr} ║ -║ WebSocket: ws://localhost:${portStr}/api/events ║ -║ Terminal: ws://localhost:${portStr}/api/terminal/ws ║ -║ Health: http://localhost:${portStr}/api/health ║ +║ Listening: ${host}:${port}${' '.repeat(Math.max(0, 34 - host.length - port.toString().length))}║ +║ HTTP API: http://${HOSTNAME}:${portStr} ║ +║ WebSocket: ws://${HOSTNAME}:${portStr}/api/events ║ +║ Terminal: ws://${HOSTNAME}:${portStr}/api/terminal/ws ║ +║ Health: http://${HOSTNAME}:${portStr}/api/health ║ ║ Terminal: ${terminalStatus.padEnd(37)}║ ╚═══════════════════════════════════════════════════════╝ `); @@ -645,7 +661,7 @@ const startServer = (port: number) => { }); }; -startServer(PORT); +startServer(PORT, HOST); // Global error handlers to prevent crashes from uncaught errors process.on('unhandledRejection', (reason: unknown, _promise: Promise) => { diff --git a/apps/server/src/lib/app-spec-format.ts b/apps/server/src/lib/app-spec-format.ts index 2894bbc4..f8393cf1 100644 --- a/apps/server/src/lib/app-spec-format.ts +++ b/apps/server/src/lib/app-spec-format.ts @@ -11,8 +11,12 @@ export { specOutputSchema } from '@automaker/types'; /** * Escape special XML characters + * Handles undefined/null values by converting them to empty strings */ -function escapeXml(str: string): string { +export function escapeXml(str: string | undefined | null): string { + if (str == null) { + return ''; + } return str .replace(/&/g, '&') .replace(/ void; + warn?: (message: string, ...args: unknown[]) => void; +} + +/** + * Options for XML extraction operations + */ +export interface ExtractXmlOptions { + /** Custom logger (defaults to internal logger) */ + logger?: XmlExtractorLogger; +} + +/** + * Escape special XML characters + * Handles undefined/null values by converting them to empty strings + */ +export function escapeXml(str: string | undefined | null): string { + if (str == null) { + return ''; + } + return str + .replace(/&/g, '&') + .replace(//g, '>') + .replace(/"/g, '"') + .replace(/'/g, '''); +} + +/** + * Unescape XML entities back to regular characters + */ +export function unescapeXml(str: string): string { + return str + .replace(/'/g, "'") + .replace(/"/g, '"') + .replace(/>/g, '>') + .replace(/</g, '<') + .replace(/&/g, '&'); +} + +/** + * Extract the content of a specific XML section + * + * @param xmlContent - The full XML content + * @param tagName - The tag name to extract (e.g., 'implemented_features') + * @param options - Optional extraction options + * @returns The content between the tags, or null if not found + */ +export function extractXmlSection( + xmlContent: string, + tagName: string, + options: ExtractXmlOptions = {} +): string | null { + const log = options.logger || logger; + + const regex = new RegExp(`<${tagName}>([\\s\\S]*?)<\\/${tagName}>`, 'i'); + const match = xmlContent.match(regex); + + if (match) { + log.debug(`Extracted <${tagName}> section`); + return match[1]; + } + + log.debug(`Section <${tagName}> not found`); + return null; +} + +/** + * Extract all values from repeated XML elements + * + * @param xmlContent - The XML content to search + * @param tagName - The tag name to extract values from + * @param options - Optional extraction options + * @returns Array of extracted values (unescaped) + */ +export function extractXmlElements( + xmlContent: string, + tagName: string, + options: ExtractXmlOptions = {} +): string[] { + const log = options.logger || logger; + const values: string[] = []; + + const regex = new RegExp(`<${tagName}>([\\s\\S]*?)<\\/${tagName}>`, 'g'); + const matches = xmlContent.matchAll(regex); + + for (const match of matches) { + values.push(unescapeXml(match[1].trim())); + } + + log.debug(`Extracted ${values.length} <${tagName}> elements`); + return values; +} + +/** + * Extract implemented features from app_spec.txt XML content + * + * @param specContent - The full XML content of app_spec.txt + * @param options - Optional extraction options + * @returns Array of implemented features with name, description, and optional file_locations + */ +export function extractImplementedFeatures( + specContent: string, + options: ExtractXmlOptions = {} +): ImplementedFeature[] { + const log = options.logger || logger; + const features: ImplementedFeature[] = []; + + // Match ... section + const implementedSection = extractXmlSection(specContent, 'implemented_features', options); + + if (!implementedSection) { + log.debug('No implemented_features section found'); + return features; + } + + // Extract individual feature blocks + const featureRegex = /([\s\S]*?)<\/feature>/g; + const featureMatches = implementedSection.matchAll(featureRegex); + + for (const featureMatch of featureMatches) { + const featureContent = featureMatch[1]; + + // Extract name + const nameMatch = featureContent.match(/([\s\S]*?)<\/name>/); + const name = nameMatch ? unescapeXml(nameMatch[1].trim()) : ''; + + // Extract description + const descMatch = featureContent.match(/([\s\S]*?)<\/description>/); + const description = descMatch ? unescapeXml(descMatch[1].trim()) : ''; + + // Extract file_locations if present + const locationsSection = extractXmlSection(featureContent, 'file_locations', options); + const file_locations = locationsSection + ? extractXmlElements(locationsSection, 'location', options) + : undefined; + + if (name) { + features.push({ + name, + description, + ...(file_locations && file_locations.length > 0 ? { file_locations } : {}), + }); + } + } + + log.debug(`Extracted ${features.length} implemented features`); + return features; +} + +/** + * Extract only the feature names from implemented_features section + * + * @param specContent - The full XML content of app_spec.txt + * @param options - Optional extraction options + * @returns Array of feature names + */ +export function extractImplementedFeatureNames( + specContent: string, + options: ExtractXmlOptions = {} +): string[] { + const features = extractImplementedFeatures(specContent, options); + return features.map((f) => f.name); +} + +/** + * Generate XML for a single implemented feature + * + * @param feature - The feature to convert to XML + * @param indent - The base indentation level (default: 2 spaces) + * @returns XML string for the feature + */ +export function featureToXml(feature: ImplementedFeature, indent: string = ' '): string { + const i2 = indent.repeat(2); + const i3 = indent.repeat(3); + const i4 = indent.repeat(4); + + let xml = `${i2} +${i3}${escapeXml(feature.name)} +${i3}${escapeXml(feature.description)}`; + + if (feature.file_locations && feature.file_locations.length > 0) { + xml += ` +${i3} +${feature.file_locations.map((loc) => `${i4}${escapeXml(loc)}`).join('\n')} +${i3}`; + } + + xml += ` +${i2}`; + + return xml; +} + +/** + * Generate XML for an array of implemented features + * + * @param features - Array of features to convert to XML + * @param indent - The base indentation level (default: 2 spaces) + * @returns XML string for the implemented_features section content + */ +export function featuresToXml(features: ImplementedFeature[], indent: string = ' '): string { + return features.map((f) => featureToXml(f, indent)).join('\n'); +} + +/** + * Update the implemented_features section in XML content + * + * @param specContent - The full XML content + * @param newFeatures - The new features to set + * @param options - Optional extraction options + * @returns Updated XML content with the new implemented_features section + */ +export function updateImplementedFeaturesSection( + specContent: string, + newFeatures: ImplementedFeature[], + options: ExtractXmlOptions = {} +): string { + const log = options.logger || logger; + const indent = ' '; + + // Generate new section content + const newSectionContent = featuresToXml(newFeatures, indent); + + // Build the new section + const newSection = ` +${newSectionContent} +${indent}`; + + // Check if section exists + const sectionRegex = /[\s\S]*?<\/implemented_features>/; + + if (sectionRegex.test(specContent)) { + log.debug('Replacing existing implemented_features section'); + return specContent.replace(sectionRegex, newSection); + } + + // If section doesn't exist, try to insert after core_capabilities + const coreCapabilitiesEnd = ''; + const insertIndex = specContent.indexOf(coreCapabilitiesEnd); + + if (insertIndex !== -1) { + const insertPosition = insertIndex + coreCapabilitiesEnd.length; + log.debug('Inserting implemented_features after core_capabilities'); + return ( + specContent.slice(0, insertPosition) + + '\n\n' + + indent + + newSection + + specContent.slice(insertPosition) + ); + } + + // As a fallback, insert before + const projectSpecEnd = ''; + const fallbackIndex = specContent.indexOf(projectSpecEnd); + + if (fallbackIndex !== -1) { + log.debug('Inserting implemented_features before '); + return ( + specContent.slice(0, fallbackIndex) + + indent + + newSection + + '\n' + + specContent.slice(fallbackIndex) + ); + } + + log.warn?.('Could not find appropriate insertion point for implemented_features'); + log.debug('Could not find appropriate insertion point for implemented_features'); + return specContent; +} + +/** + * Add a new feature to the implemented_features section + * + * @param specContent - The full XML content + * @param newFeature - The feature to add + * @param options - Optional extraction options + * @returns Updated XML content with the new feature added + */ +export function addImplementedFeature( + specContent: string, + newFeature: ImplementedFeature, + options: ExtractXmlOptions = {} +): string { + const log = options.logger || logger; + + // Extract existing features + const existingFeatures = extractImplementedFeatures(specContent, options); + + // Check for duplicates by name + const isDuplicate = existingFeatures.some( + (f) => f.name.toLowerCase() === newFeature.name.toLowerCase() + ); + + if (isDuplicate) { + log.debug(`Feature "${newFeature.name}" already exists, skipping`); + return specContent; + } + + // Add the new feature + const updatedFeatures = [...existingFeatures, newFeature]; + + log.debug(`Adding feature "${newFeature.name}"`); + return updateImplementedFeaturesSection(specContent, updatedFeatures, options); +} + +/** + * Remove a feature from the implemented_features section by name + * + * @param specContent - The full XML content + * @param featureName - The name of the feature to remove + * @param options - Optional extraction options + * @returns Updated XML content with the feature removed + */ +export function removeImplementedFeature( + specContent: string, + featureName: string, + options: ExtractXmlOptions = {} +): string { + const log = options.logger || logger; + + // Extract existing features + const existingFeatures = extractImplementedFeatures(specContent, options); + + // Filter out the feature to remove + const updatedFeatures = existingFeatures.filter( + (f) => f.name.toLowerCase() !== featureName.toLowerCase() + ); + + if (updatedFeatures.length === existingFeatures.length) { + log.debug(`Feature "${featureName}" not found, no changes made`); + return specContent; + } + + log.debug(`Removing feature "${featureName}"`); + return updateImplementedFeaturesSection(specContent, updatedFeatures, options); +} + +/** + * Update an existing feature in the implemented_features section + * + * @param specContent - The full XML content + * @param featureName - The name of the feature to update + * @param updates - Partial updates to apply to the feature + * @param options - Optional extraction options + * @returns Updated XML content with the feature modified + */ +export function updateImplementedFeature( + specContent: string, + featureName: string, + updates: Partial, + options: ExtractXmlOptions = {} +): string { + const log = options.logger || logger; + + // Extract existing features + const existingFeatures = extractImplementedFeatures(specContent, options); + + // Find and update the feature + let found = false; + const updatedFeatures = existingFeatures.map((f) => { + if (f.name.toLowerCase() === featureName.toLowerCase()) { + found = true; + return { + ...f, + ...updates, + // Preserve the original name if not explicitly updated + name: updates.name ?? f.name, + }; + } + return f; + }); + + if (!found) { + log.debug(`Feature "${featureName}" not found, no changes made`); + return specContent; + } + + log.debug(`Updating feature "${featureName}"`); + return updateImplementedFeaturesSection(specContent, updatedFeatures, options); +} + +/** + * Check if a feature exists in the implemented_features section + * + * @param specContent - The full XML content + * @param featureName - The name of the feature to check + * @param options - Optional extraction options + * @returns True if the feature exists + */ +export function hasImplementedFeature( + specContent: string, + featureName: string, + options: ExtractXmlOptions = {} +): boolean { + const features = extractImplementedFeatures(specContent, options); + return features.some((f) => f.name.toLowerCase() === featureName.toLowerCase()); +} + +/** + * Convert extracted features to SpecOutput.implemented_features format + * + * @param features - Array of extracted features + * @returns Features in SpecOutput format + */ +export function toSpecOutputFeatures( + features: ImplementedFeature[] +): SpecOutput['implemented_features'] { + return features.map((f) => ({ + name: f.name, + description: f.description, + ...(f.file_locations && f.file_locations.length > 0 + ? { file_locations: f.file_locations } + : {}), + })); +} + +/** + * Convert SpecOutput.implemented_features to ImplementedFeature format + * + * @param specFeatures - Features from SpecOutput + * @returns Features in ImplementedFeature format + */ +export function fromSpecOutputFeatures( + specFeatures: SpecOutput['implemented_features'] +): ImplementedFeature[] { + return specFeatures.map((f) => ({ + name: f.name, + description: f.description, + ...(f.file_locations && f.file_locations.length > 0 + ? { file_locations: f.file_locations } + : {}), + })); +} + +/** + * Represents a roadmap phase extracted from XML + */ +export interface RoadmapPhase { + name: string; + status: string; + description?: string; +} + +/** + * Extract the technology stack from app_spec.txt XML content + * + * @param specContent - The full XML content + * @param options - Optional extraction options + * @returns Array of technology names + */ +export function extractTechnologyStack( + specContent: string, + options: ExtractXmlOptions = {} +): string[] { + const log = options.logger || logger; + + const techSection = extractXmlSection(specContent, 'technology_stack', options); + if (!techSection) { + log.debug('No technology_stack section found'); + return []; + } + + const technologies = extractXmlElements(techSection, 'technology', options); + log.debug(`Extracted ${technologies.length} technologies`); + return technologies; +} + +/** + * Update the technology_stack section in XML content + * + * @param specContent - The full XML content + * @param technologies - The new technology list + * @param options - Optional extraction options + * @returns Updated XML content + */ +export function updateTechnologyStack( + specContent: string, + technologies: string[], + options: ExtractXmlOptions = {} +): string { + const log = options.logger || logger; + const indent = ' '; + const i2 = indent.repeat(2); + + // Generate new section content + const techXml = technologies + .map((t) => `${i2}${escapeXml(t)}`) + .join('\n'); + const newSection = `\n${techXml}\n${indent}`; + + // Check if section exists + const sectionRegex = /[\s\S]*?<\/technology_stack>/; + + if (sectionRegex.test(specContent)) { + log.debug('Replacing existing technology_stack section'); + return specContent.replace(sectionRegex, newSection); + } + + log.debug('No technology_stack section found to update'); + return specContent; +} + +/** + * Extract roadmap phases from app_spec.txt XML content + * + * @param specContent - The full XML content + * @param options - Optional extraction options + * @returns Array of roadmap phases + */ +export function extractRoadmapPhases( + specContent: string, + options: ExtractXmlOptions = {} +): RoadmapPhase[] { + const log = options.logger || logger; + const phases: RoadmapPhase[] = []; + + const roadmapSection = extractXmlSection(specContent, 'implementation_roadmap', options); + if (!roadmapSection) { + log.debug('No implementation_roadmap section found'); + return phases; + } + + // Extract individual phase blocks + const phaseRegex = /([\s\S]*?)<\/phase>/g; + const phaseMatches = roadmapSection.matchAll(phaseRegex); + + for (const phaseMatch of phaseMatches) { + const phaseContent = phaseMatch[1]; + + const nameMatch = phaseContent.match(/([\s\S]*?)<\/name>/); + const name = nameMatch ? unescapeXml(nameMatch[1].trim()) : ''; + + const statusMatch = phaseContent.match(/([\s\S]*?)<\/status>/); + const status = statusMatch ? unescapeXml(statusMatch[1].trim()) : 'pending'; + + const descMatch = phaseContent.match(/([\s\S]*?)<\/description>/); + const description = descMatch ? unescapeXml(descMatch[1].trim()) : undefined; + + if (name) { + phases.push({ name, status, description }); + } + } + + log.debug(`Extracted ${phases.length} roadmap phases`); + return phases; +} + +/** + * Update a roadmap phase status in XML content + * + * @param specContent - The full XML content + * @param phaseName - The name of the phase to update + * @param newStatus - The new status value + * @param options - Optional extraction options + * @returns Updated XML content + */ +export function updateRoadmapPhaseStatus( + specContent: string, + phaseName: string, + newStatus: string, + options: ExtractXmlOptions = {} +): string { + const log = options.logger || logger; + + // Find the phase and update its status + // Match the phase block containing the specific name + const phaseRegex = new RegExp( + `(\\s*\\s*${escapeXml(phaseName)}\\s*<\\/name>\\s*)[\\s\\S]*?(<\\/status>)`, + 'i' + ); + + if (phaseRegex.test(specContent)) { + log.debug(`Updating phase "${phaseName}" status to "${newStatus}"`); + return specContent.replace(phaseRegex, `$1${escapeXml(newStatus)}$2`); + } + + log.debug(`Phase "${phaseName}" not found`); + return specContent; +} diff --git a/apps/server/src/providers/cli-provider.ts b/apps/server/src/providers/cli-provider.ts index 8683f841..ea636cb8 100644 --- a/apps/server/src/providers/cli-provider.ts +++ b/apps/server/src/providers/cli-provider.ts @@ -35,6 +35,7 @@ import { type SubprocessOptions, type WslCliResult, } from '@automaker/platform'; +import { calculateReasoningTimeout } from '@automaker/types'; import { createLogger, isAbortError } from '@automaker/utils'; import { execSync } from 'child_process'; import * as fs from 'fs'; @@ -107,6 +108,15 @@ export interface CliDetectionResult { // Create logger for CLI operations const cliLogger = createLogger('CliProvider'); +/** + * Base timeout for CLI operations in milliseconds. + * CLI tools have longer startup and processing times compared to direct API calls, + * so we use a higher base timeout (120s) than the default provider timeout (30s). + * This is multiplied by reasoning effort multipliers when applicable. + * @see calculateReasoningTimeout from @automaker/types + */ +const CLI_BASE_TIMEOUT_MS = 120000; + /** * Abstract base class for CLI-based providers * @@ -450,6 +460,10 @@ export abstract class CliProvider extends BaseProvider { } } + // Calculate dynamic timeout based on reasoning effort. + // This addresses GitHub issue #530 where reasoning models with 'xhigh' effort would timeout. + const timeout = calculateReasoningTimeout(options.reasoningEffort, CLI_BASE_TIMEOUT_MS); + // WSL strategy if (this.useWsl && this.wslCliPath) { const wslCwd = windowsToWslPath(cwd); @@ -473,7 +487,7 @@ export abstract class CliProvider extends BaseProvider { cwd, // Windows cwd for spawn env: filteredEnv, abortController: options.abortController, - timeout: 120000, // CLI operations may take longer + timeout, }; } @@ -488,7 +502,7 @@ export abstract class CliProvider extends BaseProvider { cwd, env: filteredEnv, abortController: options.abortController, - timeout: 120000, + timeout, }; } @@ -501,7 +515,7 @@ export abstract class CliProvider extends BaseProvider { cwd, env: filteredEnv, abortController: options.abortController, - timeout: 120000, + timeout, }; } diff --git a/apps/server/src/providers/codex-provider.ts b/apps/server/src/providers/codex-provider.ts index e0f38ee9..18838cb8 100644 --- a/apps/server/src/providers/codex-provider.ts +++ b/apps/server/src/providers/codex-provider.ts @@ -33,6 +33,8 @@ import { CODEX_MODEL_MAP, supportsReasoningEffort, validateBareModelId, + calculateReasoningTimeout, + DEFAULT_TIMEOUT_MS, type CodexApprovalPolicy, type CodexSandboxMode, type CodexAuthStatus, @@ -91,7 +93,14 @@ const CODEX_ITEM_TYPES = { const SYSTEM_PROMPT_LABEL = 'System instructions'; const HISTORY_HEADER = 'Current request:\n'; const TEXT_ENCODING = 'utf-8'; -const DEFAULT_TIMEOUT_MS = 30000; +/** + * Default timeout for Codex CLI operations in milliseconds. + * This is the "no output" timeout - if the CLI doesn't produce any JSONL output + * for this duration, the process is killed. For reasoning models with high + * reasoning effort, this timeout is dynamically extended via calculateReasoningTimeout(). + * @see calculateReasoningTimeout from @automaker/types + */ +const CODEX_CLI_TIMEOUT_MS = DEFAULT_TIMEOUT_MS; const CONTEXT_WINDOW_256K = 256000; const MAX_OUTPUT_32K = 32000; const MAX_OUTPUT_16K = 16000; @@ -814,13 +823,19 @@ export class CodexProvider extends BaseProvider { envOverrides[OPENAI_API_KEY_ENV] = executionPlan.openAiApiKey; } + // Calculate dynamic timeout based on reasoning effort. + // Higher reasoning effort (e.g., 'xhigh' for "xtra thinking" mode) requires more time + // for the model to generate reasoning tokens before producing output. + // This fixes GitHub issue #530 where features would get stuck with reasoning models. + const timeout = calculateReasoningTimeout(options.reasoningEffort, CODEX_CLI_TIMEOUT_MS); + const stream = spawnJSONLProcess({ command: commandPath, args, cwd: options.cwd, env: envOverrides, abortController: options.abortController, - timeout: DEFAULT_TIMEOUT_MS, + timeout, stdinData: promptText, // Pass prompt via stdin }); diff --git a/apps/server/src/routes/app-spec/common.ts b/apps/server/src/routes/app-spec/common.ts index 7ef1aabe..1a48fc6a 100644 --- a/apps/server/src/routes/app-spec/common.ts +++ b/apps/server/src/routes/app-spec/common.ts @@ -6,8 +6,17 @@ import { createLogger } from '@automaker/utils'; const logger = createLogger('SpecRegeneration'); +// Types for running generation +export type GenerationType = 'spec_regeneration' | 'feature_generation' | 'sync'; + +interface RunningGeneration { + isRunning: boolean; + type: GenerationType; + startedAt: string; +} + // Shared state for tracking generation status - scoped by project path -const runningProjects = new Map(); +const runningProjects = new Map(); const abortControllers = new Map(); /** @@ -17,16 +26,21 @@ export function getSpecRegenerationStatus(projectPath?: string): { isRunning: boolean; currentAbortController: AbortController | null; projectPath?: string; + type?: GenerationType; + startedAt?: string; } { if (projectPath) { + const generation = runningProjects.get(projectPath); return { - isRunning: runningProjects.get(projectPath) || false, + isRunning: generation?.isRunning || false, currentAbortController: abortControllers.get(projectPath) || null, projectPath, + type: generation?.type, + startedAt: generation?.startedAt, }; } // Fallback: check if any project is running (for backward compatibility) - const isAnyRunning = Array.from(runningProjects.values()).some((running) => running); + const isAnyRunning = Array.from(runningProjects.values()).some((g) => g.isRunning); return { isRunning: isAnyRunning, currentAbortController: null }; } @@ -46,10 +60,15 @@ export function getRunningProjectPath(): string | null { export function setRunningState( projectPath: string, running: boolean, - controller: AbortController | null = null + controller: AbortController | null = null, + type: GenerationType = 'spec_regeneration' ): void { if (running) { - runningProjects.set(projectPath, true); + runningProjects.set(projectPath, { + isRunning: true, + type, + startedAt: new Date().toISOString(), + }); if (controller) { abortControllers.set(projectPath, controller); } @@ -59,6 +78,33 @@ export function setRunningState( } } +/** + * Get all running spec/feature generations for the running agents view + */ +export function getAllRunningGenerations(): Array<{ + projectPath: string; + type: GenerationType; + startedAt: string; +}> { + const results: Array<{ + projectPath: string; + type: GenerationType; + startedAt: string; + }> = []; + + for (const [projectPath, generation] of runningProjects.entries()) { + if (generation.isRunning) { + results.push({ + projectPath, + type: generation.type, + startedAt: generation.startedAt, + }); + } + } + + return results; +} + /** * Helper to log authentication status */ diff --git a/apps/server/src/routes/app-spec/generate-features-from-spec.ts b/apps/server/src/routes/app-spec/generate-features-from-spec.ts index c149d5f0..24c3f398 100644 --- a/apps/server/src/routes/app-spec/generate-features-from-spec.ts +++ b/apps/server/src/routes/app-spec/generate-features-from-spec.ts @@ -15,6 +15,7 @@ import { parseAndCreateFeatures } from './parse-and-create-features.js'; import { getAppSpecPath } from '@automaker/platform'; import type { SettingsService } from '../../services/settings-service.js'; import { getAutoLoadClaudeMdSetting, getPromptCustomization } from '../../lib/settings-helpers.js'; +import { FeatureLoader } from '../../services/feature-loader.js'; const logger = createLogger('SpecRegeneration'); @@ -56,13 +57,45 @@ export async function generateFeaturesFromSpec( // Get customized prompts from settings const prompts = await getPromptCustomization(settingsService, '[FeatureGeneration]'); + // Load existing features to prevent duplicates + const featureLoader = new FeatureLoader(); + const existingFeatures = await featureLoader.getAll(projectPath); + + logger.info(`Found ${existingFeatures.length} existing features to exclude from generation`); + + // Build existing features context for the prompt + let existingFeaturesContext = ''; + if (existingFeatures.length > 0) { + const featuresList = existingFeatures + .map( + (f) => + `- "${f.title}" (ID: ${f.id}): ${f.description?.substring(0, 100) || 'No description'}` + ) + .join('\n'); + existingFeaturesContext = ` + +## EXISTING FEATURES (DO NOT REGENERATE THESE) + +The following ${existingFeatures.length} features already exist in the project. You MUST NOT generate features that duplicate or overlap with these: + +${featuresList} + +CRITICAL INSTRUCTIONS: +- DO NOT generate any features with the same or similar titles as the existing features listed above +- DO NOT generate features that cover the same functionality as existing features +- ONLY generate NEW features that are not yet in the system +- If a feature from the roadmap already exists, skip it entirely +- Generate unique feature IDs that do not conflict with existing IDs: ${existingFeatures.map((f) => f.id).join(', ')} +`; + } + const prompt = `Based on this project specification: ${spec} - +${existingFeaturesContext} ${prompts.appSpec.generateFeaturesFromSpecPrompt} -Generate ${featureCount} features that build on each other logically.`; +Generate ${featureCount} NEW features that build on each other logically. Remember: ONLY generate features that DO NOT already exist.`; logger.info('========== PROMPT BEING SENT =========='); logger.info(`Prompt length: ${prompt.length} chars`); diff --git a/apps/server/src/routes/app-spec/generate-spec.ts b/apps/server/src/routes/app-spec/generate-spec.ts index d79ffc5f..4fa3d11a 100644 --- a/apps/server/src/routes/app-spec/generate-spec.ts +++ b/apps/server/src/routes/app-spec/generate-spec.ts @@ -201,19 +201,33 @@ Your entire response should be valid JSON starting with { and ending with }. No xmlContent = responseText.substring(xmlStart, xmlEnd + ''.length); logger.info(`Extracted XML content: ${xmlContent.length} chars (from position ${xmlStart})`); } else { - // No valid XML structure found in the response text - // This happens when structured output was expected but not received, and the agent - // output conversational text instead of XML (e.g., "The project directory appears to be empty...") - // We should NOT save this conversational text as it's not a valid spec - logger.error('❌ Response does not contain valid XML structure'); - logger.error( - 'This typically happens when structured output failed and the agent produced conversational text instead of XML' - ); - throw new Error( - 'Failed to generate spec: No valid XML structure found in response. ' + - 'The response contained conversational text but no tags. ' + - 'Please try again.' - ); + // No XML found, try JSON extraction + logger.warn('⚠️ No XML tags found, attempting JSON extraction...'); + const extractedJson = extractJson(responseText, { logger }); + + if ( + extractedJson && + typeof extractedJson.project_name === 'string' && + typeof extractedJson.overview === 'string' && + Array.isArray(extractedJson.technology_stack) && + Array.isArray(extractedJson.core_capabilities) && + Array.isArray(extractedJson.implemented_features) + ) { + logger.info('✅ Successfully extracted JSON from response text'); + xmlContent = specToXml(extractedJson); + logger.info(`✅ Converted extracted JSON to XML: ${xmlContent.length} chars`); + } else { + // Neither XML nor valid JSON found + logger.error('❌ Response does not contain valid XML or JSON structure'); + logger.error( + 'This typically happens when structured output failed and the agent produced conversational text instead of structured output' + ); + throw new Error( + 'Failed to generate spec: No valid XML or JSON structure found in response. ' + + 'The response contained conversational text but no tags or valid JSON. ' + + 'Please try again.' + ); + } } } diff --git a/apps/server/src/routes/app-spec/index.ts b/apps/server/src/routes/app-spec/index.ts index 342aecd7..79b8e44d 100644 --- a/apps/server/src/routes/app-spec/index.ts +++ b/apps/server/src/routes/app-spec/index.ts @@ -7,6 +7,7 @@ import type { EventEmitter } from '../../lib/events.js'; import { createCreateHandler } from './routes/create.js'; import { createGenerateHandler } from './routes/generate.js'; import { createGenerateFeaturesHandler } from './routes/generate-features.js'; +import { createSyncHandler } from './routes/sync.js'; import { createStopHandler } from './routes/stop.js'; import { createStatusHandler } from './routes/status.js'; import type { SettingsService } from '../../services/settings-service.js'; @@ -20,6 +21,7 @@ export function createSpecRegenerationRoutes( router.post('/create', createCreateHandler(events)); router.post('/generate', createGenerateHandler(events, settingsService)); router.post('/generate-features', createGenerateFeaturesHandler(events, settingsService)); + router.post('/sync', createSyncHandler(events, settingsService)); router.post('/stop', createStopHandler()); router.get('/status', createStatusHandler()); diff --git a/apps/server/src/routes/app-spec/parse-and-create-features.ts b/apps/server/src/routes/app-spec/parse-and-create-features.ts index 78137a73..47846fcc 100644 --- a/apps/server/src/routes/app-spec/parse-and-create-features.ts +++ b/apps/server/src/routes/app-spec/parse-and-create-features.ts @@ -5,9 +5,10 @@ import path from 'path'; import * as secureFs from '../../lib/secure-fs.js'; import type { EventEmitter } from '../../lib/events.js'; -import { createLogger } from '@automaker/utils'; +import { createLogger, atomicWriteJson, DEFAULT_BACKUP_COUNT } from '@automaker/utils'; import { getFeaturesDir } from '@automaker/platform'; import { extractJsonWithArray } from '../../lib/json-extractor.js'; +import { getNotificationService } from '../../services/notification-service.js'; const logger = createLogger('SpecRegeneration'); @@ -73,10 +74,10 @@ export async function parseAndCreateFeatures( updatedAt: new Date().toISOString(), }; - await secureFs.writeFile( - path.join(featureDir, 'feature.json'), - JSON.stringify(featureData, null, 2) - ); + // Use atomic write with backup support for crash protection + await atomicWriteJson(path.join(featureDir, 'feature.json'), featureData, { + backupCount: DEFAULT_BACKUP_COUNT, + }); createdFeatures.push({ id: feature.id, title: feature.title }); } @@ -88,6 +89,15 @@ export async function parseAndCreateFeatures( message: `Spec regeneration complete! Created ${createdFeatures.length} features.`, projectPath: projectPath, }); + + // Create notification for spec generation completion + const notificationService = getNotificationService(); + await notificationService.createNotification({ + type: 'spec_regeneration_complete', + title: 'Spec Generation Complete', + message: `Created ${createdFeatures.length} features from the project specification.`, + projectPath: projectPath, + }); } catch (error) { logger.error('❌ parseAndCreateFeatures() failed:'); logger.error('Error:', error); diff --git a/apps/server/src/routes/app-spec/routes/generate-features.ts b/apps/server/src/routes/app-spec/routes/generate-features.ts index dc627964..670652ea 100644 --- a/apps/server/src/routes/app-spec/routes/generate-features.ts +++ b/apps/server/src/routes/app-spec/routes/generate-features.ts @@ -50,7 +50,7 @@ export function createGenerateFeaturesHandler( logAuthStatus('Before starting feature generation'); const abortController = new AbortController(); - setRunningState(projectPath, true, abortController); + setRunningState(projectPath, true, abortController, 'feature_generation'); logger.info('Starting background feature generation task...'); generateFeaturesFromSpec(projectPath, events, abortController, maxFeatures, settingsService) diff --git a/apps/server/src/routes/app-spec/routes/sync.ts b/apps/server/src/routes/app-spec/routes/sync.ts new file mode 100644 index 00000000..c6c34e68 --- /dev/null +++ b/apps/server/src/routes/app-spec/routes/sync.ts @@ -0,0 +1,76 @@ +/** + * POST /sync endpoint - Sync spec with codebase and features + */ + +import type { Request, Response } from 'express'; +import type { EventEmitter } from '../../../lib/events.js'; +import { createLogger } from '@automaker/utils'; +import { + getSpecRegenerationStatus, + setRunningState, + logAuthStatus, + logError, + getErrorMessage, +} from '../common.js'; +import { syncSpec } from '../sync-spec.js'; +import type { SettingsService } from '../../../services/settings-service.js'; + +const logger = createLogger('SpecSync'); + +export function createSyncHandler(events: EventEmitter, settingsService?: SettingsService) { + return async (req: Request, res: Response): Promise => { + logger.info('========== /sync endpoint called =========='); + logger.debug('Request body:', JSON.stringify(req.body, null, 2)); + + try { + const { projectPath } = req.body as { + projectPath: string; + }; + + logger.debug('projectPath:', projectPath); + + if (!projectPath) { + logger.error('Missing projectPath parameter'); + res.status(400).json({ success: false, error: 'projectPath required' }); + return; + } + + const { isRunning } = getSpecRegenerationStatus(projectPath); + if (isRunning) { + logger.warn('Generation/sync already running for project:', projectPath); + res.json({ success: false, error: 'Operation already running for this project' }); + return; + } + + logAuthStatus('Before starting spec sync'); + + const abortController = new AbortController(); + setRunningState(projectPath, true, abortController, 'sync'); + logger.info('Starting background spec sync task...'); + + syncSpec(projectPath, events, abortController, settingsService) + .then((result) => { + logger.info('Spec sync completed successfully'); + logger.info('Result:', JSON.stringify(result, null, 2)); + }) + .catch((error) => { + logError(error, 'Spec sync failed with error'); + events.emit('spec-regeneration:event', { + type: 'spec_regeneration_error', + error: getErrorMessage(error), + projectPath, + }); + }) + .finally(() => { + logger.info('Spec sync task finished (success or error)'); + setRunningState(projectPath, false, null); + }); + + logger.info('Returning success response (sync running in background)'); + res.json({ success: true }); + } catch (error) { + logError(error, 'Sync route handler failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/app-spec/sync-spec.ts b/apps/server/src/routes/app-spec/sync-spec.ts new file mode 100644 index 00000000..98352855 --- /dev/null +++ b/apps/server/src/routes/app-spec/sync-spec.ts @@ -0,0 +1,307 @@ +/** + * Sync spec with current codebase and feature state + * + * Updates the spec file based on: + * - Completed Automaker features + * - Code analysis for tech stack and implementations + * - Roadmap phase status updates + */ + +import * as secureFs from '../../lib/secure-fs.js'; +import type { EventEmitter } from '../../lib/events.js'; +import { createLogger } from '@automaker/utils'; +import { DEFAULT_PHASE_MODELS } from '@automaker/types'; +import { resolvePhaseModel } from '@automaker/model-resolver'; +import { streamingQuery } from '../../providers/simple-query-service.js'; +import { getAppSpecPath } from '@automaker/platform'; +import type { SettingsService } from '../../services/settings-service.js'; +import { getAutoLoadClaudeMdSetting } from '../../lib/settings-helpers.js'; +import { FeatureLoader } from '../../services/feature-loader.js'; +import { + extractImplementedFeatures, + extractTechnologyStack, + extractRoadmapPhases, + updateImplementedFeaturesSection, + updateTechnologyStack, + updateRoadmapPhaseStatus, + type ImplementedFeature, + type RoadmapPhase, +} from '../../lib/xml-extractor.js'; +import { getNotificationService } from '../../services/notification-service.js'; + +const logger = createLogger('SpecSync'); + +/** + * Result of a sync operation + */ +export interface SyncResult { + techStackUpdates: { + added: string[]; + removed: string[]; + }; + implementedFeaturesUpdates: { + addedFromFeatures: string[]; + removed: string[]; + }; + roadmapUpdates: Array<{ phaseName: string; newStatus: string }>; + summary: string; +} + +/** + * Sync the spec with current codebase and feature state + */ +export async function syncSpec( + projectPath: string, + events: EventEmitter, + abortController: AbortController, + settingsService?: SettingsService +): Promise { + logger.info('========== syncSpec() started =========='); + logger.info('projectPath:', projectPath); + + const result: SyncResult = { + techStackUpdates: { added: [], removed: [] }, + implementedFeaturesUpdates: { addedFromFeatures: [], removed: [] }, + roadmapUpdates: [], + summary: '', + }; + + // Read existing spec + const specPath = getAppSpecPath(projectPath); + let specContent: string; + + try { + specContent = (await secureFs.readFile(specPath, 'utf-8')) as string; + logger.info(`Spec loaded successfully (${specContent.length} chars)`); + } catch (readError) { + logger.error('Failed to read spec file:', readError); + events.emit('spec-regeneration:event', { + type: 'spec_regeneration_error', + error: 'No project spec found. Create or regenerate spec first.', + projectPath, + }); + throw new Error('No project spec found'); + } + + events.emit('spec-regeneration:event', { + type: 'spec_regeneration_progress', + content: '[Phase: sync] Starting spec sync...\n', + projectPath, + }); + + // Extract current state from spec + const currentImplementedFeatures = extractImplementedFeatures(specContent); + const currentTechStack = extractTechnologyStack(specContent); + const currentRoadmapPhases = extractRoadmapPhases(specContent); + + logger.info(`Current spec has ${currentImplementedFeatures.length} implemented features`); + logger.info(`Current spec has ${currentTechStack.length} technologies`); + logger.info(`Current spec has ${currentRoadmapPhases.length} roadmap phases`); + + // Load completed Automaker features + const featureLoader = new FeatureLoader(); + const allFeatures = await featureLoader.getAll(projectPath); + const completedFeatures = allFeatures.filter( + (f) => f.status === 'completed' || f.status === 'verified' + ); + + logger.info(`Found ${completedFeatures.length} completed/verified features in Automaker`); + + events.emit('spec-regeneration:event', { + type: 'spec_regeneration_progress', + content: `Found ${completedFeatures.length} completed features to sync...\n`, + projectPath, + }); + + // Build new implemented features list from completed Automaker features + const newImplementedFeatures: ImplementedFeature[] = []; + const existingNames = new Set(currentImplementedFeatures.map((f) => f.name.toLowerCase())); + + for (const feature of completedFeatures) { + const name = feature.title || `Feature: ${feature.id}`; + if (!existingNames.has(name.toLowerCase())) { + newImplementedFeatures.push({ + name, + description: feature.description || '', + }); + result.implementedFeaturesUpdates.addedFromFeatures.push(name); + } + } + + // Merge: keep existing + add new from completed features + const mergedFeatures = [...currentImplementedFeatures, ...newImplementedFeatures]; + + // Update spec with merged features + if (result.implementedFeaturesUpdates.addedFromFeatures.length > 0) { + specContent = updateImplementedFeaturesSection(specContent, mergedFeatures); + logger.info( + `Added ${result.implementedFeaturesUpdates.addedFromFeatures.length} features to spec` + ); + } + + // Analyze codebase for tech stack updates using AI + events.emit('spec-regeneration:event', { + type: 'spec_regeneration_progress', + content: 'Analyzing codebase for technology updates...\n', + projectPath, + }); + + const autoLoadClaudeMd = await getAutoLoadClaudeMdSetting( + projectPath, + settingsService, + '[SpecSync]' + ); + + const settings = await settingsService?.getGlobalSettings(); + const phaseModelEntry = + settings?.phaseModels?.specGenerationModel || DEFAULT_PHASE_MODELS.specGenerationModel; + const { model, thinkingLevel } = resolvePhaseModel(phaseModelEntry); + + // Use AI to analyze tech stack + const techAnalysisPrompt = `Analyze this project and return ONLY a JSON object with the current technology stack. + +Current known technologies: ${currentTechStack.join(', ')} + +Look at package.json, config files, and source code to identify: +- Frameworks (React, Vue, Express, etc.) +- Languages (TypeScript, JavaScript, Python, etc.) +- Build tools (Vite, Webpack, etc.) +- Databases (PostgreSQL, MongoDB, etc.) +- Key libraries and tools + +Return ONLY this JSON format, no other text: +{ + "technologies": ["Technology 1", "Technology 2", ...] +}`; + + try { + const techResult = await streamingQuery({ + prompt: techAnalysisPrompt, + model, + cwd: projectPath, + maxTurns: 10, + allowedTools: ['Read', 'Glob', 'Grep'], + abortController, + thinkingLevel, + readOnly: true, + settingSources: autoLoadClaudeMd ? ['user', 'project', 'local'] : undefined, + onText: (text) => { + logger.debug(`Tech analysis text: ${text.substring(0, 100)}`); + }, + }); + + // Parse tech stack from response + const jsonMatch = techResult.text.match(/\{[\s\S]*"technologies"[\s\S]*\}/); + if (jsonMatch) { + const parsed = JSON.parse(jsonMatch[0]); + if (Array.isArray(parsed.technologies)) { + const newTechStack = parsed.technologies as string[]; + + // Calculate differences + const currentSet = new Set(currentTechStack.map((t) => t.toLowerCase())); + const newSet = new Set(newTechStack.map((t) => t.toLowerCase())); + + for (const tech of newTechStack) { + if (!currentSet.has(tech.toLowerCase())) { + result.techStackUpdates.added.push(tech); + } + } + + for (const tech of currentTechStack) { + if (!newSet.has(tech.toLowerCase())) { + result.techStackUpdates.removed.push(tech); + } + } + + // Update spec with new tech stack if there are changes + if ( + result.techStackUpdates.added.length > 0 || + result.techStackUpdates.removed.length > 0 + ) { + specContent = updateTechnologyStack(specContent, newTechStack); + logger.info( + `Updated tech stack: +${result.techStackUpdates.added.length}, -${result.techStackUpdates.removed.length}` + ); + } + } + } + } catch (error) { + logger.warn('Failed to analyze tech stack:', error); + // Continue with other sync operations + } + + // Update roadmap phase statuses based on completed features + events.emit('spec-regeneration:event', { + type: 'spec_regeneration_progress', + content: 'Checking roadmap phase statuses...\n', + projectPath, + }); + + // For each phase, check if all its features are completed + // This is a heuristic - we check if the phase name appears in any feature titles/descriptions + for (const phase of currentRoadmapPhases) { + if (phase.status === 'completed') continue; // Already completed + + // Check if this phase should be marked as completed + // A phase is considered complete if we have completed features that mention it + const phaseNameLower = phase.name.toLowerCase(); + const relatedCompletedFeatures = completedFeatures.filter( + (f) => + f.title?.toLowerCase().includes(phaseNameLower) || + f.description?.toLowerCase().includes(phaseNameLower) || + f.category?.toLowerCase().includes(phaseNameLower) + ); + + // If we have related completed features and the phase is still pending/in_progress, + // update it to in_progress or completed based on feature count + if (relatedCompletedFeatures.length > 0 && phase.status !== 'completed') { + const newStatus = 'in_progress'; + specContent = updateRoadmapPhaseStatus(specContent, phase.name, newStatus); + result.roadmapUpdates.push({ phaseName: phase.name, newStatus }); + logger.info(`Updated phase "${phase.name}" to ${newStatus}`); + } + } + + // Save updated spec + await secureFs.writeFile(specPath, specContent, 'utf-8'); + logger.info('Spec saved successfully'); + + // Build summary + const summaryParts: string[] = []; + if (result.implementedFeaturesUpdates.addedFromFeatures.length > 0) { + summaryParts.push( + `Added ${result.implementedFeaturesUpdates.addedFromFeatures.length} implemented features` + ); + } + if (result.techStackUpdates.added.length > 0) { + summaryParts.push(`Added ${result.techStackUpdates.added.length} technologies`); + } + if (result.techStackUpdates.removed.length > 0) { + summaryParts.push(`Removed ${result.techStackUpdates.removed.length} technologies`); + } + if (result.roadmapUpdates.length > 0) { + summaryParts.push(`Updated ${result.roadmapUpdates.length} roadmap phases`); + } + + result.summary = summaryParts.length > 0 ? summaryParts.join(', ') : 'Spec is already up to date'; + + // Create notification + const notificationService = getNotificationService(); + await notificationService.createNotification({ + type: 'spec_regeneration_complete', + title: 'Spec Sync Complete', + message: result.summary, + projectPath, + }); + + events.emit('spec-regeneration:event', { + type: 'spec_regeneration_complete', + message: `Spec sync complete! ${result.summary}`, + projectPath, + }); + + logger.info('========== syncSpec() completed =========='); + logger.info('Summary:', result.summary); + + return result; +} diff --git a/apps/server/src/routes/event-history/common.ts b/apps/server/src/routes/event-history/common.ts new file mode 100644 index 00000000..bd0ad3fe --- /dev/null +++ b/apps/server/src/routes/event-history/common.ts @@ -0,0 +1,19 @@ +/** + * Common utilities for event history routes + */ + +import { createLogger } from '@automaker/utils'; +import { getErrorMessage as getErrorMessageShared, createLogError } from '../common.js'; + +/** Logger instance for event history operations */ +export const logger = createLogger('EventHistory'); + +/** + * Extract user-friendly error message from error objects + */ +export { getErrorMessageShared as getErrorMessage }; + +/** + * Log error with automatic logger binding + */ +export const logError = createLogError(logger); diff --git a/apps/server/src/routes/event-history/index.ts b/apps/server/src/routes/event-history/index.ts new file mode 100644 index 00000000..93297ddd --- /dev/null +++ b/apps/server/src/routes/event-history/index.ts @@ -0,0 +1,68 @@ +/** + * Event History routes - HTTP API for event history management + * + * Provides endpoints for: + * - Listing events with filtering + * - Getting individual event details + * - Deleting events + * - Clearing all events + * - Replaying events to test hooks + * + * Mounted at /api/event-history in the main server. + */ + +import { Router } from 'express'; +import type { EventHistoryService } from '../../services/event-history-service.js'; +import type { SettingsService } from '../../services/settings-service.js'; +import { validatePathParams } from '../../middleware/validate-paths.js'; +import { createListHandler } from './routes/list.js'; +import { createGetHandler } from './routes/get.js'; +import { createDeleteHandler } from './routes/delete.js'; +import { createClearHandler } from './routes/clear.js'; +import { createReplayHandler } from './routes/replay.js'; + +/** + * Create event history router with all endpoints + * + * Endpoints: + * - POST /list - List events with optional filtering + * - POST /get - Get a single event by ID + * - POST /delete - Delete an event by ID + * - POST /clear - Clear all events for a project + * - POST /replay - Replay an event to trigger hooks + * + * @param eventHistoryService - Instance of EventHistoryService + * @param settingsService - Instance of SettingsService (for replay) + * @returns Express Router configured with all event history endpoints + */ +export function createEventHistoryRoutes( + eventHistoryService: EventHistoryService, + settingsService: SettingsService +): Router { + const router = Router(); + + // List events with filtering + router.post('/list', validatePathParams('projectPath'), createListHandler(eventHistoryService)); + + // Get single event + router.post('/get', validatePathParams('projectPath'), createGetHandler(eventHistoryService)); + + // Delete event + router.post( + '/delete', + validatePathParams('projectPath'), + createDeleteHandler(eventHistoryService) + ); + + // Clear all events + router.post('/clear', validatePathParams('projectPath'), createClearHandler(eventHistoryService)); + + // Replay event + router.post( + '/replay', + validatePathParams('projectPath'), + createReplayHandler(eventHistoryService, settingsService) + ); + + return router; +} diff --git a/apps/server/src/routes/event-history/routes/clear.ts b/apps/server/src/routes/event-history/routes/clear.ts new file mode 100644 index 00000000..c6e6bb58 --- /dev/null +++ b/apps/server/src/routes/event-history/routes/clear.ts @@ -0,0 +1,33 @@ +/** + * POST /api/event-history/clear - Clear all events for a project + * + * Request body: { projectPath: string } + * Response: { success: true, cleared: number } + */ + +import type { Request, Response } from 'express'; +import type { EventHistoryService } from '../../../services/event-history-service.js'; +import { getErrorMessage, logError } from '../common.js'; + +export function createClearHandler(eventHistoryService: EventHistoryService) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath } = req.body as { projectPath: string }; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + const cleared = await eventHistoryService.clearEvents(projectPath); + + res.json({ + success: true, + cleared, + }); + } catch (error) { + logError(error, 'Clear events failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/event-history/routes/delete.ts b/apps/server/src/routes/event-history/routes/delete.ts new file mode 100644 index 00000000..ea3f6b16 --- /dev/null +++ b/apps/server/src/routes/event-history/routes/delete.ts @@ -0,0 +1,43 @@ +/** + * POST /api/event-history/delete - Delete an event by ID + * + * Request body: { projectPath: string, eventId: string } + * Response: { success: true } or { success: false, error: string } + */ + +import type { Request, Response } from 'express'; +import type { EventHistoryService } from '../../../services/event-history-service.js'; +import { getErrorMessage, logError } from '../common.js'; + +export function createDeleteHandler(eventHistoryService: EventHistoryService) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath, eventId } = req.body as { + projectPath: string; + eventId: string; + }; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + if (!eventId || typeof eventId !== 'string') { + res.status(400).json({ success: false, error: 'eventId is required' }); + return; + } + + const deleted = await eventHistoryService.deleteEvent(projectPath, eventId); + + if (!deleted) { + res.status(404).json({ success: false, error: 'Event not found' }); + return; + } + + res.json({ success: true }); + } catch (error) { + logError(error, 'Delete event failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/event-history/routes/get.ts b/apps/server/src/routes/event-history/routes/get.ts new file mode 100644 index 00000000..f892fd41 --- /dev/null +++ b/apps/server/src/routes/event-history/routes/get.ts @@ -0,0 +1,46 @@ +/** + * POST /api/event-history/get - Get a single event by ID + * + * Request body: { projectPath: string, eventId: string } + * Response: { success: true, event: StoredEvent } or { success: false, error: string } + */ + +import type { Request, Response } from 'express'; +import type { EventHistoryService } from '../../../services/event-history-service.js'; +import { getErrorMessage, logError } from '../common.js'; + +export function createGetHandler(eventHistoryService: EventHistoryService) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath, eventId } = req.body as { + projectPath: string; + eventId: string; + }; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + if (!eventId || typeof eventId !== 'string') { + res.status(400).json({ success: false, error: 'eventId is required' }); + return; + } + + const event = await eventHistoryService.getEvent(projectPath, eventId); + + if (!event) { + res.status(404).json({ success: false, error: 'Event not found' }); + return; + } + + res.json({ + success: true, + event, + }); + } catch (error) { + logError(error, 'Get event failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/event-history/routes/list.ts b/apps/server/src/routes/event-history/routes/list.ts new file mode 100644 index 00000000..551594f2 --- /dev/null +++ b/apps/server/src/routes/event-history/routes/list.ts @@ -0,0 +1,53 @@ +/** + * POST /api/event-history/list - List events for a project + * + * Request body: { + * projectPath: string, + * filter?: { + * trigger?: EventHookTrigger, + * featureId?: string, + * since?: string, + * until?: string, + * limit?: number, + * offset?: number + * } + * } + * Response: { success: true, events: StoredEventSummary[], total: number } + */ + +import type { Request, Response } from 'express'; +import type { EventHistoryService } from '../../../services/event-history-service.js'; +import type { EventHistoryFilter } from '@automaker/types'; +import { getErrorMessage, logError } from '../common.js'; + +export function createListHandler(eventHistoryService: EventHistoryService) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath, filter } = req.body as { + projectPath: string; + filter?: EventHistoryFilter; + }; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + const events = await eventHistoryService.getEvents(projectPath, filter); + const total = await eventHistoryService.getEventCount(projectPath, { + ...filter, + limit: undefined, + offset: undefined, + }); + + res.json({ + success: true, + events, + total, + }); + } catch (error) { + logError(error, 'List events failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/event-history/routes/replay.ts b/apps/server/src/routes/event-history/routes/replay.ts new file mode 100644 index 00000000..c6f27a40 --- /dev/null +++ b/apps/server/src/routes/event-history/routes/replay.ts @@ -0,0 +1,234 @@ +/** + * POST /api/event-history/replay - Replay an event to trigger hooks + * + * Request body: { + * projectPath: string, + * eventId: string, + * hookIds?: string[] // Optional: specific hooks to run (if not provided, runs all enabled matching hooks) + * } + * Response: { success: true, result: EventReplayResult } + */ + +import type { Request, Response } from 'express'; +import type { EventHistoryService } from '../../../services/event-history-service.js'; +import type { SettingsService } from '../../../services/settings-service.js'; +import type { EventReplayResult, EventReplayHookResult, EventHook } from '@automaker/types'; +import { exec } from 'child_process'; +import { promisify } from 'util'; +import { getErrorMessage, logError, logger } from '../common.js'; + +const execAsync = promisify(exec); + +/** Default timeout for shell commands (30 seconds) */ +const DEFAULT_SHELL_TIMEOUT = 30000; + +/** Default timeout for HTTP requests (10 seconds) */ +const DEFAULT_HTTP_TIMEOUT = 10000; + +interface HookContext { + featureId?: string; + featureName?: string; + projectPath?: string; + projectName?: string; + error?: string; + errorType?: string; + timestamp: string; + eventType: string; +} + +/** + * Substitute {{variable}} placeholders in a string + */ +function substituteVariables(template: string, context: HookContext): string { + return template.replace(/\{\{(\w+)\}\}/g, (match, variable) => { + const value = context[variable as keyof HookContext]; + if (value === undefined || value === null) { + return ''; + } + return String(value); + }); +} + +/** + * Execute a single hook and return the result + */ +async function executeHook(hook: EventHook, context: HookContext): Promise { + const hookName = hook.name || hook.id; + const startTime = Date.now(); + + try { + if (hook.action.type === 'shell') { + const command = substituteVariables(hook.action.command, context); + const timeout = hook.action.timeout || DEFAULT_SHELL_TIMEOUT; + + logger.info(`Replaying shell hook "${hookName}": ${command}`); + + await execAsync(command, { + timeout, + maxBuffer: 1024 * 1024, + }); + + return { + hookId: hook.id, + hookName: hook.name, + success: true, + durationMs: Date.now() - startTime, + }; + } else if (hook.action.type === 'http') { + const url = substituteVariables(hook.action.url, context); + const method = hook.action.method || 'POST'; + + const headers: Record = { + 'Content-Type': 'application/json', + }; + if (hook.action.headers) { + for (const [key, value] of Object.entries(hook.action.headers)) { + headers[key] = substituteVariables(value, context); + } + } + + let body: string | undefined; + if (hook.action.body) { + body = substituteVariables(hook.action.body, context); + } else if (method !== 'GET') { + body = JSON.stringify({ + eventType: context.eventType, + timestamp: context.timestamp, + featureId: context.featureId, + projectPath: context.projectPath, + projectName: context.projectName, + error: context.error, + }); + } + + logger.info(`Replaying HTTP hook "${hookName}": ${method} ${url}`); + + const controller = new AbortController(); + const timeoutId = setTimeout(() => controller.abort(), DEFAULT_HTTP_TIMEOUT); + + const response = await fetch(url, { + method, + headers, + body: method !== 'GET' ? body : undefined, + signal: controller.signal, + }); + + clearTimeout(timeoutId); + + if (!response.ok) { + return { + hookId: hook.id, + hookName: hook.name, + success: false, + error: `HTTP ${response.status}: ${response.statusText}`, + durationMs: Date.now() - startTime, + }; + } + + return { + hookId: hook.id, + hookName: hook.name, + success: true, + durationMs: Date.now() - startTime, + }; + } + + return { + hookId: hook.id, + hookName: hook.name, + success: false, + error: 'Unknown hook action type', + durationMs: Date.now() - startTime, + }; + } catch (error) { + const errorMessage = + error instanceof Error + ? error.name === 'AbortError' + ? 'Request timed out' + : error.message + : String(error); + + return { + hookId: hook.id, + hookName: hook.name, + success: false, + error: errorMessage, + durationMs: Date.now() - startTime, + }; + } +} + +export function createReplayHandler( + eventHistoryService: EventHistoryService, + settingsService: SettingsService +) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath, eventId, hookIds } = req.body as { + projectPath: string; + eventId: string; + hookIds?: string[]; + }; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + if (!eventId || typeof eventId !== 'string') { + res.status(400).json({ success: false, error: 'eventId is required' }); + return; + } + + // Get the event + const event = await eventHistoryService.getEvent(projectPath, eventId); + if (!event) { + res.status(404).json({ success: false, error: 'Event not found' }); + return; + } + + // Get hooks from settings + const settings = await settingsService.getGlobalSettings(); + let hooks = settings.eventHooks || []; + + // Filter to matching trigger and enabled hooks + hooks = hooks.filter((h) => h.enabled && h.trigger === event.trigger); + + // If specific hook IDs requested, filter to those + if (hookIds && hookIds.length > 0) { + hooks = hooks.filter((h) => hookIds.includes(h.id)); + } + + // Build context for variable substitution + const context: HookContext = { + featureId: event.featureId, + featureName: event.featureName, + projectPath: event.projectPath, + projectName: event.projectName, + error: event.error, + errorType: event.errorType, + timestamp: event.timestamp, + eventType: event.trigger, + }; + + // Execute all hooks in parallel + const hookResults = await Promise.all(hooks.map((hook) => executeHook(hook, context))); + + const result: EventReplayResult = { + eventId, + hooksTriggered: hooks.length, + hookResults, + }; + + logger.info(`Replayed event ${eventId}: ${hooks.length} hooks triggered`); + + res.json({ + success: true, + result, + }); + } catch (error) { + logError(error, 'Replay event failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/features/index.ts b/apps/server/src/routes/features/index.ts index dd58e4aa..439ab6a9 100644 --- a/apps/server/src/routes/features/index.ts +++ b/apps/server/src/routes/features/index.ts @@ -5,6 +5,7 @@ import { Router } from 'express'; import { FeatureLoader } from '../../services/feature-loader.js'; import type { SettingsService } from '../../services/settings-service.js'; +import type { EventEmitter } from '../../lib/events.js'; import { validatePathParams } from '../../middleware/validate-paths.js'; import { createListHandler } from './routes/list.js'; import { createGetHandler } from './routes/get.js'; @@ -18,13 +19,18 @@ import { createGenerateTitleHandler } from './routes/generate-title.js'; export function createFeaturesRoutes( featureLoader: FeatureLoader, - settingsService?: SettingsService + settingsService?: SettingsService, + events?: EventEmitter ): Router { const router = Router(); router.post('/list', validatePathParams('projectPath'), createListHandler(featureLoader)); router.post('/get', validatePathParams('projectPath'), createGetHandler(featureLoader)); - router.post('/create', validatePathParams('projectPath'), createCreateHandler(featureLoader)); + router.post( + '/create', + validatePathParams('projectPath'), + createCreateHandler(featureLoader, events) + ); router.post('/update', validatePathParams('projectPath'), createUpdateHandler(featureLoader)); router.post( '/bulk-update', diff --git a/apps/server/src/routes/features/routes/create.ts b/apps/server/src/routes/features/routes/create.ts index 5f04ecdb..a4df45a6 100644 --- a/apps/server/src/routes/features/routes/create.ts +++ b/apps/server/src/routes/features/routes/create.ts @@ -4,10 +4,11 @@ import type { Request, Response } from 'express'; import { FeatureLoader } from '../../../services/feature-loader.js'; +import type { EventEmitter } from '../../../lib/events.js'; import type { Feature } from '@automaker/types'; import { getErrorMessage, logError } from '../common.js'; -export function createCreateHandler(featureLoader: FeatureLoader) { +export function createCreateHandler(featureLoader: FeatureLoader, events?: EventEmitter) { return async (req: Request, res: Response): Promise => { try { const { projectPath, feature } = req.body as { @@ -23,7 +24,30 @@ export function createCreateHandler(featureLoader: FeatureLoader) { return; } + // Check for duplicate title if title is provided + if (feature.title && feature.title.trim()) { + const duplicate = await featureLoader.findDuplicateTitle(projectPath, feature.title); + if (duplicate) { + res.status(409).json({ + success: false, + error: `A feature with title "${feature.title}" already exists`, + duplicateFeatureId: duplicate.id, + }); + return; + } + } + const created = await featureLoader.create(projectPath, feature); + + // Emit feature_created event for hooks + if (events) { + events.emit('feature:created', { + featureId: created.id, + featureName: created.name, + projectPath, + }); + } + res.json({ success: true, feature: created }); } catch (error) { logError(error, 'Create feature failed'); diff --git a/apps/server/src/routes/features/routes/update.ts b/apps/server/src/routes/features/routes/update.ts index 1a89cda3..a5b532c1 100644 --- a/apps/server/src/routes/features/routes/update.ts +++ b/apps/server/src/routes/features/routes/update.ts @@ -4,8 +4,14 @@ import type { Request, Response } from 'express'; import { FeatureLoader } from '../../../services/feature-loader.js'; -import type { Feature } from '@automaker/types'; +import type { Feature, FeatureStatus } from '@automaker/types'; import { getErrorMessage, logError } from '../common.js'; +import { createLogger } from '@automaker/utils'; + +const logger = createLogger('features/update'); + +// Statuses that should trigger syncing to app_spec.txt +const SYNC_TRIGGER_STATUSES: FeatureStatus[] = ['verified', 'completed']; export function createUpdateHandler(featureLoader: FeatureLoader) { return async (req: Request, res: Response): Promise => { @@ -34,6 +40,28 @@ export function createUpdateHandler(featureLoader: FeatureLoader) { return; } + // Check for duplicate title if title is being updated + if (updates.title && updates.title.trim()) { + const duplicate = await featureLoader.findDuplicateTitle( + projectPath, + updates.title, + featureId // Exclude the current feature from duplicate check + ); + if (duplicate) { + res.status(409).json({ + success: false, + error: `A feature with title "${updates.title}" already exists`, + duplicateFeatureId: duplicate.id, + }); + return; + } + } + + // Get the current feature to detect status changes + const currentFeature = await featureLoader.get(projectPath, featureId); + const previousStatus = currentFeature?.status as FeatureStatus | undefined; + const newStatus = updates.status as FeatureStatus | undefined; + const updated = await featureLoader.update( projectPath, featureId, @@ -42,6 +70,22 @@ export function createUpdateHandler(featureLoader: FeatureLoader) { enhancementMode, preEnhancementDescription ); + + // Trigger sync to app_spec.txt when status changes to verified or completed + if (newStatus && SYNC_TRIGGER_STATUSES.includes(newStatus) && previousStatus !== newStatus) { + try { + const synced = await featureLoader.syncFeatureToAppSpec(projectPath, updated); + if (synced) { + logger.info( + `Synced feature "${updated.title || updated.id}" to app_spec.txt on status change to ${newStatus}` + ); + } + } catch (syncError) { + // Log the sync error but don't fail the update operation + logger.error(`Failed to sync feature to app_spec.txt:`, syncError); + } + } + res.json({ success: true, feature: updated }); } catch (error) { logError(error, 'Update feature failed'); diff --git a/apps/server/src/routes/notifications/common.ts b/apps/server/src/routes/notifications/common.ts new file mode 100644 index 00000000..707e3a0d --- /dev/null +++ b/apps/server/src/routes/notifications/common.ts @@ -0,0 +1,21 @@ +/** + * Common utilities for notification routes + * + * Provides logger and error handling utilities shared across all notification endpoints. + */ + +import { createLogger } from '@automaker/utils'; +import { getErrorMessage as getErrorMessageShared, createLogError } from '../common.js'; + +/** Logger instance for notification-related operations */ +export const logger = createLogger('Notifications'); + +/** + * Extract user-friendly error message from error objects + */ +export { getErrorMessageShared as getErrorMessage }; + +/** + * Log error with automatic logger binding + */ +export const logError = createLogError(logger); diff --git a/apps/server/src/routes/notifications/index.ts b/apps/server/src/routes/notifications/index.ts new file mode 100644 index 00000000..2def111a --- /dev/null +++ b/apps/server/src/routes/notifications/index.ts @@ -0,0 +1,62 @@ +/** + * Notifications routes - HTTP API for project-level notifications + * + * Provides endpoints for: + * - Listing notifications + * - Getting unread count + * - Marking notifications as read + * - Dismissing notifications + * + * All endpoints use handler factories that receive the NotificationService instance. + * Mounted at /api/notifications in the main server. + */ + +import { Router } from 'express'; +import type { NotificationService } from '../../services/notification-service.js'; +import { validatePathParams } from '../../middleware/validate-paths.js'; +import { createListHandler } from './routes/list.js'; +import { createUnreadCountHandler } from './routes/unread-count.js'; +import { createMarkReadHandler } from './routes/mark-read.js'; +import { createDismissHandler } from './routes/dismiss.js'; + +/** + * Create notifications router with all endpoints + * + * Endpoints: + * - POST /list - List all notifications for a project + * - POST /unread-count - Get unread notification count + * - POST /mark-read - Mark notification(s) as read + * - POST /dismiss - Dismiss notification(s) + * + * @param notificationService - Instance of NotificationService + * @returns Express Router configured with all notification endpoints + */ +export function createNotificationsRoutes(notificationService: NotificationService): Router { + const router = Router(); + + // List notifications + router.post('/list', validatePathParams('projectPath'), createListHandler(notificationService)); + + // Get unread count + router.post( + '/unread-count', + validatePathParams('projectPath'), + createUnreadCountHandler(notificationService) + ); + + // Mark as read (single or all) + router.post( + '/mark-read', + validatePathParams('projectPath'), + createMarkReadHandler(notificationService) + ); + + // Dismiss (single or all) + router.post( + '/dismiss', + validatePathParams('projectPath'), + createDismissHandler(notificationService) + ); + + return router; +} diff --git a/apps/server/src/routes/notifications/routes/dismiss.ts b/apps/server/src/routes/notifications/routes/dismiss.ts new file mode 100644 index 00000000..c609f170 --- /dev/null +++ b/apps/server/src/routes/notifications/routes/dismiss.ts @@ -0,0 +1,53 @@ +/** + * POST /api/notifications/dismiss - Dismiss notification(s) + * + * Request body: { projectPath: string, notificationId?: string } + * - If notificationId provided: dismisses that notification + * - If notificationId not provided: dismisses all notifications + * + * Response: { success: true, dismissed: boolean | count: number } + */ + +import type { Request, Response } from 'express'; +import type { NotificationService } from '../../../services/notification-service.js'; +import { getErrorMessage, logError } from '../common.js'; + +/** + * Create handler for POST /api/notifications/dismiss + * + * @param notificationService - Instance of NotificationService + * @returns Express request handler + */ +export function createDismissHandler(notificationService: NotificationService) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath, notificationId } = req.body; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + // If notificationId provided, dismiss single notification + if (notificationId) { + const dismissed = await notificationService.dismissNotification( + projectPath, + notificationId + ); + if (!dismissed) { + res.status(404).json({ success: false, error: 'Notification not found' }); + return; + } + res.json({ success: true, dismissed: true }); + return; + } + + // Otherwise dismiss all + const count = await notificationService.dismissAll(projectPath); + res.json({ success: true, count }); + } catch (error) { + logError(error, 'Dismiss failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/notifications/routes/list.ts b/apps/server/src/routes/notifications/routes/list.ts new file mode 100644 index 00000000..46197fe9 --- /dev/null +++ b/apps/server/src/routes/notifications/routes/list.ts @@ -0,0 +1,39 @@ +/** + * POST /api/notifications/list - List all notifications for a project + * + * Request body: { projectPath: string } + * Response: { success: true, notifications: Notification[] } + */ + +import type { Request, Response } from 'express'; +import type { NotificationService } from '../../../services/notification-service.js'; +import { getErrorMessage, logError } from '../common.js'; + +/** + * Create handler for POST /api/notifications/list + * + * @param notificationService - Instance of NotificationService + * @returns Express request handler + */ +export function createListHandler(notificationService: NotificationService) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath } = req.body; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + const notifications = await notificationService.getNotifications(projectPath); + + res.json({ + success: true, + notifications, + }); + } catch (error) { + logError(error, 'List notifications failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/notifications/routes/mark-read.ts b/apps/server/src/routes/notifications/routes/mark-read.ts new file mode 100644 index 00000000..4c9bfeb5 --- /dev/null +++ b/apps/server/src/routes/notifications/routes/mark-read.ts @@ -0,0 +1,50 @@ +/** + * POST /api/notifications/mark-read - Mark notification(s) as read + * + * Request body: { projectPath: string, notificationId?: string } + * - If notificationId provided: marks that notification as read + * - If notificationId not provided: marks all notifications as read + * + * Response: { success: true, count?: number, notification?: Notification } + */ + +import type { Request, Response } from 'express'; +import type { NotificationService } from '../../../services/notification-service.js'; +import { getErrorMessage, logError } from '../common.js'; + +/** + * Create handler for POST /api/notifications/mark-read + * + * @param notificationService - Instance of NotificationService + * @returns Express request handler + */ +export function createMarkReadHandler(notificationService: NotificationService) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath, notificationId } = req.body; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + // If notificationId provided, mark single notification + if (notificationId) { + const notification = await notificationService.markAsRead(projectPath, notificationId); + if (!notification) { + res.status(404).json({ success: false, error: 'Notification not found' }); + return; + } + res.json({ success: true, notification }); + return; + } + + // Otherwise mark all as read + const count = await notificationService.markAllAsRead(projectPath); + res.json({ success: true, count }); + } catch (error) { + logError(error, 'Mark read failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/notifications/routes/unread-count.ts b/apps/server/src/routes/notifications/routes/unread-count.ts new file mode 100644 index 00000000..98d8e198 --- /dev/null +++ b/apps/server/src/routes/notifications/routes/unread-count.ts @@ -0,0 +1,39 @@ +/** + * POST /api/notifications/unread-count - Get unread notification count + * + * Request body: { projectPath: string } + * Response: { success: true, count: number } + */ + +import type { Request, Response } from 'express'; +import type { NotificationService } from '../../../services/notification-service.js'; +import { getErrorMessage, logError } from '../common.js'; + +/** + * Create handler for POST /api/notifications/unread-count + * + * @param notificationService - Instance of NotificationService + * @returns Express request handler + */ +export function createUnreadCountHandler(notificationService: NotificationService) { + return async (req: Request, res: Response): Promise => { + try { + const { projectPath } = req.body; + + if (!projectPath || typeof projectPath !== 'string') { + res.status(400).json({ success: false, error: 'projectPath is required' }); + return; + } + + const count = await notificationService.getUnreadCount(projectPath); + + res.json({ + success: true, + count, + }); + } catch (error) { + logError(error, 'Get unread count failed'); + res.status(500).json({ success: false, error: getErrorMessage(error) }); + } + }; +} diff --git a/apps/server/src/routes/running-agents/routes/index.ts b/apps/server/src/routes/running-agents/routes/index.ts index 955ac93d..1eeb4ae6 100644 --- a/apps/server/src/routes/running-agents/routes/index.ts +++ b/apps/server/src/routes/running-agents/routes/index.ts @@ -5,6 +5,7 @@ import type { Request, Response } from 'express'; import type { AutoModeService } from '../../../services/auto-mode-service.js'; import { getBacklogPlanStatus, getRunningDetails } from '../../backlog-plan/common.js'; +import { getAllRunningGenerations } from '../../app-spec/common.js'; import path from 'path'; import { getErrorMessage, logError } from '../common.js'; @@ -26,6 +27,36 @@ export function createIndexHandler(autoModeService: AutoModeService) { }); } + // Add spec/feature generation tasks + const specGenerations = getAllRunningGenerations(); + for (const generation of specGenerations) { + let title: string; + let description: string; + + switch (generation.type) { + case 'feature_generation': + title = 'Generating features from spec'; + description = 'Creating features from the project specification'; + break; + case 'sync': + title = 'Syncing spec with code'; + description = 'Updating spec from codebase and completed features'; + break; + default: + title = 'Regenerating spec'; + description = 'Analyzing project and generating specification'; + } + + runningAgents.push({ + featureId: `spec-generation:${generation.projectPath}`, + projectPath: generation.projectPath, + projectName: path.basename(generation.projectPath), + isAutoMode: false, + title, + description, + }); + } + res.json({ success: true, runningAgents, diff --git a/apps/server/src/routes/worktree/routes/dev-server-logs.ts b/apps/server/src/routes/worktree/routes/dev-server-logs.ts index 66dfed92..79d3553b 100644 --- a/apps/server/src/routes/worktree/routes/dev-server-logs.ts +++ b/apps/server/src/routes/worktree/routes/dev-server-logs.ts @@ -34,6 +34,7 @@ export function createGetDevServerLogsHandler() { result: { worktreePath: result.result.worktreePath, port: result.result.port, + url: result.result.url, logs: result.result.logs, startedAt: result.result.startedAt, }, diff --git a/apps/server/src/services/auto-mode-service.ts b/apps/server/src/services/auto-mode-service.ts index 05722181..454b7ec0 100644 --- a/apps/server/src/services/auto-mode-service.ts +++ b/apps/server/src/services/auto-mode-service.ts @@ -29,6 +29,10 @@ import { appendLearning, recordMemoryUsage, createLogger, + atomicWriteJson, + readJsonWithRecovery, + logRecoveryWarning, + DEFAULT_BACKUP_COUNT, } from '@automaker/utils'; const logger = createLogger('AutoMode'); @@ -60,6 +64,7 @@ import { getMCPServersFromSettings, getPromptCustomization, } from '../lib/settings-helpers.js'; +import { getNotificationService } from './notification-service.js'; const execAsync = promisify(exec); @@ -386,6 +391,7 @@ export class AutoModeService { this.emitAutoModeEvent('auto_mode_error', { error: errorInfo.message, errorType: errorInfo.type, + projectPath, }); }); } @@ -1414,13 +1420,13 @@ Address the follow-up instructions above. Review the previous work and make the allImagePaths.push(...allPaths); } - // Save updated feature.json with new images + // Save updated feature.json with new images (atomic write with backup) if (copiedImagePaths.length > 0 && feature) { const featureDirForSave = getFeatureDir(projectPath, featureId); const featurePath = path.join(featureDirForSave, 'feature.json'); try { - await secureFs.writeFile(featurePath, JSON.stringify(feature, null, 2)); + await atomicWriteJson(featurePath, feature, { backupCount: DEFAULT_BACKUP_COUNT }); } catch (error) { logger.error(`Failed to save feature.json:`, error); } @@ -1547,6 +1553,7 @@ Address the follow-up instructions above. Review the previous work and make the message: allPassed ? 'All verification checks passed' : `Verification failed: ${results.find((r) => !r.passed)?.check || 'Unknown'}`, + projectPath, }); return allPassed; @@ -1620,6 +1627,7 @@ Address the follow-up instructions above. Review the previous work and make the featureId, passes: true, message: `Changes committed: ${hash.trim().substring(0, 8)}`, + projectPath, }); return hash.trim(); @@ -2088,8 +2096,20 @@ Format your response as a structured markdown document.`; const featurePath = path.join(featureDir, 'feature.json'); try { - const data = (await secureFs.readFile(featurePath, 'utf-8')) as string; - const feature = JSON.parse(data); + // Use recovery-enabled read for corrupted file handling + const result = await readJsonWithRecovery(featurePath, null, { + maxBackups: DEFAULT_BACKUP_COUNT, + autoRestore: true, + }); + + logRecoveryWarning(result, `Feature ${featureId}`, logger); + + const feature = result.data; + if (!feature) { + logger.warn(`Feature ${featureId} not found or could not be recovered`); + return; + } + feature.status = status; feature.updatedAt = new Date().toISOString(); // Set justFinishedAt timestamp when moving to waiting_approval (agent just completed) @@ -2100,9 +2120,41 @@ Format your response as a structured markdown document.`; // Clear the timestamp when moving to other statuses feature.justFinishedAt = undefined; } - await secureFs.writeFile(featurePath, JSON.stringify(feature, null, 2)); - } catch { - // Feature file may not exist + + // Use atomic write with backup support + await atomicWriteJson(featurePath, feature, { backupCount: DEFAULT_BACKUP_COUNT }); + + // Create notifications for important status changes + const notificationService = getNotificationService(); + if (status === 'waiting_approval') { + await notificationService.createNotification({ + type: 'feature_waiting_approval', + title: 'Feature Ready for Review', + message: `"${feature.name || featureId}" is ready for your review and approval.`, + featureId, + projectPath, + }); + } else if (status === 'verified') { + await notificationService.createNotification({ + type: 'feature_verified', + title: 'Feature Verified', + message: `"${feature.name || featureId}" has been verified and is complete.`, + featureId, + projectPath, + }); + } + + // Sync completed/verified features to app_spec.txt + if (status === 'verified' || status === 'completed') { + try { + await this.featureLoader.syncFeatureToAppSpec(projectPath, feature); + } catch (syncError) { + // Log but don't fail the status update if sync fails + logger.warn(`Failed to sync feature ${featureId} to app_spec.txt:`, syncError); + } + } + } catch (error) { + logger.error(`Failed to update feature status for ${featureId}:`, error); } } @@ -2114,11 +2166,24 @@ Format your response as a structured markdown document.`; featureId: string, updates: Partial ): Promise { - const featurePath = path.join(projectPath, '.automaker', 'features', featureId, 'feature.json'); + // Use getFeatureDir helper for consistent path resolution + const featureDir = getFeatureDir(projectPath, featureId); + const featurePath = path.join(featureDir, 'feature.json'); try { - const data = (await secureFs.readFile(featurePath, 'utf-8')) as string; - const feature = JSON.parse(data); + // Use recovery-enabled read for corrupted file handling + const result = await readJsonWithRecovery(featurePath, null, { + maxBackups: DEFAULT_BACKUP_COUNT, + autoRestore: true, + }); + + logRecoveryWarning(result, `Feature ${featureId}`, logger); + + const feature = result.data; + if (!feature) { + logger.warn(`Feature ${featureId} not found or could not be recovered`); + return; + } // Initialize planSpec if it doesn't exist if (!feature.planSpec) { @@ -2138,7 +2203,9 @@ Format your response as a structured markdown document.`; } feature.updatedAt = new Date().toISOString(); - await secureFs.writeFile(featurePath, JSON.stringify(feature, null, 2)); + + // Use atomic write with backup support + await atomicWriteJson(featurePath, feature, { backupCount: DEFAULT_BACKUP_COUNT }); } catch (error) { logger.error(`Failed to update planSpec for ${featureId}:`, error); } @@ -2155,25 +2222,34 @@ Format your response as a structured markdown document.`; const allFeatures: Feature[] = []; const pendingFeatures: Feature[] = []; - // Load all features (for dependency checking) + // Load all features (for dependency checking) with recovery support for (const entry of entries) { if (entry.isDirectory()) { const featurePath = path.join(featuresDir, entry.name, 'feature.json'); - try { - const data = (await secureFs.readFile(featurePath, 'utf-8')) as string; - const feature = JSON.parse(data); - allFeatures.push(feature); - // Track pending features separately - if ( - feature.status === 'pending' || - feature.status === 'ready' || - feature.status === 'backlog' - ) { - pendingFeatures.push(feature); - } - } catch { - // Skip invalid features + // Use recovery-enabled read for corrupted file handling + const result = await readJsonWithRecovery(featurePath, null, { + maxBackups: DEFAULT_BACKUP_COUNT, + autoRestore: true, + }); + + logRecoveryWarning(result, `Feature ${entry.name}`, logger); + + const feature = result.data; + if (!feature) { + // Skip features that couldn't be loaded or recovered + continue; + } + + allFeatures.push(feature); + + // Track pending features separately + if ( + feature.status === 'pending' || + feature.status === 'ready' || + feature.status === 'backlog' + ) { + pendingFeatures.push(feature); } } } @@ -3405,31 +3481,39 @@ After generating the revised spec, output: for (const entry of entries) { if (entry.isDirectory()) { const featurePath = path.join(featuresDir, entry.name, 'feature.json'); - try { - const data = (await secureFs.readFile(featurePath, 'utf-8')) as string; - const feature = JSON.parse(data) as Feature; - // Check if feature was interrupted (in_progress or pipeline_*) - if ( - feature.status === 'in_progress' || - (feature.status && feature.status.startsWith('pipeline_')) - ) { - // Verify it has existing context (agent-output.md) - const featureDir = getFeatureDir(projectPath, feature.id); - const contextPath = path.join(featureDir, 'agent-output.md'); - try { - await secureFs.access(contextPath); - interruptedFeatures.push(feature); - logger.info( - `Found interrupted feature: ${feature.id} (${feature.title}) - status: ${feature.status}` - ); - } catch { - // No context file, skip this feature - it will be restarted fresh - logger.info(`Interrupted feature ${feature.id} has no context, will restart fresh`); - } + // Use recovery-enabled read for corrupted file handling + const result = await readJsonWithRecovery(featurePath, null, { + maxBackups: DEFAULT_BACKUP_COUNT, + autoRestore: true, + }); + + logRecoveryWarning(result, `Feature ${entry.name}`, logger); + + const feature = result.data; + if (!feature) { + // Skip features that couldn't be loaded or recovered + continue; + } + + // Check if feature was interrupted (in_progress or pipeline_*) + if ( + feature.status === 'in_progress' || + (feature.status && feature.status.startsWith('pipeline_')) + ) { + // Verify it has existing context (agent-output.md) + const featureDir = getFeatureDir(projectPath, feature.id); + const contextPath = path.join(featureDir, 'agent-output.md'); + try { + await secureFs.access(contextPath); + interruptedFeatures.push(feature); + logger.info( + `Found interrupted feature: ${feature.id} (${feature.title}) - status: ${feature.status}` + ); + } catch { + // No context file, skip this feature - it will be restarted fresh + logger.info(`Interrupted feature ${feature.id} has no context, will restart fresh`); } - } catch { - // Skip invalid features } } } diff --git a/apps/server/src/services/claude-usage-service.ts b/apps/server/src/services/claude-usage-service.ts index 64dceb6a..aebed98b 100644 --- a/apps/server/src/services/claude-usage-service.ts +++ b/apps/server/src/services/claude-usage-service.ts @@ -22,6 +22,29 @@ export class ClaudeUsageService { private timeout = 30000; // 30 second timeout private isWindows = os.platform() === 'win32'; private isLinux = os.platform() === 'linux'; + // On Windows, ConPTY requires AttachConsole which fails in Electron/service mode + // Detect Electron by checking for electron-specific env vars or process properties + // When in Electron, always use winpty to avoid ConPTY's AttachConsole errors + private isElectron = + !!(process.versions && (process.versions as Record).electron) || + !!process.env.ELECTRON_RUN_AS_NODE; + private useConptyFallback = false; // Track if we need to use winpty fallback on Windows + + /** + * Kill a PTY process with platform-specific handling. + * Windows doesn't support Unix signals like SIGTERM, so we call kill() without arguments. + * On Unix-like systems (macOS, Linux), we can specify the signal. + * + * @param ptyProcess - The PTY process to kill + * @param signal - The signal to send on Unix-like systems (default: 'SIGTERM') + */ + private killPtyProcess(ptyProcess: pty.IPty, signal: string = 'SIGTERM'): void { + if (this.isWindows) { + ptyProcess.kill(); + } else { + ptyProcess.kill(signal); + } + } /** * Check if Claude CLI is available on the system @@ -181,37 +204,94 @@ export class ClaudeUsageService { ? ['/c', 'claude', '--add-dir', workingDirectory] : ['-c', `claude --add-dir "${workingDirectory}"`]; + // Using 'any' for ptyProcess because node-pty types don't include 'killed' property + // eslint-disable-next-line @typescript-eslint/no-explicit-any let ptyProcess: any = null; + // Build PTY spawn options + const ptyOptions: pty.IPtyForkOptions = { + name: 'xterm-256color', + cols: 120, + rows: 30, + cwd: workingDirectory, + env: { + ...process.env, + TERM: 'xterm-256color', + } as Record, + }; + + // On Windows, always use winpty instead of ConPTY + // ConPTY requires AttachConsole which fails in many contexts: + // - Electron apps without a console + // - VS Code integrated terminal + // - Spawned from other applications + // The error happens in a subprocess so we can't catch it - must proactively disable + if (this.isWindows) { + (ptyOptions as pty.IWindowsPtyForkOptions).useConpty = false; + logger.info( + '[executeClaudeUsageCommandPty] Using winpty on Windows (ConPTY disabled for compatibility)' + ); + } + try { - ptyProcess = pty.spawn(shell, args, { - name: 'xterm-256color', - cols: 120, - rows: 30, - cwd: workingDirectory, - env: { - ...process.env, - TERM: 'xterm-256color', - } as Record, - }); + ptyProcess = pty.spawn(shell, args, ptyOptions); } catch (spawnError) { const errorMessage = spawnError instanceof Error ? spawnError.message : String(spawnError); - logger.error('[executeClaudeUsageCommandPty] Failed to spawn PTY:', errorMessage); - // Return a user-friendly error instead of crashing - reject( - new Error( - `Unable to access terminal: ${errorMessage}. Claude CLI may not be available or PTY support is limited in this environment.` - ) - ); - return; + // Check for Windows ConPTY-specific errors + if (this.isWindows && errorMessage.includes('AttachConsole failed')) { + // ConPTY failed - try winpty fallback + if (!this.useConptyFallback) { + logger.warn( + '[executeClaudeUsageCommandPty] ConPTY AttachConsole failed, retrying with winpty fallback' + ); + this.useConptyFallback = true; + + try { + (ptyOptions as pty.IWindowsPtyForkOptions).useConpty = false; + ptyProcess = pty.spawn(shell, args, ptyOptions); + logger.info( + '[executeClaudeUsageCommandPty] Successfully spawned with winpty fallback' + ); + } catch (fallbackError) { + const fallbackMessage = + fallbackError instanceof Error ? fallbackError.message : String(fallbackError); + logger.error( + '[executeClaudeUsageCommandPty] Winpty fallback also failed:', + fallbackMessage + ); + reject( + new Error( + `Windows PTY unavailable: Both ConPTY and winpty failed. This typically happens when running in Electron without a console. ConPTY error: ${errorMessage}. Winpty error: ${fallbackMessage}` + ) + ); + return; + } + } else { + logger.error('[executeClaudeUsageCommandPty] Winpty fallback failed:', errorMessage); + reject( + new Error( + `Windows PTY unavailable: ${errorMessage}. The application is running without console access (common in Electron). Try running from a terminal window.` + ) + ); + return; + } + } else { + logger.error('[executeClaudeUsageCommandPty] Failed to spawn PTY:', errorMessage); + reject( + new Error( + `Unable to access terminal: ${errorMessage}. Claude CLI may not be available or PTY support is limited in this environment.` + ) + ); + return; + } } const timeoutId = setTimeout(() => { if (!settled) { settled = true; if (ptyProcess && !ptyProcess.killed) { - ptyProcess.kill(); + this.killPtyProcess(ptyProcess); } // Don't fail if we have data - return it instead if (output.includes('Current session')) { @@ -244,16 +324,23 @@ export class ClaudeUsageService { const cleanOutput = output.replace(/\x1B\[[0-9;]*[A-Za-z]/g, ''); // Check for specific authentication/permission errors - if ( - cleanOutput.includes('OAuth token does not meet scope requirement') || - cleanOutput.includes('permission_error') || - cleanOutput.includes('token_expired') || - cleanOutput.includes('authentication_error') - ) { + // Must be very specific to avoid false positives from garbled terminal encoding + // Removed permission_error check as it was causing false positives with winpty encoding + const authChecks = { + oauth: cleanOutput.includes('OAuth token does not meet scope requirement'), + tokenExpired: cleanOutput.includes('token_expired'), + // Only match if it looks like a JSON API error response + authError: + cleanOutput.includes('"type":"authentication_error"') || + cleanOutput.includes('"type": "authentication_error"'), + }; + const hasAuthError = authChecks.oauth || authChecks.tokenExpired || authChecks.authError; + + if (hasAuthError) { if (!settled) { settled = true; if (ptyProcess && !ptyProcess.killed) { - ptyProcess.kill(); + this.killPtyProcess(ptyProcess); } reject( new Error( @@ -265,11 +352,16 @@ export class ClaudeUsageService { } // Check if we've seen the usage data (look for "Current session" or the TUI Usage header) - if ( - !hasSeenUsageData && - (cleanOutput.includes('Current session') || - (cleanOutput.includes('Usage') && cleanOutput.includes('% left'))) - ) { + // Also check for percentage patterns that appear in usage output + const hasUsageIndicators = + cleanOutput.includes('Current session') || + (cleanOutput.includes('Usage') && cleanOutput.includes('% left')) || + // Additional patterns for winpty - look for percentage patterns + /\d+%\s*(left|used|remaining)/i.test(cleanOutput) || + cleanOutput.includes('Resets in') || + cleanOutput.includes('Current week'); + + if (!hasSeenUsageData && hasUsageIndicators) { hasSeenUsageData = true; // Wait for full output, then send escape to exit setTimeout(() => { @@ -277,9 +369,10 @@ export class ClaudeUsageService { ptyProcess.write('\x1b'); // Send escape key // Fallback: if ESC doesn't exit (Linux), use SIGTERM after 2s + // Windows doesn't support signals, so killPtyProcess handles platform differences setTimeout(() => { if (!settled && ptyProcess && !ptyProcess.killed) { - ptyProcess.kill('SIGTERM'); + this.killPtyProcess(ptyProcess); } }, 2000); } @@ -307,10 +400,18 @@ export class ClaudeUsageService { } // Detect REPL prompt and send /usage command - if ( - !hasSentCommand && - (cleanOutput.includes('❯') || cleanOutput.includes('? for shortcuts')) - ) { + // On Windows with winpty, Unicode prompt char ❯ gets garbled, so also check for ASCII indicators + const isReplReady = + cleanOutput.includes('❯') || + cleanOutput.includes('? for shortcuts') || + // Fallback for winpty garbled encoding - detect CLI welcome screen elements + (cleanOutput.includes('Welcome back') && cleanOutput.includes('Claude')) || + (cleanOutput.includes('Tips for getting started') && cleanOutput.includes('Claude')) || + // Detect model indicator which appears when REPL is ready + (cleanOutput.includes('Opus') && cleanOutput.includes('Claude API')) || + (cleanOutput.includes('Sonnet') && cleanOutput.includes('Claude API')); + + if (!hasSentCommand && isReplReady) { hasSentCommand = true; // Wait for REPL to fully settle setTimeout(() => { @@ -347,11 +448,9 @@ export class ClaudeUsageService { if (settled) return; settled = true; - if ( - output.includes('token_expired') || - output.includes('authentication_error') || - output.includes('permission_error') - ) { + // Check for auth errors - must be specific to avoid false positives + // Removed permission_error check as it was causing false positives with winpty encoding + if (output.includes('token_expired') || output.includes('"type":"authentication_error"')) { reject(new Error("Authentication required - please run 'claude login'")); return; } diff --git a/apps/server/src/services/dev-server-service.ts b/apps/server/src/services/dev-server-service.ts index 5187c0c8..49f5218c 100644 --- a/apps/server/src/services/dev-server-service.ts +++ b/apps/server/src/services/dev-server-service.ts @@ -379,10 +379,11 @@ class DevServerService { // Create server info early so we can reference it in handlers // We'll add it to runningServers after verifying the process started successfully + const hostname = process.env.HOSTNAME || 'localhost'; const serverInfo: DevServerInfo = { worktreePath, port, - url: `http://localhost:${port}`, + url: `http://${hostname}:${port}`, process: devProcess, startedAt: new Date(), scrollbackBuffer: '', @@ -474,7 +475,7 @@ class DevServerService { result: { worktreePath, port, - url: `http://localhost:${port}`, + url: `http://${hostname}:${port}`, message: `Dev server started on port ${port}`, }, }; @@ -594,6 +595,7 @@ class DevServerService { result?: { worktreePath: string; port: number; + url: string; logs: string; startedAt: string; }; @@ -613,6 +615,7 @@ class DevServerService { result: { worktreePath: server.worktreePath, port: server.port, + url: server.url, logs: server.scrollbackBuffer, startedAt: server.startedAt.toISOString(), }, diff --git a/apps/server/src/services/event-history-service.ts b/apps/server/src/services/event-history-service.ts new file mode 100644 index 00000000..b983af09 --- /dev/null +++ b/apps/server/src/services/event-history-service.ts @@ -0,0 +1,338 @@ +/** + * Event History Service - Stores and retrieves event records for debugging and replay + * + * Provides persistent storage for events in {projectPath}/.automaker/events/ + * Each event is stored as a separate JSON file with an index for quick listing. + * + * Features: + * - Store events when they occur + * - List and filter historical events + * - Replay events to test hook configurations + * - Delete old events to manage disk space + */ + +import { createLogger } from '@automaker/utils'; +import * as secureFs from '../lib/secure-fs.js'; +import { + getEventHistoryDir, + getEventHistoryIndexPath, + getEventPath, + ensureEventHistoryDir, +} from '@automaker/platform'; +import type { + StoredEvent, + StoredEventIndex, + StoredEventSummary, + EventHistoryFilter, + EventHookTrigger, +} from '@automaker/types'; +import { DEFAULT_EVENT_HISTORY_INDEX } from '@automaker/types'; +import { randomUUID } from 'crypto'; + +const logger = createLogger('EventHistoryService'); + +/** Maximum events to keep in the index (oldest are pruned) */ +const MAX_EVENTS_IN_INDEX = 1000; + +/** + * Atomic file write - write to temp file then rename + */ +async function atomicWriteJson(filePath: string, data: unknown): Promise { + const tempPath = `${filePath}.tmp.${Date.now()}`; + const content = JSON.stringify(data, null, 2); + + try { + await secureFs.writeFile(tempPath, content, 'utf-8'); + await secureFs.rename(tempPath, filePath); + } catch (error) { + try { + await secureFs.unlink(tempPath); + } catch { + // Ignore cleanup errors + } + throw error; + } +} + +/** + * Safely read JSON file with fallback to default + */ +async function readJsonFile(filePath: string, defaultValue: T): Promise { + try { + const content = (await secureFs.readFile(filePath, 'utf-8')) as string; + return JSON.parse(content) as T; + } catch (error) { + if ((error as NodeJS.ErrnoException).code === 'ENOENT') { + return defaultValue; + } + logger.error(`Error reading ${filePath}:`, error); + return defaultValue; + } +} + +/** + * Input for storing a new event + */ +export interface StoreEventInput { + trigger: EventHookTrigger; + projectPath: string; + featureId?: string; + featureName?: string; + error?: string; + errorType?: string; + passes?: boolean; + metadata?: Record; +} + +/** + * EventHistoryService - Manages persistent storage of events + */ +export class EventHistoryService { + /** + * Store a new event to history + * + * @param input - Event data to store + * @returns Promise resolving to the stored event + */ + async storeEvent(input: StoreEventInput): Promise { + const { projectPath, trigger, featureId, featureName, error, errorType, passes, metadata } = + input; + + // Ensure events directory exists + await ensureEventHistoryDir(projectPath); + + const eventId = `evt-${Date.now()}-${randomUUID().slice(0, 8)}`; + const timestamp = new Date().toISOString(); + const projectName = this.extractProjectName(projectPath); + + const event: StoredEvent = { + id: eventId, + trigger, + timestamp, + projectPath, + projectName, + featureId, + featureName, + error, + errorType, + passes, + metadata, + }; + + // Write the full event to its own file + const eventPath = getEventPath(projectPath, eventId); + await atomicWriteJson(eventPath, event); + + // Update the index + await this.addToIndex(projectPath, event); + + logger.info(`Stored event ${eventId} (${trigger}) for project ${projectName}`); + + return event; + } + + /** + * Get all events for a project with optional filtering + * + * @param projectPath - Absolute path to project directory + * @param filter - Optional filter criteria + * @returns Promise resolving to array of event summaries + */ + async getEvents(projectPath: string, filter?: EventHistoryFilter): Promise { + const indexPath = getEventHistoryIndexPath(projectPath); + const index = await readJsonFile(indexPath, DEFAULT_EVENT_HISTORY_INDEX); + + let events = [...index.events]; + + // Apply filters + if (filter) { + if (filter.trigger) { + events = events.filter((e) => e.trigger === filter.trigger); + } + if (filter.featureId) { + events = events.filter((e) => e.featureId === filter.featureId); + } + if (filter.since) { + const sinceDate = new Date(filter.since).getTime(); + events = events.filter((e) => new Date(e.timestamp).getTime() >= sinceDate); + } + if (filter.until) { + const untilDate = new Date(filter.until).getTime(); + events = events.filter((e) => new Date(e.timestamp).getTime() <= untilDate); + } + } + + // Sort by timestamp (newest first) + events.sort((a, b) => new Date(b.timestamp).getTime() - new Date(a.timestamp).getTime()); + + // Apply pagination + if (filter?.offset) { + events = events.slice(filter.offset); + } + if (filter?.limit) { + events = events.slice(0, filter.limit); + } + + return events; + } + + /** + * Get a single event by ID + * + * @param projectPath - Absolute path to project directory + * @param eventId - Event identifier + * @returns Promise resolving to the full event or null if not found + */ + async getEvent(projectPath: string, eventId: string): Promise { + const eventPath = getEventPath(projectPath, eventId); + try { + const content = (await secureFs.readFile(eventPath, 'utf-8')) as string; + return JSON.parse(content) as StoredEvent; + } catch (error) { + if ((error as NodeJS.ErrnoException).code === 'ENOENT') { + return null; + } + logger.error(`Error reading event ${eventId}:`, error); + return null; + } + } + + /** + * Delete an event by ID + * + * @param projectPath - Absolute path to project directory + * @param eventId - Event identifier + * @returns Promise resolving to true if deleted + */ + async deleteEvent(projectPath: string, eventId: string): Promise { + // Remove from index + const indexPath = getEventHistoryIndexPath(projectPath); + const index = await readJsonFile(indexPath, DEFAULT_EVENT_HISTORY_INDEX); + + const initialLength = index.events.length; + index.events = index.events.filter((e) => e.id !== eventId); + + if (index.events.length === initialLength) { + return false; // Event not found in index + } + + await atomicWriteJson(indexPath, index); + + // Delete the event file + const eventPath = getEventPath(projectPath, eventId); + try { + await secureFs.unlink(eventPath); + } catch (error) { + if ((error as NodeJS.ErrnoException).code !== 'ENOENT') { + logger.error(`Error deleting event file ${eventId}:`, error); + } + } + + logger.info(`Deleted event ${eventId}`); + return true; + } + + /** + * Clear all events for a project + * + * @param projectPath - Absolute path to project directory + * @returns Promise resolving to number of events cleared + */ + async clearEvents(projectPath: string): Promise { + const indexPath = getEventHistoryIndexPath(projectPath); + const index = await readJsonFile(indexPath, DEFAULT_EVENT_HISTORY_INDEX); + + const count = index.events.length; + + // Delete all event files + for (const event of index.events) { + const eventPath = getEventPath(projectPath, event.id); + try { + await secureFs.unlink(eventPath); + } catch (error) { + if ((error as NodeJS.ErrnoException).code !== 'ENOENT') { + logger.error(`Error deleting event file ${event.id}:`, error); + } + } + } + + // Reset the index + await atomicWriteJson(indexPath, DEFAULT_EVENT_HISTORY_INDEX); + + logger.info(`Cleared ${count} events for project`); + return count; + } + + /** + * Get event count for a project + * + * @param projectPath - Absolute path to project directory + * @param filter - Optional filter criteria + * @returns Promise resolving to event count + */ + async getEventCount(projectPath: string, filter?: EventHistoryFilter): Promise { + const events = await this.getEvents(projectPath, { + ...filter, + limit: undefined, + offset: undefined, + }); + return events.length; + } + + /** + * Add an event to the index (internal) + */ + private async addToIndex(projectPath: string, event: StoredEvent): Promise { + const indexPath = getEventHistoryIndexPath(projectPath); + const index = await readJsonFile(indexPath, DEFAULT_EVENT_HISTORY_INDEX); + + const summary: StoredEventSummary = { + id: event.id, + trigger: event.trigger, + timestamp: event.timestamp, + featureName: event.featureName, + featureId: event.featureId, + }; + + // Add to beginning (newest first) + index.events.unshift(summary); + + // Prune old events if over limit + if (index.events.length > MAX_EVENTS_IN_INDEX) { + const removed = index.events.splice(MAX_EVENTS_IN_INDEX); + // Delete the pruned event files + for (const oldEvent of removed) { + const eventPath = getEventPath(projectPath, oldEvent.id); + try { + await secureFs.unlink(eventPath); + } catch { + // Ignore deletion errors for pruned events + } + } + logger.info(`Pruned ${removed.length} old events from history`); + } + + await atomicWriteJson(indexPath, index); + } + + /** + * Extract project name from path + */ + private extractProjectName(projectPath: string): string { + const parts = projectPath.split(/[/\\]/); + return parts[parts.length - 1] || projectPath; + } +} + +// Singleton instance +let eventHistoryServiceInstance: EventHistoryService | null = null; + +/** + * Get the singleton event history service instance + */ +export function getEventHistoryService(): EventHistoryService { + if (!eventHistoryServiceInstance) { + eventHistoryServiceInstance = new EventHistoryService(); + } + return eventHistoryServiceInstance; +} diff --git a/apps/server/src/services/event-hook-service.ts b/apps/server/src/services/event-hook-service.ts index d6d2d0b8..08da71dd 100644 --- a/apps/server/src/services/event-hook-service.ts +++ b/apps/server/src/services/event-hook-service.ts @@ -5,7 +5,10 @@ * - Shell commands: Executed with configurable timeout * - HTTP webhooks: POST/GET/PUT/PATCH requests with variable substitution * + * Also stores events to history for debugging and replay. + * * Supported events: + * - feature_created: A new feature was created * - feature_success: Feature completed successfully * - feature_error: Feature failed with an error * - auto_mode_complete: Auto mode finished all features (idle state) @@ -17,6 +20,7 @@ import { promisify } from 'util'; import { createLogger } from '@automaker/utils'; import type { EventEmitter } from '../lib/events.js'; import type { SettingsService } from './settings-service.js'; +import type { EventHistoryService } from './event-history-service.js'; import type { EventHook, EventHookTrigger, @@ -60,27 +64,45 @@ interface AutoModeEventPayload { projectPath?: string; } +/** + * Feature created event payload structure + */ +interface FeatureCreatedPayload { + featureId: string; + featureName?: string; + projectPath: string; +} + /** * Event Hook Service * * Manages execution of user-configured event hooks in response to system events. + * Also stores events to history for debugging and replay. */ export class EventHookService { private emitter: EventEmitter | null = null; private settingsService: SettingsService | null = null; + private eventHistoryService: EventHistoryService | null = null; private unsubscribe: (() => void) | null = null; /** - * Initialize the service with event emitter and settings service + * Initialize the service with event emitter, settings service, and event history service */ - initialize(emitter: EventEmitter, settingsService: SettingsService): void { + initialize( + emitter: EventEmitter, + settingsService: SettingsService, + eventHistoryService?: EventHistoryService + ): void { this.emitter = emitter; this.settingsService = settingsService; + this.eventHistoryService = eventHistoryService || null; - // Subscribe to auto-mode events + // Subscribe to events this.unsubscribe = emitter.subscribe((type, payload) => { if (type === 'auto-mode:event') { this.handleAutoModeEvent(payload as AutoModeEventPayload); + } else if (type === 'feature:created') { + this.handleFeatureCreatedEvent(payload as FeatureCreatedPayload); } }); @@ -97,6 +119,7 @@ export class EventHookService { } this.emitter = null; this.settingsService = null; + this.eventHistoryService = null; } /** @@ -137,17 +160,51 @@ export class EventHookService { eventType: trigger, }; - // Execute matching hooks - await this.executeHooksForTrigger(trigger, context); + // Execute matching hooks (pass passes for feature completion events) + await this.executeHooksForTrigger(trigger, context, { passes: payload.passes }); } /** - * Execute all enabled hooks matching the given trigger + * Handle feature:created events and trigger matching hooks + */ + private async handleFeatureCreatedEvent(payload: FeatureCreatedPayload): Promise { + const context: HookContext = { + featureId: payload.featureId, + featureName: payload.featureName, + projectPath: payload.projectPath, + projectName: this.extractProjectName(payload.projectPath), + timestamp: new Date().toISOString(), + eventType: 'feature_created', + }; + + await this.executeHooksForTrigger('feature_created', context); + } + + /** + * Execute all enabled hooks matching the given trigger and store event to history */ private async executeHooksForTrigger( trigger: EventHookTrigger, - context: HookContext + context: HookContext, + additionalData?: { passes?: boolean } ): Promise { + // Store event to history (even if no hooks match) + if (this.eventHistoryService && context.projectPath) { + try { + await this.eventHistoryService.storeEvent({ + trigger, + projectPath: context.projectPath, + featureId: context.featureId, + featureName: context.featureName, + error: context.error, + errorType: context.errorType, + passes: additionalData?.passes, + }); + } catch (error) { + logger.error('Failed to store event to history:', error); + } + } + if (!this.settingsService) { logger.warn('Settings service not available'); return; diff --git a/apps/server/src/services/feature-loader.ts b/apps/server/src/services/feature-loader.ts index 409abd2a..b40a85f0 100644 --- a/apps/server/src/services/feature-loader.ts +++ b/apps/server/src/services/feature-loader.ts @@ -5,14 +5,22 @@ import path from 'path'; import type { Feature, DescriptionHistoryEntry } from '@automaker/types'; -import { createLogger } from '@automaker/utils'; +import { + createLogger, + atomicWriteJson, + readJsonWithRecovery, + logRecoveryWarning, + DEFAULT_BACKUP_COUNT, +} from '@automaker/utils'; import * as secureFs from '../lib/secure-fs.js'; import { getFeaturesDir, getFeatureDir, getFeatureImagesDir, + getAppSpecPath, ensureAutomakerDir, } from '@automaker/platform'; +import { addImplementedFeature, type ImplementedFeature } from '../lib/xml-extractor.js'; const logger = createLogger('FeatureLoader'); @@ -192,31 +200,31 @@ export class FeatureLoader { })) as any[]; const featureDirs = entries.filter((entry) => entry.isDirectory()); - // Load all features concurrently (secureFs has built-in concurrency limiting) + // Load all features concurrently with automatic recovery from backups const featurePromises = featureDirs.map(async (dir) => { const featureId = dir.name; const featureJsonPath = this.getFeatureJsonPath(projectPath, featureId); - try { - const content = (await secureFs.readFile(featureJsonPath, 'utf-8')) as string; - const feature = JSON.parse(content); + // Use recovery-enabled read to handle corrupted files + const result = await readJsonWithRecovery(featureJsonPath, null, { + maxBackups: DEFAULT_BACKUP_COUNT, + autoRestore: true, + }); - if (!feature.id) { - logger.warn(`Feature ${featureId} missing required 'id' field, skipping`); - return null; - } + logRecoveryWarning(result, `Feature ${featureId}`, logger); - return feature as Feature; - } catch (error) { - if ((error as NodeJS.ErrnoException).code === 'ENOENT') { - return null; - } else if (error instanceof SyntaxError) { - logger.warn(`Failed to parse feature.json for ${featureId}: ${error.message}`); - } else { - logger.error(`Failed to load feature ${featureId}:`, (error as Error).message); - } + const feature = result.data; + + if (!feature) { return null; } + + if (!feature.id) { + logger.warn(`Feature ${featureId} missing required 'id' field, skipping`); + return null; + } + + return feature; }); const results = await Promise.all(featurePromises); @@ -236,21 +244,85 @@ export class FeatureLoader { } } + /** + * Normalize a title for comparison (case-insensitive, trimmed) + */ + private normalizeTitle(title: string): string { + return title.toLowerCase().trim(); + } + + /** + * Find a feature by its title (case-insensitive match) + * @param projectPath - Path to the project + * @param title - Title to search for + * @returns The matching feature or null if not found + */ + async findByTitle(projectPath: string, title: string): Promise { + if (!title || !title.trim()) { + return null; + } + + const normalizedTitle = this.normalizeTitle(title); + const features = await this.getAll(projectPath); + + for (const feature of features) { + if (feature.title && this.normalizeTitle(feature.title) === normalizedTitle) { + return feature; + } + } + + return null; + } + + /** + * Check if a title already exists on another feature (for duplicate detection) + * @param projectPath - Path to the project + * @param title - Title to check + * @param excludeFeatureId - Optional feature ID to exclude from the check (for updates) + * @returns The duplicate feature if found, null otherwise + */ + async findDuplicateTitle( + projectPath: string, + title: string, + excludeFeatureId?: string + ): Promise { + if (!title || !title.trim()) { + return null; + } + + const normalizedTitle = this.normalizeTitle(title); + const features = await this.getAll(projectPath); + + for (const feature of features) { + // Skip the feature being updated (if provided) + if (excludeFeatureId && feature.id === excludeFeatureId) { + continue; + } + + if (feature.title && this.normalizeTitle(feature.title) === normalizedTitle) { + return feature; + } + } + + return null; + } + /** * Get a single feature by ID + * Uses automatic recovery from backups if the main file is corrupted */ async get(projectPath: string, featureId: string): Promise { - try { - const featureJsonPath = this.getFeatureJsonPath(projectPath, featureId); - const content = (await secureFs.readFile(featureJsonPath, 'utf-8')) as string; - return JSON.parse(content); - } catch (error) { - if ((error as NodeJS.ErrnoException).code === 'ENOENT') { - return null; - } - logger.error(`Failed to get feature ${featureId}:`, error); - throw error; - } + const featureJsonPath = this.getFeatureJsonPath(projectPath, featureId); + + // Use recovery-enabled read to handle corrupted files + const result = await readJsonWithRecovery(featureJsonPath, null, { + maxBackups: DEFAULT_BACKUP_COUNT, + autoRestore: true, + }); + + logRecoveryWarning(result, `Feature ${featureId}`, logger); + + return result.data; } /** @@ -294,8 +366,8 @@ export class FeatureLoader { descriptionHistory: initialHistory, }; - // Write feature.json - await secureFs.writeFile(featureJsonPath, JSON.stringify(feature, null, 2), 'utf-8'); + // Write feature.json atomically with backup support + await atomicWriteJson(featureJsonPath, feature, { backupCount: DEFAULT_BACKUP_COUNT }); logger.info(`Created feature ${featureId}`); return feature; @@ -379,9 +451,9 @@ export class FeatureLoader { descriptionHistory: updatedHistory, }; - // Write back to file + // Write back to file atomically with backup support const featureJsonPath = this.getFeatureJsonPath(projectPath, featureId); - await secureFs.writeFile(featureJsonPath, JSON.stringify(updatedFeature, null, 2), 'utf-8'); + await atomicWriteJson(featureJsonPath, updatedFeature, { backupCount: DEFAULT_BACKUP_COUNT }); logger.info(`Updated feature ${featureId}`); return updatedFeature; @@ -460,4 +532,64 @@ export class FeatureLoader { } } } + + /** + * Sync a completed feature to the app_spec.txt implemented_features section + * + * When a feature is completed, this method adds it to the implemented_features + * section of the project's app_spec.txt file. This keeps the spec in sync + * with the actual state of the codebase. + * + * @param projectPath - Path to the project + * @param feature - The feature to sync (must have title or description) + * @param fileLocations - Optional array of file paths where the feature was implemented + * @returns True if the spec was updated, false if no spec exists or feature was skipped + */ + async syncFeatureToAppSpec( + projectPath: string, + feature: Feature, + fileLocations?: string[] + ): Promise { + try { + const appSpecPath = getAppSpecPath(projectPath); + + // Read the current app_spec.txt + let specContent: string; + try { + specContent = (await secureFs.readFile(appSpecPath, 'utf-8')) as string; + } catch (error) { + if ((error as NodeJS.ErrnoException).code === 'ENOENT') { + logger.info(`No app_spec.txt found for project, skipping sync for feature ${feature.id}`); + return false; + } + throw error; + } + + // Build the implemented feature entry + const featureName = feature.title || `Feature: ${feature.id}`; + const implementedFeature: ImplementedFeature = { + name: featureName, + description: feature.description, + ...(fileLocations && fileLocations.length > 0 ? { file_locations: fileLocations } : {}), + }; + + // Add the feature to the implemented_features section + const updatedSpecContent = addImplementedFeature(specContent, implementedFeature); + + // Check if the content actually changed (feature might already exist) + if (updatedSpecContent === specContent) { + logger.info(`Feature "${featureName}" already exists in app_spec.txt, skipping`); + return false; + } + + // Write the updated spec back to the file + await secureFs.writeFile(appSpecPath, updatedSpecContent, 'utf-8'); + + logger.info(`Synced feature "${featureName}" to app_spec.txt`); + return true; + } catch (error) { + logger.error(`Failed to sync feature ${feature.id} to app_spec.txt:`, error); + throw error; + } + } } diff --git a/apps/server/src/services/notification-service.ts b/apps/server/src/services/notification-service.ts new file mode 100644 index 00000000..21685308 --- /dev/null +++ b/apps/server/src/services/notification-service.ts @@ -0,0 +1,280 @@ +/** + * Notification Service - Handles reading/writing notifications to JSON files + * + * Provides persistent storage for project-level notifications in + * {projectPath}/.automaker/notifications.json + * + * Notifications alert users when: + * - Features reach specific statuses (waiting_approval, verified) + * - Long-running operations complete (spec generation) + */ + +import { createLogger } from '@automaker/utils'; +import * as secureFs from '../lib/secure-fs.js'; +import { getNotificationsPath, ensureAutomakerDir } from '@automaker/platform'; +import type { Notification, NotificationsFile, NotificationType } from '@automaker/types'; +import { DEFAULT_NOTIFICATIONS_FILE } from '@automaker/types'; +import type { EventEmitter } from '../lib/events.js'; +import { randomUUID } from 'crypto'; + +const logger = createLogger('NotificationService'); + +/** + * Atomic file write - write to temp file then rename + */ +async function atomicWriteJson(filePath: string, data: unknown): Promise { + const tempPath = `${filePath}.tmp.${Date.now()}`; + const content = JSON.stringify(data, null, 2); + + try { + await secureFs.writeFile(tempPath, content, 'utf-8'); + await secureFs.rename(tempPath, filePath); + } catch (error) { + // Clean up temp file if it exists + try { + await secureFs.unlink(tempPath); + } catch { + // Ignore cleanup errors + } + throw error; + } +} + +/** + * Safely read JSON file with fallback to default + */ +async function readJsonFile(filePath: string, defaultValue: T): Promise { + try { + const content = (await secureFs.readFile(filePath, 'utf-8')) as string; + return JSON.parse(content) as T; + } catch (error) { + if ((error as NodeJS.ErrnoException).code === 'ENOENT') { + return defaultValue; + } + logger.error(`Error reading ${filePath}:`, error); + return defaultValue; + } +} + +/** + * Input for creating a new notification + */ +export interface CreateNotificationInput { + type: NotificationType; + title: string; + message: string; + featureId?: string; + projectPath: string; +} + +/** + * NotificationService - Manages persistent storage of notifications + * + * Handles reading and writing notifications to JSON files with atomic operations + * for reliability. Each project has its own notifications.json file. + */ +export class NotificationService { + private events: EventEmitter | null = null; + + /** + * Set the event emitter for broadcasting notification events + */ + setEventEmitter(events: EventEmitter): void { + this.events = events; + } + + /** + * Get all notifications for a project + * + * @param projectPath - Absolute path to project directory + * @returns Promise resolving to array of notifications + */ + async getNotifications(projectPath: string): Promise { + const notificationsPath = getNotificationsPath(projectPath); + const file = await readJsonFile( + notificationsPath, + DEFAULT_NOTIFICATIONS_FILE + ); + // Filter out dismissed notifications and sort by date (newest first) + return file.notifications + .filter((n) => !n.dismissed) + .sort((a, b) => new Date(b.createdAt).getTime() - new Date(a.createdAt).getTime()); + } + + /** + * Get unread notification count for a project + * + * @param projectPath - Absolute path to project directory + * @returns Promise resolving to unread count + */ + async getUnreadCount(projectPath: string): Promise { + const notifications = await this.getNotifications(projectPath); + return notifications.filter((n) => !n.read).length; + } + + /** + * Create a new notification + * + * @param input - Notification creation input + * @returns Promise resolving to the created notification + */ + async createNotification(input: CreateNotificationInput): Promise { + const { projectPath, type, title, message, featureId } = input; + + // Ensure automaker directory exists + await ensureAutomakerDir(projectPath); + + const notificationsPath = getNotificationsPath(projectPath); + const file = await readJsonFile( + notificationsPath, + DEFAULT_NOTIFICATIONS_FILE + ); + + const notification: Notification = { + id: randomUUID(), + type, + title, + message, + createdAt: new Date().toISOString(), + read: false, + dismissed: false, + featureId, + projectPath, + }; + + file.notifications.push(notification); + await atomicWriteJson(notificationsPath, file); + + logger.info(`Created notification: ${title} for project ${projectPath}`); + + // Emit event for real-time updates + if (this.events) { + this.events.emit('notification:created', notification); + } + + return notification; + } + + /** + * Mark a notification as read + * + * @param projectPath - Absolute path to project directory + * @param notificationId - ID of the notification to mark as read + * @returns Promise resolving to the updated notification or null if not found + */ + async markAsRead(projectPath: string, notificationId: string): Promise { + const notificationsPath = getNotificationsPath(projectPath); + const file = await readJsonFile( + notificationsPath, + DEFAULT_NOTIFICATIONS_FILE + ); + + const notification = file.notifications.find((n) => n.id === notificationId); + if (!notification) { + return null; + } + + notification.read = true; + await atomicWriteJson(notificationsPath, file); + + logger.info(`Marked notification ${notificationId} as read`); + return notification; + } + + /** + * Mark all notifications as read for a project + * + * @param projectPath - Absolute path to project directory + * @returns Promise resolving to number of notifications marked as read + */ + async markAllAsRead(projectPath: string): Promise { + const notificationsPath = getNotificationsPath(projectPath); + const file = await readJsonFile( + notificationsPath, + DEFAULT_NOTIFICATIONS_FILE + ); + + let count = 0; + for (const notification of file.notifications) { + if (!notification.read && !notification.dismissed) { + notification.read = true; + count++; + } + } + + if (count > 0) { + await atomicWriteJson(notificationsPath, file); + logger.info(`Marked ${count} notifications as read`); + } + + return count; + } + + /** + * Dismiss a notification + * + * @param projectPath - Absolute path to project directory + * @param notificationId - ID of the notification to dismiss + * @returns Promise resolving to true if notification was dismissed + */ + async dismissNotification(projectPath: string, notificationId: string): Promise { + const notificationsPath = getNotificationsPath(projectPath); + const file = await readJsonFile( + notificationsPath, + DEFAULT_NOTIFICATIONS_FILE + ); + + const notification = file.notifications.find((n) => n.id === notificationId); + if (!notification) { + return false; + } + + notification.dismissed = true; + await atomicWriteJson(notificationsPath, file); + + logger.info(`Dismissed notification ${notificationId}`); + return true; + } + + /** + * Dismiss all notifications for a project + * + * @param projectPath - Absolute path to project directory + * @returns Promise resolving to number of notifications dismissed + */ + async dismissAll(projectPath: string): Promise { + const notificationsPath = getNotificationsPath(projectPath); + const file = await readJsonFile( + notificationsPath, + DEFAULT_NOTIFICATIONS_FILE + ); + + let count = 0; + for (const notification of file.notifications) { + if (!notification.dismissed) { + notification.dismissed = true; + count++; + } + } + + if (count > 0) { + await atomicWriteJson(notificationsPath, file); + logger.info(`Dismissed ${count} notifications`); + } + + return count; + } +} + +// Singleton instance +let notificationServiceInstance: NotificationService | null = null; + +/** + * Get the singleton notification service instance + */ +export function getNotificationService(): NotificationService { + if (!notificationServiceInstance) { + notificationServiceInstance = new NotificationService(); + } + return notificationServiceInstance; +} diff --git a/apps/server/src/services/settings-service.ts b/apps/server/src/services/settings-service.ts index 5f57ad83..e63b075c 100644 --- a/apps/server/src/services/settings-service.ts +++ b/apps/server/src/services/settings-service.ts @@ -7,7 +7,7 @@ * - Per-project settings ({projectPath}/.automaker/settings.json) */ -import { createLogger } from '@automaker/utils'; +import { createLogger, atomicWriteJson, DEFAULT_BACKUP_COUNT } from '@automaker/utils'; import * as secureFs from '../lib/secure-fs.js'; import { @@ -42,28 +42,8 @@ import { const logger = createLogger('SettingsService'); /** - * Atomic file write - write to temp file then rename - */ -async function atomicWriteJson(filePath: string, data: unknown): Promise { - const tempPath = `${filePath}.tmp.${Date.now()}`; - const content = JSON.stringify(data, null, 2); - - try { - await secureFs.writeFile(tempPath, content, 'utf-8'); - await secureFs.rename(tempPath, filePath); - } catch (error) { - // Clean up temp file if it exists - try { - await secureFs.unlink(tempPath); - } catch { - // Ignore cleanup errors - } - throw error; - } -} - -/** - * Safely read JSON file with fallback to default + * Wrapper for readJsonFile from utils that uses the local secureFs + * to maintain compatibility with the server's secure file system */ async function readJsonFile(filePath: string, defaultValue: T): Promise { try { @@ -90,6 +70,13 @@ async function fileExists(filePath: string): Promise { } } +/** + * Write settings atomically with backup support + */ +async function writeSettingsJson(filePath: string, data: unknown): Promise { + await atomicWriteJson(filePath, data, { backupCount: DEFAULT_BACKUP_COUNT }); +} + /** * SettingsService - Manages persistent storage of user settings and credentials * @@ -180,7 +167,7 @@ export class SettingsService { if (needsSave) { try { await ensureDataDir(this.dataDir); - await atomicWriteJson(settingsPath, result); + await writeSettingsJson(settingsPath, result); logger.info('Settings migration complete'); } catch (error) { logger.error('Failed to save migrated settings:', error); @@ -340,7 +327,7 @@ export class SettingsService { }; } - await atomicWriteJson(settingsPath, updated); + await writeSettingsJson(settingsPath, updated); logger.info('Global settings updated'); return updated; @@ -414,7 +401,7 @@ export class SettingsService { }; } - await atomicWriteJson(credentialsPath, updated); + await writeSettingsJson(credentialsPath, updated); logger.info('Credentials updated'); return updated; @@ -525,7 +512,7 @@ export class SettingsService { }; } - await atomicWriteJson(settingsPath, updated); + await writeSettingsJson(settingsPath, updated); logger.info(`Project settings updated for ${projectPath}`); return updated; diff --git a/apps/server/src/services/terminal-service.ts b/apps/server/src/services/terminal-service.ts index c309975c..f83aaede 100644 --- a/apps/server/src/services/terminal-service.ts +++ b/apps/server/src/services/terminal-service.ts @@ -70,6 +70,29 @@ export class TerminalService extends EventEmitter { private sessions: Map = new Map(); private dataCallbacks: Set = new Set(); private exitCallbacks: Set = new Set(); + private isWindows = os.platform() === 'win32'; + // On Windows, ConPTY requires AttachConsole which fails in Electron/service mode + // Detect Electron by checking for electron-specific env vars or process properties + private isElectron = + !!(process.versions && (process.versions as Record).electron) || + !!process.env.ELECTRON_RUN_AS_NODE; + private useConptyFallback = false; // Track if we need to use winpty fallback on Windows + + /** + * Kill a PTY process with platform-specific handling. + * Windows doesn't support Unix signals like SIGTERM/SIGKILL, so we call kill() without arguments. + * On Unix-like systems (macOS, Linux), we can specify the signal. + * + * @param ptyProcess - The PTY process to kill + * @param signal - The signal to send on Unix-like systems (default: 'SIGTERM') + */ + private killPtyProcess(ptyProcess: pty.IPty, signal: string = 'SIGTERM'): void { + if (this.isWindows) { + ptyProcess.kill(); + } else { + ptyProcess.kill(signal); + } + } /** * Detect the best shell for the current platform @@ -322,13 +345,60 @@ export class TerminalService extends EventEmitter { logger.info(`Creating session ${id} with shell: ${shell} in ${cwd}`); - const ptyProcess = pty.spawn(shell, shellArgs, { + // Build PTY spawn options + const ptyOptions: pty.IPtyForkOptions = { name: 'xterm-256color', cols: options.cols || 80, rows: options.rows || 24, cwd, env, - }); + }; + + // On Windows, always use winpty instead of ConPTY + // ConPTY requires AttachConsole which fails in many contexts: + // - Electron apps without a console + // - VS Code integrated terminal + // - Spawned from other applications + // The error happens in a subprocess so we can't catch it - must proactively disable + if (this.isWindows) { + (ptyOptions as pty.IWindowsPtyForkOptions).useConpty = false; + logger.info( + `[createSession] Using winpty for session ${id} (ConPTY disabled for compatibility)` + ); + } + + let ptyProcess: pty.IPty; + try { + ptyProcess = pty.spawn(shell, shellArgs, ptyOptions); + } catch (spawnError) { + const errorMessage = spawnError instanceof Error ? spawnError.message : String(spawnError); + + // Check for Windows ConPTY-specific errors + if (this.isWindows && errorMessage.includes('AttachConsole failed')) { + // ConPTY failed - try winpty fallback + if (!this.useConptyFallback) { + logger.warn(`[createSession] ConPTY AttachConsole failed, retrying with winpty fallback`); + this.useConptyFallback = true; + + try { + (ptyOptions as pty.IWindowsPtyForkOptions).useConpty = false; + ptyProcess = pty.spawn(shell, shellArgs, ptyOptions); + logger.info(`[createSession] Successfully spawned session ${id} with winpty fallback`); + } catch (fallbackError) { + const fallbackMessage = + fallbackError instanceof Error ? fallbackError.message : String(fallbackError); + logger.error(`[createSession] Winpty fallback also failed:`, fallbackMessage); + return null; + } + } else { + logger.error(`[createSession] PTY spawn failed (winpty):`, errorMessage); + return null; + } + } else { + logger.error(`[createSession] PTY spawn failed:`, errorMessage); + return null; + } + } const session: TerminalSession = { id, @@ -392,7 +462,11 @@ export class TerminalService extends EventEmitter { // Handle exit ptyProcess.onExit(({ exitCode }) => { - logger.info(`Session ${id} exited with code ${exitCode}`); + const exitMessage = + exitCode === undefined || exitCode === null + ? 'Session terminated' + : `Session exited with code ${exitCode}`; + logger.info(`${exitMessage} (${id})`); this.sessions.delete(id); this.exitCallbacks.forEach((cb) => cb(id, exitCode)); this.emit('exit', id, exitCode); @@ -477,8 +551,9 @@ export class TerminalService extends EventEmitter { } // First try graceful SIGTERM to allow process cleanup + // On Windows, killPtyProcess calls kill() without signal since Windows doesn't support Unix signals logger.info(`Session ${sessionId} sending SIGTERM`); - session.pty.kill('SIGTERM'); + this.killPtyProcess(session.pty, 'SIGTERM'); // Schedule SIGKILL fallback if process doesn't exit gracefully // The onExit handler will remove session from map when it actually exits @@ -486,7 +561,7 @@ export class TerminalService extends EventEmitter { if (this.sessions.has(sessionId)) { logger.info(`Session ${sessionId} still alive after SIGTERM, sending SIGKILL`); try { - session.pty.kill('SIGKILL'); + this.killPtyProcess(session.pty, 'SIGKILL'); } catch { // Process may have already exited } @@ -588,7 +663,8 @@ export class TerminalService extends EventEmitter { if (session.flushTimeout) { clearTimeout(session.flushTimeout); } - session.pty.kill(); + // Use platform-specific kill to ensure proper termination on Windows + this.killPtyProcess(session.pty); } catch { // Ignore errors during cleanup } diff --git a/apps/server/tests/unit/lib/xml-extractor.test.ts b/apps/server/tests/unit/lib/xml-extractor.test.ts new file mode 100644 index 00000000..750a5f33 --- /dev/null +++ b/apps/server/tests/unit/lib/xml-extractor.test.ts @@ -0,0 +1,1027 @@ +import { describe, it, expect, vi, beforeEach } from 'vitest'; +import { + escapeXml, + unescapeXml, + extractXmlSection, + extractXmlElements, + extractImplementedFeatures, + extractImplementedFeatureNames, + featureToXml, + featuresToXml, + updateImplementedFeaturesSection, + addImplementedFeature, + removeImplementedFeature, + updateImplementedFeature, + hasImplementedFeature, + toSpecOutputFeatures, + fromSpecOutputFeatures, + type ImplementedFeature, + type XmlExtractorLogger, +} from '@/lib/xml-extractor.js'; + +describe('xml-extractor.ts', () => { + // Mock logger for testing custom logger functionality + const createMockLogger = (): XmlExtractorLogger & { calls: string[] } => { + const calls: string[] = []; + return { + calls, + debug: vi.fn((msg: string) => calls.push(`debug: ${msg}`)), + warn: vi.fn((msg: string) => calls.push(`warn: ${msg}`)), + }; + }; + + beforeEach(() => { + vi.clearAllMocks(); + }); + + describe('escapeXml', () => { + it('should escape ampersand', () => { + expect(escapeXml('foo & bar')).toBe('foo & bar'); + }); + + it('should escape less than', () => { + expect(escapeXml('a < b')).toBe('a < b'); + }); + + it('should escape greater than', () => { + expect(escapeXml('a > b')).toBe('a > b'); + }); + + it('should escape double quotes', () => { + expect(escapeXml('say "hello"')).toBe('say "hello"'); + }); + + it('should escape single quotes', () => { + expect(escapeXml("it's" + ' fine')).toBe('it's fine'); + }); + + it('should handle null', () => { + expect(escapeXml(null)).toBe(''); + }); + + it('should handle undefined', () => { + expect(escapeXml(undefined)).toBe(''); + }); + + it('should handle empty string', () => { + expect(escapeXml('')).toBe(''); + }); + + it('should escape multiple special characters', () => { + expect(escapeXml('a < b & c > d "e" \'f\'')).toBe( + 'a < b & c > d "e" 'f'' + ); + }); + }); + + describe('unescapeXml', () => { + it('should unescape ampersand', () => { + expect(unescapeXml('foo & bar')).toBe('foo & bar'); + }); + + it('should unescape less than', () => { + expect(unescapeXml('a < b')).toBe('a < b'); + }); + + it('should unescape greater than', () => { + expect(unescapeXml('a > b')).toBe('a > b'); + }); + + it('should unescape double quotes', () => { + expect(unescapeXml('say "hello"')).toBe('say "hello"'); + }); + + it('should unescape single quotes', () => { + expect(unescapeXml('it's fine')).toBe("it's fine"); + }); + + it('should handle empty string', () => { + expect(unescapeXml('')).toBe(''); + }); + + it('should roundtrip with escapeXml', () => { + const original = 'Test & "quoted" \'apostrophe\''; + expect(unescapeXml(escapeXml(original))).toBe(original); + }); + }); + + describe('extractXmlSection', () => { + it('should extract section content', () => { + const xml = '
content here
'; + expect(extractXmlSection(xml, 'section')).toBe('content here'); + }); + + it('should extract multiline section content', () => { + const xml = ` +
+ line 1 + line 2 +
+
`; + expect(extractXmlSection(xml, 'section')).toContain('line 1'); + expect(extractXmlSection(xml, 'section')).toContain('line 2'); + }); + + it('should return null for non-existent section', () => { + const xml = 'content'; + expect(extractXmlSection(xml, 'section')).toBeNull(); + }); + + it('should be case-insensitive', () => { + const xml = '
content
'; + expect(extractXmlSection(xml, 'section')).toBe('content'); + }); + + it('should handle empty section', () => { + const xml = '
'; + expect(extractXmlSection(xml, 'section')).toBe(''); + }); + }); + + describe('extractXmlElements', () => { + it('should extract all element values', () => { + const xml = 'onetwothree'; + expect(extractXmlElements(xml, 'item')).toEqual(['one', 'two', 'three']); + }); + + it('should return empty array for non-existent elements', () => { + const xml = 'value'; + expect(extractXmlElements(xml, 'item')).toEqual([]); + }); + + it('should trim whitespace', () => { + const xml = ' spaced '; + expect(extractXmlElements(xml, 'item')).toEqual(['spaced']); + }); + + it('should unescape XML entities', () => { + const xml = 'foo & bar'; + expect(extractXmlElements(xml, 'item')).toEqual(['foo & bar']); + }); + + it('should handle empty elements', () => { + const xml = 'value'; + expect(extractXmlElements(xml, 'item')).toEqual(['', 'value']); + }); + }); + + describe('extractImplementedFeatures', () => { + const sampleSpec = ` + + Test Project + + + Feature One + First feature description + + + Feature Two + Second feature description + + src/feature-two.ts + src/utils/helper.ts + + + +`; + + it('should extract all features', () => { + const features = extractImplementedFeatures(sampleSpec); + expect(features).toHaveLength(2); + }); + + it('should extract feature names', () => { + const features = extractImplementedFeatures(sampleSpec); + expect(features[0].name).toBe('Feature One'); + expect(features[1].name).toBe('Feature Two'); + }); + + it('should extract feature descriptions', () => { + const features = extractImplementedFeatures(sampleSpec); + expect(features[0].description).toBe('First feature description'); + expect(features[1].description).toBe('Second feature description'); + }); + + it('should extract file_locations when present', () => { + const features = extractImplementedFeatures(sampleSpec); + expect(features[0].file_locations).toBeUndefined(); + expect(features[1].file_locations).toEqual(['src/feature-two.ts', 'src/utils/helper.ts']); + }); + + it('should return empty array for missing section', () => { + const xml = + 'Test'; + expect(extractImplementedFeatures(xml)).toEqual([]); + }); + + it('should return empty array for empty section', () => { + const xml = ` + + + `; + expect(extractImplementedFeatures(xml)).toEqual([]); + }); + + it('should handle escaped content', () => { + const xml = ` + + Test & Feature + Uses <brackets> + + `; + const features = extractImplementedFeatures(xml); + expect(features[0].name).toBe('Test & Feature'); + expect(features[0].description).toBe('Uses '); + }); + }); + + describe('extractImplementedFeatureNames', () => { + it('should return only feature names', () => { + const xml = ` + + Feature A + Description A + + + Feature B + Description B + + `; + expect(extractImplementedFeatureNames(xml)).toEqual(['Feature A', 'Feature B']); + }); + + it('should return empty array for no features', () => { + const xml = ''; + expect(extractImplementedFeatureNames(xml)).toEqual([]); + }); + }); + + describe('featureToXml', () => { + it('should generate XML for feature without file_locations', () => { + const feature: ImplementedFeature = { + name: 'My Feature', + description: 'Feature description', + }; + const xml = featureToXml(feature); + expect(xml).toContain('My Feature'); + expect(xml).toContain('Feature description'); + expect(xml).not.toContain(''); + }); + + it('should generate XML for feature with file_locations', () => { + const feature: ImplementedFeature = { + name: 'My Feature', + description: 'Feature description', + file_locations: ['src/index.ts', 'src/utils.ts'], + }; + const xml = featureToXml(feature); + expect(xml).toContain(''); + expect(xml).toContain('src/index.ts'); + expect(xml).toContain('src/utils.ts'); + }); + + it('should escape special characters', () => { + const feature: ImplementedFeature = { + name: 'Test & Feature', + description: 'Has ', + }; + const xml = featureToXml(feature); + expect(xml).toContain('Test & Feature'); + expect(xml).toContain('Has <tags>'); + }); + + it('should not include empty file_locations array', () => { + const feature: ImplementedFeature = { + name: 'Feature', + description: 'Desc', + file_locations: [], + }; + const xml = featureToXml(feature); + expect(xml).not.toContain(''); + }); + }); + + describe('featuresToXml', () => { + it('should generate XML for multiple features', () => { + const features: ImplementedFeature[] = [ + { name: 'Feature 1', description: 'Desc 1' }, + { name: 'Feature 2', description: 'Desc 2' }, + ]; + const xml = featuresToXml(features); + expect(xml).toContain('Feature 1'); + expect(xml).toContain('Feature 2'); + }); + + it('should handle empty array', () => { + expect(featuresToXml([])).toBe(''); + }); + }); + + describe('updateImplementedFeaturesSection', () => { + const baseSpec = ` + + Test + + Testing + + + + Old Feature + Old description + + +`; + + it('should replace existing section', () => { + const newFeatures: ImplementedFeature[] = [ + { name: 'New Feature', description: 'New description' }, + ]; + const result = updateImplementedFeaturesSection(baseSpec, newFeatures); + expect(result).toContain('New Feature'); + expect(result).not.toContain('Old Feature'); + }); + + it('should insert section after core_capabilities if missing', () => { + const specWithoutSection = ` + + Test + + Testing + +`; + const newFeatures: ImplementedFeature[] = [ + { name: 'New Feature', description: 'New description' }, + ]; + const result = updateImplementedFeaturesSection(specWithoutSection, newFeatures); + expect(result).toContain(''); + expect(result).toContain('New Feature'); + }); + + it('should handle multiple features', () => { + const newFeatures: ImplementedFeature[] = [ + { name: 'Feature A', description: 'Desc A' }, + { name: 'Feature B', description: 'Desc B', file_locations: ['src/b.ts'] }, + ]; + const result = updateImplementedFeaturesSection(baseSpec, newFeatures); + expect(result).toContain('Feature A'); + expect(result).toContain('Feature B'); + expect(result).toContain('src/b.ts'); + }); + }); + + describe('addImplementedFeature', () => { + const baseSpec = ` + + Existing Feature + Existing description + + `; + + it('should add new feature', () => { + const newFeature: ImplementedFeature = { + name: 'New Feature', + description: 'New description', + }; + const result = addImplementedFeature(baseSpec, newFeature); + expect(result).toContain('Existing Feature'); + expect(result).toContain('New Feature'); + }); + + it('should not add duplicate feature', () => { + const duplicate: ImplementedFeature = { + name: 'Existing Feature', + description: 'Different description', + }; + const result = addImplementedFeature(baseSpec, duplicate); + // Should still have only one instance + const matches = result.match(/Existing Feature/g); + expect(matches).toHaveLength(1); + }); + + it('should be case-insensitive for duplicates', () => { + const duplicate: ImplementedFeature = { + name: 'EXISTING FEATURE', + description: 'Different description', + }; + const result = addImplementedFeature(baseSpec, duplicate); + expect(result).not.toContain('EXISTING FEATURE'); + }); + }); + + describe('removeImplementedFeature', () => { + const baseSpec = ` + + Feature A + Description A + + + Feature B + Description B + + `; + + it('should remove feature by name', () => { + const result = removeImplementedFeature(baseSpec, 'Feature A'); + expect(result).not.toContain('Feature A'); + expect(result).toContain('Feature B'); + }); + + it('should be case-insensitive', () => { + const result = removeImplementedFeature(baseSpec, 'feature a'); + expect(result).not.toContain('Feature A'); + expect(result).toContain('Feature B'); + }); + + it('should return unchanged content if feature not found', () => { + const result = removeImplementedFeature(baseSpec, 'Nonexistent'); + expect(result).toContain('Feature A'); + expect(result).toContain('Feature B'); + }); + }); + + describe('updateImplementedFeature', () => { + const baseSpec = ` + + My Feature + Original description + + `; + + it('should update feature description', () => { + const result = updateImplementedFeature(baseSpec, 'My Feature', { + description: 'Updated description', + }); + expect(result).toContain('Updated description'); + expect(result).not.toContain('Original description'); + }); + + it('should add file_locations', () => { + const result = updateImplementedFeature(baseSpec, 'My Feature', { + file_locations: ['src/new.ts'], + }); + expect(result).toContain(''); + expect(result).toContain('src/new.ts'); + }); + + it('should preserve feature name if not updated', () => { + const result = updateImplementedFeature(baseSpec, 'My Feature', { + description: 'New desc', + }); + expect(result).toContain('My Feature'); + }); + + it('should be case-insensitive', () => { + const result = updateImplementedFeature(baseSpec, 'my feature', { + description: 'Updated', + }); + expect(result).toContain('Updated'); + }); + + it('should return unchanged content if feature not found', () => { + const result = updateImplementedFeature(baseSpec, 'Nonexistent', { + description: 'New', + }); + expect(result).toContain('Original description'); + }); + }); + + describe('hasImplementedFeature', () => { + const baseSpec = ` + + Existing Feature + Description + + `; + + it('should return true for existing feature', () => { + expect(hasImplementedFeature(baseSpec, 'Existing Feature')).toBe(true); + }); + + it('should return false for non-existing feature', () => { + expect(hasImplementedFeature(baseSpec, 'Nonexistent')).toBe(false); + }); + + it('should be case-insensitive', () => { + expect(hasImplementedFeature(baseSpec, 'existing feature')).toBe(true); + expect(hasImplementedFeature(baseSpec, 'EXISTING FEATURE')).toBe(true); + }); + }); + + describe('toSpecOutputFeatures', () => { + it('should convert to SpecOutput format', () => { + const features: ImplementedFeature[] = [ + { name: 'Feature 1', description: 'Desc 1' }, + { name: 'Feature 2', description: 'Desc 2', file_locations: ['src/f2.ts'] }, + ]; + const result = toSpecOutputFeatures(features); + expect(result).toEqual([ + { name: 'Feature 1', description: 'Desc 1' }, + { name: 'Feature 2', description: 'Desc 2', file_locations: ['src/f2.ts'] }, + ]); + }); + + it('should handle empty array', () => { + expect(toSpecOutputFeatures([])).toEqual([]); + }); + }); + + describe('fromSpecOutputFeatures', () => { + it('should convert from SpecOutput format', () => { + const specFeatures = [ + { name: 'Feature 1', description: 'Desc 1' }, + { name: 'Feature 2', description: 'Desc 2', file_locations: ['src/f2.ts'] }, + ]; + const result = fromSpecOutputFeatures(specFeatures); + expect(result).toEqual([ + { name: 'Feature 1', description: 'Desc 1' }, + { name: 'Feature 2', description: 'Desc 2', file_locations: ['src/f2.ts'] }, + ]); + }); + + it('should handle empty array', () => { + expect(fromSpecOutputFeatures([])).toEqual([]); + }); + }); + + describe('roundtrip', () => { + it('should maintain data integrity through extract -> update cycle', () => { + const originalSpec = ` + + Test + + Testing + + + + Test & Feature + Uses <special> chars + + src/test.ts + + + +`; + + // Extract features + const features = extractImplementedFeatures(originalSpec); + expect(features[0].name).toBe('Test & Feature'); + expect(features[0].description).toBe('Uses chars'); + + // Update with same features + const result = updateImplementedFeaturesSection(originalSpec, features); + + // Re-extract and verify + const reExtracted = extractImplementedFeatures(result); + expect(reExtracted[0].name).toBe('Test & Feature'); + expect(reExtracted[0].description).toBe('Uses chars'); + expect(reExtracted[0].file_locations).toEqual(['src/test.ts']); + }); + }); + + describe('custom logger', () => { + it('should use custom logger for extractXmlSection', () => { + const mockLogger = createMockLogger(); + const xml = '
content
'; + extractXmlSection(xml, 'section', { logger: mockLogger }); + expect(mockLogger.debug).toHaveBeenCalledWith('Extracted
section'); + }); + + it('should log when section is not found', () => { + const mockLogger = createMockLogger(); + const xml = 'content'; + extractXmlSection(xml, 'missing', { logger: mockLogger }); + expect(mockLogger.debug).toHaveBeenCalledWith('Section not found'); + }); + + it('should use custom logger for extractXmlElements', () => { + const mockLogger = createMockLogger(); + const xml = 'onetwo'; + extractXmlElements(xml, 'item', { logger: mockLogger }); + expect(mockLogger.debug).toHaveBeenCalledWith('Extracted 2 elements'); + }); + + it('should use custom logger for extractImplementedFeatures', () => { + const mockLogger = createMockLogger(); + const xml = ` + + Test + Desc + + `; + extractImplementedFeatures(xml, { logger: mockLogger }); + expect(mockLogger.debug).toHaveBeenCalledWith('Extracted 1 implemented features'); + }); + + it('should log when no implemented_features section found', () => { + const mockLogger = createMockLogger(); + const xml = 'content'; + extractImplementedFeatures(xml, { logger: mockLogger }); + expect(mockLogger.debug).toHaveBeenCalledWith('No implemented_features section found'); + }); + + it('should use custom logger warn for missing insertion point', () => { + const mockLogger = createMockLogger(); + // XML without project_specification, core_capabilities, or implemented_features + const xml = 'content'; + const features: ImplementedFeature[] = [{ name: 'Test', description: 'Desc' }]; + updateImplementedFeaturesSection(xml, features, { logger: mockLogger }); + expect(mockLogger.warn).toHaveBeenCalledWith( + 'Could not find appropriate insertion point for implemented_features' + ); + }); + }); + + describe('edge cases', () => { + describe('escapeXml edge cases', () => { + it('should handle strings with only special characters', () => { + expect(escapeXml('<>&"\'')).toBe('<>&"''); + }); + + it('should handle very long strings', () => { + const longString = 'a'.repeat(10000) + '&' + 'b'.repeat(10000); + const escaped = escapeXml(longString); + expect(escaped).toContain('&'); + expect(escaped.length).toBe(20005); // +4 for & minus & + }); + + it('should handle unicode characters without escaping', () => { + const unicode = '日本語 emoji: 🚀 symbols: ∞ ≠ ≤'; + expect(escapeXml(unicode)).toBe(unicode); + }); + }); + + describe('unescapeXml edge cases', () => { + it('should handle strings with only entities', () => { + expect(unescapeXml('<>&"'')).toBe('<>&"\''); + }); + + it('should not double-unescape', () => { + // &lt; should become < (not <) + expect(unescapeXml('&lt;')).toBe('<'); + }); + + it('should handle partial/invalid entities gracefully', () => { + // Invalid entities should pass through unchanged + expect(unescapeXml('&unknown;')).toBe('&unknown;'); + expect(unescapeXml('&')).toBe('&'); // Missing semicolon + }); + }); + + describe('extractXmlSection edge cases', () => { + it('should handle nested tags with same name', () => { + // Note: regex-based parsing with non-greedy matching will match + // from first opening tag to first closing tag + const xml = 'inner'; + // Non-greedy [\s\S]*? matches from first to first + expect(extractXmlSection(xml, 'outer')).toBe('inner'); + }); + + it('should handle self-closing tags (returns null)', () => { + const xml = '
'; + // Regex expects content between tags, self-closing won't match + expect(extractXmlSection(xml, 'section')).toBeNull(); + }); + + it('should handle tags with attributes', () => { + const xml = '
content
'; + // The regex matches exact tag names, so this won't match + expect(extractXmlSection(xml, 'section')).toBeNull(); + }); + + it('should handle whitespace in tag content', () => { + const xml = '
\n\t
'; + expect(extractXmlSection(xml, 'section')).toBe(' \n\t '); + }); + }); + + describe('extractXmlElements edge cases', () => { + it('should handle elements across multiple lines', () => { + const xml = ` + + first + + second + `; + // Multiline content is now captured with [\s\S]*? pattern + const result = extractXmlElements(xml, 'item'); + expect(result).toHaveLength(2); + expect(result[0]).toBe('first'); + expect(result[1]).toBe('second'); + }); + + it('should handle consecutive elements without whitespace', () => { + const xml = 'abc'; + expect(extractXmlElements(xml, 'item')).toEqual(['a', 'b', 'c']); + }); + }); + + describe('extractImplementedFeatures edge cases', () => { + it('should skip features without names', () => { + const xml = ` + + Orphan description + + + Valid Feature + Has name + + `; + const features = extractImplementedFeatures(xml); + expect(features).toHaveLength(1); + expect(features[0].name).toBe('Valid Feature'); + }); + + it('should handle features with empty names', () => { + const xml = ` + + + Empty name + + `; + const features = extractImplementedFeatures(xml); + expect(features).toHaveLength(0); // Empty name is falsy + }); + + it('should handle features with whitespace-only names', () => { + const xml = ` + + + Whitespace name + + `; + const features = extractImplementedFeatures(xml); + expect(features).toHaveLength(0); // Trimmed whitespace is empty + }); + + it('should handle empty file_locations section', () => { + const xml = ` + + Test + Desc + + + + `; + const features = extractImplementedFeatures(xml); + expect(features[0].file_locations).toBeUndefined(); + }); + }); + + describe('featureToXml edge cases', () => { + it('should handle custom indentation', () => { + const feature: ImplementedFeature = { + name: 'Test', + description: 'Desc', + }; + const xml = featureToXml(feature, '\t'); + expect(xml).toContain('\t\t'); + expect(xml).toContain('\t\t\tTest'); + }); + + it('should handle empty description', () => { + const feature: ImplementedFeature = { + name: 'Test', + description: '', + }; + const xml = featureToXml(feature); + expect(xml).toContain(''); + }); + + it('should handle undefined file_locations', () => { + const feature: ImplementedFeature = { + name: 'Test', + description: 'Desc', + file_locations: undefined, + }; + const xml = featureToXml(feature); + expect(xml).not.toContain('file_locations'); + }); + }); + + describe('updateImplementedFeaturesSection edge cases', () => { + it('should insert before as fallback', () => { + const specWithoutCoreCapabilities = ` + + Test +`; + const newFeatures: ImplementedFeature[] = [ + { name: 'New Feature', description: 'New description' }, + ]; + const result = updateImplementedFeaturesSection(specWithoutCoreCapabilities, newFeatures); + expect(result).toContain(''); + expect(result).toContain('New Feature'); + expect(result.indexOf('')).toBeLessThan( + result.indexOf('') + ); + }); + + it('should return unchanged content when no insertion point found', () => { + const invalidSpec = 'content'; + const newFeatures: ImplementedFeature[] = [{ name: 'Feature', description: 'Desc' }]; + const result = updateImplementedFeaturesSection(invalidSpec, newFeatures); + expect(result).toBe(invalidSpec); + }); + + it('should handle empty features array', () => { + const spec = ` + + Old + Old desc + + `; + const result = updateImplementedFeaturesSection(spec, []); + expect(result).toContain(''); + expect(result).not.toContain('Old'); + }); + }); + + describe('addImplementedFeature edge cases', () => { + it('should create section when adding to spec without implemented_features', () => { + const specWithoutSection = ` + + Testing + +`; + const newFeature: ImplementedFeature = { + name: 'First Feature', + description: 'First description', + }; + const result = addImplementedFeature(specWithoutSection, newFeature); + expect(result).toContain(''); + expect(result).toContain('First Feature'); + }); + + it('should handle feature with all fields populated', () => { + const spec = ``; + const newFeature: ImplementedFeature = { + name: 'Complete Feature', + description: 'Full description', + file_locations: ['src/a.ts', 'src/b.ts', 'src/c.ts'], + }; + const result = addImplementedFeature(spec, newFeature); + expect(result).toContain('Complete Feature'); + expect(result).toContain('src/a.ts'); + expect(result).toContain('src/b.ts'); + expect(result).toContain('src/c.ts'); + }); + }); + + describe('updateImplementedFeature edge cases', () => { + it('should allow updating feature name', () => { + const spec = ` + + Old Name + Desc + + `; + const result = updateImplementedFeature(spec, 'Old Name', { + name: 'New Name', + }); + expect(result).toContain('New Name'); + expect(result).not.toContain('Old Name'); + }); + + it('should allow clearing file_locations', () => { + const spec = ` + + Test + Desc + + src/old.ts + + + `; + const result = updateImplementedFeature(spec, 'Test', { + file_locations: [], + }); + expect(result).not.toContain('file_locations'); + expect(result).not.toContain('src/old.ts'); + }); + + it('should handle updating multiple fields at once', () => { + const spec = ` + + Original + Original desc + + `; + const result = updateImplementedFeature(spec, 'Original', { + name: 'Updated', + description: 'Updated desc', + file_locations: ['new/path.ts'], + }); + expect(result).toContain('Updated'); + expect(result).toContain('Updated desc'); + expect(result).toContain('new/path.ts'); + }); + }); + + describe('toSpecOutputFeatures and fromSpecOutputFeatures edge cases', () => { + it('should handle features with empty file_locations array', () => { + const features: ImplementedFeature[] = [ + { name: 'Test', description: 'Desc', file_locations: [] }, + ]; + const specOutput = toSpecOutputFeatures(features); + expect(specOutput[0].file_locations).toBeUndefined(); + }); + + it('should handle round-trip conversion', () => { + const original: ImplementedFeature[] = [ + { name: 'Feature 1', description: 'Desc 1' }, + { name: 'Feature 2', description: 'Desc 2', file_locations: ['src/f.ts'] }, + ]; + const specOutput = toSpecOutputFeatures(original); + const restored = fromSpecOutputFeatures(specOutput); + expect(restored).toEqual(original); + }); + }); + }); + + describe('integration scenarios', () => { + it('should handle a complete spec file workflow', () => { + // Start with a minimal spec + let spec = ` + + My App + + User management + +`; + + // Add first feature + spec = addImplementedFeature(spec, { + name: 'User Authentication', + description: 'Login and logout functionality', + file_locations: ['src/auth/login.ts', 'src/auth/logout.ts'], + }); + expect(hasImplementedFeature(spec, 'User Authentication')).toBe(true); + + // Add second feature + spec = addImplementedFeature(spec, { + name: 'User Profile', + description: 'View and edit user profile', + }); + expect(extractImplementedFeatureNames(spec)).toEqual(['User Authentication', 'User Profile']); + + // Update first feature + spec = updateImplementedFeature(spec, 'User Authentication', { + file_locations: ['src/auth/login.ts', 'src/auth/logout.ts', 'src/auth/session.ts'], + }); + const features = extractImplementedFeatures(spec); + expect(features[0].file_locations).toContain('src/auth/session.ts'); + + // Remove a feature + spec = removeImplementedFeature(spec, 'User Profile'); + expect(hasImplementedFeature(spec, 'User Profile')).toBe(false); + expect(hasImplementedFeature(spec, 'User Authentication')).toBe(true); + }); + + it('should handle special characters throughout workflow', () => { + const spec = ` + +`; + + const result = addImplementedFeature(spec, { + name: 'Search & Filter', + description: 'Supports syntax with "quoted" terms', + file_locations: ["src/search/parser's.ts"], + }); + + const features = extractImplementedFeatures(result); + expect(features[0].name).toBe('Search & Filter'); + expect(features[0].description).toBe('Supports syntax with "quoted" terms'); + expect(features[0].file_locations?.[0]).toBe("src/search/parser's.ts"); + }); + + it('should preserve other XML content when modifying features', () => { + const spec = ` + + Preserved Name + This should be preserved + + Capability 1 + Capability 2 + + + + Old Feature + Will be replaced + + + Keep this too +`; + + const result = updateImplementedFeaturesSection(spec, [ + { name: 'New Feature', description: 'New desc' }, + ]); + + expect(result).toContain('Preserved Name'); + expect(result).toContain('This should be preserved'); + expect(result).toContain('Capability 1'); + expect(result).toContain('Capability 2'); + expect(result).toContain('Keep this too'); + expect(result).not.toContain('Old Feature'); + expect(result).toContain('New Feature'); + }); + }); +}); diff --git a/apps/server/tests/unit/providers/codex-provider.test.ts b/apps/server/tests/unit/providers/codex-provider.test.ts index 6ca69d86..ee9c7bad 100644 --- a/apps/server/tests/unit/providers/codex-provider.test.ts +++ b/apps/server/tests/unit/providers/codex-provider.test.ts @@ -11,6 +11,11 @@ import { getCodexConfigDir, getCodexAuthIndicators, } from '@automaker/platform'; +import { + calculateReasoningTimeout, + REASONING_TIMEOUT_MULTIPLIERS, + DEFAULT_TIMEOUT_MS, +} from '@automaker/types'; const OPENAI_API_KEY_ENV = 'OPENAI_API_KEY'; const originalOpenAIKey = process.env[OPENAI_API_KEY_ENV]; @@ -289,5 +294,121 @@ describe('codex-provider.ts', () => { expect(codexRunMock).not.toHaveBeenCalled(); expect(spawnJSONLProcess).toHaveBeenCalled(); }); + + it('passes extended timeout for high reasoning effort', async () => { + vi.mocked(spawnJSONLProcess).mockReturnValue((async function* () {})()); + + await collectAsyncGenerator( + provider.executeQuery({ + prompt: 'Complex reasoning task', + model: 'gpt-5.1-codex-max', + cwd: '/tmp', + reasoningEffort: 'high', + }) + ); + + const call = vi.mocked(spawnJSONLProcess).mock.calls[0][0]; + // High reasoning effort should have 3x the default timeout (90000ms) + expect(call.timeout).toBe(DEFAULT_TIMEOUT_MS * REASONING_TIMEOUT_MULTIPLIERS.high); + }); + + it('passes extended timeout for xhigh reasoning effort', async () => { + vi.mocked(spawnJSONLProcess).mockReturnValue((async function* () {})()); + + await collectAsyncGenerator( + provider.executeQuery({ + prompt: 'Very complex reasoning task', + model: 'gpt-5.1-codex-max', + cwd: '/tmp', + reasoningEffort: 'xhigh', + }) + ); + + const call = vi.mocked(spawnJSONLProcess).mock.calls[0][0]; + // xhigh reasoning effort should have 4x the default timeout (120000ms) + expect(call.timeout).toBe(DEFAULT_TIMEOUT_MS * REASONING_TIMEOUT_MULTIPLIERS.xhigh); + }); + + it('uses default timeout when no reasoning effort is specified', async () => { + vi.mocked(spawnJSONLProcess).mockReturnValue((async function* () {})()); + + await collectAsyncGenerator( + provider.executeQuery({ + prompt: 'Simple task', + model: 'gpt-5.2', + cwd: '/tmp', + }) + ); + + const call = vi.mocked(spawnJSONLProcess).mock.calls[0][0]; + // No reasoning effort should use the default timeout + expect(call.timeout).toBe(DEFAULT_TIMEOUT_MS); + }); + }); + + describe('calculateReasoningTimeout', () => { + it('returns default timeout when no reasoning effort is specified', () => { + expect(calculateReasoningTimeout()).toBe(DEFAULT_TIMEOUT_MS); + expect(calculateReasoningTimeout(undefined)).toBe(DEFAULT_TIMEOUT_MS); + }); + + it('returns default timeout for none reasoning effort', () => { + expect(calculateReasoningTimeout('none')).toBe(DEFAULT_TIMEOUT_MS); + }); + + it('applies correct multiplier for minimal reasoning effort', () => { + const expected = Math.round(DEFAULT_TIMEOUT_MS * REASONING_TIMEOUT_MULTIPLIERS.minimal); + expect(calculateReasoningTimeout('minimal')).toBe(expected); + }); + + it('applies correct multiplier for low reasoning effort', () => { + const expected = Math.round(DEFAULT_TIMEOUT_MS * REASONING_TIMEOUT_MULTIPLIERS.low); + expect(calculateReasoningTimeout('low')).toBe(expected); + }); + + it('applies correct multiplier for medium reasoning effort', () => { + const expected = Math.round(DEFAULT_TIMEOUT_MS * REASONING_TIMEOUT_MULTIPLIERS.medium); + expect(calculateReasoningTimeout('medium')).toBe(expected); + }); + + it('applies correct multiplier for high reasoning effort', () => { + const expected = Math.round(DEFAULT_TIMEOUT_MS * REASONING_TIMEOUT_MULTIPLIERS.high); + expect(calculateReasoningTimeout('high')).toBe(expected); + }); + + it('applies correct multiplier for xhigh reasoning effort', () => { + const expected = Math.round(DEFAULT_TIMEOUT_MS * REASONING_TIMEOUT_MULTIPLIERS.xhigh); + expect(calculateReasoningTimeout('xhigh')).toBe(expected); + }); + + it('uses custom base timeout when provided', () => { + const customBase = 60000; + expect(calculateReasoningTimeout('high', customBase)).toBe( + Math.round(customBase * REASONING_TIMEOUT_MULTIPLIERS.high) + ); + }); + + it('falls back to 1.0 multiplier for invalid reasoning effort', () => { + // Test that invalid values fallback gracefully to default multiplier + // This tests the defensive ?? 1.0 in calculateReasoningTimeout + const invalidEffort = 'invalid_effort' as never; + expect(calculateReasoningTimeout(invalidEffort)).toBe(DEFAULT_TIMEOUT_MS); + }); + + it('produces expected absolute timeout values', () => { + // Verify the actual timeout values that will be used: + // none: 30000ms (30s) + // minimal: 36000ms (36s) + // low: 45000ms (45s) + // medium: 60000ms (1m) + // high: 90000ms (1m 30s) + // xhigh: 120000ms (2m) + expect(calculateReasoningTimeout('none')).toBe(30000); + expect(calculateReasoningTimeout('minimal')).toBe(36000); + expect(calculateReasoningTimeout('low')).toBe(45000); + expect(calculateReasoningTimeout('medium')).toBe(60000); + expect(calculateReasoningTimeout('high')).toBe(90000); + expect(calculateReasoningTimeout('xhigh')).toBe(120000); + }); }); }); diff --git a/apps/server/tests/unit/services/claude-usage-service.test.ts b/apps/server/tests/unit/services/claude-usage-service.test.ts index af2d10c8..07ad13c9 100644 --- a/apps/server/tests/unit/services/claude-usage-service.test.ts +++ b/apps/server/tests/unit/services/claude-usage-service.test.ts @@ -518,7 +518,11 @@ Resets in 2h const promise = ptyService.fetchUsageData(); - dataCallback!('authentication_error'); + // Send data containing the authentication error pattern the service looks for + dataCallback!('"type":"authentication_error"'); + + // Trigger the exit handler which checks for auth errors + exitCallback!({ exitCode: 1 }); await expect(promise).rejects.toThrow( "Claude CLI authentication issue. Please run 'claude logout' and then 'claude login' in your terminal to refresh permissions." @@ -586,6 +590,8 @@ Resets in 2h it('should send SIGTERM after ESC if process does not exit', async () => { vi.useFakeTimers(); + // Mock Unix platform to test SIGTERM behavior (Windows calls kill() without signal) + vi.mocked(os.platform).mockReturnValue('darwin'); const ptyService = new ClaudeUsageService(); let dataCallback: Function | undefined; diff --git a/apps/server/tests/unit/services/feature-loader.test.ts b/apps/server/tests/unit/services/feature-loader.test.ts index dc540982..d3d0bbdc 100644 --- a/apps/server/tests/unit/services/feature-loader.test.ts +++ b/apps/server/tests/unit/services/feature-loader.test.ts @@ -190,9 +190,10 @@ describe('feature-loader.ts', () => { const result = await loader.getAll(testProjectPath); expect(result).toEqual([]); + // With recovery-enabled reads, warnings come from AtomicWriter and FeatureLoader expect(consoleSpy).toHaveBeenCalledWith( - expect.stringMatching(/WARN.*\[FeatureLoader\]/), - expect.stringContaining('Failed to parse feature.json') + expect.stringMatching(/WARN.*\[AtomicWriter\]/), + expect.stringContaining('unavailable') ); consoleSpy.mockRestore(); @@ -260,10 +261,13 @@ describe('feature-loader.ts', () => { expect(result).toBeNull(); }); - it('should throw on other errors', async () => { + it('should return null on other errors (with recovery attempt)', async () => { + // With recovery-enabled reads, get() returns null instead of throwing + // because it attempts to recover from backups before giving up vi.mocked(fs.readFile).mockRejectedValue(new Error('Permission denied')); - await expect(loader.get(testProjectPath, 'feature-123')).rejects.toThrow('Permission denied'); + const result = await loader.get(testProjectPath, 'feature-123'); + expect(result).toBeNull(); }); }); @@ -442,4 +446,471 @@ describe('feature-loader.ts', () => { ); }); }); + + describe('findByTitle', () => { + it('should find feature by exact title match (case-insensitive)', async () => { + vi.mocked(fs.access).mockResolvedValue(undefined); + vi.mocked(fs.readdir).mockResolvedValue([ + { name: 'feature-1', isDirectory: () => true } as any, + { name: 'feature-2', isDirectory: () => true } as any, + ]); + + vi.mocked(fs.readFile) + .mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-1000-abc', + title: 'Login Feature', + category: 'auth', + description: 'Login implementation', + }) + ) + .mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-2000-def', + title: 'Logout Feature', + category: 'auth', + description: 'Logout implementation', + }) + ); + + const result = await loader.findByTitle(testProjectPath, 'LOGIN FEATURE'); + + expect(result).not.toBeNull(); + expect(result?.id).toBe('feature-1000-abc'); + expect(result?.title).toBe('Login Feature'); + }); + + it('should return null when title is not found', async () => { + vi.mocked(fs.access).mockResolvedValue(undefined); + vi.mocked(fs.readdir).mockResolvedValue([ + { name: 'feature-1', isDirectory: () => true } as any, + ]); + + vi.mocked(fs.readFile).mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-1000-abc', + title: 'Login Feature', + category: 'auth', + description: 'Login implementation', + }) + ); + + const result = await loader.findByTitle(testProjectPath, 'Nonexistent Feature'); + + expect(result).toBeNull(); + }); + + it('should return null for empty or whitespace title', async () => { + const result1 = await loader.findByTitle(testProjectPath, ''); + const result2 = await loader.findByTitle(testProjectPath, ' '); + + expect(result1).toBeNull(); + expect(result2).toBeNull(); + }); + + it('should skip features without titles', async () => { + vi.mocked(fs.access).mockResolvedValue(undefined); + vi.mocked(fs.readdir).mockResolvedValue([ + { name: 'feature-1', isDirectory: () => true } as any, + { name: 'feature-2', isDirectory: () => true } as any, + ]); + + vi.mocked(fs.readFile) + .mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-1000-abc', + // no title + category: 'auth', + description: 'Login implementation', + }) + ) + .mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-2000-def', + title: 'Login Feature', + category: 'auth', + description: 'Another login', + }) + ); + + const result = await loader.findByTitle(testProjectPath, 'Login Feature'); + + expect(result).not.toBeNull(); + expect(result?.id).toBe('feature-2000-def'); + }); + }); + + describe('findDuplicateTitle', () => { + it('should find duplicate title', async () => { + vi.mocked(fs.access).mockResolvedValue(undefined); + vi.mocked(fs.readdir).mockResolvedValue([ + { name: 'feature-1', isDirectory: () => true } as any, + ]); + + vi.mocked(fs.readFile).mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-1000-abc', + title: 'My Feature', + category: 'ui', + description: 'Feature description', + }) + ); + + const result = await loader.findDuplicateTitle(testProjectPath, 'my feature'); + + expect(result).not.toBeNull(); + expect(result?.id).toBe('feature-1000-abc'); + }); + + it('should exclude specified feature ID from duplicate check', async () => { + vi.mocked(fs.access).mockResolvedValue(undefined); + vi.mocked(fs.readdir).mockResolvedValue([ + { name: 'feature-1', isDirectory: () => true } as any, + { name: 'feature-2', isDirectory: () => true } as any, + ]); + + vi.mocked(fs.readFile) + .mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-1000-abc', + title: 'My Feature', + category: 'ui', + description: 'Feature 1', + }) + ) + .mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-2000-def', + title: 'Other Feature', + category: 'ui', + description: 'Feature 2', + }) + ); + + // Should not find duplicate when excluding the feature that has the title + const result = await loader.findDuplicateTitle( + testProjectPath, + 'My Feature', + 'feature-1000-abc' + ); + + expect(result).toBeNull(); + }); + + it('should find duplicate when title exists on different feature', async () => { + vi.mocked(fs.access).mockResolvedValue(undefined); + vi.mocked(fs.readdir).mockResolvedValue([ + { name: 'feature-1', isDirectory: () => true } as any, + { name: 'feature-2', isDirectory: () => true } as any, + ]); + + vi.mocked(fs.readFile) + .mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-1000-abc', + title: 'My Feature', + category: 'ui', + description: 'Feature 1', + }) + ) + .mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-2000-def', + title: 'Other Feature', + category: 'ui', + description: 'Feature 2', + }) + ); + + // Should find duplicate because feature-1000-abc has the title and we're excluding feature-2000-def + const result = await loader.findDuplicateTitle( + testProjectPath, + 'My Feature', + 'feature-2000-def' + ); + + expect(result).not.toBeNull(); + expect(result?.id).toBe('feature-1000-abc'); + }); + + it('should return null for empty or whitespace title', async () => { + const result1 = await loader.findDuplicateTitle(testProjectPath, ''); + const result2 = await loader.findDuplicateTitle(testProjectPath, ' '); + + expect(result1).toBeNull(); + expect(result2).toBeNull(); + }); + + it('should handle titles with leading/trailing whitespace', async () => { + vi.mocked(fs.access).mockResolvedValue(undefined); + vi.mocked(fs.readdir).mockResolvedValue([ + { name: 'feature-1', isDirectory: () => true } as any, + ]); + + vi.mocked(fs.readFile).mockResolvedValueOnce( + JSON.stringify({ + id: 'feature-1000-abc', + title: 'My Feature', + category: 'ui', + description: 'Feature description', + }) + ); + + const result = await loader.findDuplicateTitle(testProjectPath, ' My Feature '); + + expect(result).not.toBeNull(); + expect(result?.id).toBe('feature-1000-abc'); + }); + }); + + describe('syncFeatureToAppSpec', () => { + const sampleAppSpec = ` + + Test Project + + Testing + + + + Existing Feature + Already implemented + + +`; + + const appSpecWithoutFeatures = ` + + Test Project + + Testing + +`; + + it('should add feature to app_spec.txt', async () => { + vi.mocked(fs.readFile).mockResolvedValueOnce(sampleAppSpec); + vi.mocked(fs.writeFile).mockResolvedValue(undefined); + + const feature = { + id: 'feature-1234-abc', + title: 'New Feature', + category: 'ui', + description: 'A new feature description', + }; + + const result = await loader.syncFeatureToAppSpec(testProjectPath, feature); + + expect(result).toBe(true); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.stringContaining('app_spec.txt'), + expect.stringContaining('New Feature'), + 'utf-8' + ); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('A new feature description'), + 'utf-8' + ); + }); + + it('should add feature with file locations', async () => { + vi.mocked(fs.readFile).mockResolvedValueOnce(sampleAppSpec); + vi.mocked(fs.writeFile).mockResolvedValue(undefined); + + const feature = { + id: 'feature-1234-abc', + title: 'Feature With Locations', + category: 'backend', + description: 'Feature with file locations', + }; + + const result = await loader.syncFeatureToAppSpec(testProjectPath, feature, [ + 'src/feature.ts', + 'src/utils/helper.ts', + ]); + + expect(result).toBe(true); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('src/feature.ts'), + 'utf-8' + ); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('src/utils/helper.ts'), + 'utf-8' + ); + }); + + it('should return false when app_spec.txt does not exist', async () => { + const error: any = new Error('File not found'); + error.code = 'ENOENT'; + vi.mocked(fs.readFile).mockRejectedValueOnce(error); + + const feature = { + id: 'feature-1234-abc', + title: 'New Feature', + category: 'ui', + description: 'A new feature description', + }; + + const result = await loader.syncFeatureToAppSpec(testProjectPath, feature); + + expect(result).toBe(false); + expect(fs.writeFile).not.toHaveBeenCalled(); + }); + + it('should return false when feature already exists (duplicate)', async () => { + vi.mocked(fs.readFile).mockResolvedValueOnce(sampleAppSpec); + + const feature = { + id: 'feature-5678-xyz', + title: 'Existing Feature', // Same name as existing feature + category: 'ui', + description: 'Different description', + }; + + const result = await loader.syncFeatureToAppSpec(testProjectPath, feature); + + expect(result).toBe(false); + expect(fs.writeFile).not.toHaveBeenCalled(); + }); + + it('should use feature ID as fallback name when title is missing', async () => { + vi.mocked(fs.readFile).mockResolvedValueOnce(sampleAppSpec); + vi.mocked(fs.writeFile).mockResolvedValue(undefined); + + const feature = { + id: 'feature-1234-abc', + category: 'ui', + description: 'Feature without title', + // No title property + }; + + const result = await loader.syncFeatureToAppSpec(testProjectPath, feature); + + expect(result).toBe(true); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('Feature: feature-1234-abc'), + 'utf-8' + ); + }); + + it('should handle app_spec without implemented_features section', async () => { + vi.mocked(fs.readFile).mockResolvedValueOnce(appSpecWithoutFeatures); + vi.mocked(fs.writeFile).mockResolvedValue(undefined); + + const feature = { + id: 'feature-1234-abc', + title: 'First Feature', + category: 'ui', + description: 'First implemented feature', + }; + + const result = await loader.syncFeatureToAppSpec(testProjectPath, feature); + + expect(result).toBe(true); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining(''), + 'utf-8' + ); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('First Feature'), + 'utf-8' + ); + }); + + it('should throw on non-ENOENT file read errors', async () => { + const error = new Error('Permission denied'); + vi.mocked(fs.readFile).mockRejectedValueOnce(error); + + const feature = { + id: 'feature-1234-abc', + title: 'New Feature', + category: 'ui', + description: 'A new feature description', + }; + + await expect(loader.syncFeatureToAppSpec(testProjectPath, feature)).rejects.toThrow( + 'Permission denied' + ); + }); + + it('should preserve existing features when adding a new one', async () => { + vi.mocked(fs.readFile).mockResolvedValueOnce(sampleAppSpec); + vi.mocked(fs.writeFile).mockResolvedValue(undefined); + + const feature = { + id: 'feature-1234-abc', + title: 'New Feature', + category: 'ui', + description: 'A new feature', + }; + + await loader.syncFeatureToAppSpec(testProjectPath, feature); + + // Verify both old and new features are in the output + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('Existing Feature'), + 'utf-8' + ); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('New Feature'), + 'utf-8' + ); + }); + + it('should escape special characters in feature name and description', async () => { + vi.mocked(fs.readFile).mockResolvedValueOnce(sampleAppSpec); + vi.mocked(fs.writeFile).mockResolvedValue(undefined); + + const feature = { + id: 'feature-1234-abc', + title: 'Feature with & "chars"', + category: 'ui', + description: 'Description with & "quotes"', + }; + + const result = await loader.syncFeatureToAppSpec(testProjectPath, feature); + + expect(result).toBe(true); + // The XML should have escaped characters + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('<special>'), + 'utf-8' + ); + expect(fs.writeFile).toHaveBeenCalledWith( + expect.any(String), + expect.stringContaining('&'), + 'utf-8' + ); + }); + + it('should not add empty file_locations array', async () => { + vi.mocked(fs.readFile).mockResolvedValueOnce(sampleAppSpec); + vi.mocked(fs.writeFile).mockResolvedValue(undefined); + + const feature = { + id: 'feature-1234-abc', + title: 'Feature Without Locations', + category: 'ui', + description: 'No file locations', + }; + + await loader.syncFeatureToAppSpec(testProjectPath, feature, []); + + // File locations should not be included when array is empty + const writeCall = vi.mocked(fs.writeFile).mock.calls[0]; + const writtenContent = writeCall[1] as string; + + // Count occurrences of file_locations - should only have the one from Existing Feature if any + // The new feature should not add file_locations + expect(writtenContent).toContain('Feature Without Locations'); + }); + }); }); diff --git a/apps/ui/package.json b/apps/ui/package.json index 8d5a5ddb..2515632e 100644 --- a/apps/ui/package.json +++ b/apps/ui/package.json @@ -1,6 +1,6 @@ { "name": "@automaker/ui", - "version": "0.11.0", + "version": "0.12.0", "description": "An autonomous AI development studio that helps you build software faster using AI-powered agents", "homepage": "https://github.com/AutoMaker-Org/automaker", "repository": { diff --git a/apps/ui/src/components/layout/project-switcher/components/notification-bell.tsx b/apps/ui/src/components/layout/project-switcher/components/notification-bell.tsx new file mode 100644 index 00000000..adcd7b64 --- /dev/null +++ b/apps/ui/src/components/layout/project-switcher/components/notification-bell.tsx @@ -0,0 +1,207 @@ +/** + * Notification Bell - Bell icon with unread count and popover + */ + +import { useCallback } from 'react'; +import { Bell, Check, Trash2, ExternalLink } from 'lucide-react'; +import { useNavigate } from '@tanstack/react-router'; +import { useNotificationsStore } from '@/store/notifications-store'; +import { useLoadNotifications, useNotificationEvents } from '@/hooks/use-notification-events'; +import { getHttpApiClient } from '@/lib/http-api-client'; +import { Button } from '@/components/ui/button'; +import { Popover, PopoverContent, PopoverTrigger } from '@/components/ui/popover'; +import type { Notification } from '@automaker/types'; +import { cn } from '@/lib/utils'; + +/** + * Format a date as relative time (e.g., "2 minutes ago", "3 hours ago") + */ +function formatRelativeTime(date: Date): string { + const now = new Date(); + const diffMs = now.getTime() - date.getTime(); + const diffSec = Math.floor(diffMs / 1000); + const diffMin = Math.floor(diffSec / 60); + const diffHour = Math.floor(diffMin / 60); + const diffDay = Math.floor(diffHour / 24); + + if (diffSec < 60) return 'just now'; + if (diffMin < 60) return `${diffMin} minute${diffMin === 1 ? '' : 's'} ago`; + if (diffHour < 24) return `${diffHour} hour${diffHour === 1 ? '' : 's'} ago`; + if (diffDay < 7) return `${diffDay} day${diffDay === 1 ? '' : 's'} ago`; + return date.toLocaleDateString(); +} + +interface NotificationBellProps { + projectPath: string | null; +} + +export function NotificationBell({ projectPath }: NotificationBellProps) { + const navigate = useNavigate(); + const { + notifications, + unreadCount, + isPopoverOpen, + setPopoverOpen, + markAsRead, + dismissNotification, + } = useNotificationsStore(); + + // Load notifications and subscribe to events + useLoadNotifications(projectPath); + useNotificationEvents(projectPath); + + const handleMarkAsRead = useCallback( + async (notificationId: string) => { + if (!projectPath) return; + + // Optimistic update + markAsRead(notificationId); + + // Sync with server + const api = getHttpApiClient(); + await api.notifications.markAsRead(projectPath, notificationId); + }, + [projectPath, markAsRead] + ); + + const handleDismiss = useCallback( + async (notificationId: string) => { + if (!projectPath) return; + + // Optimistic update + dismissNotification(notificationId); + + // Sync with server + const api = getHttpApiClient(); + await api.notifications.dismiss(projectPath, notificationId); + }, + [projectPath, dismissNotification] + ); + + const handleNotificationClick = useCallback( + (notification: Notification) => { + // Mark as read + handleMarkAsRead(notification.id); + setPopoverOpen(false); + + // Navigate to the relevant view based on notification type + if (notification.featureId) { + navigate({ to: '/board' }); + } + }, + [handleMarkAsRead, setPopoverOpen, navigate] + ); + + const handleViewAll = useCallback(() => { + setPopoverOpen(false); + navigate({ to: '/notifications' }); + }, [setPopoverOpen, navigate]); + + const getNotificationIcon = (type: string) => { + switch (type) { + case 'feature_waiting_approval': + return ; + case 'feature_verified': + return ; + case 'spec_regeneration_complete': + return ; + default: + return ; + } + }; + + // Show recent 3 notifications in popover + const recentNotifications = notifications.slice(0, 3); + + if (!projectPath) { + return null; + } + + return ( + + + + + +
+

Notifications

+ {unreadCount > 0 && ( + {unreadCount} unread + )} +
+ + {recentNotifications.length === 0 ? ( +
+ +

No notifications

+
+ ) : ( +
+ {recentNotifications.map((notification) => ( +
handleNotificationClick(notification)} + > +
{getNotificationIcon(notification.type)}
+
+
+

{notification.title}

+ {!notification.read && ( + + )} +
+

+ {notification.message} +

+

+ {formatRelativeTime(new Date(notification.createdAt))} +

+
+
+ +
+
+ ))} +
+ )} + + {notifications.length > 0 && ( +
+ +
+ )} +
+
+ ); +} diff --git a/apps/ui/src/components/layout/project-switcher/project-switcher.tsx b/apps/ui/src/components/layout/project-switcher/project-switcher.tsx index 0713df72..426777b5 100644 --- a/apps/ui/src/components/layout/project-switcher/project-switcher.tsx +++ b/apps/ui/src/components/layout/project-switcher/project-switcher.tsx @@ -7,6 +7,7 @@ import { useOSDetection } from '@/hooks/use-os-detection'; import { ProjectSwitcherItem } from './components/project-switcher-item'; import { ProjectContextMenu } from './components/project-context-menu'; import { EditProjectDialog } from './components/edit-project-dialog'; +import { NotificationBell } from './components/notification-bell'; import { NewProjectModal } from '@/components/dialogs/new-project-modal'; import { OnboardingDialog } from '@/components/layout/sidebar/dialogs'; import { useProjectCreation, useProjectTheme } from '@/components/layout/sidebar/hooks'; @@ -327,6 +328,11 @@ export function ProjectSwitcher() { v{appVersion} {versionSuffix} + + {/* Notification Bell */} +
+ +
diff --git a/apps/ui/src/components/layout/sidebar.tsx b/apps/ui/src/components/layout/sidebar.tsx index 5bdf8a92..0baa81cf 100644 --- a/apps/ui/src/components/layout/sidebar.tsx +++ b/apps/ui/src/components/layout/sidebar.tsx @@ -5,6 +5,7 @@ import { useNavigate, useLocation } from '@tanstack/react-router'; const logger = createLogger('Sidebar'); import { cn } from '@/lib/utils'; import { useAppStore, type ThemeMode } from '@/store/app-store'; +import { useNotificationsStore } from '@/store/notifications-store'; import { useKeyboardShortcuts, useKeyboardShortcutsConfig } from '@/hooks/use-keyboard-shortcuts'; import { getElectronAPI } from '@/lib/electron'; import { initializeProject, hasAppSpec, hasAutomakerDir } from '@/lib/project-init'; @@ -19,7 +20,10 @@ import { SidebarHeader, SidebarNavigation, SidebarFooter, + MobileSidebarToggle, } from './sidebar/components'; +import { useIsCompact } from '@/hooks/use-media-query'; +import { PanelLeftClose } from 'lucide-react'; import { TrashDialog, OnboardingDialog } from './sidebar/dialogs'; import { SIDEBAR_FEATURE_FLAGS } from './sidebar/constants'; import { @@ -43,9 +47,11 @@ export function Sidebar() { trashedProjects, currentProject, sidebarOpen, + mobileSidebarHidden, projectHistory, upsertAndSetCurrentProject, toggleSidebar, + toggleMobileSidebarHidden, restoreTrashedProject, deleteTrashedProject, emptyTrash, @@ -56,12 +62,17 @@ export function Sidebar() { setSpecCreatingForProject, } = useAppStore(); + const isCompact = useIsCompact(); + // Environment variable flags for hiding sidebar items const { hideTerminal, hideRunningAgents, hideContext, hideSpecEditor } = SIDEBAR_FEATURE_FLAGS; // Get customizable keyboard shortcuts const shortcuts = useKeyboardShortcutsConfig(); + // Get unread notifications count + const unreadNotificationsCount = useNotificationsStore((s) => s.unreadCount); + // State for delete project confirmation dialog const [showDeleteProjectDialog, setShowDeleteProjectDialog] = useState(false); @@ -238,6 +249,7 @@ export function Sidebar() { cyclePrevProject, cycleNextProject, unviewedValidationsCount, + unreadNotificationsCount, isSpecGenerating: isCurrentProjectGeneratingSpec, }); @@ -250,10 +262,16 @@ export function Sidebar() { return location.pathname === routePath; }; + // Check if sidebar should be completely hidden on mobile + const shouldHideSidebar = isCompact && mobileSidebarHidden; + return ( <> + {/* Floating toggle to show sidebar on mobile when hidden */} + + {/* Mobile backdrop overlay */} - {sidebarOpen && ( + {sidebarOpen && !shouldHideSidebar && ( - {/* Content */} -
- {eventHooks.length === 0 ? ( -
- -

No event hooks configured

-

- Add hooks to run commands or send webhooks when features complete -

-
- ) : ( -
- {/* Group by trigger type */} - {Object.entries(hooksByTrigger).map(([trigger, hooks]) => ( -
-

- {EVENT_HOOK_TRIGGER_LABELS[trigger as EventHookTrigger]} -

-
- {hooks.map((hook) => ( - handleEditHook(hook)} - onDelete={() => handleDeleteHook(hook.id)} - onToggle={(enabled) => handleToggleHook(hook.id, enabled)} - /> - ))} -
+ {/* Tabs */} + setActiveTab(v as 'hooks' | 'history')}> +
+ + + + Hooks + + + + History + + +
+ + {/* Hooks Tab */} + +
+ {eventHooks.length === 0 ? ( +
+ +

No event hooks configured

+

+ Add hooks to run commands or send webhooks when features complete +

- ))} + ) : ( +
+ {/* Group by trigger type */} + {Object.entries(hooksByTrigger).map(([trigger, hooks]) => ( +
+

+ {EVENT_HOOK_TRIGGER_LABELS[trigger as EventHookTrigger]} +

+
+ {hooks.map((hook) => ( + handleEditHook(hook)} + onDelete={() => handleDeleteHook(hook.id)} + onToggle={(enabled) => handleToggleHook(hook.id, enabled)} + /> + ))} +
+
+ ))} +
+ )}
- )} -
- {/* Variable reference */} -
-
-

Available variables:

- - {'{{featureId}}'} {'{{featureName}}'} {'{{projectPath}}'} {'{{projectName}}'}{' '} - {'{{error}}'} {'{{timestamp}}'} {'{{eventType}}'} - -
-
+ {/* Variable reference */} +
+
+

Available variables:

+ + {'{{featureId}}'} {'{{featureName}}'} {'{{projectPath}}'} {'{{projectName}}'}{' '} + {'{{error}}'} {'{{timestamp}}'} {'{{eventType}}'} + +
+
+ + + {/* History Tab */} + +
+ +
+
+ {/* Dialog */} void; } -interface InitScriptResponse { - success: boolean; - exists: boolean; - content: string; - path: string; - error?: string; -} - export function WorktreesSection({ useWorktrees, onUseWorktreesChange }: WorktreesSectionProps) { - const currentProject = useAppStore((s) => s.currentProject); - const getShowInitScriptIndicator = useAppStore((s) => s.getShowInitScriptIndicator); - const setShowInitScriptIndicator = useAppStore((s) => s.setShowInitScriptIndicator); - const getDefaultDeleteBranch = useAppStore((s) => s.getDefaultDeleteBranch); - const setDefaultDeleteBranch = useAppStore((s) => s.setDefaultDeleteBranch); - const getAutoDismissInitScriptIndicator = useAppStore((s) => s.getAutoDismissInitScriptIndicator); - const setAutoDismissInitScriptIndicator = useAppStore((s) => s.setAutoDismissInitScriptIndicator); - const [scriptContent, setScriptContent] = useState(''); - const [originalContent, setOriginalContent] = useState(''); - const [scriptExists, setScriptExists] = useState(false); - const [isLoading, setIsLoading] = useState(true); - const [isSaving, setIsSaving] = useState(false); - const [isDeleting, setIsDeleting] = useState(false); - - // Get the current show indicator setting - const showIndicator = currentProject?.path - ? getShowInitScriptIndicator(currentProject.path) - : true; - - // Get the default delete branch setting - const defaultDeleteBranch = currentProject?.path - ? getDefaultDeleteBranch(currentProject.path) - : false; - - // Get the auto-dismiss setting - const autoDismiss = currentProject?.path - ? getAutoDismissInitScriptIndicator(currentProject.path) - : true; - - // Check if there are unsaved changes - const hasChanges = scriptContent !== originalContent; - - // Load init script content when project changes - useEffect(() => { - if (!currentProject?.path) { - setScriptContent(''); - setOriginalContent(''); - setScriptExists(false); - setIsLoading(false); - return; - } - - const loadInitScript = async () => { - setIsLoading(true); - try { - const response = await apiGet( - `/api/worktree/init-script?projectPath=${encodeURIComponent(currentProject.path)}` - ); - if (response.success) { - const content = response.content || ''; - setScriptContent(content); - setOriginalContent(content); - setScriptExists(response.exists); - } - } catch (error) { - console.error('Failed to load init script:', error); - } finally { - setIsLoading(false); - } - }; - - loadInitScript(); - }, [currentProject?.path]); - - // Save script - const handleSave = useCallback(async () => { - if (!currentProject?.path) return; - - setIsSaving(true); - try { - const response = await apiPut<{ success: boolean; error?: string }>( - '/api/worktree/init-script', - { - projectPath: currentProject.path, - content: scriptContent, - } - ); - if (response.success) { - setOriginalContent(scriptContent); - setScriptExists(true); - toast.success('Init script saved'); - } else { - toast.error('Failed to save init script', { - description: response.error, - }); - } - } catch (error) { - console.error('Failed to save init script:', error); - toast.error('Failed to save init script'); - } finally { - setIsSaving(false); - } - }, [currentProject?.path, scriptContent]); - - // Reset to original content - const handleReset = useCallback(() => { - setScriptContent(originalContent); - }, [originalContent]); - - // Delete script - const handleDelete = useCallback(async () => { - if (!currentProject?.path) return; - - setIsDeleting(true); - try { - const response = await apiDelete<{ success: boolean; error?: string }>( - '/api/worktree/init-script', - { - body: { projectPath: currentProject.path }, - } - ); - if (response.success) { - setScriptContent(''); - setOriginalContent(''); - setScriptExists(false); - toast.success('Init script deleted'); - } else { - toast.error('Failed to delete init script', { - description: response.error, - }); - } - } catch (error) { - console.error('Failed to delete init script:', error); - toast.error('Failed to delete init script'); - } finally { - setIsDeleting(false); - } - }, [currentProject?.path]); - - // Handle content change (no auto-save) - const handleContentChange = useCallback((value: string) => { - setScriptContent(value); - }, []); - return (
Worktrees

- Configure git worktree isolation and initialization scripts. + Configure git worktree isolation for feature development.

@@ -212,217 +54,12 @@ export function WorktreesSection({ useWorktrees, onUseWorktreesChange }: Worktre
- {/* Show Init Script Indicator Toggle */} - {currentProject && ( -
- { - if (currentProject?.path) { - const value = checked === true; - setShowInitScriptIndicator(currentProject.path, value); - // Persist to server - try { - const httpClient = getHttpApiClient(); - await httpClient.settings.updateProject(currentProject.path, { - showInitScriptIndicator: value, - }); - } catch (error) { - console.error('Failed to persist showInitScriptIndicator:', error); - } - } - }} - className="mt-1" - /> -
- -

- Display a floating panel in the bottom-right corner showing init script execution - status and output when a worktree is created. -

-
-
- )} - - {/* Auto-dismiss Init Script Indicator Toggle */} - {currentProject && showIndicator && ( -
- { - if (currentProject?.path) { - const value = checked === true; - setAutoDismissInitScriptIndicator(currentProject.path, value); - // Persist to server - try { - const httpClient = getHttpApiClient(); - await httpClient.settings.updateProject(currentProject.path, { - autoDismissInitScriptIndicator: value, - }); - } catch (error) { - console.error('Failed to persist autoDismissInitScriptIndicator:', error); - } - } - }} - className="mt-1" - /> -
- -

- Automatically hide the indicator 5 seconds after the script completes. -

-
-
- )} - - {/* Default Delete Branch Toggle */} - {currentProject && ( -
- { - if (currentProject?.path) { - const value = checked === true; - setDefaultDeleteBranch(currentProject.path, value); - // Persist to server - try { - const httpClient = getHttpApiClient(); - await httpClient.settings.updateProject(currentProject.path, { - defaultDeleteBranch: value, - }); - } catch (error) { - console.error('Failed to persist defaultDeleteBranch:', error); - } - } - }} - className="mt-1" - /> -
- -

- When deleting a worktree, automatically check the "Also delete the branch" option. -

-
-
- )} - - {/* Separator */} -
- - {/* Init Script Section */} -
-
-
- - -
-
-

- Shell commands to run after a worktree is created. Runs once per worktree. Uses Git Bash - on Windows for cross-platform compatibility. + {/* Info about project-specific settings */} +

+

+ Project-specific worktree preferences (init script, delete branch behavior) can be + configured in each project's settings via the sidebar.

- - {currentProject ? ( - <> - {/* File path indicator */} -
- - .automaker/worktree-init.sh - {hasChanges && ( - (unsaved changes) - )} -
- - {isLoading ? ( -
- -
- ) : ( - <> - - - {/* Action buttons */} -
- - - -
- - )} - - ) : ( -
- Select a project to configure the init script. -
- )}
diff --git a/apps/ui/src/components/views/spec-view.tsx b/apps/ui/src/components/views/spec-view.tsx index 189e0f9a..616dc4dd 100644 --- a/apps/ui/src/components/views/spec-view.tsx +++ b/apps/ui/src/components/views/spec-view.tsx @@ -1,3 +1,4 @@ +import { useState } from 'react'; import { RefreshCw } from 'lucide-react'; import { useAppStore } from '@/store/app-store'; @@ -13,6 +14,9 @@ import { CreateSpecDialog, RegenerateSpecDialog } from './spec-view/dialogs'; export function SpecView() { const { currentProject, appSpec } = useAppStore(); + // Actions panel state (for tablet/mobile) + const [showActionsPanel, setShowActionsPanel] = useState(false); + // Loading state const { isLoading, specExists, isGenerationRunning, loadSpec } = useSpecLoading(); @@ -52,6 +56,9 @@ export function SpecView() { // Feature generation isGeneratingFeatures, + // Sync + isSyncing, + // Status currentPhase, errorMessage, @@ -59,6 +66,8 @@ export function SpecView() { // Handlers handleCreateSpec, handleRegenerate, + handleGenerateFeatures, + handleSync, } = useSpecGeneration({ loadSpec }); // Reset hasChanges when spec is reloaded @@ -82,10 +91,9 @@ export function SpecView() { ); } - // Empty state - no spec exists or generation is running - // When generation is running, we skip loading the spec to avoid 500 errors, - // so we show the empty state with generation indicator - if (!specExists || isGenerationRunning) { + // Empty state - only show when spec doesn't exist AND no generation is running + // If generation is running but no spec exists, show the generating UI + if (!specExists) { // If generation is running (from loading hook check), ensure we show the generating UI const showAsGenerating = isCreating || isGenerationRunning; @@ -123,15 +131,20 @@ export function SpecView() {
setShowRegenerateDialog(true)} + onGenerateFeaturesClick={handleGenerateFeatures} + onSyncClick={handleSync} onSaveClick={saveSpec} + showActionsPanel={showActionsPanel} + onToggleActionsPanel={() => setShowActionsPanel(!showActionsPanel)} /> diff --git a/apps/ui/src/components/views/spec-view/components/spec-header.tsx b/apps/ui/src/components/views/spec-view/components/spec-header.tsx index 8fd791bb..b38a6579 100644 --- a/apps/ui/src/components/views/spec-view/components/spec-header.tsx +++ b/apps/ui/src/components/views/spec-view/components/spec-header.tsx @@ -1,5 +1,9 @@ import { Button } from '@/components/ui/button'; -import { Save, Sparkles, Loader2, FileText, AlertCircle } from 'lucide-react'; +import { + HeaderActionsPanel, + HeaderActionsPanelTrigger, +} from '@/components/ui/header-actions-panel'; +import { Save, Sparkles, Loader2, FileText, AlertCircle, ListPlus, RefreshCcw } from 'lucide-react'; import { PHASE_LABELS } from '../constants'; interface SpecHeaderProps { @@ -7,12 +11,17 @@ interface SpecHeaderProps { isRegenerating: boolean; isCreating: boolean; isGeneratingFeatures: boolean; + isSyncing: boolean; isSaving: boolean; hasChanges: boolean; currentPhase: string; errorMessage: string; onRegenerateClick: () => void; + onGenerateFeaturesClick: () => void; + onSyncClick: () => void; onSaveClick: () => void; + showActionsPanel: boolean; + onToggleActionsPanel: () => void; } export function SpecHeader({ @@ -20,87 +29,200 @@ export function SpecHeader({ isRegenerating, isCreating, isGeneratingFeatures, + isSyncing, isSaving, hasChanges, currentPhase, errorMessage, onRegenerateClick, + onGenerateFeaturesClick, + onSyncClick, onSaveClick, + showActionsPanel, + onToggleActionsPanel, }: SpecHeaderProps) { - const isProcessing = isRegenerating || isCreating || isGeneratingFeatures; + const isProcessing = isRegenerating || isCreating || isGeneratingFeatures || isSyncing; const phaseLabel = PHASE_LABELS[currentPhase] || currentPhase; return ( -
-
- -
-

App Specification

-

{projectPath}/.automaker/app_spec.txt

+ <> +
+
+ +
+

App Specification

+

{projectPath}/.automaker/app_spec.txt

+
+
+
+ {/* Status indicators - always visible */} + {isProcessing && ( +
+
+ +
+
+
+ + {isSyncing + ? 'Syncing Specification' + : isGeneratingFeatures + ? 'Generating Features' + : isCreating + ? 'Generating Specification' + : 'Regenerating Specification'} + + {currentPhase && ( + + {phaseLabel} + + )} +
+
+ )} + {/* Mobile processing indicator */} + {isProcessing && ( +
+ + Processing... +
+ )} + {errorMessage && ( +
+ +
+ + Error + + + {errorMessage} + +
+
+ )} + {/* Mobile error indicator */} + {errorMessage && ( +
+ + Error +
+ )} + {/* Desktop: show actions inline - hidden when processing since status card shows progress */} + {!isProcessing && ( +
+ + + + +
+ )} + {/* Tablet/Mobile: show trigger for actions panel */} +
-
+ + {/* Actions Panel (tablet/mobile) */} + + {/* Status messages in panel */} {isProcessing && ( -
-
- -
-
-
- - {isGeneratingFeatures - ? 'Generating Features' - : isCreating - ? 'Generating Specification' - : 'Regenerating Specification'} +
+ +
+ + {isSyncing + ? 'Syncing Specification' + : isGeneratingFeatures + ? 'Generating Features' + : isCreating + ? 'Generating Specification' + : 'Regenerating Specification'} - {currentPhase && ( - - {phaseLabel} - - )} + {currentPhase && {phaseLabel}}
)} {errorMessage && ( -
- -
- - Error - - - {errorMessage} - +
+ +
+ Error + {errorMessage}
)} -
- + - -
-
-
+ Regenerate + + + + + )} + + ); } diff --git a/apps/ui/src/components/views/spec-view/constants.ts b/apps/ui/src/components/views/spec-view/constants.ts index 5b4a5a4a..f6c32a80 100644 --- a/apps/ui/src/components/views/spec-view/constants.ts +++ b/apps/ui/src/components/views/spec-view/constants.ts @@ -24,6 +24,7 @@ export const PHASE_LABELS: Record = { analysis: 'Analyzing project structure...', spec_complete: 'Spec created! Generating features...', feature_generation: 'Creating features from roadmap...', + working: 'Working...', complete: 'Complete!', error: 'Error occurred', }; diff --git a/apps/ui/src/components/views/spec-view/hooks/use-spec-generation.ts b/apps/ui/src/components/views/spec-view/hooks/use-spec-generation.ts index 69054125..30a8150f 100644 --- a/apps/ui/src/components/views/spec-view/hooks/use-spec-generation.ts +++ b/apps/ui/src/components/views/spec-view/hooks/use-spec-generation.ts @@ -39,6 +39,9 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { // Generate features only state const [isGeneratingFeatures, setIsGeneratingFeatures] = useState(false); + // Sync state + const [isSyncing, setIsSyncing] = useState(false); + // Logs state (kept for internal tracking) const [logs, setLogs] = useState(''); const logsRef = useRef(''); @@ -55,6 +58,7 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { setIsCreating(false); setIsRegenerating(false); setIsGeneratingFeatures(false); + setIsSyncing(false); setCurrentPhase(''); setErrorMessage(''); setLogs(''); @@ -135,7 +139,7 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { if ( !document.hidden && currentProject && - (isCreating || isRegenerating || isGeneratingFeatures) + (isCreating || isRegenerating || isGeneratingFeatures || isSyncing) ) { try { const api = getElectronAPI(); @@ -151,6 +155,7 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { setIsCreating(false); setIsRegenerating(false); setIsGeneratingFeatures(false); + setIsSyncing(false); setCurrentPhase(''); stateRestoredRef.current = false; loadSpec(); @@ -167,11 +172,12 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { return () => { document.removeEventListener('visibilitychange', handleVisibilityChange); }; - }, [currentProject, isCreating, isRegenerating, isGeneratingFeatures, loadSpec]); + }, [currentProject, isCreating, isRegenerating, isGeneratingFeatures, isSyncing, loadSpec]); // Periodic status check useEffect(() => { - if (!currentProject || (!isCreating && !isRegenerating && !isGeneratingFeatures)) return; + if (!currentProject || (!isCreating && !isRegenerating && !isGeneratingFeatures && !isSyncing)) + return; const intervalId = setInterval(async () => { try { @@ -187,6 +193,7 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { setIsCreating(false); setIsRegenerating(false); setIsGeneratingFeatures(false); + setIsSyncing(false); setCurrentPhase(''); stateRestoredRef.current = false; loadSpec(); @@ -205,7 +212,15 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { return () => { clearInterval(intervalId); }; - }, [currentProject, isCreating, isRegenerating, isGeneratingFeatures, currentPhase, loadSpec]); + }, [ + currentProject, + isCreating, + isRegenerating, + isGeneratingFeatures, + isSyncing, + currentPhase, + loadSpec, + ]); // Subscribe to spec regeneration events useEffect(() => { @@ -317,7 +332,8 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { event.message === 'All tasks completed!' || event.message === 'All tasks completed' || event.message === 'Spec regeneration complete!' || - event.message === 'Initial spec creation complete!'; + event.message === 'Initial spec creation complete!' || + event.message?.includes('Spec sync complete'); const hasCompletePhase = logsRef.current.includes('[Phase: complete]'); @@ -337,6 +353,7 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { setIsRegenerating(false); setIsCreating(false); setIsGeneratingFeatures(false); + setIsSyncing(false); setCurrentPhase(''); setShowRegenerateDialog(false); setShowCreateDialog(false); @@ -349,18 +366,23 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { loadSpec(); }, SPEC_FILE_WRITE_DELAY); + const isSyncComplete = event.message?.includes('sync'); const isRegeneration = event.message?.includes('regeneration'); const isFeatureGeneration = event.message?.includes('Feature generation'); toast.success( - isFeatureGeneration - ? 'Feature Generation Complete' - : isRegeneration - ? 'Spec Regeneration Complete' - : 'Spec Creation Complete', + isSyncComplete + ? 'Spec Sync Complete' + : isFeatureGeneration + ? 'Feature Generation Complete' + : isRegeneration + ? 'Spec Regeneration Complete' + : 'Spec Creation Complete', { - description: isFeatureGeneration - ? 'Features have been created from the app specification.' - : 'Your app specification has been saved.', + description: isSyncComplete + ? 'Your spec has been updated with the latest changes.' + : isFeatureGeneration + ? 'Features have been created from the app specification.' + : 'Your app specification has been saved.', icon: createElement(CheckCircle2, { className: 'w-4 h-4' }), } ); @@ -378,6 +400,7 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { setIsRegenerating(false); setIsCreating(false); setIsGeneratingFeatures(false); + setIsSyncing(false); setCurrentPhase('error'); setErrorMessage(event.error); stateRestoredRef.current = false; @@ -544,6 +567,46 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { } }, [currentProject]); + const handleSync = useCallback(async () => { + if (!currentProject) return; + + setIsSyncing(true); + setCurrentPhase('sync'); + setErrorMessage(''); + logsRef.current = ''; + setLogs(''); + logger.debug('[useSpecGeneration] Starting spec sync'); + try { + const api = getElectronAPI(); + if (!api.specRegeneration) { + logger.error('[useSpecGeneration] Spec regeneration not available'); + setIsSyncing(false); + return; + } + const result = await api.specRegeneration.sync(currentProject.path); + + if (!result.success) { + const errorMsg = result.error || 'Unknown error'; + logger.error('[useSpecGeneration] Failed to start spec sync:', errorMsg); + setIsSyncing(false); + setCurrentPhase('error'); + setErrorMessage(errorMsg); + const errorLog = `[Error] Failed to start spec sync: ${errorMsg}\n`; + logsRef.current = errorLog; + setLogs(errorLog); + } + } catch (error) { + const errorMsg = error instanceof Error ? error.message : String(error); + logger.error('[useSpecGeneration] Failed to sync spec:', errorMsg); + setIsSyncing(false); + setCurrentPhase('error'); + setErrorMessage(errorMsg); + const errorLog = `[Error] Failed to sync spec: ${errorMsg}\n`; + logsRef.current = errorLog; + setLogs(errorLog); + } + }, [currentProject]); + return { // Dialog state showCreateDialog, @@ -576,6 +639,9 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { // Feature generation state isGeneratingFeatures, + // Sync state + isSyncing, + // Status state currentPhase, errorMessage, @@ -584,6 +650,7 @@ export function useSpecGeneration({ loadSpec }: UseSpecGenerationOptions) { // Handlers handleCreateSpec, handleRegenerate, + handleSync, handleGenerateFeatures, }; } diff --git a/apps/ui/src/components/views/spec-view/hooks/use-spec-loading.ts b/apps/ui/src/components/views/spec-view/hooks/use-spec-loading.ts index 4343e300..9fc09b81 100644 --- a/apps/ui/src/components/views/spec-view/hooks/use-spec-loading.ts +++ b/apps/ui/src/components/views/spec-view/hooks/use-spec-loading.ts @@ -18,20 +18,21 @@ export function useSpecLoading() { try { const api = getElectronAPI(); - // Check if spec generation is running before trying to load - // This prevents showing "No App Specification Found" during generation + // Check if spec generation is running if (api.specRegeneration) { const status = await api.specRegeneration.status(currentProject.path); if (status.success && status.isRunning) { - logger.debug('Spec generation is running for this project, skipping load'); + logger.debug('Spec generation is running for this project'); setIsGenerationRunning(true); - setIsLoading(false); - return; + } else { + setIsGenerationRunning(false); } + } else { + setIsGenerationRunning(false); } - // Always reset when generation is not running (handles edge case where api.specRegeneration might not be available) - setIsGenerationRunning(false); + // Always try to load the spec file, even if generation is running + // This allows users to view their existing spec while generating features const result = await api.readFile(`${currentProject.path}/.automaker/app_spec.txt`); if (result.success && result.content) { diff --git a/apps/ui/src/hooks/use-media-query.ts b/apps/ui/src/hooks/use-media-query.ts index 6215498f..deb01a5a 100644 --- a/apps/ui/src/hooks/use-media-query.ts +++ b/apps/ui/src/hooks/use-media-query.ts @@ -56,3 +56,12 @@ export function useIsMobile(): boolean { export function useIsTablet(): boolean { return useMediaQuery('(max-width: 1024px)'); } + +/** + * Hook to detect compact layout (screen width <= 1240px) + * Used for collapsing top bar controls into mobile menu + * @returns boolean indicating if compact layout should be used + */ +export function useIsCompact(): boolean { + return useMediaQuery('(max-width: 1240px)'); +} diff --git a/apps/ui/src/hooks/use-notification-events.ts b/apps/ui/src/hooks/use-notification-events.ts new file mode 100644 index 00000000..5ac40d05 --- /dev/null +++ b/apps/ui/src/hooks/use-notification-events.ts @@ -0,0 +1,78 @@ +/** + * Hook to subscribe to notification WebSocket events and update the store. + */ + +import { useEffect } from 'react'; +import { useNotificationsStore } from '@/store/notifications-store'; +import { getHttpApiClient } from '@/lib/http-api-client'; +import { pathsEqual } from '@/lib/utils'; +import type { Notification } from '@automaker/types'; + +/** + * Hook to subscribe to notification events and update the store. + * Should be used in a component that's always mounted when a project is open. + */ +export function useNotificationEvents(projectPath: string | null) { + const addNotification = useNotificationsStore((s) => s.addNotification); + + useEffect(() => { + if (!projectPath) return; + + const api = getHttpApiClient(); + + const unsubscribe = api.notifications.onNotificationCreated((notification: Notification) => { + // Only handle notifications for the current project + if (!pathsEqual(notification.projectPath, projectPath)) return; + + addNotification(notification); + }); + + return unsubscribe; + }, [projectPath, addNotification]); +} + +/** + * Hook to load notifications for a project. + * Should be called when switching projects or on initial load. + */ +export function useLoadNotifications(projectPath: string | null) { + const setNotifications = useNotificationsStore((s) => s.setNotifications); + const setUnreadCount = useNotificationsStore((s) => s.setUnreadCount); + const setLoading = useNotificationsStore((s) => s.setLoading); + const setError = useNotificationsStore((s) => s.setError); + const reset = useNotificationsStore((s) => s.reset); + + useEffect(() => { + if (!projectPath) { + reset(); + return; + } + + const loadNotifications = async () => { + setLoading(true); + setError(null); + + try { + const api = getHttpApiClient(); + const [listResult, countResult] = await Promise.all([ + api.notifications.list(projectPath), + api.notifications.getUnreadCount(projectPath), + ]); + + if (listResult.success && listResult.notifications) { + setNotifications(listResult.notifications); + } + + if (countResult.success && countResult.count !== undefined) { + setUnreadCount(countResult.count); + } + } catch (error) { + setError(error instanceof Error ? error.message : 'Failed to load notifications'); + } finally { + setLoading(false); + } + }; + + loadNotifications(); + }, [projectPath, setNotifications, setUnreadCount, setLoading, setError, reset]); +} diff --git a/apps/ui/src/lib/electron.ts b/apps/ui/src/lib/electron.ts index 773fdd82..97167e89 100644 --- a/apps/ui/src/lib/electron.ts +++ b/apps/ui/src/lib/electron.ts @@ -437,6 +437,10 @@ export interface SpecRegenerationAPI { success: boolean; error?: string; }>; + sync: (projectPath: string) => Promise<{ + success: boolean; + error?: string; + }>; stop: (projectPath?: string) => Promise<{ success: boolean; error?: string }>; status: (projectPath?: string) => Promise<{ success: boolean; @@ -550,6 +554,88 @@ export interface SaveImageResult { error?: string; } +// Notifications API interface +import type { + Notification, + StoredEvent, + StoredEventSummary, + EventHistoryFilter, + EventReplayResult, +} from '@automaker/types'; + +export interface NotificationsAPI { + list: (projectPath: string) => Promise<{ + success: boolean; + notifications?: Notification[]; + error?: string; + }>; + getUnreadCount: (projectPath: string) => Promise<{ + success: boolean; + count?: number; + error?: string; + }>; + markAsRead: ( + projectPath: string, + notificationId?: string + ) => Promise<{ + success: boolean; + notification?: Notification; + count?: number; + error?: string; + }>; + dismiss: ( + projectPath: string, + notificationId?: string + ) => Promise<{ + success: boolean; + dismissed?: boolean; + count?: number; + error?: string; + }>; +} + +// Event History API interface +export interface EventHistoryAPI { + list: ( + projectPath: string, + filter?: EventHistoryFilter + ) => Promise<{ + success: boolean; + events?: StoredEventSummary[]; + total?: number; + error?: string; + }>; + get: ( + projectPath: string, + eventId: string + ) => Promise<{ + success: boolean; + event?: StoredEvent; + error?: string; + }>; + delete: ( + projectPath: string, + eventId: string + ) => Promise<{ + success: boolean; + error?: string; + }>; + clear: (projectPath: string) => Promise<{ + success: boolean; + cleared?: number; + error?: string; + }>; + replay: ( + projectPath: string, + eventId: string, + hookIds?: string[] + ) => Promise<{ + success: boolean; + result?: EventReplayResult; + error?: string; + }>; +} + export interface ElectronAPI { ping: () => Promise; getApiKey?: () => Promise; @@ -760,6 +846,8 @@ export interface ElectronAPI { }>; }; ideation?: IdeationAPI; + notifications?: NotificationsAPI; + eventHistory?: EventHistoryAPI; codex?: { getUsage: () => Promise; getModels: (refresh?: boolean) => Promise<{ @@ -2658,6 +2746,30 @@ function createMockSpecRegenerationAPI(): SpecRegenerationAPI { return { success: true }; }, + sync: async (projectPath: string) => { + if (mockSpecRegenerationRunning) { + return { + success: false, + error: 'Spec sync is already running', + }; + } + + mockSpecRegenerationRunning = true; + console.log(`[Mock] Syncing spec for: ${projectPath}`); + + // Simulate async spec sync (similar to feature generation but simpler) + setTimeout(() => { + emitSpecRegenerationEvent({ + type: 'spec_regeneration_complete', + message: 'Spec synchronized successfully', + projectPath, + }); + mockSpecRegenerationRunning = false; + }, 1000); + + return { success: true }; + }, + stop: async (_projectPath?: string) => { mockSpecRegenerationRunning = false; mockSpecRegenerationPhase = ''; diff --git a/apps/ui/src/lib/http-api-client.ts b/apps/ui/src/lib/http-api-client.ts index 90781b59..547fee7f 100644 --- a/apps/ui/src/lib/http-api-client.ts +++ b/apps/ui/src/lib/http-api-client.ts @@ -32,7 +32,10 @@ import type { CreateIdeaInput, UpdateIdeaInput, ConvertToFeatureOptions, + NotificationsAPI, + EventHistoryAPI, } from './electron'; +import type { EventHistoryFilter } from '@automaker/types'; import type { Message, SessionListItem } from '@/types/electron'; import type { Feature, ClaudeUsageResponse, CodexUsageResponse } from '@/store/app-store'; import type { WorktreeAPI, GitAPI, ModelDefinition, ProviderStatus } from '@/types/electron'; @@ -154,7 +157,9 @@ const getServerUrl = (): string => { const envUrl = import.meta.env.VITE_SERVER_URL; if (envUrl) return envUrl; } - return 'http://localhost:3008'; + // Use VITE_HOSTNAME if set, otherwise default to localhost + const hostname = import.meta.env.VITE_HOSTNAME || 'localhost'; + return `http://${hostname}:3008`; }; /** @@ -514,7 +519,8 @@ type EventType = | 'worktree:init-completed' | 'dev-server:started' | 'dev-server:output' - | 'dev-server:stopped'; + | 'dev-server:stopped' + | 'notification:created'; /** * Dev server log event payloads for WebSocket streaming @@ -553,6 +559,7 @@ export interface DevServerLogsResponse { result?: { worktreePath: string; port: number; + url: string; logs: string; startedAt: string; }; @@ -1875,6 +1882,7 @@ export class HttpApiClient implements ElectronAPI { projectPath, maxFeatures, }), + sync: (projectPath: string) => this.post('/api/spec-regeneration/sync', { projectPath }), stop: (projectPath?: string) => this.post('/api/spec-regeneration/stop', { projectPath }), status: (projectPath?: string) => this.get( @@ -2171,6 +2179,9 @@ export class HttpApiClient implements ElectronAPI { hideScrollbar: boolean; }; worktreePanelVisible?: boolean; + showInitScriptIndicator?: boolean; + defaultDeleteBranchWithWorktree?: boolean; + autoDismissInitScriptIndicator?: boolean; lastSelectedSessionId?: string; }; error?: string; @@ -2440,6 +2451,43 @@ export class HttpApiClient implements ElectronAPI { }, }; + // Notifications API - project-level notifications + notifications: NotificationsAPI & { + onNotificationCreated: (callback: (notification: any) => void) => () => void; + } = { + list: (projectPath: string) => this.post('/api/notifications/list', { projectPath }), + + getUnreadCount: (projectPath: string) => + this.post('/api/notifications/unread-count', { projectPath }), + + markAsRead: (projectPath: string, notificationId?: string) => + this.post('/api/notifications/mark-read', { projectPath, notificationId }), + + dismiss: (projectPath: string, notificationId?: string) => + this.post('/api/notifications/dismiss', { projectPath, notificationId }), + + onNotificationCreated: (callback: (notification: any) => void): (() => void) => { + return this.subscribeToEvent('notification:created', callback as EventCallback); + }, + }; + + // Event History API - stored events for debugging and replay + eventHistory: EventHistoryAPI = { + list: (projectPath: string, filter?: EventHistoryFilter) => + this.post('/api/event-history/list', { projectPath, filter }), + + get: (projectPath: string, eventId: string) => + this.post('/api/event-history/get', { projectPath, eventId }), + + delete: (projectPath: string, eventId: string) => + this.post('/api/event-history/delete', { projectPath, eventId }), + + clear: (projectPath: string) => this.post('/api/event-history/clear', { projectPath }), + + replay: (projectPath: string, eventId: string, hookIds?: string[]) => + this.post('/api/event-history/replay', { projectPath, eventId, hookIds }), + }; + // MCP API - Test MCP server connections and list tools // SECURITY: Only accepts serverId, not arbitrary serverConfig, to prevent // drive-by command execution attacks. Servers must be saved first. diff --git a/apps/ui/src/routes/__root.tsx b/apps/ui/src/routes/__root.tsx index 9472545b..f3a662e8 100644 --- a/apps/ui/src/routes/__root.tsx +++ b/apps/ui/src/routes/__root.tsx @@ -28,12 +28,12 @@ import { performSettingsMigration, } from '@/hooks/use-settings-migration'; import { Toaster } from 'sonner'; -import { Menu } from 'lucide-react'; import { ThemeOption, themeOptions } from '@/config/theme-options'; import { SandboxRiskDialog } from '@/components/dialogs/sandbox-risk-dialog'; import { SandboxRejectionScreen } from '@/components/dialogs/sandbox-rejection-screen'; import { LoadingState } from '@/components/ui/loading-state'; import { useProjectSettingsLoader } from '@/hooks/use-project-settings-loader'; +import { useIsCompact } from '@/hooks/use-media-query'; import type { Project } from '@/lib/electron'; const logger = createLogger('RootLayout'); @@ -176,6 +176,9 @@ function RootLayoutContent() { // Load project settings when switching projects useProjectSettingsLoader(); + // Check if we're in compact mode (< 1240px) to hide project switcher + const isCompact = useIsCompact(); + const isSetupRoute = location.pathname === '/setup'; const isLoginRoute = location.pathname === '/login'; const isLoggedOutRoute = location.pathname === '/logged-out'; @@ -805,8 +808,9 @@ function RootLayoutContent() { } // Show project switcher on all app pages (not on dashboard, setup, or login) + // Also hide on compact screens (< 1240px) - the sidebar will show a logo instead const showProjectSwitcher = - !isDashboardRoute && !isSetupRoute && !isLoginRoute && !isLoggedOutRoute; + !isDashboardRoute && !isSetupRoute && !isLoginRoute && !isLoggedOutRoute && !isCompact; return ( <> @@ -820,16 +824,6 @@ function RootLayoutContent() { )} {showProjectSwitcher && } - {/* Mobile menu toggle button - only shows when sidebar is closed on mobile */} - {!sidebarOpen && ( - - )}
; // projectPath -> sessionId @@ -730,6 +735,10 @@ export interface AppState { // Whether to auto-dismiss the indicator after completion (default: true) autoDismissInitScriptIndicatorByProject: Record; + // Use Worktrees Override (per-project, keyed by project path) + // undefined = use global setting, true/false = project-specific override + useWorktreesByProject: Record; + // UI State (previously in localStorage, now synced via API) /** Whether worktree panel is collapsed in board view */ worktreePanelCollapsed: boolean; @@ -902,6 +911,8 @@ export interface AppActions { setCurrentView: (view: ViewMode) => void; toggleSidebar: () => void; setSidebarOpen: (open: boolean) => void; + toggleMobileSidebarHidden: () => void; + setMobileSidebarHidden: (hidden: boolean) => void; // Theme actions setTheme: (theme: ThemeMode) => void; @@ -1183,6 +1194,11 @@ export interface AppActions { setAutoDismissInitScriptIndicator: (projectPath: string, autoDismiss: boolean) => void; getAutoDismissInitScriptIndicator: (projectPath: string) => boolean; + // Use Worktrees Override actions (per-project) + setProjectUseWorktrees: (projectPath: string, useWorktrees: boolean | null) => void; // null = use global + getProjectUseWorktrees: (projectPath: string) => boolean | undefined; // undefined = using global + getEffectiveUseWorktrees: (projectPath: string) => boolean; // Returns actual value (project or global fallback) + // UI State actions (previously in localStorage, now synced via API) setWorktreePanelCollapsed: (collapsed: boolean) => void; setLastProjectDir: (dir: string) => void; @@ -1239,6 +1255,7 @@ const initialState: AppState = { projectHistoryIndex: -1, currentView: 'welcome', sidebarOpen: true, + mobileSidebarHidden: false, // Sidebar visible by default on mobile lastSelectedSessionByProject: {}, theme: getStoredTheme() || 'dark', // Use localStorage theme as initial value, fallback to 'dark' features: [], @@ -1343,6 +1360,7 @@ const initialState: AppState = { showInitScriptIndicatorByProject: {}, defaultDeleteBranchByProject: {}, autoDismissInitScriptIndicatorByProject: {}, + useWorktreesByProject: {}, // UI State (previously in localStorage, now synced via API) worktreePanelCollapsed: false, lastProjectDir: '', @@ -1667,6 +1685,8 @@ export const useAppStore = create()((set, get) => ({ setCurrentView: (view) => set({ currentView: view }), toggleSidebar: () => set({ sidebarOpen: !get().sidebarOpen }), setSidebarOpen: (open) => set({ sidebarOpen: open }), + toggleMobileSidebarHidden: () => set({ mobileSidebarHidden: !get().mobileSidebarHidden }), + setMobileSidebarHidden: (hidden) => set({ mobileSidebarHidden: hidden }), // Theme actions setTheme: (theme) => { @@ -3526,6 +3546,31 @@ export const useAppStore = create()((set, get) => ({ return get().autoDismissInitScriptIndicatorByProject[projectPath] ?? true; }, + // Use Worktrees Override actions (per-project) + setProjectUseWorktrees: (projectPath, useWorktrees) => { + const newValue = useWorktrees === null ? undefined : useWorktrees; + set({ + useWorktreesByProject: { + ...get().useWorktreesByProject, + [projectPath]: newValue, + }, + }); + }, + + getProjectUseWorktrees: (projectPath) => { + // Returns undefined if using global setting, true/false if project-specific + return get().useWorktreesByProject[projectPath]; + }, + + getEffectiveUseWorktrees: (projectPath) => { + // Returns the actual value to use (project override or global fallback) + const projectSetting = get().useWorktreesByProject[projectPath]; + if (projectSetting !== undefined) { + return projectSetting; + } + return get().useWorktrees; + }, + // UI State actions (previously in localStorage, now synced via API) setWorktreePanelCollapsed: (collapsed) => set({ worktreePanelCollapsed: collapsed }), setLastProjectDir: (dir) => set({ lastProjectDir: dir }), diff --git a/apps/ui/src/store/notifications-store.ts b/apps/ui/src/store/notifications-store.ts new file mode 100644 index 00000000..278f645d --- /dev/null +++ b/apps/ui/src/store/notifications-store.ts @@ -0,0 +1,129 @@ +/** + * Notifications Store - State management for project-level notifications + */ + +import { create } from 'zustand'; +import type { Notification } from '@automaker/types'; + +// ============================================================================ +// State Interface +// ============================================================================ + +interface NotificationsState { + // Notifications for the current project + notifications: Notification[]; + unreadCount: number; + isLoading: boolean; + error: string | null; + + // Popover state + isPopoverOpen: boolean; +} + +// ============================================================================ +// Actions Interface +// ============================================================================ + +interface NotificationsActions { + // Data management + setNotifications: (notifications: Notification[]) => void; + setUnreadCount: (count: number) => void; + addNotification: (notification: Notification) => void; + markAsRead: (notificationId: string) => void; + markAllAsRead: () => void; + dismissNotification: (notificationId: string) => void; + dismissAll: () => void; + + // Loading state + setLoading: (loading: boolean) => void; + setError: (error: string | null) => void; + + // Popover state + setPopoverOpen: (open: boolean) => void; + + // Reset + reset: () => void; +} + +// ============================================================================ +// Initial State +// ============================================================================ + +const initialState: NotificationsState = { + notifications: [], + unreadCount: 0, + isLoading: false, + error: null, + isPopoverOpen: false, +}; + +// ============================================================================ +// Store +// ============================================================================ + +export const useNotificationsStore = create( + (set, get) => ({ + ...initialState, + + // Data management + setNotifications: (notifications) => + set({ + notifications, + unreadCount: notifications.filter((n) => !n.read).length, + }), + + setUnreadCount: (count) => set({ unreadCount: count }), + + addNotification: (notification) => + set((state) => ({ + notifications: [notification, ...state.notifications], + unreadCount: notification.read ? state.unreadCount : state.unreadCount + 1, + })), + + markAsRead: (notificationId) => + set((state) => { + const notification = state.notifications.find((n) => n.id === notificationId); + if (!notification || notification.read) return state; + + return { + notifications: state.notifications.map((n) => + n.id === notificationId ? { ...n, read: true } : n + ), + unreadCount: Math.max(0, state.unreadCount - 1), + }; + }), + + markAllAsRead: () => + set((state) => ({ + notifications: state.notifications.map((n) => ({ ...n, read: true })), + unreadCount: 0, + })), + + dismissNotification: (notificationId) => + set((state) => { + const notification = state.notifications.find((n) => n.id === notificationId); + if (!notification) return state; + + return { + notifications: state.notifications.filter((n) => n.id !== notificationId), + unreadCount: notification.read ? state.unreadCount : Math.max(0, state.unreadCount - 1), + }; + }), + + dismissAll: () => + set({ + notifications: [], + unreadCount: 0, + }), + + // Loading state + setLoading: (loading) => set({ isLoading: loading }), + setError: (error) => set({ error }), + + // Popover state + setPopoverOpen: (open) => set({ isPopoverOpen: open }), + + // Reset + reset: () => set(initialState), + }) +); diff --git a/apps/ui/src/styles/global.css b/apps/ui/src/styles/global.css index 54a32c4a..9ffd4978 100644 --- a/apps/ui/src/styles/global.css +++ b/apps/ui/src/styles/global.css @@ -483,6 +483,16 @@ background: oklch(0.45 0 0); } +/* Hidden scrollbar - still scrollable but no visible scrollbar */ +.scrollbar-hide { + -ms-overflow-style: none; /* IE and Edge */ + scrollbar-width: none; /* Firefox */ +} + +.scrollbar-hide::-webkit-scrollbar { + display: none; /* Chrome, Safari, Opera */ +} + /* Glass morphism utilities */ @layer utilities { .glass { diff --git a/apps/ui/src/types/electron.d.ts b/apps/ui/src/types/electron.d.ts index fbba4063..42e4200d 100644 --- a/apps/ui/src/types/electron.d.ts +++ b/apps/ui/src/types/electron.d.ts @@ -367,6 +367,11 @@ export interface SpecRegenerationAPI { error?: string; }>; + sync: (projectPath: string) => Promise<{ + success: boolean; + error?: string; + }>; + stop: (projectPath?: string) => Promise<{ success: boolean; error?: string; @@ -994,6 +999,7 @@ export interface WorktreeAPI { result?: { worktreePath: string; port: number; + url: string; logs: string; startedAt: string; }; diff --git a/apps/ui/vite.config.mts b/apps/ui/vite.config.mts index e40bce49..0d18997e 100644 --- a/apps/ui/vite.config.mts +++ b/apps/ui/vite.config.mts @@ -65,7 +65,9 @@ export default defineConfig(({ command }) => { }, }, server: { + host: process.env.HOST || '0.0.0.0', port: parseInt(process.env.TEST_PORT || '3007', 10), + allowedHosts: true, }, build: { outDir: 'dist', diff --git a/dev.mjs b/dev.mjs deleted file mode 100644 index 6d137d23..00000000 --- a/dev.mjs +++ /dev/null @@ -1,198 +0,0 @@ -#!/usr/bin/env node - -/** - * Automaker - Development Mode Launch Script - * - * This script starts the application in development mode with hot reloading. - * It uses Vite dev server for fast HMR during development. - * - * Usage: npm run dev - */ - -import path from 'path'; -import { fileURLToPath } from 'url'; - -import { - createRestrictedFs, - log, - runNpm, - runNpmAndWait, - runNpx, - printHeader, - printModeMenu, - resolvePortConfiguration, - createCleanupHandler, - setupSignalHandlers, - startServerAndWait, - ensureDependencies, - prompt, - launchDockerDevContainers, - launchDockerDevServerContainer, -} from './scripts/launcher-utils.mjs'; - -const __filename = fileURLToPath(import.meta.url); -const __dirname = path.dirname(__filename); - -// Create restricted fs for this script's directory -const fs = createRestrictedFs(__dirname, 'dev.mjs'); - -// Track background processes for cleanup -const processes = { - server: null, - web: null, - electron: null, - docker: null, -}; - -/** - * Install Playwright browsers (dev-only dependency) - */ -async function installPlaywrightBrowsers() { - log('Checking Playwright browsers...', 'yellow'); - try { - const exitCode = await new Promise((resolve) => { - const playwright = runNpx( - ['playwright', 'install', 'chromium'], - { stdio: 'inherit' }, - path.join(__dirname, 'apps', 'ui') - ); - playwright.on('close', (code) => resolve(code)); - playwright.on('error', () => resolve(1)); - }); - - if (exitCode === 0) { - log('Playwright browsers ready', 'green'); - } else { - log('Playwright installation failed (browser automation may not work)', 'yellow'); - } - } catch { - log('Playwright installation skipped', 'yellow'); - } -} - -/** - * Main function - */ -async function main() { - // Change to script directory - process.chdir(__dirname); - - printHeader('Automaker Development Environment'); - - // Ensure dependencies are installed - await ensureDependencies(fs, __dirname); - - // Install Playwright browsers (dev-only) - await installPlaywrightBrowsers(); - - // Resolve port configuration (check/kill/change ports) - const { webPort, serverPort, corsOriginEnv } = await resolvePortConfiguration(); - - // Show mode selection menu - printModeMenu({ isDev: true }); - - // Setup cleanup handlers - const cleanup = createCleanupHandler(processes); - setupSignalHandlers(cleanup); - - // Prompt for choice - while (true) { - const choice = await prompt('Enter your choice (1, 2, 3, or 4): '); - - if (choice === '1') { - console.log(''); - log('Launching Web Application (Development Mode)...', 'blue'); - - // Build shared packages once - log('Building shared packages...', 'blue'); - await runNpmAndWait(['run', 'build:packages'], { stdio: 'inherit' }, __dirname); - - // Start the backend server in dev mode - processes.server = await startServerAndWait({ - serverPort, - corsOriginEnv, - npmArgs: ['run', '_dev:server'], - cwd: __dirname, - fs, - baseDir: __dirname, - }); - - if (!processes.server) { - await cleanup(); - process.exit(1); - } - - log(`The application will be available at: http://localhost:${webPort}`, 'green'); - console.log(''); - - // Start web app with Vite dev server (HMR enabled) - processes.web = runNpm( - ['run', '_dev:web'], - { - stdio: 'inherit', - env: { - TEST_PORT: String(webPort), - VITE_SERVER_URL: `http://localhost:${serverPort}`, - VITE_APP_MODE: '1', - }, - }, - __dirname - ); - - await new Promise((resolve) => { - processes.web.on('close', resolve); - }); - - break; - } else if (choice === '2') { - console.log(''); - log('Launching Desktop Application (Development Mode)...', 'blue'); - log('(Electron will start its own backend server)', 'yellow'); - console.log(''); - - // Pass selected ports through to Vite + Electron backend - processes.electron = runNpm( - ['run', 'dev:electron'], - { - stdio: 'inherit', - env: { - TEST_PORT: String(webPort), - PORT: String(serverPort), - VITE_SERVER_URL: `http://localhost:${serverPort}`, - CORS_ORIGIN: corsOriginEnv, - VITE_APP_MODE: '2', - }, - }, - __dirname - ); - - await new Promise((resolve) => { - processes.electron.on('close', resolve); - }); - - break; - } else if (choice === '3') { - console.log(''); - await launchDockerDevContainers({ baseDir: __dirname, processes }); - break; - } else if (choice === '4') { - console.log(''); - await launchDockerDevServerContainer({ baseDir: __dirname, processes }); - break; - } else { - log('Invalid choice. Please enter 1, 2, 3, or 4.', 'red'); - } - } -} - -// Run main function -main().catch(async (err) => { - console.error(err); - const cleanup = createCleanupHandler(processes); - try { - await cleanup(); - } catch (cleanupErr) { - console.error('Cleanup error:', cleanupErr); - } - process.exit(1); -}); diff --git a/docker-compose.dev-server.yml b/docker-compose.dev-server.yml index 9de27928..aad8fcef 100644 --- a/docker-compose.dev-server.yml +++ b/docker-compose.dev-server.yml @@ -27,7 +27,7 @@ services: ports: - '3008:3008' environment: - # Required + # Optional - Anthropic API key - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY} # Optional - Claude CLI OAuth credentials @@ -43,6 +43,7 @@ services: - NODE_ENV=development - PORT=3008 - CORS_ORIGIN=http://localhost:3007 + - HUSKY=0 # Optional - restrict to specific directory within container - ALLOWED_ROOT_DIRECTORY=${ALLOWED_ROOT_DIRECTORY:-/projects} diff --git a/docker-compose.dev.yml b/docker-compose.dev.yml index ff83ea05..087bb095 100644 --- a/docker-compose.dev.yml +++ b/docker-compose.dev.yml @@ -28,7 +28,7 @@ services: ports: - '3008:3008' environment: - # Required + # Optional - Anthropic API key - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY} # Optional - Claude CLI OAuth credentials @@ -44,6 +44,7 @@ services: - NODE_ENV=development - PORT=3008 - CORS_ORIGIN=http://localhost:3007 + - HUSKY=0 # Optional - restrict to specific directory within container - ALLOWED_ROOT_DIRECTORY=${ALLOWED_ROOT_DIRECTORY:-/projects} @@ -112,6 +113,7 @@ services: - TEST_PORT=3007 - VITE_SKIP_ELECTRON=true - VITE_APP_MODE=3 + - HUSKY=0 volumes: # Mount source code for live reload - .:/app:cached diff --git a/docker-compose.yml b/docker-compose.yml index 2c4cb71e..4e74825d 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -38,7 +38,7 @@ services: ports: - '3008:3008' environment: - # Required + # Optional - Anthropic API key - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY} # Optional - Claude CLI OAuth credentials (for macOS users) diff --git a/libs/platform/src/index.ts b/libs/platform/src/index.ts index cd37da49..d51845f9 100644 --- a/libs/platform/src/index.ts +++ b/libs/platform/src/index.ts @@ -19,6 +19,12 @@ export { getAppSpecPath, getBranchTrackingPath, getExecutionStatePath, + getNotificationsPath, + // Event history paths + getEventHistoryDir, + getEventHistoryIndexPath, + getEventPath, + ensureEventHistoryDir, ensureAutomakerDir, getGlobalSettingsPath, getCredentialsPath, diff --git a/libs/platform/src/paths.ts b/libs/platform/src/paths.ts index 5a56a2a2..130f54e0 100644 --- a/libs/platform/src/paths.ts +++ b/libs/platform/src/paths.ts @@ -161,6 +161,18 @@ export function getAppSpecPath(projectPath: string): string { return path.join(getAutomakerDir(projectPath), 'app_spec.txt'); } +/** + * Get the notifications file path for a project + * + * Stores project-level notifications for feature status changes and operation completions. + * + * @param projectPath - Absolute path to project directory + * @returns Absolute path to {projectPath}/.automaker/notifications.json + */ +export function getNotificationsPath(projectPath: string): string { + return path.join(getAutomakerDir(projectPath), 'notifications.json'); +} + /** * Get the branch tracking file path for a project * @@ -335,6 +347,57 @@ export async function ensureIdeationDir(projectPath: string): Promise { return ideationDir; } +// ============================================================================ +// Event History Paths +// ============================================================================ + +/** + * Get the event history directory for a project + * + * Contains stored event records for debugging and replay. + * + * @param projectPath - Absolute path to project directory + * @returns Absolute path to {projectPath}/.automaker/events + */ +export function getEventHistoryDir(projectPath: string): string { + return path.join(getAutomakerDir(projectPath), 'events'); +} + +/** + * Get the event history index file path + * + * Stores an index of all events for quick listing without scanning directory. + * + * @param projectPath - Absolute path to project directory + * @returns Absolute path to {projectPath}/.automaker/events/index.json + */ +export function getEventHistoryIndexPath(projectPath: string): string { + return path.join(getEventHistoryDir(projectPath), 'index.json'); +} + +/** + * Get the file path for a specific event + * + * @param projectPath - Absolute path to project directory + * @param eventId - Event identifier + * @returns Absolute path to {projectPath}/.automaker/events/{eventId}.json + */ +export function getEventPath(projectPath: string, eventId: string): string { + return path.join(getEventHistoryDir(projectPath), `${eventId}.json`); +} + +/** + * Create the event history directory for a project if it doesn't exist + * + * @param projectPath - Absolute path to project directory + * @returns Promise resolving to the created events directory path + */ +export async function ensureEventHistoryDir(projectPath: string): Promise { + const eventsDir = getEventHistoryDir(projectPath); + await secureFs.mkdir(eventsDir, { recursive: true }); + return eventsDir; +} + // ============================================================================ // Global Settings Paths (stored in DATA_DIR from app.getPath('userData')) // ============================================================================ diff --git a/libs/prompts/src/defaults.ts b/libs/prompts/src/defaults.ts index a582313d..f9849813 100644 --- a/libs/prompts/src/defaults.ts +++ b/libs/prompts/src/defaults.ts @@ -685,6 +685,12 @@ Format as JSON: Generate features that build on each other logically. +CRITICAL RULES: +- If an "EXISTING FEATURES" section is provided above, you MUST NOT generate any features that duplicate or overlap with those existing features +- Check each feature you generate against the existing features list - if it already exists, DO NOT include it +- Only generate truly NEW features that add value beyond what already exists +- Generate unique IDs that don't conflict with existing feature IDs + IMPORTANT: Do not ask for clarification. The specification is provided above. Generate the JSON immediately.`; /** diff --git a/libs/types/src/event-history.ts b/libs/types/src/event-history.ts new file mode 100644 index 00000000..09ff92aa --- /dev/null +++ b/libs/types/src/event-history.ts @@ -0,0 +1,123 @@ +/** + * Event History Types - Stored event records for debugging and replay + * + * Events are stored on disk to allow users to: + * - View historical events for debugging + * - Replay events with custom hooks + * - Test hook configurations against past events + */ + +import type { EventHookTrigger } from './settings.js'; + +/** + * StoredEvent - A single event record stored on disk + * + * Contains all information needed to replay the event or inspect what happened. + */ +export interface StoredEvent { + /** Unique identifier for this event record */ + id: string; + /** The hook trigger type this event maps to */ + trigger: EventHookTrigger; + /** ISO timestamp when the event occurred */ + timestamp: string; + /** ID of the feature involved (if applicable) */ + featureId?: string; + /** Name of the feature involved (if applicable) */ + featureName?: string; + /** Path to the project where the event occurred */ + projectPath: string; + /** Name of the project (extracted from path) */ + projectName: string; + /** Error message if this was an error event */ + error?: string; + /** Error classification if applicable */ + errorType?: string; + /** Whether the feature passed (for completion events) */ + passes?: boolean; + /** Additional context/metadata for the event */ + metadata?: Record; +} + +/** + * StoredEventIndex - Quick lookup index for event history + * + * Stored separately for fast listing without loading full event data. + */ +export interface StoredEventIndex { + /** Version for future migrations */ + version: number; + /** Array of event summaries for quick listing */ + events: StoredEventSummary[]; +} + +/** + * StoredEventSummary - Minimal event info for listing + */ +export interface StoredEventSummary { + /** Event ID */ + id: string; + /** Trigger type */ + trigger: EventHookTrigger; + /** When it occurred */ + timestamp: string; + /** Feature name for display (if applicable) */ + featureName?: string; + /** Feature ID (if applicable) */ + featureId?: string; +} + +/** + * EventHistoryFilter - Options for filtering event history + */ +export interface EventHistoryFilter { + /** Filter by trigger type */ + trigger?: EventHookTrigger; + /** Filter by feature ID */ + featureId?: string; + /** Filter events after this timestamp */ + since?: string; + /** Filter events before this timestamp */ + until?: string; + /** Maximum number of events to return */ + limit?: number; + /** Number of events to skip (for pagination) */ + offset?: number; +} + +/** + * EventReplayResult - Result of replaying an event + */ +export interface EventReplayResult { + /** Event that was replayed */ + eventId: string; + /** Number of hooks that were triggered */ + hooksTriggered: number; + /** Results from each hook execution */ + hookResults: EventReplayHookResult[]; +} + +/** + * EventReplayHookResult - Result of a single hook execution during replay + */ +export interface EventReplayHookResult { + /** Hook ID */ + hookId: string; + /** Hook name (if set) */ + hookName?: string; + /** Whether the hook executed successfully */ + success: boolean; + /** Error message if failed */ + error?: string; + /** Execution time in milliseconds */ + durationMs: number; +} + +/** Current version of the event history index schema */ +export const EVENT_HISTORY_VERSION = 1; + +/** Default empty event history index */ +export const DEFAULT_EVENT_HISTORY_INDEX: StoredEventIndex = { + version: EVENT_HISTORY_VERSION, + events: [], +}; diff --git a/libs/types/src/event.ts b/libs/types/src/event.ts index 6e723d86..c274ffb5 100644 --- a/libs/types/src/event.ts +++ b/libs/types/src/event.ts @@ -10,6 +10,7 @@ export type EventType = | 'auto-mode:idle' | 'auto-mode:error' | 'backlog-plan:event' + | 'feature:created' | 'feature:started' | 'feature:completed' | 'feature:stopped' @@ -45,6 +46,7 @@ export type EventType = | 'worktree:init-completed' | 'dev-server:started' | 'dev-server:output' - | 'dev-server:stopped'; + | 'dev-server:stopped' + | 'notification:created'; export type EventCallback = (type: EventType, payload: unknown) => void; diff --git a/libs/types/src/index.ts b/libs/types/src/index.ts index ba09a3b2..a0145782 100644 --- a/libs/types/src/index.ts +++ b/libs/types/src/index.ts @@ -21,6 +21,13 @@ export type { ReasoningEffort, } from './provider.js'; +// Provider constants and utilities +export { + DEFAULT_TIMEOUT_MS, + REASONING_TIMEOUT_MULTIPLIERS, + calculateReasoningTimeout, +} from './provider.js'; + // Codex CLI types export type { CodexSandboxMode, @@ -270,3 +277,18 @@ export type { IdeationStreamEvent, IdeationAnalysisEvent, } from './ideation.js'; + +// Notification types +export type { NotificationType, Notification, NotificationsFile } from './notification.js'; +export { NOTIFICATIONS_VERSION, DEFAULT_NOTIFICATIONS_FILE } from './notification.js'; + +// Event history types +export type { + StoredEvent, + StoredEventIndex, + StoredEventSummary, + EventHistoryFilter, + EventReplayResult, + EventReplayHookResult, +} from './event-history.js'; +export { EVENT_HISTORY_VERSION, DEFAULT_EVENT_HISTORY_INDEX } from './event-history.js'; diff --git a/libs/types/src/notification.ts b/libs/types/src/notification.ts new file mode 100644 index 00000000..e134e0ea --- /dev/null +++ b/libs/types/src/notification.ts @@ -0,0 +1,58 @@ +/** + * Notification Types - Types for project-level notification system + * + * Notifications alert users when features reach specific statuses + * or when long-running operations complete. + */ + +/** + * NotificationType - Types of notifications that can be created + */ +export type NotificationType = + | 'feature_waiting_approval' + | 'feature_verified' + | 'spec_regeneration_complete' + | 'agent_complete'; + +/** + * Notification - A single notification entry + */ +export interface Notification { + /** Unique identifier for the notification */ + id: string; + /** Type of notification */ + type: NotificationType; + /** Short title for display */ + title: string; + /** Longer descriptive message */ + message: string; + /** ISO timestamp when notification was created */ + createdAt: string; + /** Whether the notification has been read */ + read: boolean; + /** Whether the notification has been dismissed */ + dismissed: boolean; + /** Associated feature ID if applicable */ + featureId?: string; + /** Project path this notification belongs to */ + projectPath: string; +} + +/** + * NotificationsFile - Structure of the notifications.json file + */ +export interface NotificationsFile { + /** Version for future migrations */ + version: number; + /** List of notifications */ + notifications: Notification[]; +} + +/** Current version of the notifications file schema */ +export const NOTIFICATIONS_VERSION = 1; + +/** Default notifications file structure */ +export const DEFAULT_NOTIFICATIONS_FILE: NotificationsFile = { + version: NOTIFICATIONS_VERSION, + notifications: [], +}; diff --git a/libs/types/src/provider.ts b/libs/types/src/provider.ts index aa62b561..e934e999 100644 --- a/libs/types/src/provider.ts +++ b/libs/types/src/provider.ts @@ -18,6 +18,61 @@ import type { CodexSandboxMode, CodexApprovalPolicy } from './codex.js'; */ export type ReasoningEffort = 'none' | 'minimal' | 'low' | 'medium' | 'high' | 'xhigh'; +/** + * Default timeout in milliseconds for provider operations. + * Used as the baseline timeout for API calls and CLI operations. + */ +export const DEFAULT_TIMEOUT_MS = 30000; + +/** + * Timeout multipliers for reasoning effort levels. + * Higher reasoning effort requires more time for the model to generate reasoning tokens. + * These multipliers are applied to DEFAULT_TIMEOUT_MS. + */ +export const REASONING_TIMEOUT_MULTIPLIERS: Record = { + none: 1.0, // No reasoning, baseline timeout + minimal: 1.2, // Very quick reasoning, slight increase + low: 1.5, // Quick reasoning, moderate increase + medium: 2.0, // Balanced reasoning, double baseline + high: 3.0, // Extended reasoning, triple baseline + xhigh: 4.0, // Maximum reasoning, quadruple baseline +}; + +/** + * Calculate timeout for provider operations based on reasoning effort. + * Higher reasoning effort requires more time for the model to generate reasoning tokens. + * + * This function addresses GitHub issue #530 where Codex CLI with GPT-5.2 "xtra thinking" + * (xhigh reasoning effort) mode would get stuck because the 30-second "no output" timeout + * would trigger during extended reasoning phases. + * + * @param reasoningEffort - The reasoning effort level, defaults to 'none' if undefined. + * If an invalid value is provided, falls back to multiplier 1.0. + * @param baseTimeoutMs - Optional custom base timeout, defaults to DEFAULT_TIMEOUT_MS (30000ms) + * @returns The calculated timeout in milliseconds, rounded to the nearest integer + * + * @example + * // Using default base timeout (30000ms) + * calculateReasoningTimeout('high') // Returns 90000 (30000 * 3.0) + * + * @example + * // Using custom base timeout + * calculateReasoningTimeout('medium', 60000) // Returns 120000 (60000 * 2.0) + * + * @example + * // No reasoning effort (default) + * calculateReasoningTimeout() // Returns 30000 (default timeout) + * calculateReasoningTimeout(undefined) // Returns 30000 + */ +export function calculateReasoningTimeout( + reasoningEffort?: ReasoningEffort, + baseTimeoutMs: number = DEFAULT_TIMEOUT_MS +): number { + const effort = reasoningEffort ?? 'none'; + const multiplier = REASONING_TIMEOUT_MULTIPLIERS[effort] ?? 1.0; + return Math.round(baseTimeoutMs * multiplier); +} + /** * Configuration for a provider instance */ diff --git a/libs/types/src/settings.ts b/libs/types/src/settings.ts index 6e807f66..844abf1e 100644 --- a/libs/types/src/settings.ts +++ b/libs/types/src/settings.ts @@ -108,12 +108,14 @@ export type ModelProvider = 'claude' | 'cursor' | 'codex' | 'opencode'; /** * EventHookTrigger - Event types that can trigger custom hooks * + * - feature_created: A new feature was created * - feature_success: Feature completed successfully * - feature_error: Feature failed with an error * - auto_mode_complete: Auto mode finished processing all features * - auto_mode_error: Auto mode encountered a critical error and paused */ export type EventHookTrigger = + | 'feature_created' | 'feature_success' | 'feature_error' | 'auto_mode_complete' @@ -186,6 +188,7 @@ export interface EventHook { /** Human-readable labels for event hook triggers */ export const EVENT_HOOK_TRIGGER_LABELS: Record = { + feature_created: 'Feature created', feature_success: 'Feature completed successfully', feature_error: 'Feature failed with error', auto_mode_complete: 'Auto mode completed all features', @@ -296,8 +299,12 @@ export interface KeyboardShortcuts { context: string; /** Open settings */ settings: string; + /** Open project settings */ + projectSettings: string; /** Open terminal */ terminal: string; + /** Open notifications */ + notifications: string; /** Toggle sidebar visibility */ toggleSidebar: string; /** Add new feature */ @@ -799,7 +806,9 @@ export const DEFAULT_KEYBOARD_SHORTCUTS: KeyboardShortcuts = { spec: 'D', context: 'C', settings: 'S', + projectSettings: 'Shift+S', terminal: 'T', + notifications: 'X', toggleSidebar: '`', addFeature: 'N', addContextFile: 'N', diff --git a/libs/utils/src/atomic-writer.ts b/libs/utils/src/atomic-writer.ts new file mode 100644 index 00000000..fe07e5eb --- /dev/null +++ b/libs/utils/src/atomic-writer.ts @@ -0,0 +1,362 @@ +/** + * Atomic file writing utilities for JSON data + * + * Provides atomic write operations using temp-file + rename pattern, + * ensuring data integrity even during crashes or power failures. + */ + +import { secureFs } from '@automaker/platform'; +import path from 'path'; +import { createLogger } from './logger.js'; +import { mkdirSafe } from './fs-utils.js'; + +const logger = createLogger('AtomicWriter'); + +/** Default maximum number of backup files to keep for crash recovery */ +export const DEFAULT_BACKUP_COUNT = 3; + +/** + * Options for atomic write operations + */ +export interface AtomicWriteOptions { + /** Number of spaces for JSON indentation (default: 2) */ + indent?: number; + /** Create parent directories if they don't exist (default: false) */ + createDirs?: boolean; + /** Number of backup files to keep (0 = no backups, default: 0). When > 0, rotates .bak1, .bak2, etc. */ + backupCount?: number; +} + +/** + * Rotate backup files (.bak1 -> .bak2 -> .bak3, oldest is deleted) + * and create a new backup from the current file. + * + * @param filePath - Absolute path to the file being backed up + * @param maxBackups - Maximum number of backup files to keep + */ +export async function rotateBackups( + filePath: string, + maxBackups: number = DEFAULT_BACKUP_COUNT +): Promise { + // Check if the source file exists before attempting backup + try { + await secureFs.access(filePath); + } catch { + // No existing file to backup + return; + } + + // Rotate existing backups: .bak3 is deleted, .bak2 -> .bak3, .bak1 -> .bak2 + for (let i = maxBackups; i >= 1; i--) { + const currentBackup = `${filePath}.bak${i}`; + const nextBackup = `${filePath}.bak${i + 1}`; + + try { + if (i === maxBackups) { + // Delete the oldest backup + await secureFs.unlink(currentBackup); + } else { + // Rename current backup to next slot + await secureFs.rename(currentBackup, nextBackup); + } + } catch { + // Ignore errors - backup file may not exist + } + } + + // Copy current file to .bak1 + try { + await secureFs.copyFile(filePath, `${filePath}.bak1`); + } catch (error) { + logger.warn(`Failed to create backup of ${filePath}:`, error); + // Continue with write even if backup fails + } +} + +/** + * Atomically write JSON data to a file. + * + * Uses the temp-file + rename pattern for atomicity: + * 1. Writes data to a temporary file + * 2. Atomically renames temp file to target path + * 3. Cleans up temp file on error + * + * @param filePath - Absolute path to the target file + * @param data - Data to serialize as JSON + * @param options - Optional write options + * @throws Error if write fails (temp file is cleaned up) + * + * @example + * ```typescript + * await atomicWriteJson('/path/to/config.json', { key: 'value' }); + * await atomicWriteJson('/path/to/data.json', data, { indent: 4, createDirs: true }); + * ``` + */ +export async function atomicWriteJson( + filePath: string, + data: T, + options: AtomicWriteOptions = {} +): Promise { + const { indent = 2, createDirs = false, backupCount = 0 } = options; + const resolvedPath = path.resolve(filePath); + const tempPath = `${resolvedPath}.tmp.${Date.now()}`; + + // Create parent directories if requested + if (createDirs) { + const dirPath = path.dirname(resolvedPath); + await mkdirSafe(dirPath); + } + + const content = JSON.stringify(data, null, indent); + + try { + // Rotate backups before writing (if backups are enabled) + if (backupCount > 0) { + await rotateBackups(resolvedPath, backupCount); + } + + await secureFs.writeFile(tempPath, content, 'utf-8'); + await secureFs.rename(tempPath, resolvedPath); + } catch (error) { + // Clean up temp file if it exists + try { + await secureFs.unlink(tempPath); + } catch { + // Ignore cleanup errors - best effort + } + logger.error(`Failed to atomically write to ${resolvedPath}:`, error); + throw error; + } +} + +/** + * Safely read JSON from a file with fallback to default value. + * + * Returns the default value if: + * - File doesn't exist (ENOENT) + * - File content is invalid JSON + * + * @param filePath - Absolute path to the file + * @param defaultValue - Value to return if file doesn't exist or is invalid + * @returns Parsed JSON data or default value + * + * @example + * ```typescript + * const config = await readJsonFile('/path/to/config.json', { version: 1 }); + * ``` + */ +export async function readJsonFile(filePath: string, defaultValue: T): Promise { + const resolvedPath = path.resolve(filePath); + + try { + const content = (await secureFs.readFile(resolvedPath, 'utf-8')) as string; + return JSON.parse(content) as T; + } catch (error) { + const nodeError = error as NodeJS.ErrnoException; + if (nodeError.code === 'ENOENT') { + return defaultValue; + } + logger.error(`Error reading JSON from ${resolvedPath}:`, error); + return defaultValue; + } +} + +/** + * Atomically update a JSON file by reading, transforming, and writing. + * + * Provides a safe read-modify-write pattern: + * 1. Reads existing file (or uses default) + * 2. Applies updater function + * 3. Atomically writes result + * + * @param filePath - Absolute path to the file + * @param defaultValue - Default value if file doesn't exist + * @param updater - Function that transforms the data + * @param options - Optional write options + * + * @example + * ```typescript + * await updateJsonAtomically( + * '/path/to/counter.json', + * { count: 0 }, + * (data) => ({ ...data, count: data.count + 1 }) + * ); + * ``` + */ +export async function updateJsonAtomically( + filePath: string, + defaultValue: T, + updater: (current: T) => T | Promise, + options: AtomicWriteOptions = {} +): Promise { + const current = await readJsonFile(filePath, defaultValue); + const updated = await updater(current); + await atomicWriteJson(filePath, updated, options); +} + +/** + * Result of a JSON read operation with recovery information + */ +export interface ReadJsonRecoveryResult { + /** The data that was successfully read */ + data: T; + /** Whether recovery was needed (main file was corrupted or missing) */ + recovered: boolean; + /** Source of the data: 'main', 'backup', 'temp', or 'default' */ + source: 'main' | 'backup' | 'temp' | 'default'; + /** Error message if the main file had an issue */ + error?: string; +} + +/** + * Options for readJsonWithRecovery + */ +export interface ReadJsonRecoveryOptions { + /** Maximum number of backup files to check (.bak1, .bak2, etc.) Default: 3 */ + maxBackups?: number; + /** Whether to automatically restore main file from backup when corrupted. Default: true */ + autoRestore?: boolean; +} + +/** + * Log a warning if recovery was needed (from backup or temp file). + * + * Use this helper to reduce duplicate logging code when using readJsonWithRecovery. + * + * @param result - The result from readJsonWithRecovery + * @param identifier - A human-readable identifier for the file being recovered (e.g., "Feature abc123") + * @param loggerInstance - Optional logger instance to use (defaults to AtomicWriter logger) + * + * @example + * ```typescript + * const result = await readJsonWithRecovery(featurePath, null); + * logRecoveryWarning(result, `Feature ${featureId}`); + * ``` + */ +export function logRecoveryWarning( + result: ReadJsonRecoveryResult, + identifier: string, + loggerInstance: { warn: (msg: string, ...args: unknown[]) => void } = logger +): void { + if (result.recovered && result.source !== 'default') { + loggerInstance.warn(`${identifier} was recovered from ${result.source}: ${result.error}`); + } +} + +/** + * Read JSON file with automatic recovery from backups. + * + * This function attempts to read a JSON file with fallback to backups: + * 1. Try to read the main file + * 2. If corrupted, check for temp files (.tmp.*) that might have valid data + * 3. If no valid temp file, try backup files (.bak1, .bak2, .bak3) + * 4. If all fail, return the default value + * + * Optionally restores the main file from a valid backup (autoRestore: true). + * + * @param filePath - Absolute path to the file + * @param defaultValue - Value to return if no valid data found + * @param options - Recovery options + * @returns Result containing the data and recovery information + * + * @example + * ```typescript + * const result = await readJsonWithRecovery('/path/to/config.json', { version: 1 }); + * if (result.recovered) { + * console.log(`Recovered from ${result.source}: ${result.error}`); + * } + * const config = result.data; + * ``` + */ +export async function readJsonWithRecovery( + filePath: string, + defaultValue: T, + options: ReadJsonRecoveryOptions = {} +): Promise> { + const { maxBackups = 3, autoRestore = true } = options; + const resolvedPath = path.resolve(filePath); + const dirPath = path.dirname(resolvedPath); + const fileName = path.basename(resolvedPath); + + // Try to read the main file first + try { + const content = (await secureFs.readFile(resolvedPath, 'utf-8')) as string; + const data = JSON.parse(content) as T; + return { data, recovered: false, source: 'main' }; + } catch (mainError) { + const nodeError = mainError as NodeJS.ErrnoException; + const errorMessage = + nodeError.code === 'ENOENT' + ? 'File does not exist' + : `Failed to parse: ${mainError instanceof Error ? mainError.message : String(mainError)}`; + + // If file doesn't exist, check for temp files or backups + logger.warn(`Main file ${resolvedPath} unavailable: ${errorMessage}`); + + // Try to find and recover from temp files first (in case of interrupted write) + try { + const files = (await secureFs.readdir(dirPath)) as string[]; + const tempFiles = files + .filter((f: string) => f.startsWith(`${fileName}.tmp.`)) + .sort() + .reverse(); // Most recent first + + for (const tempFile of tempFiles) { + const tempPath = path.join(dirPath, tempFile); + try { + const content = (await secureFs.readFile(tempPath, 'utf-8')) as string; + const data = JSON.parse(content) as T; + + logger.info(`Recovered data from temp file: ${tempPath}`); + + // Optionally restore main file from temp + if (autoRestore) { + try { + await secureFs.rename(tempPath, resolvedPath); + logger.info(`Restored main file from temp: ${tempPath}`); + } catch (restoreError) { + logger.warn(`Failed to restore main file from temp: ${restoreError}`); + } + } + + return { data, recovered: true, source: 'temp', error: errorMessage }; + } catch { + // This temp file is also corrupted, try next + continue; + } + } + } catch { + // Could not read directory, skip temp file check + } + + // Try backup files (.bak1, .bak2, .bak3) + for (let i = 1; i <= maxBackups; i++) { + const backupPath = `${resolvedPath}.bak${i}`; + try { + const content = (await secureFs.readFile(backupPath, 'utf-8')) as string; + const data = JSON.parse(content) as T; + + logger.info(`Recovered data from backup: ${backupPath}`); + + // Optionally restore main file from backup + if (autoRestore) { + try { + await secureFs.copyFile(backupPath, resolvedPath); + logger.info(`Restored main file from backup: ${backupPath}`); + } catch (restoreError) { + logger.warn(`Failed to restore main file from backup: ${restoreError}`); + } + } + + return { data, recovered: true, source: 'backup', error: errorMessage }; + } catch { + // This backup doesn't exist or is corrupted, try next + continue; + } + } + + // All recovery attempts failed, return default + logger.warn(`All recovery attempts failed for ${resolvedPath}, using default value`); + return { data: defaultValue, recovered: true, source: 'default', error: errorMessage }; + } +} diff --git a/libs/utils/src/index.ts b/libs/utils/src/index.ts index cb831db3..e5e7ea16 100644 --- a/libs/utils/src/index.ts +++ b/libs/utils/src/index.ts @@ -53,6 +53,20 @@ export { // File system utilities export { mkdirSafe, existsSafe } from './fs-utils.js'; +// Atomic file operations +export { + atomicWriteJson, + readJsonFile, + updateJsonAtomically, + readJsonWithRecovery, + rotateBackups, + logRecoveryWarning, + DEFAULT_BACKUP_COUNT, + type AtomicWriteOptions, + type ReadJsonRecoveryResult, + type ReadJsonRecoveryOptions, +} from './atomic-writer.js'; + // Path utilities export { normalizePath, pathsEqual } from './path-utils.js'; diff --git a/libs/utils/tests/atomic-writer.test.ts b/libs/utils/tests/atomic-writer.test.ts new file mode 100644 index 00000000..1efa57d5 --- /dev/null +++ b/libs/utils/tests/atomic-writer.test.ts @@ -0,0 +1,709 @@ +import { describe, it, expect, beforeEach, afterEach, vi, type MockInstance } from 'vitest'; +import fs from 'fs/promises'; +import path from 'path'; +import os from 'os'; +import { secureFs } from '@automaker/platform'; +import { + atomicWriteJson, + readJsonFile, + updateJsonAtomically, + readJsonWithRecovery, +} from '../src/atomic-writer'; + +// Mock secureFs +vi.mock('@automaker/platform', () => ({ + secureFs: { + writeFile: vi.fn(), + readFile: vi.fn(), + rename: vi.fn(), + unlink: vi.fn(), + readdir: vi.fn(), + copyFile: vi.fn(), + access: vi.fn(), + lstat: vi.fn(), + mkdir: vi.fn(), + }, +})); + +// Mock logger to suppress output during tests +vi.mock('../src/logger.js', () => ({ + createLogger: () => ({ + info: vi.fn(), + warn: vi.fn(), + error: vi.fn(), + debug: vi.fn(), + }), +})); + +describe('atomic-writer.ts', () => { + let tempDir: string; + + beforeEach(async () => { + // Create a temporary directory for integration tests + tempDir = await fs.mkdtemp(path.join(os.tmpdir(), 'atomic-writer-test-')); + vi.clearAllMocks(); + }); + + afterEach(async () => { + // Clean up temporary directory + try { + await fs.rm(tempDir, { recursive: true, force: true }); + } catch { + // Ignore cleanup errors + } + }); + + describe('atomicWriteJson', () => { + it('should write JSON data atomically', async () => { + const filePath = path.join(tempDir, 'test.json'); + const data = { key: 'value', number: 42 }; + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await atomicWriteJson(filePath, data); + + // Verify writeFile was called with temp file path and JSON content + expect(secureFs.writeFile).toHaveBeenCalledTimes(1); + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + expect(writeCall[0]).toMatch(/\.tmp\.\d+$/); + expect(writeCall[1]).toBe(JSON.stringify(data, null, 2)); + expect(writeCall[2]).toBe('utf-8'); + + // Verify rename was called with temp -> target + expect(secureFs.rename).toHaveBeenCalledTimes(1); + const renameCall = (secureFs.rename as unknown as MockInstance).mock.calls[0]; + expect(renameCall[0]).toMatch(/\.tmp\.\d+$/); + expect(renameCall[1]).toBe(path.resolve(filePath)); + }); + + it('should use custom indentation', async () => { + const filePath = path.join(tempDir, 'test.json'); + const data = { key: 'value' }; + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await atomicWriteJson(filePath, data, { indent: 4 }); + + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + expect(writeCall[1]).toBe(JSON.stringify(data, null, 4)); + }); + + it('should clean up temp file on write failure', async () => { + const filePath = path.join(tempDir, 'test.json'); + const data = { key: 'value' }; + + const writeError = new Error('Write failed'); + (secureFs.writeFile as unknown as MockInstance).mockRejectedValue(writeError); + (secureFs.unlink as unknown as MockInstance).mockResolvedValue(undefined); + + await expect(atomicWriteJson(filePath, data)).rejects.toThrow('Write failed'); + + expect(secureFs.unlink).toHaveBeenCalledTimes(1); + }); + + it('should clean up temp file on rename failure', async () => { + const filePath = path.join(tempDir, 'test.json'); + const data = { key: 'value' }; + + const renameError = new Error('Rename failed'); + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockRejectedValue(renameError); + (secureFs.unlink as unknown as MockInstance).mockResolvedValue(undefined); + + await expect(atomicWriteJson(filePath, data)).rejects.toThrow('Rename failed'); + + expect(secureFs.unlink).toHaveBeenCalledTimes(1); + }); + + it('should ignore cleanup errors', async () => { + const filePath = path.join(tempDir, 'test.json'); + const data = { key: 'value' }; + + const writeError = new Error('Write failed'); + const unlinkError = new Error('Unlink failed'); + (secureFs.writeFile as unknown as MockInstance).mockRejectedValue(writeError); + (secureFs.unlink as unknown as MockInstance).mockRejectedValue(unlinkError); + + // Should still throw the original error, not the cleanup error + await expect(atomicWriteJson(filePath, data)).rejects.toThrow('Write failed'); + }); + + it('should resolve relative paths', async () => { + const relativePath = 'test.json'; + const data = { key: 'value' }; + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await atomicWriteJson(relativePath, data); + + const renameCall = (secureFs.rename as unknown as MockInstance).mock.calls[0]; + expect(renameCall[1]).toBe(path.resolve(relativePath)); + }); + + it('should handle arrays as data', async () => { + const filePath = path.join(tempDir, 'array.json'); + const data = [1, 2, 3, { nested: 'value' }]; + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await atomicWriteJson(filePath, data); + + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + expect(writeCall[1]).toBe(JSON.stringify(data, null, 2)); + }); + + it('should handle null and primitive values', async () => { + const filePath = path.join(tempDir, 'primitive.json'); + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await atomicWriteJson(filePath, null); + expect((secureFs.writeFile as unknown as MockInstance).mock.calls[0][1]).toBe('null'); + + await atomicWriteJson(filePath, 'string'); + expect((secureFs.writeFile as unknown as MockInstance).mock.calls[1][1]).toBe('"string"'); + + await atomicWriteJson(filePath, 123); + expect((secureFs.writeFile as unknown as MockInstance).mock.calls[2][1]).toBe('123'); + }); + + it('should create directories when createDirs is true', async () => { + const filePath = path.join(tempDir, 'nested', 'deep', 'test.json'); + const data = { key: 'value' }; + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + // Mock lstat to indicate directory already exists + (secureFs.lstat as unknown as MockInstance).mockResolvedValue({ + isDirectory: () => true, + isSymbolicLink: () => false, + }); + + await atomicWriteJson(filePath, data, { createDirs: true }); + + expect(secureFs.writeFile).toHaveBeenCalled(); + }); + }); + + describe('readJsonFile', () => { + it('should read and parse JSON file', async () => { + const filePath = path.join(tempDir, 'read.json'); + const data = { key: 'value', count: 5 }; + + (secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(data)); + + const result = await readJsonFile(filePath, {}); + + expect(result).toEqual(data); + expect(secureFs.readFile).toHaveBeenCalledWith(path.resolve(filePath), 'utf-8'); + }); + + it('should return default value when file does not exist', async () => { + const filePath = path.join(tempDir, 'nonexistent.json'); + const defaultValue = { default: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError); + + const result = await readJsonFile(filePath, defaultValue); + + expect(result).toEqual(defaultValue); + }); + + it('should return default value when JSON is invalid', async () => { + const filePath = path.join(tempDir, 'invalid.json'); + const defaultValue = { default: true }; + + (secureFs.readFile as unknown as MockInstance).mockResolvedValue('not valid json'); + + const result = await readJsonFile(filePath, defaultValue); + + expect(result).toEqual(defaultValue); + }); + + it('should return default value for other read errors', async () => { + const filePath = path.join(tempDir, 'error.json'); + const defaultValue = { default: true }; + + const accessError = new Error('Access denied') as NodeJS.ErrnoException; + accessError.code = 'EACCES'; + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(accessError); + + const result = await readJsonFile(filePath, defaultValue); + + expect(result).toEqual(defaultValue); + }); + + it('should handle empty object as default', async () => { + const filePath = path.join(tempDir, 'nonexistent.json'); + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError); + + const result = await readJsonFile>(filePath, {}); + + expect(result).toEqual({}); + }); + + it('should handle array as default', async () => { + const filePath = path.join(tempDir, 'nonexistent.json'); + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError); + + const result = await readJsonFile(filePath, []); + + expect(result).toEqual([]); + }); + + it('should parse nested objects correctly', async () => { + const filePath = path.join(tempDir, 'nested.json'); + const data = { + level1: { + level2: { + value: 'deep', + array: [1, 2, { nested: true }], + }, + }, + }; + + (secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(data)); + + const result = await readJsonFile(filePath, {}); + + expect(result).toEqual(data); + }); + }); + + describe('updateJsonAtomically', () => { + it('should read, update, and write file atomically', async () => { + const filePath = path.join(tempDir, 'update.json'); + const initialData = { count: 5 }; + const defaultValue = { count: 0 }; + + (secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(initialData)); + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await updateJsonAtomically(filePath, defaultValue, (data) => ({ + ...data, + count: data.count + 1, + })); + + // Verify the write was called with updated data + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + const writtenData = JSON.parse(writeCall[1]); + expect(writtenData.count).toBe(6); + }); + + it('should use default value when file does not exist', async () => { + const filePath = path.join(tempDir, 'new.json'); + const defaultValue = { count: 0 }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError); + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await updateJsonAtomically(filePath, defaultValue, (data) => ({ + ...data, + count: data.count + 1, + })); + + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + const writtenData = JSON.parse(writeCall[1]); + expect(writtenData.count).toBe(1); + }); + + it('should support async updater function', async () => { + const filePath = path.join(tempDir, 'async.json'); + const initialData = { value: 'initial' }; + + (secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(initialData)); + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await updateJsonAtomically(filePath, {}, async (data) => { + await new Promise((resolve) => setTimeout(resolve, 10)); + return { ...data, value: 'updated' }; + }); + + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + const writtenData = JSON.parse(writeCall[1]); + expect(writtenData.value).toBe('updated'); + }); + + it('should pass through options to atomicWriteJson', async () => { + const filePath = path.join(tempDir, 'options.json'); + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError); + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await updateJsonAtomically(filePath, { key: 'value' }, (d) => d, { indent: 4 }); + + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + expect(writeCall[1]).toBe(JSON.stringify({ key: 'value' }, null, 4)); + }); + }); + + describe('readJsonWithRecovery', () => { + it('should return main file data when available', async () => { + const filePath = path.join(tempDir, 'main.json'); + const data = { main: true }; + + (secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(data)); + + const result = await readJsonWithRecovery(filePath, {}); + + expect(result.data).toEqual(data); + expect(result.recovered).toBe(false); + expect(result.source).toBe('main'); + expect(result.error).toBeUndefined(); + }); + + it('should recover from temp file when main file is missing', async () => { + const filePath = path.join(tempDir, 'data.json'); + const tempData = { fromTemp: true }; + const fileName = path.basename(filePath); + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance) + .mockRejectedValueOnce(enoentError) // Main file + .mockResolvedValueOnce(JSON.stringify(tempData)); // Temp file + + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([ + `${fileName}.tmp.1234567890`, + 'other-file.json', + ]); + + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + const result = await readJsonWithRecovery(filePath, {}); + + expect(result.data).toEqual(tempData); + expect(result.recovered).toBe(true); + expect(result.source).toBe('temp'); + expect(result.error).toBe('File does not exist'); + }); + + it('should recover from backup file when main and temp are unavailable', async () => { + const filePath = path.join(tempDir, 'data.json'); + const backupData = { fromBackup: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance) + .mockRejectedValueOnce(enoentError) // Main file + .mockRejectedValueOnce(enoentError) // backup1 + .mockResolvedValueOnce(JSON.stringify(backupData)); // backup2 + + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([]); // No temp files + + (secureFs.copyFile as unknown as MockInstance).mockResolvedValue(undefined); + + const result = await readJsonWithRecovery(filePath, {}); + + expect(result.data).toEqual(backupData); + expect(result.recovered).toBe(true); + expect(result.source).toBe('backup'); + }); + + it('should return default when all recovery attempts fail', async () => { + const filePath = path.join(tempDir, 'data.json'); + const defaultValue = { default: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError); + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([]); + + const result = await readJsonWithRecovery(filePath, defaultValue); + + expect(result.data).toEqual(defaultValue); + expect(result.recovered).toBe(true); + expect(result.source).toBe('default'); + expect(result.error).toBe('File does not exist'); + }); + + it('should try multiple temp files in order', async () => { + const filePath = path.join(tempDir, 'data.json'); + const fileName = path.basename(filePath); + const validTempData = { valid: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance) + .mockRejectedValueOnce(enoentError) // Main file + .mockResolvedValueOnce('invalid json') // First temp file (invalid) + .mockResolvedValueOnce(JSON.stringify(validTempData)); // Second temp file + + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([ + `${fileName}.tmp.9999999999`, // Most recent + `${fileName}.tmp.1111111111`, // Older + ]); + + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + const result = await readJsonWithRecovery(filePath, {}); + + expect(result.data).toEqual(validTempData); + expect(result.source).toBe('temp'); + }); + + it('should try multiple backup files in order', async () => { + const filePath = path.join(tempDir, 'data.json'); + const backupData = { backup2: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance) + .mockRejectedValueOnce(enoentError) // Main file + .mockRejectedValueOnce(enoentError) // .bak1 + .mockResolvedValueOnce(JSON.stringify(backupData)); // .bak2 + + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([]); + + (secureFs.copyFile as unknown as MockInstance).mockResolvedValue(undefined); + + const result = await readJsonWithRecovery(filePath, {}); + + expect(result.data).toEqual(backupData); + expect(result.source).toBe('backup'); + + // Verify it tried .bak1 first + expect(secureFs.readFile).toHaveBeenNthCalledWith( + 2, + `${path.resolve(filePath)}.bak1`, + 'utf-8' + ); + }); + + it('should respect maxBackups option', async () => { + const filePath = path.join(tempDir, 'data.json'); + const defaultValue = { default: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(enoentError); + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([]); + + const result = await readJsonWithRecovery(filePath, defaultValue, { maxBackups: 1 }); + + expect(result.source).toBe('default'); + // Should only have tried main + 1 backup + expect(secureFs.readFile).toHaveBeenCalledTimes(2); + }); + + it('should not auto-restore when autoRestore is false', async () => { + const filePath = path.join(tempDir, 'data.json'); + const fileName = path.basename(filePath); + const tempData = { fromTemp: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance) + .mockRejectedValueOnce(enoentError) + .mockResolvedValueOnce(JSON.stringify(tempData)); + + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([`${fileName}.tmp.123`]); + + const result = await readJsonWithRecovery(filePath, {}, { autoRestore: false }); + + expect(result.data).toEqual(tempData); + expect(secureFs.rename).not.toHaveBeenCalled(); + expect(secureFs.copyFile).not.toHaveBeenCalled(); + }); + + it('should handle directory read errors gracefully', async () => { + const filePath = path.join(tempDir, 'data.json'); + const backupData = { backup: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance) + .mockRejectedValueOnce(enoentError) // Main file + .mockResolvedValueOnce(JSON.stringify(backupData)); // backup1 + + (secureFs.readdir as unknown as MockInstance).mockRejectedValue(new Error('Dir read failed')); + (secureFs.copyFile as unknown as MockInstance).mockResolvedValue(undefined); + + const result = await readJsonWithRecovery(filePath, {}); + + // Should skip temp files and go to backups + expect(result.data).toEqual(backupData); + expect(result.source).toBe('backup'); + }); + + it('should handle corrupted main file with valid error message', async () => { + const filePath = path.join(tempDir, 'corrupted.json'); + const defaultValue = { default: true }; + + const parseError = new SyntaxError('Unexpected token'); + (secureFs.readFile as unknown as MockInstance).mockResolvedValueOnce('{{invalid'); + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([]); + + // Mock to actually throw parse error + (secureFs.readFile as unknown as MockInstance).mockImplementationOnce(() => { + return Promise.resolve('{{invalid json'); + }); + + const result = await readJsonWithRecovery(filePath, defaultValue); + + expect(result.recovered).toBe(true); + expect(result.error).toContain('Failed to parse'); + }); + + it('should handle restore failures gracefully', async () => { + const filePath = path.join(tempDir, 'data.json'); + const fileName = path.basename(filePath); + const tempData = { fromTemp: true }; + + const enoentError = new Error('File not found') as NodeJS.ErrnoException; + enoentError.code = 'ENOENT'; + + (secureFs.readFile as unknown as MockInstance) + .mockRejectedValueOnce(enoentError) + .mockResolvedValueOnce(JSON.stringify(tempData)); + + (secureFs.readdir as unknown as MockInstance).mockResolvedValue([`${fileName}.tmp.123`]); + (secureFs.rename as unknown as MockInstance).mockRejectedValue(new Error('Restore failed')); + + const result = await readJsonWithRecovery(filePath, {}); + + // Should still return data even if restore failed + expect(result.data).toEqual(tempData); + expect(result.source).toBe('temp'); + }); + }); + + describe('Edge cases', () => { + it('should handle empty file path gracefully', async () => { + (secureFs.readFile as unknown as MockInstance).mockRejectedValue(new Error('Invalid path')); + + const result = await readJsonFile('', { default: true }); + + expect(result).toEqual({ default: true }); + }); + + it('should handle special characters in file path', async () => { + const filePath = path.join(tempDir, 'file with spaces & special!.json'); + const data = { special: 'chars' }; + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await atomicWriteJson(filePath, data); + + expect(secureFs.writeFile).toHaveBeenCalled(); + }); + + it('should handle very large objects', async () => { + const filePath = path.join(tempDir, 'large.json'); + const largeArray = Array.from({ length: 10000 }, (_, i) => ({ + id: i, + data: `item-${i}`, + })); + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await atomicWriteJson(filePath, largeArray); + + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + expect(JSON.parse(writeCall[1])).toEqual(largeArray); + }); + + it('should handle unicode content', async () => { + const filePath = path.join(tempDir, 'unicode.json'); + const data = { emoji: '🎉', japanese: 'こんにちは', chinese: '你好' }; + + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await atomicWriteJson(filePath, data); + + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + expect(JSON.parse(writeCall[1])).toEqual(data); + }); + + it('should handle circular reference error in JSON', async () => { + const filePath = path.join(tempDir, 'circular.json'); + const circular: Record = { key: 'value' }; + circular.self = circular; + + await expect(atomicWriteJson(filePath, circular)).rejects.toThrow(); + }); + }); + + describe('Type safety', () => { + interface TestConfig { + version: number; + settings: { + enabled: boolean; + name: string; + }; + } + + it('should preserve types in readJsonFile', async () => { + const filePath = path.join(tempDir, 'config.json'); + const expected: TestConfig = { + version: 1, + settings: { enabled: true, name: 'test' }, + }; + + (secureFs.readFile as unknown as MockInstance).mockResolvedValue(JSON.stringify(expected)); + + const result = await readJsonFile(filePath, { + version: 0, + settings: { enabled: false, name: '' }, + }); + + expect(result.version).toBe(1); + expect(result.settings.enabled).toBe(true); + expect(result.settings.name).toBe('test'); + }); + + it('should preserve types in updateJsonAtomically', async () => { + const filePath = path.join(tempDir, 'counter.json'); + + interface Counter { + count: number; + } + + (secureFs.readFile as unknown as MockInstance).mockResolvedValue( + JSON.stringify({ count: 5 }) + ); + (secureFs.writeFile as unknown as MockInstance).mockResolvedValue(undefined); + (secureFs.rename as unknown as MockInstance).mockResolvedValue(undefined); + + await updateJsonAtomically(filePath, { count: 0 }, (data) => ({ + count: data.count + 1, + })); + + const writeCall = (secureFs.writeFile as unknown as MockInstance).mock.calls[0]; + const writtenData: Counter = JSON.parse(writeCall[1]); + expect(writtenData.count).toBe(6); + }); + }); +}); diff --git a/package-lock.json b/package-lock.json index 45fa6af8..d40c4f73 100644 --- a/package-lock.json +++ b/package-lock.json @@ -8950,9 +8950,9 @@ } }, "node_modules/diff": { - "version": "8.0.2", - "resolved": "https://registry.npmjs.org/diff/-/diff-8.0.2.tgz", - "integrity": "sha512-sSuxWU5j5SR9QQji/o2qMvqRNYRDOcBTgsJ/DeCf4iSN4gW+gNMXM7wFIP+fdXZxoNiAnHUTGjCr+TSWXdRDKg==", + "version": "8.0.3", + "resolved": "https://registry.npmjs.org/diff/-/diff-8.0.3.tgz", + "integrity": "sha512-qejHi7bcSD4hQAZE0tNAawRK1ZtafHDmMTMkrrIGgSLl7hTnQHmKCeB45xAcbfTqK2zowkM3j3bHt/4b/ARbYQ==", "dev": true, "license": "BSD-3-Clause", "engines": { @@ -10803,9 +10803,9 @@ } }, "node_modules/hono": { - "version": "4.11.3", - "resolved": "https://registry.npmjs.org/hono/-/hono-4.11.3.tgz", - "integrity": "sha512-PmQi306+M/ct/m5s66Hrg+adPnkD5jiO6IjA7WhWw0gSBSo1EcRegwuI1deZ+wd5pzCGynCcn2DprnE4/yEV4w==", + "version": "4.11.4", + "resolved": "https://registry.npmjs.org/hono/-/hono-4.11.4.tgz", + "integrity": "sha512-U7tt8JsyrxSRKspfhtLET79pU8K+tInj5QZXs1jSugO1Vq5dFj3kmZsRldo29mTBfcjDRVRXrEZ6LS63Cog9ZA==", "license": "MIT", "peer": true, "engines": { diff --git a/package.json b/package.json index 27ef27bb..815745e8 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "automaker", - "version": "1.0.0", + "version": "0.12.0rc", "private": true, "engines": { "node": ">=22.0.0 <23.0.0" @@ -12,7 +12,8 @@ "scripts": { "postinstall": "node -e \"const fs=require('fs');if(process.platform==='darwin'){['darwin-arm64','darwin-x64'].forEach(a=>{const p='node_modules/node-pty/prebuilds/'+a+'/spawn-helper';if(fs.existsSync(p))fs.chmodSync(p,0o755)})}\" && node scripts/fix-lockfile-urls.mjs", "fix:lockfile": "node scripts/fix-lockfile-urls.mjs", - "dev": "node dev.mjs", + "dev": "./start-automaker.sh", + "start": "./start-automaker.sh --production", "_dev:web": "npm run dev:web --workspace=apps/ui", "_dev:electron": "npm run dev:electron --workspace=apps/ui", "_dev:electron:debug": "npm run dev:electron:debug --workspace=apps/ui", diff --git a/scripts/launcher-utils.mjs b/scripts/launcher-utils.mjs deleted file mode 100644 index 1dcdab7f..00000000 --- a/scripts/launcher-utils.mjs +++ /dev/null @@ -1,1095 +0,0 @@ -/** - * Shared utilities for Automaker launcher scripts (dev.mjs) - * - * This module contains cross-platform utilities for: - * - Process management (ports, killing processes) - * - Terminal output (colors, logging) - * - npm/npx command execution - * - User prompts - * - Health checks - * - * SECURITY NOTE: Uses a restricted fs wrapper that only allows - * operations within a specified base directory. - */ - -import { execSync } from 'child_process'; -import fsNative, { statSync } from 'fs'; -import http from 'http'; -import path from 'path'; -import readline from 'readline'; -import { createRequire } from 'module'; - -const require = createRequire(import.meta.url); -const treeKill = require('tree-kill'); -const crossSpawn = require('cross-spawn'); - -// ============================================================================= -// Terminal Colors -// ============================================================================= - -export const colors = { - green: '\x1b[0;32m', - blue: '\x1b[0;34m', - yellow: '\x1b[1;33m', - red: '\x1b[0;31m', - reset: '\x1b[0m', -}; - -export const isWindows = process.platform === 'win32'; - -// ============================================================================= -// Restricted fs wrapper - only allows operations within a base directory -// ============================================================================= - -/** - * Create a restricted fs wrapper for a given base directory - * @param {string} baseDir - The base directory to restrict operations to - * @param {string} scriptName - Name of the calling script for error messages - * @returns {object} - Restricted fs operations - */ -export function createRestrictedFs(baseDir, scriptName = 'launcher') { - const normalizedBase = path.resolve(baseDir); - - function validatePath(targetPath) { - const resolved = path.resolve(baseDir, targetPath); - if (!resolved.startsWith(normalizedBase + path.sep) && resolved !== normalizedBase) { - throw new Error( - `[${scriptName}] Security: Path access denied outside script directory: ${targetPath}` - ); - } - return resolved; - } - - return { - existsSync(targetPath) { - const validated = validatePath(targetPath); - return fsNative.existsSync(validated); - }, - mkdirSync(targetPath, options) { - const validated = validatePath(targetPath); - return fsNative.mkdirSync(validated, options); - }, - createWriteStream(targetPath) { - const validated = validatePath(targetPath); - return fsNative.createWriteStream(validated); - }, - }; -} - -// ============================================================================= -// Logging -// ============================================================================= - -/** - * Print colored output - * @param {string} message - Message to print - * @param {string} color - Color name (green, blue, yellow, red, reset) - */ -export function log(message, color = 'reset') { - console.log(`${colors[color]}${message}${colors.reset}`); -} - -// ============================================================================= -// Command Execution -// ============================================================================= - -/** - * Execute a command synchronously and return stdout - * @param {string} command - Command to execute - * @param {object} options - execSync options - * @returns {string|null} - Command output or null on error - */ -export function execCommand(command, options = {}) { - try { - return execSync(command, { - encoding: 'utf8', - stdio: 'pipe', - ...options, - }).trim(); - } catch { - return null; - } -} - -/** - * Run npm command using cross-spawn for Windows compatibility - * @param {string[]} args - npm command arguments - * @param {object} options - spawn options - * @param {string} cwd - Working directory - * @returns {ChildProcess} - Spawned process - */ -export function runNpm(args, options = {}, cwd = process.cwd()) { - const { env, ...restOptions } = options; - const spawnOptions = { - stdio: 'inherit', - cwd, - ...restOptions, - env: { - ...process.env, - ...(env || {}), - }, - }; - return crossSpawn('npm', args, spawnOptions); -} - -/** - * Run an npm command and wait for completion - * @param {string[]} args - npm command arguments - * @param {object} options - spawn options - * @param {string} cwd - Working directory - * @returns {Promise} - */ -export function runNpmAndWait(args, options = {}, cwd = process.cwd()) { - const child = runNpm(args, options, cwd); - return new Promise((resolve, reject) => { - child.on('close', (code) => { - if (code === 0) resolve(); - else reject(new Error(`npm ${args.join(' ')} failed with code ${code}`)); - }); - child.on('error', (err) => reject(err)); - }); -} - -/** - * Run npx command using cross-spawn for Windows compatibility - * @param {string[]} args - npx command arguments - * @param {object} options - spawn options - * @param {string} cwd - Working directory - * @returns {ChildProcess} - Spawned process - */ -export function runNpx(args, options = {}, cwd = process.cwd()) { - const { env, ...restOptions } = options; - const spawnOptions = { - stdio: 'inherit', - cwd, - ...restOptions, - env: { - ...process.env, - ...(env || {}), - }, - }; - return crossSpawn('npx', args, spawnOptions); -} - -// ============================================================================= -// Process Management -// ============================================================================= - -/** - * Get process IDs using a specific port (cross-platform) - * @param {number} port - Port number to check - * @returns {number[]} - Array of PIDs using the port - */ -export function getProcessesOnPort(port) { - const pids = new Set(); - - if (isWindows) { - try { - const output = execCommand(`netstat -ano | findstr :${port}`); - if (output) { - const lines = output.split('\n'); - for (const line of lines) { - const match = line.match(/:\d+\s+.*?(\d+)\s*$/); - if (match) { - const pid = parseInt(match[1], 10); - if (pid > 0) pids.add(pid); - } - } - } - } catch { - // Ignore errors - } - } else { - try { - const output = execCommand(`lsof -ti:${port}`); - if (output) { - output.split('\n').forEach((pid) => { - const parsed = parseInt(pid.trim(), 10); - if (parsed > 0) pids.add(parsed); - }); - } - } catch { - // Ignore errors - } - } - - return Array.from(pids); -} - -/** - * Kill a process by PID (cross-platform) - * @param {number} pid - Process ID to kill - * @returns {boolean} - Whether the kill succeeded - */ -export function killProcess(pid) { - try { - if (isWindows) { - execCommand(`taskkill /F /PID ${pid}`); - } else { - process.kill(pid, 'SIGKILL'); - } - return true; - } catch { - return false; - } -} - -/** - * Check if a port is in use (without killing) - * @param {number} port - Port number to check - * @returns {boolean} - Whether the port is in use - */ -export function isPortInUse(port) { - const pids = getProcessesOnPort(port); - return pids.length > 0; -} - -/** - * Kill processes on a port and wait for it to be freed - * @param {number} port - Port number to free - * @returns {Promise} - Whether the port was freed - */ -export async function killPort(port) { - const pids = getProcessesOnPort(port); - - if (pids.length === 0) { - log(`✓ Port ${port} is available`, 'green'); - return true; - } - - log(`Killing process(es) on port ${port}: ${pids.join(', ')}`, 'yellow'); - - for (const pid of pids) { - killProcess(pid); - } - - // Wait for port to be freed (max 5 seconds) - for (let i = 0; i < 10; i++) { - await sleep(500); - const remainingPids = getProcessesOnPort(port); - if (remainingPids.length === 0) { - log(`✓ Port ${port} is now free`, 'green'); - return true; - } - } - - log(`Warning: Port ${port} may still be in use`, 'red'); - return false; -} - -/** - * Kill a process tree using tree-kill - * @param {number} pid - Root process ID - * @returns {Promise} - */ -export function killProcessTree(pid) { - return new Promise((resolve) => { - if (!pid) { - resolve(); - return; - } - treeKill(pid, 'SIGTERM', (err) => { - if (err) { - treeKill(pid, 'SIGKILL', () => resolve()); - } else { - resolve(); - } - }); - }); -} - -// ============================================================================= -// Utilities -// ============================================================================= - -/** - * Sleep for a given number of milliseconds - * @param {number} ms - Milliseconds to sleep - * @returns {Promise} - */ -export function sleep(ms) { - return new Promise((resolve) => setTimeout(resolve, ms)); -} - -/** - * Check if the server health endpoint is responding - * @param {number} port - Server port (default 3008) - * @returns {Promise} - Whether the server is healthy - */ -export function checkHealth(port = 3008) { - return new Promise((resolve) => { - const req = http.get(`http://localhost:${port}/api/health`, (res) => { - resolve(res.statusCode === 200); - }); - req.on('error', () => resolve(false)); - req.setTimeout(2000, () => { - req.destroy(); - resolve(false); - }); - }); -} - -/** - * Prompt the user for input - * @param {string} question - Question to ask - * @returns {Promise} - User's answer - */ -export function prompt(question) { - const rl = readline.createInterface({ - input: process.stdin, - output: process.stdout, - }); - - return new Promise((resolve) => { - rl.question(question, (answer) => { - rl.close(); - resolve(answer.trim()); - }); - }); -} - -// ============================================================================= -// Port Configuration Flow -// ============================================================================= - -/** - * Check ports and prompt user for resolution if in use - * @param {object} options - Configuration options - * @param {number} options.defaultWebPort - Default web port (3007) - * @param {number} options.defaultServerPort - Default server port (3008) - * @returns {Promise<{webPort: number, serverPort: number, corsOriginEnv: string}>} - */ -export async function resolvePortConfiguration({ - defaultWebPort = 3007, - defaultServerPort = 3008, -} = {}) { - log(`Checking for processes on ports ${defaultWebPort} and ${defaultServerPort}...`, 'yellow'); - - const webPortInUse = isPortInUse(defaultWebPort); - const serverPortInUse = isPortInUse(defaultServerPort); - - let webPort = defaultWebPort; - let serverPort = defaultServerPort; - - if (webPortInUse || serverPortInUse) { - console.log(''); - if (webPortInUse) { - const pids = getProcessesOnPort(defaultWebPort); - log(`⚠ Port ${defaultWebPort} is in use by process(es): ${pids.join(', ')}`, 'yellow'); - } - if (serverPortInUse) { - const pids = getProcessesOnPort(defaultServerPort); - log(`⚠ Port ${defaultServerPort} is in use by process(es): ${pids.join(', ')}`, 'yellow'); - } - console.log(''); - - while (true) { - const choice = await prompt( - 'What would you like to do? (k)ill processes, (u)se different ports, or (c)ancel: ' - ); - const lowerChoice = choice.toLowerCase(); - - if (lowerChoice === 'k' || lowerChoice === 'kill') { - if (webPortInUse) { - await killPort(defaultWebPort); - } else { - log(`✓ Port ${defaultWebPort} is available`, 'green'); - } - if (serverPortInUse) { - await killPort(defaultServerPort); - } else { - log(`✓ Port ${defaultServerPort} is available`, 'green'); - } - break; - } else if (lowerChoice === 'u' || lowerChoice === 'use') { - webPort = await promptForPort('web', defaultWebPort); - serverPort = await promptForPort('server', defaultServerPort, webPort); - log(`Using ports: Web=${webPort}, Server=${serverPort}`, 'blue'); - break; - } else if (lowerChoice === 'c' || lowerChoice === 'cancel') { - log('Cancelled.', 'yellow'); - process.exit(0); - } else { - log( - 'Invalid choice. Please enter k (kill), u (use different ports), or c (cancel).', - 'red' - ); - } - } - } else { - log(`✓ Port ${defaultWebPort} is available`, 'green'); - log(`✓ Port ${defaultServerPort} is available`, 'green'); - } - - // Build CORS origin env - const existing = (process.env.CORS_ORIGIN || '') - .split(',') - .map((o) => o.trim()) - .filter(Boolean) - .filter((o) => o !== '*'); - const origins = new Set(existing); - origins.add(`http://localhost:${webPort}`); - origins.add(`http://127.0.0.1:${webPort}`); - const corsOriginEnv = Array.from(origins).join(','); - - console.log(''); - - return { webPort, serverPort, corsOriginEnv }; -} - -/** - * Prompt for a specific port with validation - * @param {string} name - Port name (web/server) - * @param {number} defaultPort - Default port value - * @param {number} excludePort - Port to exclude (optional) - * @returns {Promise} - */ -async function promptForPort(name, defaultPort, excludePort = null) { - while (true) { - const input = await prompt(`Enter ${name} port (default ${defaultPort}): `); - const parsed = input.trim() ? parseInt(input.trim(), 10) : defaultPort; - - if (isNaN(parsed) || parsed < 1024 || parsed > 65535) { - log('Invalid port. Please enter a number between 1024 and 65535.', 'red'); - continue; - } - - if (excludePort && parsed === excludePort) { - log(`${name} port cannot be the same as the other port.`, 'red'); - continue; - } - - if (isPortInUse(parsed)) { - const pids = getProcessesOnPort(parsed); - log(`Port ${parsed} is already in use by process(es): ${pids.join(', ')}`, 'red'); - const useAnyway = await prompt('Use this port anyway? (y/n): '); - if (useAnyway.toLowerCase() !== 'y' && useAnyway.toLowerCase() !== 'yes') { - continue; - } - } - - return parsed; - } -} - -// ============================================================================= -// UI Components -// ============================================================================= - -/** - * Print the application header banner - * @param {string} title - Header title - */ -export function printHeader(title) { - console.log('╔═══════════════════════════════════════════════════════╗'); - console.log(`║ ${title.padEnd(45)}║`); - console.log('╚═══════════════════════════════════════════════════════╝'); - console.log(''); -} - -/** - * Print the application mode menu - * @param {object} options - Menu options - * @param {boolean} options.isDev - Whether this is dev mode (changes Docker option description) - */ -export function printModeMenu({ isDev = false } = {}) { - console.log('═══════════════════════════════════════════════════════'); - console.log(' Select Application Mode:'); - console.log('═══════════════════════════════════════════════════════'); - console.log(' 1) Web Application (Browser)'); - console.log(' 2) Desktop Application (Electron)'); - if (isDev) { - console.log(' 3) Docker Container (Dev with Live Reload)'); - console.log(' 4) Electron + Docker API (Local Electron, Container API)'); - } else { - console.log(' 3) Docker Container (Isolated)'); - } - console.log('═══════════════════════════════════════════════════════'); - console.log(''); -} - -// ============================================================================= -// Process Cleanup -// ============================================================================= - -/** - * Create a cleanup handler for spawned processes - * @param {object} processes - Object with process references {server, web, electron, docker} - * @returns {Function} - Cleanup function - */ -export function createCleanupHandler(processes) { - return async function cleanup() { - console.log('\nCleaning up...'); - - const killPromises = []; - - if (processes.server && !processes.server.killed && processes.server.pid) { - killPromises.push(killProcessTree(processes.server.pid)); - } - - if (processes.web && !processes.web.killed && processes.web.pid) { - killPromises.push(killProcessTree(processes.web.pid)); - } - - if (processes.electron && !processes.electron.killed && processes.electron.pid) { - killPromises.push(killProcessTree(processes.electron.pid)); - } - - if (processes.docker && !processes.docker.killed && processes.docker.pid) { - killPromises.push(killProcessTree(processes.docker.pid)); - } - - await Promise.all(killPromises); - }; -} - -/** - * Setup signal handlers for graceful shutdown - * @param {Function} cleanup - Cleanup function - */ -export function setupSignalHandlers(cleanup) { - let cleaningUp = false; - - const handleExit = async () => { - if (cleaningUp) return; - cleaningUp = true; - await cleanup(); - process.exit(0); - }; - - process.on('SIGINT', () => handleExit()); - process.on('SIGTERM', () => handleExit()); -} - -// ============================================================================= -// Server Startup -// ============================================================================= - -/** - * Start the backend server and wait for it to be ready - * @param {object} options - Configuration options - * @returns {Promise} - Server process - */ -export async function startServerAndWait({ serverPort, corsOriginEnv, npmArgs, cwd, fs, baseDir }) { - log(`Starting backend server on port ${serverPort}...`, 'blue'); - - // Create logs directory - const logsDir = path.join(baseDir, 'logs'); - if (!fs.existsSync(logsDir)) { - fs.mkdirSync(logsDir, { recursive: true }); - } - - const logStream = fs.createWriteStream(path.join(baseDir, 'logs', 'server.log')); - const serverProcess = runNpm( - npmArgs, - { - stdio: ['ignore', 'pipe', 'pipe'], - env: { - PORT: String(serverPort), - CORS_ORIGIN: corsOriginEnv, - }, - }, - cwd - ); - - // Pipe to both log file and console - serverProcess.stdout?.on('data', (data) => { - process.stdout.write(data); - logStream.write(data); - }); - serverProcess.stderr?.on('data', (data) => { - process.stderr.write(data); - logStream.write(data); - }); - - log('Waiting for server to be ready...', 'yellow'); - - // Wait for server health check - const maxRetries = 30; - let serverReady = false; - - for (let i = 0; i < maxRetries; i++) { - if (await checkHealth(serverPort)) { - serverReady = true; - break; - } - process.stdout.write('.'); - await sleep(1000); - } - - console.log(''); - - if (!serverReady) { - log('Error: Server failed to start', 'red'); - console.log('Check logs/server.log for details'); - - // Clean up the spawned server process that failed health check - if (serverProcess && !serverProcess.killed && serverProcess.pid) { - log('Terminating failed server process...', 'yellow'); - try { - await killProcessTree(serverProcess.pid); - } catch (killErr) { - // Fallback: try direct kill if tree-kill fails - try { - serverProcess.kill('SIGKILL'); - } catch { - // Process may have already exited - } - } - } - - // Close the log stream - logStream.end(); - - return null; - } - - log('✓ Server is ready!', 'green'); - return serverProcess; -} - -// ============================================================================= -// Dependencies -// ============================================================================= - -/** - * Ensure node_modules exists, install if not - * @param {object} fs - Restricted fs object - * @param {string} baseDir - Base directory - */ -export async function ensureDependencies(fs, baseDir) { - if (!fs.existsSync(path.join(baseDir, 'node_modules'))) { - log('Installing dependencies...', 'blue'); - const install = runNpm(['install'], { stdio: 'inherit' }, baseDir); - await new Promise((resolve, reject) => { - install.on('close', (code) => { - if (code === 0) resolve(); - else reject(new Error(`npm install failed with code ${code}`)); - }); - }); - } -} - -// ============================================================================= -// Docker Utilities -// ============================================================================= - -/** - * Sanitize a project name to be safe for use in shell commands and Docker image names. - * Converts to lowercase and removes any characters that aren't alphanumeric. - * @param {string} name - Project name to sanitize - * @returns {string} - Sanitized project name - */ -export function sanitizeProjectName(name) { - return name.toLowerCase().replace(/[^a-z0-9]/g, ''); -} - -/** - * Get the current git commit SHA - * @param {string} baseDir - Base directory of the git repository - * @returns {string|null} - Current commit SHA or null if not available - */ -export function getCurrentCommitSha(baseDir) { - try { - const sha = execSync('git rev-parse HEAD', { - encoding: 'utf-8', - cwd: baseDir, - stdio: ['pipe', 'pipe', 'pipe'], - }).trim(); - return sha || null; - } catch { - return null; - } -} - -/** - * Get the commit SHA from a Docker image label - * @param {string} imageName - Docker image name - * @returns {string|null} - Commit SHA from image label or null if not found - */ -export function getImageCommitSha(imageName) { - try { - const labelValue = execSync( - `docker image inspect ${imageName} --format "{{index .Config.Labels \\"automaker.git.commit.sha\\"}}" 2>/dev/null`, - { encoding: 'utf-8' } - ).trim(); - return labelValue && labelValue !== 'unknown' && labelValue !== '' ? labelValue : null; - } catch { - return null; - } -} - -/** - * Check if Docker images need to be rebuilt based on git commit SHA - * Compares the current git commit with the commit SHA stored in the image labels - * @param {string} baseDir - Base directory containing Dockerfile and docker-compose.yml - * @returns {{needsRebuild: boolean, reason: string, currentSha: string|null, imageSha: string|null}} - */ -export function shouldRebuildDockerImages(baseDir) { - try { - // Get current git commit SHA - const currentSha = getCurrentCommitSha(baseDir); - if (!currentSha) { - return { - needsRebuild: true, - reason: 'Could not determine current git commit', - currentSha: null, - imageSha: null, - }; - } - - // Get project name from docker-compose config, falling back to directory name - let projectName; - try { - const composeConfig = execSync('docker compose config --format json', { - encoding: 'utf-8', - cwd: baseDir, - }); - const config = JSON.parse(composeConfig); - projectName = config.name; - } catch { - // Fallback handled below - } - - // Sanitize project name - const sanitizedProjectName = sanitizeProjectName(projectName || path.basename(baseDir)); - const serverImageName = `${sanitizedProjectName}-server`; - const uiImageName = `${sanitizedProjectName}-ui`; - - // Check if images exist - const serverExists = checkImageExists(serverImageName); - const uiExists = checkImageExists(uiImageName); - - if (!serverExists || !uiExists) { - return { - needsRebuild: true, - reason: 'Docker images do not exist', - currentSha, - imageSha: null, - }; - } - - // Get commit SHA from server image (both should have the same) - const imageSha = getImageCommitSha(serverImageName); - - if (!imageSha) { - return { - needsRebuild: true, - reason: 'Docker images have no commit SHA label (legacy build)', - currentSha, - imageSha: null, - }; - } - - // Compare commit SHAs - if (currentSha !== imageSha) { - return { - needsRebuild: true, - reason: `Code changed: ${imageSha.substring(0, 8)} -> ${currentSha.substring(0, 8)}`, - currentSha, - imageSha, - }; - } - - return { - needsRebuild: false, - reason: 'Images are up to date', - currentSha, - imageSha, - }; - } catch (error) { - return { - needsRebuild: true, - reason: 'Could not check Docker image status', - currentSha: null, - imageSha: null, - }; - } -} - -/** - * Check if a Docker image exists - * @param {string} imageName - Docker image name - * @returns {boolean} - Whether the image exists - */ -function checkImageExists(imageName) { - try { - execSync(`docker image inspect ${imageName} 2>/dev/null`, { - encoding: 'utf-8', - stdio: ['pipe', 'pipe', 'pipe'], - }); - return true; - } catch { - return false; - } -} - -/** - * Launch Docker containers for development with live reload - * Uses docker-compose.dev.yml which volume mounts the source code - * Also includes docker-compose.override.yml if it exists (for workspace mounts) - * @param {object} options - Configuration options - * @param {string} options.baseDir - Base directory containing docker-compose.dev.yml - * @param {object} options.processes - Processes object to track docker process - * @returns {Promise} - */ -export async function launchDockerDevContainers({ baseDir, processes }) { - log('Launching Docker Container (Development Mode with Live Reload)...', 'blue'); - console.log(''); - - // Check if ANTHROPIC_API_KEY is set - if (!process.env.ANTHROPIC_API_KEY) { - log('Warning: ANTHROPIC_API_KEY environment variable is not set.', 'yellow'); - log('The server will require an API key to function.', 'yellow'); - log('Set it with: export ANTHROPIC_API_KEY=your-key', 'yellow'); - console.log(''); - } - - log('Starting development container...', 'yellow'); - log('Source code is volume mounted for live reload', 'yellow'); - log('Running npm install inside container (this may take a moment on first run)...', 'yellow'); - console.log(''); - - // Build compose file arguments - // Start with dev compose file, then add override if it exists - const composeArgs = ['compose', '-f', 'docker-compose.dev.yml']; - - // Check if docker-compose.override.yml exists and include it for workspace mounts - const overridePath = path.join(baseDir, 'docker-compose.override.yml'); - if (fsNative.existsSync(overridePath)) { - composeArgs.push('-f', 'docker-compose.override.yml'); - log('Using docker-compose.override.yml for workspace mount', 'yellow'); - } - - composeArgs.push('up', '--build'); - - // Use docker-compose.dev.yml for development - processes.docker = crossSpawn('docker', composeArgs, { - stdio: 'inherit', - cwd: baseDir, - env: { - ...process.env, - }, - }); - - log('Development container starting...', 'blue'); - log('UI will be available at: http://localhost:3007 (with HMR)', 'green'); - log('API will be available at: http://localhost:3008', 'green'); - console.log(''); - log('Changes to source files will automatically reload.', 'yellow'); - log('Press Ctrl+C to stop the container.', 'yellow'); - - await new Promise((resolve) => { - processes.docker.on('close', resolve); - }); -} - -/** - * Launch only the Docker server container for use with local Electron - * Uses docker-compose.dev-server.yml which only runs the backend API - * Also includes docker-compose.override.yml if it exists (for workspace mounts) - * Automatically launches Electron once the server is healthy. - * @param {object} options - Configuration options - * @param {string} options.baseDir - Base directory containing docker-compose.dev-server.yml - * @param {object} options.processes - Processes object to track docker process - * @returns {Promise} - */ -export async function launchDockerDevServerContainer({ baseDir, processes }) { - log('Launching Docker Server Container + Local Electron...', 'blue'); - console.log(''); - - // Check if ANTHROPIC_API_KEY is set - if (!process.env.ANTHROPIC_API_KEY) { - log('Warning: ANTHROPIC_API_KEY environment variable is not set.', 'yellow'); - log('The server will require an API key to function.', 'yellow'); - log('Set it with: export ANTHROPIC_API_KEY=your-key', 'yellow'); - console.log(''); - } - - log('Starting server container...', 'yellow'); - log('Source code is volume mounted for live reload', 'yellow'); - log('Running npm install inside container (this may take a moment on first run)...', 'yellow'); - console.log(''); - - // Build compose file arguments - // Start with dev-server compose file, then add override if it exists - const composeArgs = ['compose', '-f', 'docker-compose.dev-server.yml']; - - // Check if docker-compose.override.yml exists and include it for workspace mounts - const overridePath = path.join(baseDir, 'docker-compose.override.yml'); - if (fsNative.existsSync(overridePath)) { - composeArgs.push('-f', 'docker-compose.override.yml'); - log('Using docker-compose.override.yml for workspace mount', 'yellow'); - } - - composeArgs.push('up', '--build'); - - // Use docker-compose.dev-server.yml for server-only development - // Run with piped stdio so we can still see output but also run Electron - processes.docker = crossSpawn('docker', composeArgs, { - stdio: 'inherit', - cwd: baseDir, - env: { - ...process.env, - }, - }); - - log('Server container starting...', 'blue'); - log('API will be available at: http://localhost:3008', 'green'); - console.log(''); - - // Wait for the server to become healthy - log('Waiting for server to be ready...', 'yellow'); - const serverPort = 3008; - const maxRetries = 120; // 2 minutes (first run may need npm install + build) - let serverReady = false; - - for (let i = 0; i < maxRetries; i++) { - if (await checkHealth(serverPort)) { - serverReady = true; - break; - } - await sleep(1000); - // Show progress dots every 5 seconds - if (i > 0 && i % 5 === 0) { - process.stdout.write('.'); - } - } - - if (!serverReady) { - console.log(''); - log('Error: Server container failed to become healthy', 'red'); - log('Check the Docker logs above for errors', 'red'); - return; - } - - console.log(''); - log('Server is ready! Launching Electron...', 'green'); - console.log(''); - - // Build shared packages before launching Electron - log('Building shared packages...', 'blue'); - try { - await runNpmAndWait(['run', 'build:packages'], { stdio: 'inherit' }, baseDir); - } catch (error) { - log('Failed to build packages: ' + error.message, 'red'); - return; - } - - // Launch Electron with SKIP_EMBEDDED_SERVER=true - // This tells Electron to connect to the external Docker server instead of starting its own - processes.electron = crossSpawn('npm', ['run', '_dev:electron'], { - stdio: 'inherit', - cwd: baseDir, - env: { - ...process.env, - SKIP_EMBEDDED_SERVER: 'true', - PORT: '3008', - VITE_SERVER_URL: 'http://localhost:3008', - VITE_APP_MODE: '4', - }, - }); - - log('Electron launched with SKIP_EMBEDDED_SERVER=true', 'green'); - log('Changes to server source files will automatically reload.', 'yellow'); - log('Press Ctrl+C to stop both Electron and the container.', 'yellow'); - console.log(''); - - // Wait for either process to exit - await Promise.race([ - new Promise((resolve) => processes.docker.on('close', resolve)), - new Promise((resolve) => processes.electron.on('close', resolve)), - ]); -} - -/** - * Launch Docker containers with docker-compose (production mode) - * Uses git commit SHA to determine if rebuild is needed - * @param {object} options - Configuration options - * @param {string} options.baseDir - Base directory containing docker-compose.yml - * @param {object} options.processes - Processes object to track docker process - * @returns {Promise} - */ -export async function launchDockerContainers({ baseDir, processes }) { - log('Launching Docker Container (Isolated Mode)...', 'blue'); - - // Check if ANTHROPIC_API_KEY is set - if (!process.env.ANTHROPIC_API_KEY) { - log('Warning: ANTHROPIC_API_KEY environment variable is not set.', 'yellow'); - log('The server will require an API key to function.', 'yellow'); - log('Set it with: export ANTHROPIC_API_KEY=your-key', 'yellow'); - console.log(''); - } - - // Check if rebuild is needed based on git commit SHA - const rebuildCheck = shouldRebuildDockerImages(baseDir); - - if (rebuildCheck.needsRebuild) { - log(`Rebuild needed: ${rebuildCheck.reason}`, 'yellow'); - - if (rebuildCheck.currentSha) { - log(`Building images for commit: ${rebuildCheck.currentSha.substring(0, 8)}`, 'blue'); - } - console.log(''); - - // Build with commit SHA label - const buildArgs = ['compose', 'build']; - if (rebuildCheck.currentSha) { - buildArgs.push('--build-arg', `GIT_COMMIT_SHA=${rebuildCheck.currentSha}`); - } - - const buildProcess = crossSpawn('docker', buildArgs, { - stdio: 'inherit', - cwd: baseDir, - }); - - await new Promise((resolve, reject) => { - buildProcess.on('close', (code) => { - if (code !== 0) { - log('Build failed. Exiting.', 'red'); - reject(new Error(`Docker build failed with code ${code}`)); - } else { - log('Build complete. Starting containers...', 'green'); - console.log(''); - resolve(); - } - }); - buildProcess.on('error', (err) => reject(err)); - }); - - // Start containers (already built above) - processes.docker = crossSpawn('docker', ['compose', 'up'], { - stdio: 'inherit', - cwd: baseDir, - env: { - ...process.env, - }, - }); - } else { - log( - `Images are up to date (commit: ${rebuildCheck.currentSha?.substring(0, 8) || 'unknown'})`, - 'green' - ); - log('Starting Docker containers...', 'yellow'); - console.log(''); - - // Start containers without rebuilding - processes.docker = crossSpawn('docker', ['compose', 'up'], { - stdio: 'inherit', - cwd: baseDir, - env: { - ...process.env, - }, - }); - } - - log('Docker containers starting...', 'blue'); - log('UI will be available at: http://localhost:3007', 'green'); - log('API will be available at: http://localhost:3008', 'green'); - console.log(''); - log('Press Ctrl+C to stop the containers.', 'yellow'); - - await new Promise((resolve) => { - processes.docker.on('close', resolve); - }); -} diff --git a/start-automaker.sh b/start-automaker.sh new file mode 100755 index 00000000..a2d3e54c --- /dev/null +++ b/start-automaker.sh @@ -0,0 +1,1342 @@ +#!/bin/bash +# Automaker TUI Launcher - Interactive menu for launching Automaker in different modes +# Supports: Web Browser, Desktop (Electron), Docker Dev, Electron + Docker API +# Platforms: Linux, macOS, Windows (Git Bash, WSL, MSYS2, Cygwin) +# Features: Terminal responsiveness, history, pre-flight checks, port management + +set -e + +# ============================================================================ +# CONFIGURATION & CONSTANTS +# ============================================================================ + +APP_NAME="Automaker" +SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" +HISTORY_FILE="${HOME}/.automaker_launcher_history" +MIN_TERM_WIDTH=70 +MIN_TERM_HEIGHT=20 +MENU_BOX_WIDTH=66 +MENU_INNER_WIDTH=64 +LOGO_WIDTH=52 +INPUT_TIMEOUT=30 +SELECTED_OPTION=1 +MAX_OPTIONS=4 + +# Platform detection (set early for cross-platform compatibility) +IS_WINDOWS=false +IS_MACOS=false +if [[ "$OSTYPE" == "msys" || "$OSTYPE" == "cygwin" || "$OSTYPE" == "mingw"* ]]; then + IS_WINDOWS=true +elif [[ "$OSTYPE" == "darwin"* ]]; then + IS_MACOS=true +fi + +# Port configuration +DEFAULT_WEB_PORT=3007 +DEFAULT_SERVER_PORT=3008 +WEB_PORT=$DEFAULT_WEB_PORT +SERVER_PORT=$DEFAULT_SERVER_PORT + +# Extract VERSION from apps/ui/package.json (the actual app version, not monorepo version) +if command -v node &> /dev/null; then + VERSION="v$(node -p "require('$SCRIPT_DIR/apps/ui/package.json').version" 2>/dev/null || echo "0.11.0")" +else + VERSION=$(grep '"version"' "$SCRIPT_DIR/apps/ui/package.json" 2>/dev/null | head -1 | sed 's/.*"version"[^"]*"\([^"]*\)".*/v\1/' || echo "v0.11.0") +fi + +# ANSI Color codes (256-color palette) +ESC=$(printf '\033') +RESET="${ESC}[0m" +BOLD="${ESC}[1m" +DIM="${ESC}[2m" + +C_PRI="${ESC}[38;5;51m" # Primary cyan +C_SEC="${ESC}[38;5;39m" # Secondary blue +C_ACC="${ESC}[38;5;33m" # Accent darker blue +C_GREEN="${ESC}[38;5;118m" # Green +C_RED="${ESC}[38;5;196m" # Red +C_YELLOW="${ESC}[38;5;226m" # Yellow +C_GRAY="${ESC}[38;5;240m" # Dark gray +C_WHITE="${ESC}[38;5;255m" # White +C_MUTE="${ESC}[38;5;248m" # Muted gray + +# ============================================================================ +# ARGUMENT PARSING +# ============================================================================ + +MODE="" +USE_COLORS=true +CHECK_DEPS=false +NO_HISTORY=false +PRODUCTION_MODE=false + +show_help() { + cat << 'EOF' +Automaker TUI Launcher - Interactive development environment starter + +USAGE: + start-automaker.sh [MODE] [OPTIONS] + +MODES: + web Launch in web browser (localhost:3007) + electron Launch as desktop app (Electron) + docker Launch in Docker container (dev with live reload) + docker-electron Launch Electron with Docker API backend + +OPTIONS: + --help Show this help message + --version Show version information + --no-colors Disable colored output + --check-deps Check dependencies before launching + --no-history Don't remember last choice + --production Run in production mode (builds first, faster React) + +EXAMPLES: + start-automaker.sh # Interactive menu (development) + start-automaker.sh --production # Interactive menu (production) + start-automaker.sh web # Launch web mode directly (dev) + start-automaker.sh web --production # Launch web mode (production) + start-automaker.sh electron # Launch desktop app directly + start-automaker.sh docker # Launch Docker dev container + start-automaker.sh --version # Show version + +KEYBOARD SHORTCUTS (in menu): + Up/Down arrows Navigate between options + Enter Select highlighted option + 1-4 Jump to and select mode + Q Exit + +HISTORY: + Your last selected mode is remembered in: ~/.automaker_launcher_history + Use --no-history to disable this feature + +PLATFORMS: + Linux, macOS, Windows (Git Bash, WSL, MSYS2, Cygwin) + +EOF +} + +show_version() { + echo "Automaker Launcher $VERSION" + echo "Node.js: $(node -v 2>/dev/null || echo 'not installed')" + echo "Bash: ${BASH_VERSION%.*}" +} + +parse_args() { + while [[ $# -gt 0 ]]; do + case "$1" in + --help) + show_help + exit 0 + ;; + --version) + show_version + exit 0 + ;; + --no-colors) + USE_COLORS=false + RESET="" + C_PRI="" C_SEC="" C_ACC="" C_GREEN="" C_RED="" C_YELLOW="" C_GRAY="" C_WHITE="" C_MUTE="" + ;; + --check-deps) + CHECK_DEPS=true + ;; + --no-history) + NO_HISTORY=true + ;; + --production) + PRODUCTION_MODE=true + ;; + web|electron|docker|docker-electron) + MODE="$1" + ;; + *) + echo "Unknown option: $1" >&2 + echo "Use --help for usage information" >&2 + exit 1 + ;; + esac + shift + done +} + +# ============================================================================ +# PRE-FLIGHT CHECKS +# ============================================================================ + +check_platform() { + # Platform already detected at script start + # This function is kept for any additional platform-specific checks + if [ "$IS_WINDOWS" = true ]; then + # Check if running in a proper terminal + if [ -z "$TERM" ]; then + echo "${C_YELLOW}Warning:${RESET} Running on Windows without proper terminal." + echo "For best experience, use Git Bash, WSL, or Windows Terminal." + fi + fi +} + +check_required_commands() { + local missing=() + + # Check for required commands + for cmd in node npm tput; do + if ! command -v "$cmd" &> /dev/null; then + missing+=("$cmd") + fi + done + + if [ ${#missing[@]} -gt 0 ]; then + echo "${C_RED}Error:${RESET} Missing required commands: ${missing[*]}" + echo "" + echo "Please install:" + for cmd in "${missing[@]}"; do + case "$cmd" in + node|npm) echo " - Node.js (includes npm) from https://nodejs.org/" ;; + tput) echo " - ncurses package (usually pre-installed on Unix systems)" ;; + esac + done + exit 1 + fi +} + +DOCKER_CMD="docker" + +check_docker() { + if ! command -v docker &> /dev/null; then + echo "${C_RED}Error:${RESET} Docker is not installed or not in PATH" + echo "Please install Docker from https://docs.docker.com/get-docker/" + return 1 + fi + + if ! docker info &> /dev/null 2>&1; then + if sg docker -c "docker info" &> /dev/null 2>&1; then + DOCKER_CMD="sg docker -c" + else + echo "${C_RED}Error:${RESET} Docker daemon is not running or not accessible" + echo "" + echo "To fix, run:" + echo " sudo usermod -aG docker \$USER" + echo "" + echo "Then either log out and back in, or run:" + echo " newgrp docker" + return 1 + fi + fi + + export DOCKER_CMD + return 0 +} + +check_running_electron() { + local electron_pids="" + + if [ "$IS_WINDOWS" = true ]; then + # Windows: look for electron.exe or Automaker.exe + electron_pids=$(tasklist 2>/dev/null | grep -iE "electron|automaker" | awk '{print $2}' | tr '\n' ' ' || true) + else + # Unix: look for electron or Automaker processes + electron_pids=$(pgrep -f "electron.*automaker|Automaker" 2>/dev/null | tr '\n' ' ' || true) + fi + + if [ -n "$electron_pids" ] && [ "$electron_pids" != " " ]; then + get_term_size + echo "" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + center_print "Running Electron App Detected" "$C_YELLOW" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" + center_print "Electron process(es): $electron_pids" "$C_MUTE" + echo "" + center_print "What would you like to do?" "$C_WHITE" + echo "" + center_print "[K] Kill Electron and continue" "$C_GREEN" + center_print "[I] Ignore and continue anyway" "$C_MUTE" + center_print "[C] Cancel" "$C_RED" + echo "" + + while true; do + local choice_pad=$(( (TERM_COLS - 20) / 2 )) + printf "%${choice_pad}s" "" + read -r -p "Choice: " choice + + case "$choice" in + [kK]|[kK][iI][lL][lL]) + echo "" + center_print "Killing Electron processes..." "$C_YELLOW" + if [ "$IS_WINDOWS" = true ]; then + taskkill //F //IM "electron.exe" 2>/dev/null || true + taskkill //F //IM "Automaker.exe" 2>/dev/null || true + else + pkill -f "electron.*automaker" 2>/dev/null || true + pkill -f "Automaker" 2>/dev/null || true + fi + sleep 1 + center_print "✓ Electron stopped" "$C_GREEN" + echo "" + return 0 + ;; + [iI]|[iI][gG][nN][oO][rR][eE]) + echo "" + center_print "Continuing without stopping Electron..." "$C_MUTE" + echo "" + return 0 + ;; + [cC]|[cC][aA][nN][cC][eE][lL]) + echo "" + center_print "Cancelled." "$C_MUTE" + echo "" + exit 0 + ;; + *) + center_print "Invalid choice. Please enter K, I, or C." "$C_RED" + ;; + esac + done + fi + + return 0 +} + +check_running_containers() { + local compose_file="$1" + local running_containers="" + + # Get list of running automaker containers + if [ "$DOCKER_CMD" = "sg docker -c" ]; then + running_containers=$(sg docker -c "docker ps --filter 'name=automaker-dev' --format '{{{{Names}}}}'" 2>/dev/null | tr '\n' ' ' || true) + else + running_containers=$($DOCKER_CMD ps --filter "name=automaker-dev" --format "{{.Names}}" 2>/dev/null | tr '\n' ' ' || true) + fi + + if [ -n "$running_containers" ] && [ "$running_containers" != " " ]; then + get_term_size + echo "" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + center_print "Existing Containers Detected" "$C_YELLOW" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" + center_print "Running containers: $running_containers" "$C_MUTE" + echo "" + center_print "What would you like to do?" "$C_WHITE" + echo "" + center_print "[S] Stop containers and start fresh" "$C_GREEN" + center_print "[R] Restart containers (rebuild)" "$C_MUTE" + center_print "[A] Attach to existing containers" "$C_MUTE" + center_print "[C] Cancel" "$C_RED" + echo "" + + while true; do + local choice_pad=$(( (TERM_COLS - 20) / 2 )) + printf "%${choice_pad}s" "" + read -r -p "Choice: " choice + + case "$choice" in + [sS]|[sS][tT][oO][pP]) + echo "" + center_print "Stopping existing containers..." "$C_YELLOW" + if [ "$DOCKER_CMD" = "sg docker -c" ]; then + sg docker -c "docker compose -f '$compose_file' down" 2>/dev/null || true + sg docker -c "docker ps --filter 'name=automaker-dev' -q" 2>/dev/null | xargs -r sg docker -c "docker stop" 2>/dev/null || true + else + $DOCKER_CMD compose -f "$compose_file" down 2>/dev/null || true + $DOCKER_CMD ps --filter "name=automaker-dev" -q 2>/dev/null | xargs -r $DOCKER_CMD stop 2>/dev/null || true + fi + center_print "✓ Containers stopped" "$C_GREEN" + echo "" + return 0 # Continue with fresh start + ;; + [rR]|[rR][eE][sS][tT][aA][rR][tT]) + echo "" + center_print "Stopping and rebuilding containers..." "$C_YELLOW" + if [ "$DOCKER_CMD" = "sg docker -c" ]; then + sg docker -c "docker compose -f '$compose_file' down" 2>/dev/null || true + else + $DOCKER_CMD compose -f "$compose_file" down 2>/dev/null || true + fi + center_print "✓ Ready to rebuild" "$C_GREEN" + echo "" + return 0 # Continue with rebuild + ;; + [aA]|[aA][tT][tT][aA][cC][hH]) + echo "" + center_print "Attaching to existing containers..." "$C_GREEN" + echo "" + return 2 # Special code for attach + ;; + [cC]|[cC][aA][nN][cC][eE][lL]) + echo "" + center_print "Cancelled." "$C_MUTE" + echo "" + exit 0 + ;; + *) + center_print "Invalid choice. Please enter S, R, A, or C." "$C_RED" + ;; + esac + done + fi + + return 0 # No containers running, continue normally +} + +check_dependencies() { + if [ "$CHECK_DEPS" = false ]; then + return 0 + fi + + echo "${C_MUTE}Checking project dependencies...${RESET}" + + if [ ! -d "node_modules" ]; then + echo "${C_YELLOW}⚠${RESET} node_modules not found. Run 'npm install' before launching." + return 1 + fi + + if [ ! -f "package-lock.json" ]; then + echo "${C_YELLOW}⚠${RESET} package-lock.json not found." + fi + + return 0 +} + +# ============================================================================ +# PORT MANAGEMENT (Cross-platform) +# ============================================================================ + +get_pids_on_port() { + local port=$1 + + if [ "$IS_WINDOWS" = true ]; then + # Windows: use netstat + netstat -ano 2>/dev/null | grep ":$port " | grep "LISTENING" | awk '{print $5}' | sort -u | tr '\n' ' ' || true + else + # Unix: use lsof + lsof -ti:"$port" 2>/dev/null || true + fi +} + +is_port_in_use() { + local port=$1 + local pids + pids=$(get_pids_on_port "$port") + [ -n "$pids" ] && [ "$pids" != " " ] +} + +kill_port() { + local port=$1 + local pids + pids=$(get_pids_on_port "$port") + + if [ -z "$pids" ] || [ "$pids" = " " ]; then + echo "${C_GREEN}✓${RESET} Port $port is available" + return 0 + fi + + echo "${C_YELLOW}Killing process(es) on port $port: $pids${RESET}" + + if [ "$IS_WINDOWS" = true ]; then + # Windows: use taskkill + for pid in $pids; do + taskkill //F //PID "$pid" 2>/dev/null || true + done + else + # Unix: use kill + echo "$pids" | xargs kill -9 2>/dev/null || true + fi + + # Wait for port to be freed + local i=0 + while [ $i -lt 10 ]; do + sleep 0.5 2>/dev/null || sleep 1 + if ! is_port_in_use "$port"; then + echo "${C_GREEN}✓${RESET} Port $port is now free" + return 0 + fi + i=$((i + 1)) + done + + echo "${C_RED}Warning:${RESET} Port $port may still be in use" + return 1 +} + +check_ports() { + show_cursor + stty echo icanon 2>/dev/null || true + + local web_in_use=false + local server_in_use=false + + if is_port_in_use "$DEFAULT_WEB_PORT"; then + web_in_use=true + fi + if is_port_in_use "$DEFAULT_SERVER_PORT"; then + server_in_use=true + fi + + if [ "$web_in_use" = true ] || [ "$server_in_use" = true ]; then + echo "" + if [ "$web_in_use" = true ]; then + local pids + pids=$(get_pids_on_port "$DEFAULT_WEB_PORT") + echo "${C_YELLOW}⚠${RESET} Port $DEFAULT_WEB_PORT is in use by process(es): $pids" + fi + if [ "$server_in_use" = true ]; then + local pids + pids=$(get_pids_on_port "$DEFAULT_SERVER_PORT") + echo "${C_YELLOW}⚠${RESET} Port $DEFAULT_SERVER_PORT is in use by process(es): $pids" + fi + echo "" + + while true; do + read -r -p "What would you like to do? (k)ill processes, (u)se different ports, or (c)ancel: " choice + case "$choice" in + [kK]|[kK][iI][lL][lL]) + if [ "$web_in_use" = true ]; then + kill_port "$DEFAULT_WEB_PORT" + else + echo "${C_GREEN}✓${RESET} Port $DEFAULT_WEB_PORT is available" + fi + if [ "$server_in_use" = true ]; then + kill_port "$DEFAULT_SERVER_PORT" + else + echo "${C_GREEN}✓${RESET} Port $DEFAULT_SERVER_PORT is available" + fi + break + ;; + [uU]|[uU][sS][eE]) + read -r -p "Enter web port (default $DEFAULT_WEB_PORT): " input_web + WEB_PORT=${input_web:-$DEFAULT_WEB_PORT} + read -r -p "Enter server port (default $DEFAULT_SERVER_PORT): " input_server + SERVER_PORT=${input_server:-$DEFAULT_SERVER_PORT} + echo "${C_GREEN}Using ports: Web=$WEB_PORT, Server=$SERVER_PORT${RESET}" + break + ;; + [cC]|[cC][aA][nN][cC][eE][lL]) + echo "${C_MUTE}Cancelled.${RESET}" + exit 0 + ;; + *) + echo "${C_RED}Invalid choice. Please enter k, u, or c.${RESET}" + ;; + esac + done + echo "" + else + echo "${C_GREEN}✓${RESET} Port $DEFAULT_WEB_PORT is available" + echo "${C_GREEN}✓${RESET} Port $DEFAULT_SERVER_PORT is available" + fi + + hide_cursor + stty -echo -icanon 2>/dev/null || true +} + +validate_terminal_size() { + if [ "$USE_COLORS" = false ]; then + return 0 + fi + + local term_width term_height + term_width=$(tput cols 2>/dev/null || echo 80) + term_height=$(tput lines 2>/dev/null || echo 24) + + if [ "$term_width" -lt "$MIN_TERM_WIDTH" ] || [ "$term_height" -lt "$MIN_TERM_HEIGHT" ]; then + echo "${C_YELLOW}⚠${RESET} Terminal size ${term_width}x${term_height} is smaller than recommended ${MIN_TERM_WIDTH}x${MIN_TERM_HEIGHT}" + echo " Some elements may not display correctly." + echo "" + return 1 + fi +} + +# ============================================================================ +# CURSOR & CLEANUP +# ============================================================================ + +hide_cursor() { + [ "$USE_COLORS" = true ] && printf "${ESC}[?25l" +} + +show_cursor() { + [ "$USE_COLORS" = true ] && printf "${ESC}[?25h" +} + +cleanup() { + show_cursor + # Restore terminal settings (echo and canonical mode) + stty echo icanon 2>/dev/null || true + # Kill server process if running in production mode + if [ -n "${SERVER_PID:-}" ]; then + kill $SERVER_PID 2>/dev/null || true + fi + printf "${RESET}\n" +} + +trap cleanup EXIT INT TERM + +# ============================================================================ +# TERMINAL SIZE & UI UTILITIES +# ============================================================================ + +get_term_size() { + TERM_COLS=$(tput cols 2>/dev/null || echo 80) + TERM_LINES=$(tput lines 2>/dev/null || echo 24) +} + +center_text() { + local text="$1" + local len=${#text} + local pad=$(( (TERM_COLS - len) / 2 )) + printf "%${pad}s%s\n" "" "$text" +} + +draw_line() { + local char="${1:-─}" + local color="${2:-$C_GRAY}" + local width="${3:-58}" + printf "${color}" + for ((i=0; i/dev/null || echo "") + if [ -n "$last_mode" ]; then + local hint_text="(Last: $last_mode)" + local h_pad=$(( (TERM_COLS - ${#hint_text}) / 2 )) + printf "%${h_pad}s" "" + echo -e "${DIM}${hint_text}${RESET}" + fi + fi +} + +# ============================================================================ +# SPINNER & INITIALIZATION +# ============================================================================ + +spinner() { + local text="$1" + local -a frames=('⠋' '⠙' '⠹' '⠸' '⠼' '⠴' '⠦' '⠧' '⠇' '⠏') + local i=0 + local count=0 + local max_frames=20 # Max 2 seconds + + # Ensure TERM_COLS is set + [ -z "$TERM_COLS" ] && TERM_COLS=80 + + while [ $count -lt $max_frames ]; do + local len=${#text} + local pad_left=$(( (TERM_COLS - len - 4) / 2 )) + [ $pad_left -lt 0 ] && pad_left=0 + printf "\r%${pad_left}s${C_PRI}${frames[$i]}${RESET} ${C_WHITE}%s${RESET}" "" "$text" + i=$(( (i + 1) % ${#frames[@]} )) + count=$((count + 1)) + sleep 0.1 2>/dev/null || sleep 1 + done + + local len=${#text} + local pad_left=$(( (TERM_COLS - len - 4) / 2 )) + [ $pad_left -lt 0 ] && pad_left=0 + printf "\r%${pad_left}s${C_GREEN}✓${RESET} ${C_WHITE}%s${RESET} \n" "" "$text" +} + +center_print() { + local text="$1" + local color="${2:-}" + local len=${#text} + local pad=$(( (TERM_COLS - len) / 2 )) + [ $pad -lt 0 ] && pad=0 + printf "%${pad}s${color}%s${RESET}\n" "" "$text" +} + +resolve_port_conflicts() { + # Ensure terminal is in proper state for input + show_cursor + stty echo icanon 2>/dev/null || true + + local web_in_use=false + local server_in_use=false + local web_pids="" + local server_pids="" + + if is_port_in_use "$DEFAULT_WEB_PORT"; then + web_in_use=true + web_pids=$(get_pids_on_port "$DEFAULT_WEB_PORT") + fi + if is_port_in_use "$DEFAULT_SERVER_PORT"; then + server_in_use=true + server_pids=$(get_pids_on_port "$DEFAULT_SERVER_PORT") + fi + + if [ "$web_in_use" = true ] || [ "$server_in_use" = true ]; then + echo "" + if [ "$web_in_use" = true ]; then + center_print "⚠ Port $DEFAULT_WEB_PORT is in use by process(es): $web_pids" "$C_YELLOW" + fi + if [ "$server_in_use" = true ]; then + center_print "⚠ Port $DEFAULT_SERVER_PORT is in use by process(es): $server_pids" "$C_YELLOW" + fi + echo "" + + # Show options + center_print "What would you like to do?" "$C_WHITE" + echo "" + center_print "[K] Kill processes and continue" "$C_GREEN" + center_print "[U] Use different ports" "$C_MUTE" + center_print "[C] Cancel" "$C_RED" + echo "" + + while true; do + local choice_pad=$(( (TERM_COLS - 20) / 2 )) + printf "%${choice_pad}s" "" + read -r -p "Choice: " choice + + case "$choice" in + [kK]|[kK][iI][lL][lL]) + echo "" + if [ "$web_in_use" = true ]; then + center_print "Killing process(es) on port $DEFAULT_WEB_PORT..." "$C_YELLOW" + kill_port "$DEFAULT_WEB_PORT" > /dev/null 2>&1 || true + center_print "✓ Port $DEFAULT_WEB_PORT is now free" "$C_GREEN" + fi + if [ "$server_in_use" = true ]; then + center_print "Killing process(es) on port $DEFAULT_SERVER_PORT..." "$C_YELLOW" + kill_port "$DEFAULT_SERVER_PORT" > /dev/null 2>&1 || true + center_print "✓ Port $DEFAULT_SERVER_PORT is now free" "$C_GREEN" + fi + break + ;; + [uU]|[uU][sS][eE]) + echo "" + local input_pad=$(( (TERM_COLS - 40) / 2 )) + printf "%${input_pad}s" "" + read -r -p "Enter web port (default $DEFAULT_WEB_PORT): " input_web + WEB_PORT=${input_web:-$DEFAULT_WEB_PORT} + printf "%${input_pad}s" "" + read -r -p "Enter server port (default $DEFAULT_SERVER_PORT): " input_server + SERVER_PORT=${input_server:-$DEFAULT_SERVER_PORT} + center_print "Using ports: Web=$WEB_PORT, Server=$SERVER_PORT" "$C_GREEN" + break + ;; + [cC]|[cC][aA][nN][cC][eE][lL]) + echo "" + center_print "Cancelled." "$C_MUTE" + echo "" + exit 0 + ;; + *) + center_print "Invalid choice. Please enter K, U, or C." "$C_RED" + ;; + esac + done + else + center_print "✓ Port $DEFAULT_WEB_PORT is available" "$C_GREEN" + center_print "✓ Port $DEFAULT_SERVER_PORT is available" "$C_GREEN" + fi + + # Restore terminal state + hide_cursor + stty -echo -icanon 2>/dev/null || true +} + +launch_sequence() { + local mode_name="$1" + + # Ensure terminal size is available + get_term_size + + echo "" + + # Show port checking for modes that use local ports + if [[ "$MODE" == "web" || "$MODE" == "electron" ]]; then + center_print "Checking ports ${DEFAULT_WEB_PORT} and ${DEFAULT_SERVER_PORT}..." "$C_MUTE" + resolve_port_conflicts + echo "" + fi + + spinner "Initializing environment..." + spinner "Starting $mode_name..." + + echo "" + local msg="Automaker is ready!" + local pad=$(( (TERM_COLS - ${#msg}) / 2 )) + printf "%${pad}s${C_GREEN}${BOLD}%s${RESET}\n" "" "$msg" + + case "$MODE" in + web) + local url="http://localhost:$WEB_PORT" + local upad=$(( (TERM_COLS - ${#url} - 10) / 2 )) + echo "" + printf "%${upad}s${DIM}Opening ${C_SEC}%s${RESET}\n" "" "$url" + ;; + docker|docker-electron) + echo "" + local ui_msg="UI: http://localhost:$DEFAULT_WEB_PORT" + local api_msg="API: http://localhost:$DEFAULT_SERVER_PORT" + center_text "${DIM}${ui_msg}${RESET}" + center_text "${DIM}${api_msg}${RESET}" + ;; + esac + echo "" +} + +# ============================================================================ +# HISTORY MANAGEMENT +# ============================================================================ + +save_mode_to_history() { + if [ "$NO_HISTORY" = false ]; then + echo "$1" > "$HISTORY_FILE" + fi +} + +get_last_mode_from_history() { + if [ -f "$HISTORY_FILE" ] && [ "$NO_HISTORY" = false ]; then + cat "$HISTORY_FILE" + fi +} + +# ============================================================================ +# PRODUCTION BUILD +# ============================================================================ + +build_for_production() { + echo "" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + center_print "Building for Production" "$C_PRI" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" + + center_print "Building shared packages..." "$C_YELLOW" + if ! npm run build:packages; then + center_print "✗ Failed to build packages" "$C_RED" + exit 1 + fi + center_print "✓ Packages built" "$C_GREEN" + echo "" + + center_print "Building server..." "$C_YELLOW" + if ! npm run build --workspace=apps/server; then + center_print "✗ Failed to build server" "$C_RED" + exit 1 + fi + center_print "✓ Server built" "$C_GREEN" + echo "" + + center_print "Building UI..." "$C_YELLOW" + if ! npm run build --workspace=apps/ui; then + center_print "✗ Failed to build UI" "$C_RED" + exit 1 + fi + center_print "✓ UI built" "$C_GREEN" + echo "" + + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + center_print "Build Complete" "$C_GREEN" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" +} + +# Ensure production env is applied consistently for builds and runtime +apply_production_env() { + if [ "$PRODUCTION_MODE" = true ]; then + export NODE_ENV="production" + fi +} + +# ============================================================================ +# MAIN EXECUTION +# ============================================================================ + +parse_args "$@" + +apply_production_env + +# Pre-flight checks +check_platform +check_required_commands +validate_terminal_size + +if [ "$CHECK_DEPS" = true ]; then + check_dependencies || true +fi + +hide_cursor +# Disable echo and line buffering for single-key input +stty -echo -icanon 2>/dev/null || true + +# Function to read a single key, handling escape sequences for arrows +# Note: bash 3.2 (macOS) doesn't support fractional timeouts, so we use a different approach +read_key() { + local key + local escape_seq="" + + if [ -n "$ZSH_VERSION" ]; then + read -k 1 -s -t "$INPUT_TIMEOUT" key 2>/dev/null || key="" + else + # Use IFS= to preserve special characters + IFS= read -n 1 -s -t "$INPUT_TIMEOUT" -r key 2>/dev/null || key="" + fi + + # Check for escape sequence (arrow keys send ESC [ A/B/C/D) + if [[ "$key" == $'\x1b' ]]; then + # Read the rest of the escape sequence without timeout + # Arrow keys send 3 bytes: ESC [ A/B/C/D + IFS= read -n 1 -s -r escape_seq 2>/dev/null || escape_seq="" + if [[ "$escape_seq" == "[" ]] || [[ "$escape_seq" == "O" ]]; then + IFS= read -n 1 -s -r escape_seq 2>/dev/null || escape_seq="" + case "$escape_seq" in + A) echo "UP"; return ;; + B) echo "DOWN"; return ;; + C) echo "RIGHT"; return ;; + D) echo "LEFT"; return ;; + esac + fi + # Just ESC key pressed + echo "ESC" + return + fi + + echo "$key" +} + +# Interactive menu if no mode specified +if [ -z "$MODE" ]; then + while true; do + show_header + show_menu + + key=$(read_key) + + case $key in + UP) + SELECTED_OPTION=$((SELECTED_OPTION - 1)) + [ $SELECTED_OPTION -lt 1 ] && SELECTED_OPTION=$MAX_OPTIONS + ;; + DOWN) + SELECTED_OPTION=$((SELECTED_OPTION + 1)) + [ $SELECTED_OPTION -gt $MAX_OPTIONS ] && SELECTED_OPTION=1 + ;; + 1) SELECTED_OPTION=1; MODE="web"; break ;; + 2) SELECTED_OPTION=2; MODE="electron"; break ;; + 3) SELECTED_OPTION=3; MODE="docker"; break ;; + 4) SELECTED_OPTION=4; MODE="docker-electron"; break ;; + ""|$'\n'|$'\r') + # Enter key - select current option + case $SELECTED_OPTION in + 1) MODE="web" ;; + 2) MODE="electron" ;; + 3) MODE="docker" ;; + 4) MODE="docker-electron" ;; + esac + break + ;; + q|Q) + echo "" + echo "" + center_text "${C_MUTE}Goodbye! See you soon.${RESET}" + echo "" + exit 0 + ;; + *) + ;; + esac + done +fi + +# Validate mode +case $MODE in + web) MODE_NAME="Web Browser" ;; + electron) MODE_NAME="Desktop App" ;; + docker) MODE_NAME="Docker Dev" ;; + docker-electron) MODE_NAME="Electron + Docker" ;; + *) + echo "${C_RED}Error:${RESET} Invalid mode '$MODE'" + echo "Valid modes: web, electron, docker, docker-electron" + exit 1 + ;; +esac + +# Check Docker for Docker modes +if [[ "$MODE" == "docker" || "$MODE" == "docker-electron" ]]; then + show_cursor + stty echo icanon 2>/dev/null || true + if ! check_docker; then + exit 1 + fi + hide_cursor + stty -echo -icanon 2>/dev/null || true +fi + +# Save to history +save_mode_to_history "$MODE" + +# Launch sequence +launch_sequence "$MODE_NAME" + +# Restore terminal state before running npm +show_cursor +stty echo icanon 2>/dev/null || true + +# Build for production if needed +if [ "$PRODUCTION_MODE" = true ]; then + build_for_production +fi + +# Execute the appropriate command +case $MODE in + web) + export TEST_PORT="$WEB_PORT" + export VITE_SERVER_URL="http://$HOSTNAME:$SERVER_PORT" + export PORT="$SERVER_PORT" + export CORS_ORIGIN="http://$HOSTNAME:$WEB_PORT,http://127.0.0.1:$WEB_PORT" + export VITE_APP_MODE="1" + + if [ "$PRODUCTION_MODE" = true ]; then + # Production: run built server and UI preview concurrently + echo "" + center_print "Starting server on port $SERVER_PORT..." "$C_YELLOW" + npm run start --workspace=apps/server & + SERVER_PID=$! + + # Wait for server to be healthy + echo "" + center_print "Waiting for server to be ready..." "$C_YELLOW" + max_retries=30 + server_ready=false + for ((i=0; i /dev/null 2>&1; then + server_ready=true + break + fi + sleep 1 + done + + if [ "$server_ready" = false ]; then + center_print "✗ Server failed to start" "$C_RED" + kill $SERVER_PID 2>/dev/null || true + exit 1 + fi + center_print "✓ Server is ready!" "$C_GREEN" + echo "" + + # Start UI preview + center_print "Starting UI preview on port $WEB_PORT..." "$C_YELLOW" + npm run preview --workspace=apps/ui -- --port "$WEB_PORT" + + # Cleanup server on exit + kill $SERVER_PID 2>/dev/null || true + else + # Development: build packages, start server, then start UI with Vite dev server + echo "" + center_print "Building shared packages..." "$C_YELLOW" + npm run build:packages + center_print "✓ Packages built" "$C_GREEN" + echo "" + + # Start backend server in dev mode (background) + center_print "Starting backend server on port $SERVER_PORT..." "$C_YELLOW" + npm run _dev:server & + SERVER_PID=$! + + # Wait for server to be healthy + center_print "Waiting for server to be ready..." "$C_YELLOW" + max_retries=30 + server_ready=false + for ((i=0; i /dev/null 2>&1; then + server_ready=true + break + fi + sleep 1 + printf "." + done + echo "" + + if [ "$server_ready" = false ]; then + center_print "✗ Server failed to start" "$C_RED" + kill $SERVER_PID 2>/dev/null || true + exit 1 + fi + center_print "✓ Server is ready!" "$C_GREEN" + echo "" + + center_print "The application will be available at: http://localhost:$WEB_PORT" "$C_GREEN" + echo "" + + # Start web app with Vite dev server (HMR enabled) + export VITE_APP_MODE="1" + npm run _dev:web + fi + ;; + electron) + # Set environment variables for Electron (it starts its own server) + export TEST_PORT="$WEB_PORT" + export PORT="$SERVER_PORT" + export VITE_SERVER_URL="http://localhost:$SERVER_PORT" + export CORS_ORIGIN="http://localhost:$WEB_PORT,http://127.0.0.1:$WEB_PORT" + export VITE_APP_MODE="2" + + if [ "$PRODUCTION_MODE" = true ]; then + # For production electron, we'd normally use the packaged app + # For now, run in dev mode but with production-built packages + center_print "Note: For production Electron, use the packaged app" "$C_YELLOW" + center_print "Running with production-built packages..." "$C_MUTE" + echo "" + fi + + center_print "Launching Desktop Application..." "$C_YELLOW" + center_print "(Electron will start its own backend server)" "$C_MUTE" + echo "" + npm run dev:electron + ;; + docker) + # Check for running Electron (user might be switching from option 4) + check_running_electron + + # Check for running containers + check_running_containers "docker-compose.dev.yml" + container_check=$? + + if [ $container_check -eq 2 ]; then + # Attach to existing containers + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + center_print "Attaching to Docker Dev Containers" "$C_PRI" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" + center_print "UI: http://localhost:$DEFAULT_WEB_PORT" "$C_GREEN" + center_print "API: http://localhost:$DEFAULT_SERVER_PORT" "$C_GREEN" + center_print "Press Ctrl+C to detach" "$C_MUTE" + echo "" + if [ "$DOCKER_CMD" = "sg docker -c" ]; then + if [ -f "docker-compose.override.yml" ]; then + sg docker -c "docker compose -f 'docker-compose.dev.yml' -f 'docker-compose.override.yml' logs -f" + else + sg docker -c "docker compose -f 'docker-compose.dev.yml' logs -f" + fi + else + if [ -f "docker-compose.override.yml" ]; then + $DOCKER_CMD compose -f docker-compose.dev.yml -f docker-compose.override.yml logs -f + else + $DOCKER_CMD compose -f docker-compose.dev.yml logs -f + fi + fi + else + echo "" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + center_print "Docker Development Mode" "$C_PRI" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" + center_print "Starting UI + Server containers..." "$C_MUTE" + center_print "Source code is volume mounted for live reload" "$C_MUTE" + echo "" + center_print "UI: http://localhost:$DEFAULT_WEB_PORT" "$C_GREEN" + center_print "API: http://localhost:$DEFAULT_SERVER_PORT" "$C_GREEN" + echo "" + center_print "First run may take several minutes (building image + npm install)" "$C_YELLOW" + center_print "Press Ctrl+C to stop" "$C_MUTE" + echo "" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" + if [ "$DOCKER_CMD" = "sg docker -c" ]; then + if [ -f "docker-compose.override.yml" ]; then + sg docker -c "docker compose -f 'docker-compose.dev.yml' -f 'docker-compose.override.yml' up --build" + else + sg docker -c "docker compose -f 'docker-compose.dev.yml' up --build" + fi + else + if [ -f "docker-compose.override.yml" ]; then + $DOCKER_CMD compose -f docker-compose.dev.yml -f docker-compose.override.yml up --build + else + $DOCKER_CMD compose -f docker-compose.dev.yml up --build + fi + fi + fi + ;; + docker-electron) + # Check for running Electron (user might be switching from option 2) + check_running_electron + + # Check for running containers + check_running_containers "docker-compose.dev-server.yml" + container_check=$? + + echo "" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + center_print "Electron + Docker API Mode" "$C_PRI" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" + center_print "Server runs in Docker container" "$C_MUTE" + center_print "Electron runs locally on your machine" "$C_MUTE" + echo "" + center_print "API: http://localhost:$DEFAULT_SERVER_PORT (Docker)" "$C_GREEN" + echo "" + + # If attaching to existing, skip the build + if [ $container_check -eq 2 ]; then + center_print "Using existing server container..." "$C_MUTE" + else + center_print "First run may take several minutes (building image + npm install)" "$C_YELLOW" + fi + echo "" + center_print "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━" "$C_GRAY" + echo "" + + # Start docker in background (or skip if attaching) + if [ $container_check -eq 2 ]; then + center_print "Checking if server is healthy..." "$C_MUTE" + DOCKER_PID="" + else + center_print "Starting Docker server container..." "$C_MUTE" + echo "" + if [ "$DOCKER_CMD" = "sg docker -c" ]; then + if [ -f "docker-compose.override.yml" ]; then + sg docker -c "docker compose -f 'docker-compose.dev-server.yml' -f 'docker-compose.override.yml' up --build" & + else + sg docker -c "docker compose -f 'docker-compose.dev-server.yml' up --build" & + fi + else + if [ -f "docker-compose.override.yml" ]; then + $DOCKER_CMD compose -f docker-compose.dev-server.yml -f docker-compose.override.yml up --build & + else + $DOCKER_CMD compose -f docker-compose.dev-server.yml up --build & + fi + fi + DOCKER_PID=$! + fi + + # Wait for server to be healthy + echo "" + center_print "Waiting for server to become healthy..." "$C_YELLOW" + center_print "(This may take a while on first run)" "$C_MUTE" + echo "" + max_retries=180 + server_ready=false + dots="" + for ((i=0; i /dev/null 2>&1; then + server_ready=true + break + fi + sleep 1 + if (( i > 0 && i % 10 == 0 )); then + dots="${dots}." + center_print "Still waiting${dots}" "$C_MUTE" + fi + done + echo "" + + if [ "$server_ready" = false ]; then + center_print "✗ Server container failed to become healthy" "$C_RED" + center_print "Check Docker logs above for errors" "$C_MUTE" + [ -n "$DOCKER_PID" ] && kill $DOCKER_PID 2>/dev/null || true + exit 1 + fi + + center_print "✓ Server is healthy!" "$C_GREEN" + echo "" + center_print "Building packages and launching Electron..." "$C_MUTE" + echo "" + + # Build packages and launch Electron + npm run build:packages + SKIP_EMBEDDED_SERVER=true PORT=$DEFAULT_SERVER_PORT VITE_SERVER_URL="http://localhost:$DEFAULT_SERVER_PORT" VITE_APP_MODE="4" npm run _dev:electron + + # Cleanup docker when electron exits + echo "" + center_print "Shutting down Docker container..." "$C_MUTE" + [ -n "$DOCKER_PID" ] && kill $DOCKER_PID 2>/dev/null || true + if [ "$DOCKER_CMD" = "sg docker -c" ]; then + sg docker -c "docker compose -f 'docker-compose.dev-server.yml' down" 2>/dev/null || true + else + $DOCKER_CMD compose -f docker-compose.dev-server.yml down 2>/dev/null || true + fi + center_print "Done!" "$C_GREEN" + ;; +esac