mirror of
https://github.com/AutoMaker-Org/automaker.git
synced 2026-01-30 06:12:03 +00:00
Merge pull request #42 from AutoMaker-Org/removing-electron-features-build-api
chore: update project management and API integration
This commit is contained in:
@@ -1,202 +0,0 @@
|
||||
<project_specification>
|
||||
<project_name>Automaker - Autonomous AI Development Studio</project_name>
|
||||
|
||||
<overview>
|
||||
Automaker is a sophisticated desktop application that empowers developers to build software autonomously through AI-powered agents. Built with Electron and Next.js, it provides an intelligent GUI for project management, feature tracking via Kanban boards, and autonomous code generation. The application leverages multiple AI models (Claude, GPT) and supports complex workflows including git worktree isolation, testing automation, and multi-model agent execution. It acts as a complete development orchestrator, managing the entire lifecycle from specification to verified implementation.
|
||||
</overview>
|
||||
|
||||
<technology_stack>
|
||||
<frontend>
|
||||
<framework>Next.js 16.0.7 (App Router)</framework>
|
||||
<ui_library>shadcn/ui with Radix UI primitives</ui_library>
|
||||
<styling>Tailwind CSS 4.0</styling>
|
||||
<state_management>Zustand with persistence</state_management>
|
||||
<drag_drop>@dnd-kit for Kanban board</drag_drop>
|
||||
<icons>Lucide React</icons>
|
||||
<query_client>TanStack Query for server state</query_client>
|
||||
</frontend>
|
||||
<desktop_shell>
|
||||
<framework>Electron 39.2.6</framework>
|
||||
<language>TypeScript 5.x</language>
|
||||
<inter_process_communication>Electron IPC with security sandboxing</inter_process_communication>
|
||||
<file_system>Node.js fs/promises with path validation</file_system>
|
||||
</desktop_shell>
|
||||
<ai_engine>
|
||||
<primary_model>Claude 3.5 (Opus, Sonnet, Haiku) via Anthropic Claude Agent SDK</primary_model>
|
||||
<secondary_model>GPT-5.1 Codex family via OpenAI CLI</secondary_model>
|
||||
<orchestration>Custom Agent Service with streaming responses</orchestration>
|
||||
<model_registry>Dynamic model provider system with CLI detection</model_registry>
|
||||
</ai_engine>
|
||||
<testing>
|
||||
<framework>Playwright for E2E testing</framework>
|
||||
<unit>Jest/Vitest compatible</unit>
|
||||
<integration>Agent-driven test execution and verification</integration>
|
||||
</testing>
|
||||
<version_control>
|
||||
<system>Git with worktree isolation support</system>
|
||||
<branching>Feature branch management</branching>
|
||||
<workflow>Automated commit and merge capabilities</workflow>
|
||||
</version_control>
|
||||
</technology_stack>
|
||||
|
||||
<core_capabilities>
|
||||
<project_management>
|
||||
- Open and manage multiple local projects
|
||||
- Project-specific themes and configurations
|
||||
- Session management with project context
|
||||
- Recently used project cycling (Q/E shortcuts)
|
||||
- Project search and type-ahead selection
|
||||
- Trash and restore functionality for projects
|
||||
</project_management>
|
||||
|
||||
<intelligent_analysis>
|
||||
- Auto-generation and updating of app_spec.txt
|
||||
- Feature extraction from existing codebases
|
||||
- Technology stack detection and documentation
|
||||
- Project structure analysis with file tree visualization - "Project Ingestion": Analyzes existing codebases to understand structure
|
||||
- Auto-generation of `.automaker/app_spec.txt` based on codebase analysis
|
||||
- Auto-generation of features in `.automaker/features/{id}/feature.json`:
|
||||
- Scans code for implemented features
|
||||
- Creates test cases for existing features
|
||||
- Marks existing features as "passes": true automatically
|
||||
</intelligent_analysis>
|
||||
|
||||
<kanban_workflow>
|
||||
- Visual representation of features from `.automaker/features/` folder
|
||||
- Drag-and-drop interface to reprioritize tasks
|
||||
- direct editing of feature details (steps, description) from the card
|
||||
- Visual Kanban board with drag-and-drop functionality
|
||||
- Multiple status columns: Backlog, In Progress, Waiting Approval, Verified
|
||||
- Feature cards with detailed information display (3 detail levels)
|
||||
- Real-time status updates during agent execution
|
||||
- Search and filtering capabilities
|
||||
- Category management and autocomplete
|
||||
- Image attachment support for feature descriptions
|
||||
</kanban_workflow>
|
||||
|
||||
<autonomous_agent_engine>
|
||||
- Multi-model agent system with profile-based execution
|
||||
- Streaming agent output with real-time logs
|
||||
- Git worktree isolation for safe feature development
|
||||
- Automatic testing and verification workflows
|
||||
- Context-aware prompt generation
|
||||
- Agent memory and learning capabilities
|
||||
- Concurrent feature processing with configurable limits
|
||||
- Follow-up and resume capabilities
|
||||
</autonomous_agent_engine>
|
||||
|
||||
<advanced_workflows>
|
||||
- Git worktree management for isolated development
|
||||
- Feature-specific branching and merging
|
||||
- Automated commit generation with file tracking
|
||||
- Test-driven development support
|
||||
- Code review and approval workflows
|
||||
- Revert and rollback capabilities
|
||||
</advanced_workflows>
|
||||
|
||||
<user_interface>
|
||||
- Dark/Light theme support with 12 custom themes
|
||||
- Per-project theme configurations
|
||||
- Comprehensive keyboard shortcut system
|
||||
- Sidebar navigation with project switching
|
||||
- Multi-view architecture (Board, Spec, Agent, Context, Settings)
|
||||
- Setup wizard for first-time configuration
|
||||
- CLI integration status monitoring
|
||||
</user_interface>
|
||||
|
||||
<extensibility>
|
||||
- AI Profile system for model/thinking level presets
|
||||
- Keyboard shortcut customization
|
||||
- Model provider plugin architecture
|
||||
- Context file management for agent guidance
|
||||
- Feature suggestion generation
|
||||
- Spec regeneration workflows
|
||||
</extensibility>
|
||||
</core_capabilities>
|
||||
|
||||
<ui_layout>
|
||||
<window_structure>
|
||||
- Sidebar: Project List, Settings, Logs, Plugins
|
||||
- Main Content:
|
||||
- **Spec View**: Split editor for `.automaker/app_spec.txt`
|
||||
- **Board View**: Kanban board for `.automaker/features/` folder
|
||||
- **Code View**: Read-only Monaco editor to see what the agent is writing
|
||||
- **Agent View**: Chat-like interface showing agent thought process and tool usage. Also used for the "New Project Interview".
|
||||
</window_structure>
|
||||
<theme>
|
||||
- Dark/Light mode support (system sync)
|
||||
- "Hacker" aesthetic option (terminal-like)
|
||||
- Professional/Clean default
|
||||
</theme>
|
||||
</ui_layout>
|
||||
|
||||
<development_workflow>
|
||||
<local_testing>
|
||||
- "Browser Mode": Run the Next.js frontend in a standard browser with mocked Electron IPC for rapid UI iteration.
|
||||
- "Electron Mode": Full desktop app testing.
|
||||
- Hot Reloading for both Main and Renderer processes.
|
||||
</local_testing>
|
||||
</development_workflow>
|
||||
|
||||
<implemented_features>
|
||||
- Complete Kanban board with drag-and-drop functionality
|
||||
- Multi-model AI agent execution (Claude + GPT/Codex)
|
||||
- Git worktree isolation for features
|
||||
- Real-time agent output streaming and logging
|
||||
- Project management with session persistence
|
||||
- Theme system with 12 themes + per-project themes
|
||||
- Comprehensive settings panel with all configurations
|
||||
- Feature image attachment and context system
|
||||
- Agent profiles with model/thinking level presets
|
||||
- Keyboard shortcut system with customization
|
||||
- CLI integration detection (Claude Code + Codex CLI)
|
||||
- Auto mode for autonomous feature processing
|
||||
- Feature suggestions generation
|
||||
- Spec regeneration and project analysis
|
||||
- Context file management
|
||||
- Chat history and session management
|
||||
- File diff viewing and git integration
|
||||
- Search and filtering across all features
|
||||
- Category management and autocomplete
|
||||
- Test automation and verification workflows
|
||||
</implemented_features>
|
||||
|
||||
<implementation_roadmap>
|
||||
<phase_1_foundation>
|
||||
- Enhanced error handling and recovery mechanisms
|
||||
- Performance optimization for large projects
|
||||
- Improved memory management for long-running sessions
|
||||
- Advanced logging and debugging capabilities
|
||||
</phase_1_foundation>
|
||||
|
||||
<phase_2_core_logic>
|
||||
- Plugin system for custom model providers
|
||||
- Advanced workflow customization engine
|
||||
- Team collaboration features
|
||||
- Cloud synchronization capabilities
|
||||
- Advanced project templates and scaffolding
|
||||
</phase_2_core_logic>
|
||||
|
||||
<phase_3_kanban_and_interaction>
|
||||
- Build Kanban board with drag-and-drop
|
||||
- Connect Kanban state to `.automaker/features/` filesystem
|
||||
- Implement "Run Feature" capability
|
||||
- Integrate standard prompts library
|
||||
</phase_3_kanban_and_interaction>
|
||||
|
||||
<phase_3_polish>
|
||||
- Enhanced accessibility features
|
||||
- Advanced theme customization
|
||||
- Performance monitoring and analytics
|
||||
- Documentation generation automation
|
||||
- Integration with external development tools
|
||||
- Advanced security auditing and sandboxing
|
||||
</phase_3_polish>
|
||||
|
||||
<phase_4_polish>
|
||||
- Advanced terminal integration
|
||||
- Settings & Extensibility
|
||||
- UI refinement
|
||||
</phase_4_polish>
|
||||
</implementation_roadmap>
|
||||
</project_specification>
|
||||
@@ -1,9 +0,0 @@
|
||||
[
|
||||
"Agent Runner",
|
||||
"Core",
|
||||
"Kanban",
|
||||
"Other",
|
||||
"Settings",
|
||||
"Uncategorized",
|
||||
"ka"
|
||||
]
|
||||
@@ -1,474 +0,0 @@
|
||||
# Clean Code Guidelines
|
||||
|
||||
## Overview
|
||||
|
||||
This document serves as a comprehensive guide for writing clean, maintainable, and extensible code. It outlines principles and practices that ensure code quality, reusability, and long-term maintainability. When writing or reviewing code, follow these guidelines to create software that is easy to understand, modify, and extend. This file is used by LLMs to understand and enforce coding standards throughout the codebase.
|
||||
|
||||
---
|
||||
|
||||
## Core Principles
|
||||
|
||||
### 1. DRY (Don't Repeat Yourself)
|
||||
|
||||
**Principle**: Every piece of knowledge should have a single, unambiguous representation within a system.
|
||||
|
||||
**Practices**:
|
||||
|
||||
- Extract repeated logic into reusable functions, classes, or modules
|
||||
- Use constants for repeated values
|
||||
- Create shared utilities for common operations
|
||||
- Avoid copy-pasting code blocks
|
||||
- When you find yourself writing similar code more than twice, refactor it
|
||||
|
||||
**Example - Bad**:
|
||||
|
||||
```typescript
|
||||
// Repeated validation logic
|
||||
if (email.includes("@") && email.length > 5) {
|
||||
// ...
|
||||
}
|
||||
if (email.includes("@") && email.length > 5) {
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
**Example - Good**:
|
||||
|
||||
```typescript
|
||||
function isValidEmail(email: string): boolean {
|
||||
return email.includes("@") && email.length > 5;
|
||||
}
|
||||
|
||||
if (isValidEmail(email)) {
|
||||
// ...
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 2. Code Reusability
|
||||
|
||||
**Principle**: Write code that can be used in multiple contexts without modification or with minimal adaptation.
|
||||
|
||||
**Practices**:
|
||||
|
||||
- Create generic, parameterized functions instead of specific ones
|
||||
- Use composition over inheritance where appropriate
|
||||
- Design functions to be pure (no side effects) when possible
|
||||
- Create utility libraries for common operations
|
||||
- Use dependency injection to make components reusable
|
||||
- Design APIs that are flexible and configurable
|
||||
|
||||
**Example - Bad**:
|
||||
|
||||
```typescript
|
||||
function calculateUserTotal(userId: string) {
|
||||
const user = getUser(userId);
|
||||
return user.items.reduce((sum, item) => sum + item.price, 0);
|
||||
}
|
||||
```
|
||||
|
||||
**Example - Good**:
|
||||
|
||||
```typescript
|
||||
function calculateTotal<T extends { price: number }>(items: T[]): number {
|
||||
return items.reduce((sum, item) => sum + item.price, 0);
|
||||
}
|
||||
|
||||
function calculateUserTotal(userId: string) {
|
||||
const user = getUser(userId);
|
||||
return calculateTotal(user.items);
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 3. Abstract Functions and Abstractions
|
||||
|
||||
**Principle**: Create abstractions that hide implementation details and provide clear, simple interfaces.
|
||||
|
||||
**Practices**:
|
||||
|
||||
- Use interfaces and abstract classes to define contracts
|
||||
- Create abstraction layers between different concerns
|
||||
- Hide complex implementation behind simple function signatures
|
||||
- Use dependency inversion - depend on abstractions, not concretions
|
||||
- Create factory functions/classes for object creation
|
||||
- Use strategy pattern for interchangeable algorithms
|
||||
|
||||
**Example - Bad**:
|
||||
|
||||
```typescript
|
||||
function processPayment(amount: number, cardNumber: string, cvv: string) {
|
||||
// Direct implementation tied to specific payment processor
|
||||
fetch("https://stripe.com/api/charge", {
|
||||
method: "POST",
|
||||
body: JSON.stringify({ amount, cardNumber, cvv }),
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
**Example - Good**:
|
||||
|
||||
```typescript
|
||||
interface PaymentProcessor {
|
||||
processPayment(
|
||||
amount: number,
|
||||
details: PaymentDetails
|
||||
): Promise<PaymentResult>;
|
||||
}
|
||||
|
||||
class StripeProcessor implements PaymentProcessor {
|
||||
async processPayment(
|
||||
amount: number,
|
||||
details: PaymentDetails
|
||||
): Promise<PaymentResult> {
|
||||
// Implementation
|
||||
}
|
||||
}
|
||||
|
||||
function processPayment(
|
||||
processor: PaymentProcessor,
|
||||
amount: number,
|
||||
details: PaymentDetails
|
||||
) {
|
||||
return processor.processPayment(amount, details);
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 4. Extensibility
|
||||
|
||||
**Principle**: Design code that can be easily extended with new features without modifying existing code.
|
||||
|
||||
**Practices**:
|
||||
|
||||
- Follow the Open/Closed Principle: open for extension, closed for modification
|
||||
- Use plugin architectures and hooks for extensibility
|
||||
- Design with future requirements in mind (but don't over-engineer)
|
||||
- Use configuration over hardcoding
|
||||
- Create extension points through interfaces and callbacks
|
||||
- Use composition and dependency injection
|
||||
- Design APIs that can accommodate new parameters/options
|
||||
|
||||
**Example - Bad**:
|
||||
|
||||
```typescript
|
||||
function sendNotification(user: User, type: string) {
|
||||
if (type === "email") {
|
||||
sendEmail(user.email);
|
||||
} else if (type === "sms") {
|
||||
sendSMS(user.phone);
|
||||
}
|
||||
// Adding new notification types requires modifying this function
|
||||
}
|
||||
```
|
||||
|
||||
**Example - Good**:
|
||||
|
||||
```typescript
|
||||
interface NotificationChannel {
|
||||
send(user: User): Promise<void>;
|
||||
}
|
||||
|
||||
class EmailChannel implements NotificationChannel {
|
||||
async send(user: User): Promise<void> {
|
||||
// Implementation
|
||||
}
|
||||
}
|
||||
|
||||
class SMSChannel implements NotificationChannel {
|
||||
async send(user: User): Promise<void> {
|
||||
// Implementation
|
||||
}
|
||||
}
|
||||
|
||||
class NotificationService {
|
||||
constructor(private channels: NotificationChannel[]) {}
|
||||
|
||||
async send(user: User): Promise<void> {
|
||||
await Promise.all(this.channels.map((channel) => channel.send(user)));
|
||||
}
|
||||
}
|
||||
// New notification types can be added without modifying existing code
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 5. Avoid Magic Numbers and Strings
|
||||
|
||||
**Principle**: Use named constants instead of hardcoded values to improve readability and maintainability.
|
||||
|
||||
**Practices**:
|
||||
|
||||
- Extract all magic numbers into named constants
|
||||
- Use enums for related constants
|
||||
- Create configuration objects for settings
|
||||
- Use constants for API endpoints, timeouts, limits, etc.
|
||||
- Document why specific values are used
|
||||
|
||||
**Example - Bad**:
|
||||
|
||||
```typescript
|
||||
if (user.age >= 18) {
|
||||
// What does 18 mean?
|
||||
}
|
||||
|
||||
setTimeout(() => {
|
||||
// What does 3000 mean?
|
||||
}, 3000);
|
||||
|
||||
if (status === "active") {
|
||||
// What are the valid statuses?
|
||||
}
|
||||
```
|
||||
|
||||
**Example - Good**:
|
||||
|
||||
```typescript
|
||||
const MINIMUM_AGE_FOR_ADULTS = 18;
|
||||
const SESSION_TIMEOUT_MS = 3000;
|
||||
|
||||
enum UserStatus {
|
||||
ACTIVE = "active",
|
||||
INACTIVE = "inactive",
|
||||
SUSPENDED = "suspended",
|
||||
}
|
||||
|
||||
if (user.age >= MINIMUM_AGE_FOR_ADULTS) {
|
||||
// Clear intent
|
||||
}
|
||||
|
||||
setTimeout(() => {
|
||||
// Clear intent
|
||||
}, SESSION_TIMEOUT_MS);
|
||||
|
||||
if (status === UserStatus.ACTIVE) {
|
||||
// Type-safe and clear
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Additional Best Practices
|
||||
|
||||
### 6. Single Responsibility Principle
|
||||
|
||||
Each function, class, or module should have one reason to change.
|
||||
|
||||
**Example**:
|
||||
|
||||
```typescript
|
||||
// Bad: Multiple responsibilities
|
||||
class User {
|
||||
save() {
|
||||
/* database logic */
|
||||
}
|
||||
sendEmail() {
|
||||
/* email logic */
|
||||
}
|
||||
validate() {
|
||||
/* validation logic */
|
||||
}
|
||||
}
|
||||
|
||||
// Good: Single responsibility
|
||||
class User {
|
||||
validate() {
|
||||
/* validation only */
|
||||
}
|
||||
}
|
||||
|
||||
class UserRepository {
|
||||
save(user: User) {
|
||||
/* database logic */
|
||||
}
|
||||
}
|
||||
|
||||
class EmailService {
|
||||
sendToUser(user: User) {
|
||||
/* email logic */
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 7. Meaningful Names
|
||||
|
||||
- Use descriptive names that reveal intent
|
||||
- Avoid abbreviations unless they're widely understood
|
||||
- Use verbs for functions, nouns for classes
|
||||
- Be consistent with naming conventions
|
||||
|
||||
**Example**:
|
||||
|
||||
```typescript
|
||||
// Bad
|
||||
const d = new Date();
|
||||
const u = getUser();
|
||||
function calc(x, y) {}
|
||||
|
||||
// Good
|
||||
const currentDate = new Date();
|
||||
const currentUser = getUser();
|
||||
function calculateTotal(price: number, quantity: number): number {}
|
||||
```
|
||||
|
||||
### 8. Small Functions
|
||||
|
||||
- Functions should do one thing and do it well
|
||||
- Keep functions short (ideally under 20 lines)
|
||||
- Extract complex logic into separate functions
|
||||
- Use descriptive function names instead of comments
|
||||
|
||||
### 9. Error Handling
|
||||
|
||||
- Handle errors explicitly
|
||||
- Use appropriate error types
|
||||
- Provide meaningful error messages
|
||||
- Don't swallow errors silently
|
||||
- Use try-catch appropriately
|
||||
|
||||
**Example**:
|
||||
|
||||
```typescript
|
||||
// Bad
|
||||
function divide(a: number, b: number) {
|
||||
return a / b; // Can throw division by zero
|
||||
}
|
||||
|
||||
// Good
|
||||
function divide(a: number, b: number): number {
|
||||
if (b === 0) {
|
||||
throw new Error("Division by zero is not allowed");
|
||||
}
|
||||
return a / b;
|
||||
}
|
||||
```
|
||||
|
||||
### 10. Comments and Documentation
|
||||
|
||||
- Write self-documenting code (code should explain itself)
|
||||
- Use comments to explain "why", not "what"
|
||||
- Document complex algorithms or business logic
|
||||
- Keep comments up-to-date with code changes
|
||||
- Use JSDoc/TSDoc for public APIs
|
||||
|
||||
### 11. Type Safety
|
||||
|
||||
- Use TypeScript types/interfaces effectively
|
||||
- Avoid `any` type unless absolutely necessary
|
||||
- Use union types and discriminated unions
|
||||
- Leverage type inference where appropriate
|
||||
- Create custom types for domain concepts
|
||||
|
||||
**Example**:
|
||||
|
||||
```typescript
|
||||
// Bad
|
||||
function processUser(data: any) {
|
||||
return data.name;
|
||||
}
|
||||
|
||||
// Good
|
||||
interface User {
|
||||
id: string;
|
||||
name: string;
|
||||
email: string;
|
||||
}
|
||||
|
||||
function processUser(user: User): string {
|
||||
return user.name;
|
||||
}
|
||||
```
|
||||
|
||||
### 12. Testing Considerations
|
||||
|
||||
- Write testable code (pure functions, dependency injection)
|
||||
- Keep functions small and focused
|
||||
- Avoid hidden dependencies
|
||||
- Use mocks and stubs appropriately
|
||||
- Design for testability from the start
|
||||
|
||||
### 13. Performance vs. Readability
|
||||
|
||||
- Prefer readability over premature optimization
|
||||
- Profile before optimizing
|
||||
- Use clear algorithms first, optimize if needed
|
||||
- Document performance-critical sections
|
||||
- Balance between clean code and performance requirements
|
||||
|
||||
### 14. Code Organization
|
||||
|
||||
- Group related functionality together
|
||||
- Use modules/packages to organize code
|
||||
- Follow consistent file and folder structures
|
||||
- Separate concerns (UI, business logic, data access)
|
||||
- Use barrel exports (index files) appropriately
|
||||
|
||||
### 15. Configuration Management
|
||||
|
||||
- Externalize configuration values
|
||||
- Use environment variables for environment-specific settings
|
||||
- Create configuration objects/interfaces
|
||||
- Validate configuration at startup
|
||||
- Provide sensible defaults
|
||||
|
||||
**Example**:
|
||||
|
||||
```typescript
|
||||
// Bad
|
||||
const apiUrl = "https://api.example.com";
|
||||
const timeout = 5000;
|
||||
|
||||
// Good
|
||||
interface Config {
|
||||
apiUrl: string;
|
||||
timeout: number;
|
||||
maxRetries: number;
|
||||
}
|
||||
|
||||
const config: Config = {
|
||||
apiUrl: process.env.API_URL || "https://api.example.com",
|
||||
timeout: parseInt(process.env.TIMEOUT || "5000"),
|
||||
maxRetries: parseInt(process.env.MAX_RETRIES || "3"),
|
||||
};
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Code Review Checklist
|
||||
|
||||
When reviewing code, check for:
|
||||
|
||||
- [ ] No code duplication (DRY principle)
|
||||
- [ ] Meaningful variable and function names
|
||||
- [ ] No magic numbers or strings
|
||||
- [ ] Functions are small and focused
|
||||
- [ ] Proper error handling
|
||||
- [ ] Type safety maintained
|
||||
- [ ] Code is testable
|
||||
- [ ] Documentation where needed
|
||||
- [ ] Consistent code style
|
||||
- [ ] Proper abstraction levels
|
||||
- [ ] Extensibility considered
|
||||
- [ ] Single responsibility principle followed
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
Clean code is:
|
||||
|
||||
- **Readable**: Easy to understand at a glance
|
||||
- **Maintainable**: Easy to modify and update
|
||||
- **Testable**: Easy to write tests for
|
||||
- **Extensible**: Easy to add new features
|
||||
- **Reusable**: Can be used in multiple contexts
|
||||
- **Well-documented**: Clear intent and purpose
|
||||
- **Type-safe**: Leverages type system effectively
|
||||
- **DRY**: No unnecessary repetition
|
||||
- **Abstracted**: Proper separation of concerns
|
||||
- **Configurable**: Uses constants and configuration over hardcoding
|
||||
|
||||
Remember: Code is read far more often than it is written. Write code for your future self and your teammates.
|
||||
@@ -1,70 +0,0 @@
|
||||
You are a very strong reasoner and planner. Use these critical instructions to structure your plans, thoughts, and responses.
|
||||
|
||||
Before taking any action (either tool calls or responses to the user), you must proactively, methodically, and independently plan and reason about:
|
||||
|
||||
1. Logical dependencies and constraints:
|
||||
|
||||
Analyze the intended action against the following factors. Resolve conflicts in order of importance:
|
||||
|
||||
1.1) Policy-based rules, mandatory prerequisites, and constraints.
|
||||
1.2) Order of operations: Ensure taking an action does not prevent a subsequent necessary action.
|
||||
1.2.1) The user may request actions in a random order, but you may need to reorder operations to maximize successful completion of the task.
|
||||
1.3) Other prerequisites (information and/or actions needed).
|
||||
1.4) Explicit user constraints or preferences.
|
||||
|
||||
2. Risk assessment:
|
||||
|
||||
What are the consequences of taking the action? Will the new state cause any future issues?
|
||||
|
||||
2.1) For exploratory tasks (like searches), missing optional parameters is a LOW risk.
|
||||
Prefer calling the tool with the available information over asking the user, unless your Rule 1 (Logical Dependencies) reasoning determines that optional information is required for a later step in your plan.
|
||||
|
||||
3. Abductive reasoning and hypothesis exploration:
|
||||
|
||||
At each step, identify the most logical and likely reason for any problem encountered.
|
||||
|
||||
3.1) Look beyond immediate or obvious causes. The most likely reason may not be the simplest and may require deeper inference.
|
||||
3.2) Hypotheses may require additional research. Each hypothesis may take multiple steps to test.
|
||||
3.3) Prioritize hypotheses based on likelihood, but do not discard less likely ones prematurely. A low-probability event may still be the root cause.
|
||||
|
||||
4. Outcome evaluation and adaptability:
|
||||
|
||||
Does the previous observation require any changes to your plan?
|
||||
|
||||
4.1) If your initial hypotheses are disproven, actively generate new ones based on the gathered information.
|
||||
|
||||
5. Information availability:
|
||||
|
||||
Incorporate all applicable and alternative sources of information, including:
|
||||
|
||||
5.1) Using available tools and their capabilities
|
||||
5.2) All policies, rules, checklists, and constraints
|
||||
5.3) Previous observations and conversation history
|
||||
5.4) Information only available by asking the user
|
||||
|
||||
6. Precision and Grounding:
|
||||
|
||||
Ensure your reasoning is extremely precise and relevant to each exact ongoing situation.
|
||||
|
||||
6.1) Verify your claims by quoting the exact applicable information (including policies) when referring to them.
|
||||
|
||||
7. Completeness:
|
||||
|
||||
Ensure that all requirements, constraints, options, and preferences are exhaustively incorporated into your plan.
|
||||
|
||||
7.1) Resolve conflicts using the order of importance in #1.
|
||||
7.2) Avoid premature conclusions: There may be multiple relevant options for a given situation.
|
||||
7.2.1) To check for whether an option is relevant, reason about all information sources from #5.
|
||||
7.2.2) You may need to consult the user to even know whether something is applicable. Do not assume it is not applicable without checking.
|
||||
7.3) Review applicable sources of information from #5 to confirm which are relevant to the current state.
|
||||
|
||||
8. Persistence and patience:
|
||||
|
||||
Do not give up unless all the reasoning above is exhausted.
|
||||
|
||||
8.1) Don't be dissuaded by time taken or user frustration.
|
||||
8.2) This persistence must be intelligent: On transient errors (e.g. please try again), you must retry unless an explicit retry limit (e.g., max x tries) has been reached. If such a limit is hit, you must stop. On other errors, you must change your strategy or arguments, not repeat the same failed call.
|
||||
|
||||
9. Inhibit your response:
|
||||
|
||||
Only take an action after all the above reasoning is completed. Once you've taken an action, you cannot take it back.
|
||||
@@ -1,53 +0,0 @@
|
||||
📋 Planning implementation for: move the report bug request button to top in header next to the automaker logo
|
||||
⚡ Executing implementation for: move the report bug request button to top in header next to the automaker logo
|
||||
Starting code implementation...
|
||||
|
||||
🔧 Tool: TodoWrite
|
||||
Let me start by exploring the header component and understanding the current structure.
|
||||
🔧 Tool: Glob
|
||||
|
||||
🔧 Tool: Grep
|
||||
|
||||
🔧 Tool: Read
|
||||
|
||||
🔧 Tool: Glob
|
||||
|
||||
🔧 Tool: TodoWrite
|
||||
I found the sidebar component. The "Report Bug / Feature Request" button is currently in the bottom section of the sidebar (lines 1150-1180). The feature request is to move this button to the top header, next to the Automaker logo.
|
||||
|
||||
Looking at the current structure:
|
||||
1. The logo section is at lines 742-770
|
||||
2. The bug report button is in the bottom section at lines 1150-1180
|
||||
|
||||
I'll move the bug report button to be next to the logo in the header area. Let me make this change:
|
||||
🔧 Tool: Edit
|
||||
Now I need to remove the old bug report button from the bottom section:
|
||||
🔧 Tool: Edit
|
||||
The change is complete. Let me verify the build compiles correctly:
|
||||
🔧 Tool: Bash
|
||||
|
||||
🔧 Tool: Bash
|
||||
|
||||
🔧 Tool: Bash
|
||||
|
||||
🔧 Tool: Bash
|
||||
The TypeScript check passed with no errors. Let me mark the task as completed and update the feature status:
|
||||
🔧 Tool: TodoWrite
|
||||
|
||||
🔧 Tool: mcp__automaker-tools__UpdateFeatureStatus
|
||||
|
||||
🔧 Tool: TodoWrite
|
||||
I've completed the feature implementation. Here's a summary of what was done:
|
||||
|
||||
**Changes Made:**
|
||||
|
||||
1. **Modified `app/src/components/layout/sidebar.tsx`:**
|
||||
- Moved the "Report Bug / Feature Request" button from the bottom sidebar section (between Running Agents and Settings) to the header area, next to the AutoMaker logo
|
||||
- Changed the header container from `justify-center` to `justify-between` to accommodate the new button placement
|
||||
- The button is now a compact icon-only button with a tooltip on hover
|
||||
- Kept the same test ID (`bug-report-link`) for consistency
|
||||
- Removed the old button from the bottom section
|
||||
|
||||
The bug report button now appears as a small bug icon in the header, right next to the AutoMaker logo, making it more accessible and visible to users.✅ Verifying implementation for: move the report bug request button to top in header next to the automaker logo
|
||||
Verifying implementation and checking test results...
|
||||
✓ Verification successful: All tests passed
|
||||
@@ -1,14 +0,0 @@
|
||||
{
|
||||
"category": "Uncategorized",
|
||||
"description": "move the report bug request button to top in header next to the automaker logo",
|
||||
"steps": [],
|
||||
"status": "waiting_approval",
|
||||
"images": [],
|
||||
"imagePaths": [],
|
||||
"skipTests": true,
|
||||
"model": "opus",
|
||||
"thinkingLevel": "none",
|
||||
"id": "feature-1765426941179-om12nlapa",
|
||||
"startedAt": "2025-12-11T04:22:21.750Z",
|
||||
"summary": "Moved the Report Bug / Feature Request button from the bottom sidebar section to the header, next to the AutoMaker logo. Modified: app/src/components/layout/sidebar.tsx. The button now appears as a compact icon button in the header area."
|
||||
}
|
||||
@@ -1,11 +0,0 @@
|
||||
{
|
||||
"timestamp": "2025-12-11T04:22:21.809Z",
|
||||
"modifiedFiles": [
|
||||
"app/src/components/views/analysis-view.tsx",
|
||||
"app/src/components/views/interview-view.tsx"
|
||||
],
|
||||
"untrackedFiles": [
|
||||
".automaker/features/feature-1765426941179-om12nlapa/feature.json",
|
||||
"marketing/index.html"
|
||||
]
|
||||
}
|
||||
@@ -1,172 +0,0 @@
|
||||
# Agent Memory - Lessons Learned
|
||||
|
||||
This file documents issues encountered by previous agents and their solutions. Read this before starting work to avoid repeating mistakes.
|
||||
|
||||
## Testing Issues
|
||||
|
||||
### Issue: Mock project setup not navigating to board view
|
||||
|
||||
**Problem:** Setting `currentProject` in localStorage didn't automatically show the board view - app stayed on welcome view.
|
||||
**Fix:** The `currentView` state is not persisted in localStorage. Instead of trying to set it, have tests click on the recent project from the welcome view to trigger `setCurrentProject()` which handles the view transition properly.
|
||||
|
||||
```typescript
|
||||
// Don't do this:
|
||||
await setupMockProject(page); // Sets localStorage
|
||||
await page.goto("/");
|
||||
await waitForElement(page, "board-view"); // ❌ Fails - still on welcome view
|
||||
|
||||
// Do this instead:
|
||||
await setupMockProject(page);
|
||||
await page.goto("/");
|
||||
await waitForElement(page, "welcome-view");
|
||||
const recentProject = page.locator(
|
||||
'[data-testid="recent-project-test-project-1"]'
|
||||
);
|
||||
await recentProject.click(); // ✅ Triggers proper view transition
|
||||
await waitForElement(page, "board-view");
|
||||
```
|
||||
|
||||
### Issue: View output button test IDs are conditional
|
||||
|
||||
**Problem:** Tests failed looking for `view-output-inprogress-${featureId}` when the actual button had `view-output-${featureId}`.
|
||||
**Fix:** The button test ID depends on whether the feature is actively running:
|
||||
|
||||
- `view-output-${featureId}` - shown when feature is in `runningAutoTasks` (actively running)
|
||||
- `view-output-inprogress-${featureId}` - shown when status is "in_progress" but NOT actively running
|
||||
|
||||
After dragging a feature to in_progress, wait for the `auto_mode_feature_start` event to fire before looking for the button:
|
||||
|
||||
```typescript
|
||||
// Wait for feature to start running
|
||||
const viewOutputButton = page
|
||||
.locator(
|
||||
`[data-testid="view-output-${featureId}"], [data-testid="view-output-inprogress-${featureId}"]`
|
||||
)
|
||||
.first();
|
||||
await expect(viewOutputButton).toBeVisible({ timeout: 8000 });
|
||||
```
|
||||
|
||||
### Issue: Elements not appearing due to async event timing
|
||||
|
||||
**Problem:** Tests checked for UI elements before async events (like `auto_mode_feature_start`) had fired and updated the UI.
|
||||
**Fix:** Add appropriate timeouts when waiting for elements that depend on async events. The mock auto mode takes ~2.4 seconds to complete, so allow sufficient time:
|
||||
|
||||
```typescript
|
||||
// Mock auto mode timing: ~2.4s + 1.5s delay = ~4s total
|
||||
await waitForAgentOutputModalHidden(page, { timeout: 10000 });
|
||||
```
|
||||
|
||||
### Issue: Slider interaction testing
|
||||
|
||||
**Problem:** Clicking on slider track didn't reliably set specific values.
|
||||
**Fix:** Use the slider's keyboard interaction or calculate the exact click position on the track. For max value, click on the rightmost edge of the track.
|
||||
|
||||
### Issue: Port binding blocked in sandbox mode
|
||||
|
||||
**Problem:** Playwright tests couldn't bind to port in sandbox mode.
|
||||
**Fix:** Tests don't need sandbox disabled - the issue was TEST_REUSE_SERVER environment variable. Make sure to start the dev server separately or let Playwright's webServer config handle it.
|
||||
|
||||
## Code Architecture
|
||||
|
||||
### Issue: Understanding store state persistence
|
||||
|
||||
**Problem:** Not all store state is persisted to localStorage.
|
||||
**Fix:** Check the `partialize` function in `app-store.ts` to see which state is persisted:
|
||||
|
||||
```typescript
|
||||
partialize: (state) => ({
|
||||
projects: state.projects,
|
||||
currentProject: state.currentProject,
|
||||
theme: state.theme,
|
||||
sidebarOpen: state.sidebarOpen,
|
||||
apiKeys: state.apiKeys,
|
||||
chatSessions: state.chatSessions,
|
||||
chatHistoryOpen: state.chatHistoryOpen,
|
||||
maxConcurrency: state.maxConcurrency, // Added for concurrency feature
|
||||
});
|
||||
```
|
||||
|
||||
Note: `currentView` is NOT persisted - it's managed through actions.
|
||||
|
||||
### Issue: Auto mode task lifecycle
|
||||
|
||||
**Problem:** Confusion about when features are considered "running" vs "in_progress".
|
||||
**Fix:** Understand the task lifecycle:
|
||||
|
||||
1. Feature dragged to "in_progress" column → status becomes "in_progress"
|
||||
2. `auto_mode_feature_start` event fires → feature added to `runningAutoTasks`
|
||||
3. Agent works on feature → periodic events sent
|
||||
4. `auto_mode_feature_complete` event fires → feature removed from `runningAutoTasks`
|
||||
5. If `passes: true` → status becomes "verified", if `passes: false` → stays "in_progress"
|
||||
|
||||
### Issue: waiting_approval features not draggable when skipTests=true
|
||||
|
||||
**Problem:** Features in `waiting_approval` status couldn't be dragged to `verified` column, even though the code appeared to handle it.
|
||||
**Fix:** The order of condition checks in `handleDragEnd` matters. The `skipTests` check was catching `waiting_approval` features before the `waiting_approval` status check could handle them. Move the `waiting_approval` status check **before** the `skipTests` check in `board-view.tsx`:
|
||||
|
||||
```typescript
|
||||
// Correct order in handleDragEnd:
|
||||
if (draggedFeature.status === "backlog") {
|
||||
// ...
|
||||
} else if (draggedFeature.status === "waiting_approval") {
|
||||
// Handle waiting_approval BEFORE skipTests check
|
||||
// because waiting_approval features often have skipTests=true
|
||||
} else if (draggedFeature.skipTests) {
|
||||
// Handle other skipTests features
|
||||
}
|
||||
```
|
||||
|
||||
## Best Practices Discovered
|
||||
|
||||
### Testing utilities are critical
|
||||
|
||||
Create comprehensive testing utilities in `tests/utils.ts` to avoid repeating selector logic:
|
||||
|
||||
- `waitForElement` - waits for elements to appear
|
||||
- `waitForElementHidden` - waits for elements to disappear
|
||||
- `setupMockProject` - sets up mock localStorage state
|
||||
- `navigateToBoard` - handles navigation from welcome to board view
|
||||
|
||||
### Always add data-testid attributes
|
||||
|
||||
When implementing features, immediately add `data-testid` attributes to key UI elements. This makes tests more reliable and easier to write.
|
||||
|
||||
### Test timeouts should be generous but not excessive
|
||||
|
||||
- Default timeout: 30s (set in playwright.config.ts)
|
||||
- Element waits: 5-15s for critical elements
|
||||
- Auto mode completion: 10s (accounts for ~4s mock duration)
|
||||
- Don't increase timeouts past 10s for individual operations
|
||||
|
||||
### Mock auto mode timing
|
||||
|
||||
The mock auto mode in `electron.ts` has predictable timing:
|
||||
|
||||
- Total duration: ~2.4 seconds (300+500+300+300+500+500ms)
|
||||
- Plus 1.5s delay before auto-closing modals
|
||||
- Total: ~4 seconds from start to completion
|
||||
|
||||
### Issue: HotkeyButton conflicting with useKeyboardShortcuts
|
||||
|
||||
**Problem:** Adding `HotkeyButton` with a simple key (like "N") to buttons that already had keyboard shortcuts registered via `useKeyboardShortcuts` caused the hotkey to stop working. Both registered duplicate listeners, and the HotkeyButton's `stopPropagation()` call could interfere.
|
||||
**Fix:** When a simple single-key hotkey is already handled by `useKeyboardShortcuts`, set `hotkeyActive={false}` on the `HotkeyButton` so it only displays the indicator badge without registering a duplicate listener:
|
||||
|
||||
```tsx
|
||||
// In views that already use useKeyboardShortcuts for the "N" key:
|
||||
<HotkeyButton
|
||||
onClick={() => setShowAddDialog(true)}
|
||||
hotkey={shortcuts.addFeature}
|
||||
hotkeyActive={false} // <-- Important! Prevents duplicate listener
|
||||
>
|
||||
Add Feature
|
||||
</HotkeyButton>
|
||||
|
||||
// HotkeyButton should only actively listen when it's the sole handler (e.g., Cmd+Enter in dialogs)
|
||||
<HotkeyButton
|
||||
onClick={handleSubmit}
|
||||
hotkey={{ key: "Enter", cmdCtrl: true }}
|
||||
hotkeyActive={isDialogOpen} // Active when dialog is open
|
||||
>
|
||||
Submit
|
||||
</HotkeyButton>
|
||||
```
|
||||
148
.github/scripts/upload-to-r2.js
vendored
Normal file
148
.github/scripts/upload-to-r2.js
vendored
Normal file
@@ -0,0 +1,148 @@
|
||||
const { S3Client, PutObjectCommand, GetObjectCommand } = require('@aws-sdk/client-s3');
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const s3Client = new S3Client({
|
||||
region: 'auto',
|
||||
endpoint: `https://${process.env.R2_ACCOUNT_ID}.r2.cloudflarestorage.com`,
|
||||
credentials: {
|
||||
accessKeyId: process.env.R2_ACCESS_KEY_ID,
|
||||
secretAccessKey: process.env.R2_SECRET_ACCESS_KEY,
|
||||
},
|
||||
});
|
||||
|
||||
const BUCKET = process.env.R2_BUCKET_NAME;
|
||||
const PUBLIC_URL = process.env.R2_PUBLIC_URL;
|
||||
const VERSION = process.env.RELEASE_VERSION;
|
||||
const GITHUB_REPO = process.env.GITHUB_REPOSITORY;
|
||||
|
||||
async function fetchExistingReleases() {
|
||||
try {
|
||||
const response = await s3Client.send(new GetObjectCommand({
|
||||
Bucket: BUCKET,
|
||||
Key: 'releases.json',
|
||||
}));
|
||||
const body = await response.Body.transformToString();
|
||||
return JSON.parse(body);
|
||||
} catch (error) {
|
||||
if (error.name === 'NoSuchKey' || error.$metadata?.httpStatusCode === 404) {
|
||||
console.log('No existing releases.json found, creating new one');
|
||||
return { latestVersion: null, releases: [] };
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async function uploadFile(localPath, r2Key, contentType) {
|
||||
const fileBuffer = fs.readFileSync(localPath);
|
||||
const stats = fs.statSync(localPath);
|
||||
|
||||
await s3Client.send(new PutObjectCommand({
|
||||
Bucket: BUCKET,
|
||||
Key: r2Key,
|
||||
Body: fileBuffer,
|
||||
ContentType: contentType,
|
||||
}));
|
||||
|
||||
console.log(`Uploaded: ${r2Key} (${stats.size} bytes)`);
|
||||
return stats.size;
|
||||
}
|
||||
|
||||
function findArtifacts(dir, pattern) {
|
||||
if (!fs.existsSync(dir)) return [];
|
||||
const files = fs.readdirSync(dir);
|
||||
return files.filter(f => pattern.test(f)).map(f => path.join(dir, f));
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const artifactsDir = 'artifacts';
|
||||
|
||||
// Find all artifacts
|
||||
const artifacts = {
|
||||
windows: findArtifacts(
|
||||
path.join(artifactsDir, 'windows-builds'),
|
||||
/\.exe$/
|
||||
),
|
||||
macos: findArtifacts(
|
||||
path.join(artifactsDir, 'macos-builds'),
|
||||
/-x64\.dmg$/
|
||||
),
|
||||
macosArm: findArtifacts(
|
||||
path.join(artifactsDir, 'macos-builds'),
|
||||
/-arm64\.dmg$/
|
||||
),
|
||||
linux: findArtifacts(
|
||||
path.join(artifactsDir, 'linux-builds'),
|
||||
/\.AppImage$/
|
||||
),
|
||||
};
|
||||
|
||||
console.log('Found artifacts:');
|
||||
for (const [platform, files] of Object.entries(artifacts)) {
|
||||
console.log(` ${platform}: ${files.length > 0 ? files.map(f => path.basename(f)).join(', ') : 'none'}`);
|
||||
}
|
||||
|
||||
// Upload each artifact to R2
|
||||
const assets = {};
|
||||
const contentTypes = {
|
||||
windows: 'application/x-msdownload',
|
||||
macos: 'application/x-apple-diskimage',
|
||||
macosArm: 'application/x-apple-diskimage',
|
||||
linux: 'application/x-executable',
|
||||
};
|
||||
|
||||
for (const [platform, files] of Object.entries(artifacts)) {
|
||||
if (files.length === 0) {
|
||||
console.warn(`Warning: No artifact found for ${platform}`);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Use the first matching file for each platform
|
||||
const localPath = files[0];
|
||||
const filename = path.basename(localPath);
|
||||
const r2Key = `releases/${VERSION}/${filename}`;
|
||||
const size = await uploadFile(localPath, r2Key, contentTypes[platform]);
|
||||
|
||||
assets[platform] = {
|
||||
url: `${PUBLIC_URL}/releases/${VERSION}/${filename}`,
|
||||
filename,
|
||||
size,
|
||||
arch: platform === 'macosArm' ? 'arm64' : 'x64',
|
||||
};
|
||||
}
|
||||
|
||||
// Fetch and update releases.json
|
||||
const releasesData = await fetchExistingReleases();
|
||||
|
||||
const newRelease = {
|
||||
version: VERSION,
|
||||
date: new Date().toISOString(),
|
||||
assets,
|
||||
githubReleaseUrl: `https://github.com/${GITHUB_REPO}/releases/tag/${VERSION}`,
|
||||
};
|
||||
|
||||
// Remove existing entry for this version if re-running
|
||||
releasesData.releases = releasesData.releases.filter(r => r.version !== VERSION);
|
||||
|
||||
// Prepend new release
|
||||
releasesData.releases.unshift(newRelease);
|
||||
releasesData.latestVersion = VERSION;
|
||||
|
||||
// Upload updated releases.json
|
||||
await s3Client.send(new PutObjectCommand({
|
||||
Bucket: BUCKET,
|
||||
Key: 'releases.json',
|
||||
Body: JSON.stringify(releasesData, null, 2),
|
||||
ContentType: 'application/json',
|
||||
CacheControl: 'public, max-age=60',
|
||||
}));
|
||||
|
||||
console.log('Successfully updated releases.json');
|
||||
console.log(`Latest version: ${VERSION}`);
|
||||
console.log(`Total releases: ${releasesData.releases.length}`);
|
||||
}
|
||||
|
||||
main().catch(err => {
|
||||
console.error('Failed to upload to R2:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
59
.github/workflows/release.yml
vendored
59
.github/workflows/release.yml
vendored
@@ -19,10 +19,13 @@ jobs:
|
||||
include:
|
||||
- os: macos-latest
|
||||
name: macOS
|
||||
artifact-name: macos-builds
|
||||
- os: windows-latest
|
||||
name: Windows
|
||||
artifact-name: windows-builds
|
||||
- os: ubuntu-latest
|
||||
name: Linux
|
||||
artifact-name: linux-builds
|
||||
|
||||
runs-on: ${{ matrix.os }}
|
||||
|
||||
@@ -78,3 +81,59 @@ jobs:
|
||||
prerelease: false
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Upload macOS artifacts for R2
|
||||
if: matrix.os == 'macos-latest'
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ${{ matrix.artifact-name }}
|
||||
path: apps/app/dist/*.dmg
|
||||
retention-days: 1
|
||||
|
||||
- name: Upload Windows artifacts for R2
|
||||
if: matrix.os == 'windows-latest'
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ${{ matrix.artifact-name }}
|
||||
path: apps/app/dist/*.exe
|
||||
retention-days: 1
|
||||
|
||||
- name: Upload Linux artifacts for R2
|
||||
if: matrix.os == 'ubuntu-latest'
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ${{ matrix.artifact-name }}
|
||||
path: apps/app/dist/*.AppImage
|
||||
retention-days: 1
|
||||
|
||||
upload-to-r2:
|
||||
needs: build-and-release
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "20"
|
||||
|
||||
- name: Download all artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: artifacts
|
||||
|
||||
- name: Install AWS SDK
|
||||
run: npm install @aws-sdk/client-s3
|
||||
|
||||
- name: Upload to R2 and update releases.json
|
||||
env:
|
||||
R2_ACCOUNT_ID: ${{ secrets.R2_ACCOUNT_ID }}
|
||||
R2_ACCESS_KEY_ID: ${{ secrets.R2_ACCESS_KEY_ID }}
|
||||
R2_SECRET_ACCESS_KEY: ${{ secrets.R2_SECRET_ACCESS_KEY }}
|
||||
R2_BUCKET_NAME: ${{ secrets.R2_BUCKET_NAME }}
|
||||
R2_PUBLIC_URL: ${{ secrets.R2_PUBLIC_URL }}
|
||||
RELEASE_VERSION: ${{ github.event.inputs.version || github.ref_name }}
|
||||
GITHUB_REPOSITORY: ${{ github.repository }}
|
||||
run: node .github/scripts/upload-to-r2.js
|
||||
|
||||
5
.gitignore
vendored
5
.gitignore
vendored
@@ -7,3 +7,8 @@ node_modules/
|
||||
# Build outputs
|
||||
dist/
|
||||
.next/
|
||||
node_modules
|
||||
.automaker/images/
|
||||
.automaker/
|
||||
/.automaker/*
|
||||
/.automaker/
|
||||
|
||||
241
apps/app/electron/main-simplified.js
Normal file
241
apps/app/electron/main-simplified.js
Normal file
@@ -0,0 +1,241 @@
|
||||
/**
|
||||
* Simplified Electron main process
|
||||
*
|
||||
* This version spawns the backend server and uses HTTP API for most operations.
|
||||
* Only native features (dialogs, shell) use IPC.
|
||||
*/
|
||||
|
||||
const path = require("path");
|
||||
const { spawn } = require("child_process");
|
||||
|
||||
// Load environment variables from .env file
|
||||
require("dotenv").config({ path: path.join(__dirname, "../.env") });
|
||||
|
||||
const { app, BrowserWindow, ipcMain, dialog, shell } = require("electron");
|
||||
|
||||
let mainWindow = null;
|
||||
let serverProcess = null;
|
||||
const SERVER_PORT = 3008;
|
||||
|
||||
// Get icon path - works in both dev and production
|
||||
function getIconPath() {
|
||||
return app.isPackaged
|
||||
? path.join(process.resourcesPath, "app", "public", "logo.png")
|
||||
: path.join(__dirname, "../public/logo.png");
|
||||
}
|
||||
|
||||
/**
|
||||
* Start the backend server
|
||||
*/
|
||||
async function startServer() {
|
||||
const isDev = !app.isPackaged;
|
||||
|
||||
// Server entry point
|
||||
const serverPath = isDev
|
||||
? path.join(__dirname, "../../server/dist/index.js")
|
||||
: path.join(process.resourcesPath, "server", "index.js");
|
||||
|
||||
// Set environment variables for server
|
||||
const env = {
|
||||
...process.env,
|
||||
PORT: SERVER_PORT.toString(),
|
||||
DATA_DIR: app.getPath("userData"),
|
||||
};
|
||||
|
||||
console.log("[Electron] Starting backend server...");
|
||||
|
||||
serverProcess = spawn("node", [serverPath], {
|
||||
env,
|
||||
stdio: ["ignore", "pipe", "pipe"],
|
||||
});
|
||||
|
||||
serverProcess.stdout.on("data", (data) => {
|
||||
console.log(`[Server] ${data.toString().trim()}`);
|
||||
});
|
||||
|
||||
serverProcess.stderr.on("data", (data) => {
|
||||
console.error(`[Server Error] ${data.toString().trim()}`);
|
||||
});
|
||||
|
||||
serverProcess.on("close", (code) => {
|
||||
console.log(`[Server] Process exited with code ${code}`);
|
||||
serverProcess = null;
|
||||
});
|
||||
|
||||
// Wait for server to be ready
|
||||
await waitForServer();
|
||||
}
|
||||
|
||||
/**
|
||||
* Wait for server to be available
|
||||
*/
|
||||
async function waitForServer(maxAttempts = 30) {
|
||||
const http = require("http");
|
||||
|
||||
for (let i = 0; i < maxAttempts; i++) {
|
||||
try {
|
||||
await new Promise((resolve, reject) => {
|
||||
const req = http.get(`http://localhost:${SERVER_PORT}/api/health`, (res) => {
|
||||
if (res.statusCode === 200) {
|
||||
resolve();
|
||||
} else {
|
||||
reject(new Error(`Status: ${res.statusCode}`));
|
||||
}
|
||||
});
|
||||
req.on("error", reject);
|
||||
req.setTimeout(1000, () => {
|
||||
req.destroy();
|
||||
reject(new Error("Timeout"));
|
||||
});
|
||||
});
|
||||
console.log("[Electron] Server is ready");
|
||||
return;
|
||||
} catch {
|
||||
await new Promise((r) => setTimeout(r, 500));
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error("Server failed to start");
|
||||
}
|
||||
|
||||
/**
|
||||
* Create the main window
|
||||
*/
|
||||
function createWindow() {
|
||||
mainWindow = new BrowserWindow({
|
||||
width: 1400,
|
||||
height: 900,
|
||||
minWidth: 1024,
|
||||
minHeight: 700,
|
||||
icon: getIconPath(),
|
||||
webPreferences: {
|
||||
preload: path.join(__dirname, "preload-simplified.js"),
|
||||
contextIsolation: true,
|
||||
nodeIntegration: false,
|
||||
},
|
||||
titleBarStyle: "hiddenInset",
|
||||
backgroundColor: "#0a0a0a",
|
||||
});
|
||||
|
||||
// Load Next.js dev server in development or production build
|
||||
const isDev = !app.isPackaged;
|
||||
if (isDev) {
|
||||
mainWindow.loadURL("http://localhost:3007");
|
||||
if (process.env.OPEN_DEVTOOLS === "true") {
|
||||
mainWindow.webContents.openDevTools();
|
||||
}
|
||||
} else {
|
||||
mainWindow.loadFile(path.join(__dirname, "../.next/server/app/index.html"));
|
||||
}
|
||||
|
||||
mainWindow.on("closed", () => {
|
||||
mainWindow = null;
|
||||
});
|
||||
}
|
||||
|
||||
// App lifecycle
|
||||
app.whenReady().then(async () => {
|
||||
// Set app icon (dock icon on macOS)
|
||||
if (process.platform === "darwin" && app.dock) {
|
||||
app.dock.setIcon(getIconPath());
|
||||
}
|
||||
|
||||
try {
|
||||
// Start backend server
|
||||
await startServer();
|
||||
|
||||
// Create window
|
||||
createWindow();
|
||||
} catch (error) {
|
||||
console.error("[Electron] Failed to start:", error);
|
||||
app.quit();
|
||||
}
|
||||
|
||||
app.on("activate", () => {
|
||||
if (BrowserWindow.getAllWindows().length === 0) {
|
||||
createWindow();
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
app.on("window-all-closed", () => {
|
||||
if (process.platform !== "darwin") {
|
||||
app.quit();
|
||||
}
|
||||
});
|
||||
|
||||
app.on("before-quit", () => {
|
||||
// Kill server process
|
||||
if (serverProcess) {
|
||||
console.log("[Electron] Stopping server...");
|
||||
serverProcess.kill();
|
||||
serverProcess = null;
|
||||
}
|
||||
});
|
||||
|
||||
// ============================================
|
||||
// IPC Handlers - Only native features
|
||||
// ============================================
|
||||
|
||||
// Native file dialogs
|
||||
ipcMain.handle("dialog:openDirectory", async () => {
|
||||
const result = await dialog.showOpenDialog(mainWindow, {
|
||||
properties: ["openDirectory", "createDirectory"],
|
||||
});
|
||||
return result;
|
||||
});
|
||||
|
||||
ipcMain.handle("dialog:openFile", async (_, options = {}) => {
|
||||
const result = await dialog.showOpenDialog(mainWindow, {
|
||||
properties: ["openFile"],
|
||||
...options,
|
||||
});
|
||||
return result;
|
||||
});
|
||||
|
||||
ipcMain.handle("dialog:saveFile", async (_, options = {}) => {
|
||||
const result = await dialog.showSaveDialog(mainWindow, options);
|
||||
return result;
|
||||
});
|
||||
|
||||
// Shell operations
|
||||
ipcMain.handle("shell:openExternal", async (_, url) => {
|
||||
try {
|
||||
await shell.openExternal(url);
|
||||
return { success: true };
|
||||
} catch (error) {
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
});
|
||||
|
||||
ipcMain.handle("shell:openPath", async (_, filePath) => {
|
||||
try {
|
||||
await shell.openPath(filePath);
|
||||
return { success: true };
|
||||
} catch (error) {
|
||||
return { success: false, error: error.message };
|
||||
}
|
||||
});
|
||||
|
||||
// App info
|
||||
ipcMain.handle("app:getPath", async (_, name) => {
|
||||
return app.getPath(name);
|
||||
});
|
||||
|
||||
ipcMain.handle("app:getVersion", async () => {
|
||||
return app.getVersion();
|
||||
});
|
||||
|
||||
ipcMain.handle("app:isPackaged", async () => {
|
||||
return app.isPackaged;
|
||||
});
|
||||
|
||||
// Ping - for connection check
|
||||
ipcMain.handle("ping", async () => {
|
||||
return "pong";
|
||||
});
|
||||
|
||||
// Get server URL for HTTP client
|
||||
ipcMain.handle("server:getUrl", async () => {
|
||||
return `http://localhost:${SERVER_PORT}`;
|
||||
});
|
||||
File diff suppressed because it is too large
Load Diff
37
apps/app/electron/preload-simplified.js
Normal file
37
apps/app/electron/preload-simplified.js
Normal file
@@ -0,0 +1,37 @@
|
||||
/**
|
||||
* Simplified Electron preload script
|
||||
*
|
||||
* Only exposes native features (dialogs, shell) and server URL.
|
||||
* All other operations go through HTTP API.
|
||||
*/
|
||||
|
||||
const { contextBridge, ipcRenderer } = require("electron");
|
||||
|
||||
// Expose minimal API for native features
|
||||
contextBridge.exposeInMainWorld("electronAPI", {
|
||||
// Platform info
|
||||
platform: process.platform,
|
||||
isElectron: true,
|
||||
|
||||
// Connection check
|
||||
ping: () => ipcRenderer.invoke("ping"),
|
||||
|
||||
// Get server URL for HTTP client
|
||||
getServerUrl: () => ipcRenderer.invoke("server:getUrl"),
|
||||
|
||||
// Native dialogs - better UX than prompt()
|
||||
openDirectory: () => ipcRenderer.invoke("dialog:openDirectory"),
|
||||
openFile: (options) => ipcRenderer.invoke("dialog:openFile", options),
|
||||
saveFile: (options) => ipcRenderer.invoke("dialog:saveFile", options),
|
||||
|
||||
// Shell operations
|
||||
openExternalLink: (url) => ipcRenderer.invoke("shell:openExternal", url),
|
||||
openPath: (filePath) => ipcRenderer.invoke("shell:openPath", filePath),
|
||||
|
||||
// App info
|
||||
getPath: (name) => ipcRenderer.invoke("app:getPath", name),
|
||||
getVersion: () => ipcRenderer.invoke("app:getVersion"),
|
||||
isPackaged: () => ipcRenderer.invoke("app:isPackaged"),
|
||||
});
|
||||
|
||||
console.log("[Preload] Electron API exposed (simplified mode)");
|
||||
@@ -1,404 +1,10 @@
|
||||
const { contextBridge, ipcRenderer } = require("electron");
|
||||
const { contextBridge } = require("electron");
|
||||
|
||||
// Expose protected methods that allow the renderer process to use
|
||||
// the ipcRenderer without exposing the entire object
|
||||
contextBridge.exposeInMainWorld("electronAPI", {
|
||||
// IPC test
|
||||
ping: () => ipcRenderer.invoke("ping"),
|
||||
|
||||
// Shell APIs
|
||||
openExternalLink: (url) => ipcRenderer.invoke("shell:openExternal", url),
|
||||
|
||||
// Dialog APIs
|
||||
openDirectory: () => ipcRenderer.invoke("dialog:openDirectory"),
|
||||
openFile: (options) => ipcRenderer.invoke("dialog:openFile", options),
|
||||
|
||||
// File system APIs
|
||||
readFile: (filePath) => ipcRenderer.invoke("fs:readFile", filePath),
|
||||
writeFile: (filePath, content) =>
|
||||
ipcRenderer.invoke("fs:writeFile", filePath, content),
|
||||
mkdir: (dirPath) => ipcRenderer.invoke("fs:mkdir", dirPath),
|
||||
readdir: (dirPath) => ipcRenderer.invoke("fs:readdir", dirPath),
|
||||
exists: (filePath) => ipcRenderer.invoke("fs:exists", filePath),
|
||||
stat: (filePath) => ipcRenderer.invoke("fs:stat", filePath),
|
||||
deleteFile: (filePath) => ipcRenderer.invoke("fs:deleteFile", filePath),
|
||||
trashItem: (filePath) => ipcRenderer.invoke("fs:trashItem", filePath),
|
||||
|
||||
// App APIs
|
||||
getPath: (name) => ipcRenderer.invoke("app:getPath", name),
|
||||
saveImageToTemp: (data, filename, mimeType, projectPath) =>
|
||||
ipcRenderer.invoke("app:saveImageToTemp", {
|
||||
data,
|
||||
filename,
|
||||
mimeType,
|
||||
projectPath,
|
||||
}),
|
||||
|
||||
// Agent APIs
|
||||
agent: {
|
||||
// Start or resume a conversation
|
||||
start: (sessionId, workingDirectory) =>
|
||||
ipcRenderer.invoke("agent:start", { sessionId, workingDirectory }),
|
||||
|
||||
// Send a message to the agent
|
||||
send: (sessionId, message, workingDirectory, imagePaths) =>
|
||||
ipcRenderer.invoke("agent:send", {
|
||||
sessionId,
|
||||
message,
|
||||
workingDirectory,
|
||||
imagePaths,
|
||||
}),
|
||||
|
||||
// Get conversation history
|
||||
getHistory: (sessionId) =>
|
||||
ipcRenderer.invoke("agent:getHistory", { sessionId }),
|
||||
|
||||
// Stop current execution
|
||||
stop: (sessionId) => ipcRenderer.invoke("agent:stop", { sessionId }),
|
||||
|
||||
// Clear conversation
|
||||
clear: (sessionId) => ipcRenderer.invoke("agent:clear", { sessionId }),
|
||||
|
||||
// Subscribe to streaming events
|
||||
onStream: (callback) => {
|
||||
const subscription = (_, data) => callback(data);
|
||||
ipcRenderer.on("agent:stream", subscription);
|
||||
// Return unsubscribe function
|
||||
return () => ipcRenderer.removeListener("agent:stream", subscription);
|
||||
},
|
||||
},
|
||||
|
||||
// Session Management APIs
|
||||
sessions: {
|
||||
// List all sessions
|
||||
list: (includeArchived) =>
|
||||
ipcRenderer.invoke("sessions:list", { includeArchived }),
|
||||
|
||||
// Create a new session
|
||||
create: (name, projectPath, workingDirectory) =>
|
||||
ipcRenderer.invoke("sessions:create", {
|
||||
name,
|
||||
projectPath,
|
||||
workingDirectory,
|
||||
}),
|
||||
|
||||
// Update session metadata
|
||||
update: (sessionId, name, tags) =>
|
||||
ipcRenderer.invoke("sessions:update", { sessionId, name, tags }),
|
||||
|
||||
// Archive a session
|
||||
archive: (sessionId) =>
|
||||
ipcRenderer.invoke("sessions:archive", { sessionId }),
|
||||
|
||||
// Unarchive a session
|
||||
unarchive: (sessionId) =>
|
||||
ipcRenderer.invoke("sessions:unarchive", { sessionId }),
|
||||
|
||||
// Delete a session permanently
|
||||
delete: (sessionId) => ipcRenderer.invoke("sessions:delete", { sessionId }),
|
||||
},
|
||||
|
||||
// Auto Mode API
|
||||
autoMode: {
|
||||
// Start auto mode for a specific project
|
||||
start: (projectPath, maxConcurrency) =>
|
||||
ipcRenderer.invoke("auto-mode:start", { projectPath, maxConcurrency }),
|
||||
|
||||
// Stop auto mode for a specific project
|
||||
stop: (projectPath) => ipcRenderer.invoke("auto-mode:stop", { projectPath }),
|
||||
|
||||
// Get auto mode status (optionally for a specific project)
|
||||
status: (projectPath) => ipcRenderer.invoke("auto-mode:status", { projectPath }),
|
||||
|
||||
// Run a specific feature
|
||||
runFeature: (projectPath, featureId, useWorktrees) =>
|
||||
ipcRenderer.invoke("auto-mode:run-feature", {
|
||||
projectPath,
|
||||
featureId,
|
||||
useWorktrees,
|
||||
}),
|
||||
|
||||
// Verify a specific feature by running its tests
|
||||
verifyFeature: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("auto-mode:verify-feature", {
|
||||
projectPath,
|
||||
featureId,
|
||||
}),
|
||||
|
||||
// Resume a specific feature with previous context
|
||||
resumeFeature: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("auto-mode:resume-feature", {
|
||||
projectPath,
|
||||
featureId,
|
||||
}),
|
||||
|
||||
// Check if context file exists for a feature
|
||||
contextExists: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("auto-mode:context-exists", {
|
||||
projectPath,
|
||||
featureId,
|
||||
}),
|
||||
|
||||
// Analyze a new project - kicks off an agent to analyze codebase
|
||||
analyzeProject: (projectPath) =>
|
||||
ipcRenderer.invoke("auto-mode:analyze-project", { projectPath }),
|
||||
|
||||
// Stop a specific feature
|
||||
stopFeature: (featureId) =>
|
||||
ipcRenderer.invoke("auto-mode:stop-feature", { featureId }),
|
||||
|
||||
// Follow-up on a feature with additional prompt
|
||||
followUpFeature: (projectPath, featureId, prompt, imagePaths) =>
|
||||
ipcRenderer.invoke("auto-mode:follow-up-feature", {
|
||||
projectPath,
|
||||
featureId,
|
||||
prompt,
|
||||
imagePaths,
|
||||
}),
|
||||
|
||||
// Commit changes for a feature
|
||||
commitFeature: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("auto-mode:commit-feature", {
|
||||
projectPath,
|
||||
featureId,
|
||||
}),
|
||||
|
||||
// Listen for auto mode events
|
||||
onEvent: (callback) => {
|
||||
const subscription = (_, data) => callback(data);
|
||||
ipcRenderer.on("auto-mode:event", subscription);
|
||||
|
||||
// Return unsubscribe function
|
||||
return () => {
|
||||
ipcRenderer.removeListener("auto-mode:event", subscription);
|
||||
};
|
||||
},
|
||||
},
|
||||
|
||||
// Claude CLI Detection API
|
||||
checkClaudeCli: () => ipcRenderer.invoke("claude:check-cli"),
|
||||
|
||||
// Codex CLI Detection API
|
||||
checkCodexCli: () => ipcRenderer.invoke("codex:check-cli"),
|
||||
|
||||
// Model Management APIs
|
||||
model: {
|
||||
// Get all available models from all providers
|
||||
getAvailable: () => ipcRenderer.invoke("model:get-available"),
|
||||
|
||||
// Check all provider installation status
|
||||
checkProviders: () => ipcRenderer.invoke("model:check-providers"),
|
||||
},
|
||||
|
||||
// OpenAI API
|
||||
testOpenAIConnection: (apiKey) =>
|
||||
ipcRenderer.invoke("openai:test-connection", { apiKey }),
|
||||
|
||||
// Worktree Management APIs
|
||||
worktree: {
|
||||
// Revert feature changes by removing the worktree
|
||||
revertFeature: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("worktree:revert-feature", { projectPath, featureId }),
|
||||
|
||||
// Merge feature worktree changes back to main branch
|
||||
mergeFeature: (projectPath, featureId, options) =>
|
||||
ipcRenderer.invoke("worktree:merge-feature", {
|
||||
projectPath,
|
||||
featureId,
|
||||
options,
|
||||
}),
|
||||
|
||||
// Get worktree info for a feature
|
||||
getInfo: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("worktree:get-info", { projectPath, featureId }),
|
||||
|
||||
// Get worktree status (changed files, commits)
|
||||
getStatus: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("worktree:get-status", { projectPath, featureId }),
|
||||
|
||||
// List all feature worktrees
|
||||
list: (projectPath) => ipcRenderer.invoke("worktree:list", { projectPath }),
|
||||
|
||||
// Get file diffs for a feature worktree
|
||||
getDiffs: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("worktree:get-diffs", { projectPath, featureId }),
|
||||
|
||||
// Get diff for a specific file in a worktree
|
||||
getFileDiff: (projectPath, featureId, filePath) =>
|
||||
ipcRenderer.invoke("worktree:get-file-diff", {
|
||||
projectPath,
|
||||
featureId,
|
||||
filePath,
|
||||
}),
|
||||
},
|
||||
|
||||
// Git Operations APIs (for non-worktree operations)
|
||||
git: {
|
||||
// Get file diffs for the main project
|
||||
getDiffs: (projectPath) =>
|
||||
ipcRenderer.invoke("git:get-diffs", { projectPath }),
|
||||
|
||||
// Get diff for a specific file in the main project
|
||||
getFileDiff: (projectPath, filePath) =>
|
||||
ipcRenderer.invoke("git:get-file-diff", { projectPath, filePath }),
|
||||
},
|
||||
|
||||
// Feature Suggestions API
|
||||
suggestions: {
|
||||
// Generate feature suggestions
|
||||
// suggestionType can be: "features", "refactoring", "security", "performance"
|
||||
generate: (projectPath, suggestionType = "features") =>
|
||||
ipcRenderer.invoke("suggestions:generate", { projectPath, suggestionType }),
|
||||
|
||||
// Stop generating suggestions
|
||||
stop: () => ipcRenderer.invoke("suggestions:stop"),
|
||||
|
||||
// Get suggestions status
|
||||
status: () => ipcRenderer.invoke("suggestions:status"),
|
||||
|
||||
// Listen for suggestions events
|
||||
onEvent: (callback) => {
|
||||
const subscription = (_, data) => callback(data);
|
||||
ipcRenderer.on("suggestions:event", subscription);
|
||||
|
||||
// Return unsubscribe function
|
||||
return () => {
|
||||
ipcRenderer.removeListener("suggestions:event", subscription);
|
||||
};
|
||||
},
|
||||
},
|
||||
|
||||
// Spec Regeneration API
|
||||
specRegeneration: {
|
||||
// Create initial app spec for a new project
|
||||
create: (projectPath, projectOverview, generateFeatures = true) =>
|
||||
ipcRenderer.invoke("spec-regeneration:create", {
|
||||
projectPath,
|
||||
projectOverview,
|
||||
generateFeatures,
|
||||
}),
|
||||
|
||||
// Regenerate the app spec
|
||||
generate: (projectPath, projectDefinition) =>
|
||||
ipcRenderer.invoke("spec-regeneration:generate", {
|
||||
projectPath,
|
||||
projectDefinition,
|
||||
}),
|
||||
|
||||
// Generate features from existing app_spec.txt
|
||||
generateFeatures: (projectPath) =>
|
||||
ipcRenderer.invoke("spec-regeneration:generate-features", {
|
||||
projectPath,
|
||||
}),
|
||||
|
||||
// Stop regenerating spec
|
||||
stop: () => ipcRenderer.invoke("spec-regeneration:stop"),
|
||||
|
||||
// Get regeneration status
|
||||
status: () => ipcRenderer.invoke("spec-regeneration:status"),
|
||||
|
||||
// Listen for regeneration events
|
||||
onEvent: (callback) => {
|
||||
const subscription = (_, data) => callback(data);
|
||||
ipcRenderer.on("spec-regeneration:event", subscription);
|
||||
|
||||
// Return unsubscribe function
|
||||
return () => {
|
||||
ipcRenderer.removeListener("spec-regeneration:event", subscription);
|
||||
};
|
||||
},
|
||||
},
|
||||
|
||||
// Setup & CLI Management API
|
||||
setup: {
|
||||
// Get comprehensive Claude CLI status
|
||||
getClaudeStatus: () => ipcRenderer.invoke("setup:claude-status"),
|
||||
|
||||
// Get comprehensive Codex CLI status
|
||||
getCodexStatus: () => ipcRenderer.invoke("setup:codex-status"),
|
||||
|
||||
// Install Claude CLI
|
||||
installClaude: () => ipcRenderer.invoke("setup:install-claude"),
|
||||
|
||||
// Install Codex CLI
|
||||
installCodex: () => ipcRenderer.invoke("setup:install-codex"),
|
||||
|
||||
// Authenticate Claude CLI
|
||||
authClaude: () => ipcRenderer.invoke("setup:auth-claude"),
|
||||
|
||||
// Authenticate Codex CLI with optional API key
|
||||
authCodex: (apiKey) => ipcRenderer.invoke("setup:auth-codex", { apiKey }),
|
||||
|
||||
// Store API key securely
|
||||
storeApiKey: (provider, apiKey) =>
|
||||
ipcRenderer.invoke("setup:store-api-key", { provider, apiKey }),
|
||||
|
||||
// Get stored API keys status
|
||||
getApiKeys: () => ipcRenderer.invoke("setup:get-api-keys"),
|
||||
|
||||
// Configure Codex MCP server for a project
|
||||
configureCodexMcp: (projectPath) =>
|
||||
ipcRenderer.invoke("setup:configure-codex-mcp", { projectPath }),
|
||||
|
||||
// Get platform information
|
||||
getPlatform: () => ipcRenderer.invoke("setup:get-platform"),
|
||||
|
||||
// Listen for installation progress
|
||||
onInstallProgress: (callback) => {
|
||||
const subscription = (_, data) => callback(data);
|
||||
ipcRenderer.on("setup:install-progress", subscription);
|
||||
return () => {
|
||||
ipcRenderer.removeListener("setup:install-progress", subscription);
|
||||
};
|
||||
},
|
||||
|
||||
// Listen for auth progress
|
||||
onAuthProgress: (callback) => {
|
||||
const subscription = (_, data) => callback(data);
|
||||
ipcRenderer.on("setup:auth-progress", subscription);
|
||||
return () => {
|
||||
ipcRenderer.removeListener("setup:auth-progress", subscription);
|
||||
};
|
||||
},
|
||||
},
|
||||
|
||||
// Features API
|
||||
features: {
|
||||
// Get all features for a project
|
||||
getAll: (projectPath) =>
|
||||
ipcRenderer.invoke("features:getAll", { projectPath }),
|
||||
|
||||
// Get a single feature by ID
|
||||
get: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("features:get", { projectPath, featureId }),
|
||||
|
||||
// Create a new feature
|
||||
create: (projectPath, feature) =>
|
||||
ipcRenderer.invoke("features:create", { projectPath, feature }),
|
||||
|
||||
// Update a feature (partial updates supported)
|
||||
update: (projectPath, featureId, updates) =>
|
||||
ipcRenderer.invoke("features:update", {
|
||||
projectPath,
|
||||
featureId,
|
||||
updates,
|
||||
}),
|
||||
|
||||
// Delete a feature and its folder
|
||||
delete: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("features:delete", { projectPath, featureId }),
|
||||
|
||||
// Get agent output for a feature
|
||||
getAgentOutput: (projectPath, featureId) =>
|
||||
ipcRenderer.invoke("features:getAgentOutput", { projectPath, featureId }),
|
||||
},
|
||||
|
||||
// Running Agents API
|
||||
runningAgents: {
|
||||
// Get all running agents across all projects
|
||||
getAll: () => ipcRenderer.invoke("running-agents:getAll"),
|
||||
},
|
||||
});
|
||||
|
||||
// Also expose a flag to detect if we're in Electron
|
||||
// Only expose a flag to detect Electron environment
|
||||
// All API calls go through HTTP to the backend server
|
||||
contextBridge.exposeInMainWorld("isElectron", true);
|
||||
|
||||
// Expose platform info for UI purposes
|
||||
contextBridge.exposeInMainWorld("electronPlatform", process.platform);
|
||||
|
||||
console.log("[Preload] Electron flag exposed (HTTP-only mode)");
|
||||
|
||||
@@ -22,7 +22,7 @@ class McpServerFactory {
|
||||
"Create or update a feature. Use this tool to create new features with detailed information or update existing feature status. When creating features, provide comprehensive description, category, and implementation steps.",
|
||||
{
|
||||
featureId: z.string().describe("The ID of the feature (lowercase, hyphens for spaces). Example: 'user-authentication', 'budget-tracking'"),
|
||||
status: z.enum(["backlog", "todo", "in_progress", "verified"]).describe("The status for the feature. Use 'backlog' or 'todo' for new features."),
|
||||
status: z.enum(["backlog", "todo", "in_progress", "verified"]).describe("The status for the feature. For NEW features, ONLY use 'backlog' or 'verified'. NEVER use 'in_progress' for new features - the user will manually start them."),
|
||||
summary: z.string().optional().describe("A brief summary of what was implemented/changed or what the feature does."),
|
||||
description: z.string().optional().describe("A detailed description of the feature. Be comprehensive - explain what the feature does, its purpose, and key functionality."),
|
||||
category: z.string().optional().describe("The category/phase for this feature. Example: 'Phase 1: Foundation', 'Phase 2: Core Logic', 'Phase 3: Polish', 'Authentication', 'UI/UX'"),
|
||||
@@ -38,14 +38,16 @@ class McpServerFactory {
|
||||
const feature = features.find((f) => f.id === args.featureId);
|
||||
|
||||
if (!feature) {
|
||||
console.log(`[Feature Creation] Feature ${args.featureId} not found - this might be a new feature being created`);
|
||||
// This might be a new feature - try to proceed anyway
|
||||
console.log(`[Feature Creation] Feature ${args.featureId} not found - this is a new feature being created`);
|
||||
// This is a new feature - enforce backlog status for any non-verified features
|
||||
}
|
||||
|
||||
// If agent tries to mark as verified but feature has skipTests=true, convert to waiting_approval
|
||||
let finalStatus = args.status;
|
||||
// Convert 'todo' to 'backlog' for consistency, but only for new features
|
||||
if (!feature && finalStatus === "todo") {
|
||||
// For NEW features: Convert 'todo' or 'in_progress' to 'backlog' for consistency
|
||||
// New features should ALWAYS go to backlog first, user must manually start them
|
||||
if (!feature && (finalStatus === "todo" || finalStatus === "in_progress")) {
|
||||
console.log(`[Feature Creation] New feature ${args.featureId} - converting "${finalStatus}" to "backlog" (user must manually start features)`);
|
||||
finalStatus = "backlog";
|
||||
}
|
||||
if (feature && args.status === "verified" && feature.skipTests === true) {
|
||||
@@ -53,6 +55,15 @@ class McpServerFactory {
|
||||
finalStatus = "waiting_approval";
|
||||
}
|
||||
|
||||
// IMPORTANT: Prevent agent from moving an in_progress feature back to backlog
|
||||
// When a feature is being worked on, the agent should only be able to mark it as verified
|
||||
// (which may be converted to waiting_approval for skipTests features)
|
||||
// This prevents the agent from incorrectly putting completed work back in the backlog
|
||||
if (feature && feature.status === "in_progress" && (args.status === "backlog" || args.status === "todo")) {
|
||||
console.log(`[McpServerFactory] Feature ${args.featureId} is in_progress - preventing move to ${args.status}, converting to waiting_approval instead`);
|
||||
finalStatus = "waiting_approval";
|
||||
}
|
||||
|
||||
// Call the provided callback to update feature status
|
||||
await updateFeatureStatusCallback(
|
||||
args.featureId,
|
||||
|
||||
@@ -211,7 +211,16 @@ async function handleToolsCall(params, id) {
|
||||
if (status === 'verified' && feature.skipTests === true) {
|
||||
finalStatus = 'waiting_approval';
|
||||
}
|
||||
|
||||
|
||||
// IMPORTANT: Prevent agent from moving an in_progress feature back to backlog
|
||||
// When a feature is being worked on, the agent should only be able to mark it as verified
|
||||
// (which may be converted to waiting_approval for skipTests features)
|
||||
// This prevents the agent from incorrectly putting completed work back in the backlog
|
||||
if (feature.status === 'in_progress' && (status === 'backlog' || status === 'todo')) {
|
||||
console.log(`[McpServerStdio] Feature ${featureId} is in_progress - preventing move to ${status}, converting to waiting_approval instead`);
|
||||
finalStatus = 'waiting_approval';
|
||||
}
|
||||
|
||||
// Call the update callback via IPC or direct call
|
||||
// Since we're in a separate process, we need to use IPC to communicate back
|
||||
// For now, we'll call the feature loader directly since it has the update method
|
||||
|
||||
@@ -390,15 +390,13 @@ class SpecRegenerationService {
|
||||
3. For EACH feature in the implementation_roadmap:
|
||||
- Determine if it's ALREADY IMPLEMENTED (fully or partially)
|
||||
- If fully implemented: Create with status "verified" and note what's done
|
||||
- If partially implemented: Create with status "in_progress" and note remaining work
|
||||
- If not started: Create with status "backlog"
|
||||
- If partially implemented OR not started: Create with status "backlog" and note what still needs to be done
|
||||
|
||||
**IMPORTANT - For each feature you MUST provide:**
|
||||
- **featureId**: A descriptive ID (lowercase, hyphens for spaces). Example: "user-authentication", "budget-tracking"
|
||||
- **status**:
|
||||
- "verified" if feature is fully implemented in the codebase
|
||||
- "in_progress" if partially implemented
|
||||
- "backlog" if not yet started
|
||||
- **status**:
|
||||
- "verified" ONLY if feature is 100% fully implemented in the codebase
|
||||
- "backlog" for ALL features that need ANY work (partial or not started) - the user will manually start these
|
||||
- **description**: A DETAILED description (2-4 sentences) explaining what the feature does, its purpose, and key functionality
|
||||
- **category**: The phase from the roadmap (e.g., "Phase 1: Foundation", "Phase 2: Core Logic", "Phase 3: Polish")
|
||||
- **steps**: An array of 4-8 clear, actionable implementation steps. For verified features, these are what WAS done. For backlog, these are what NEEDS to be done.
|
||||
@@ -407,10 +405,12 @@ class SpecRegenerationService {
|
||||
**Example of analyzing existing code:**
|
||||
If you find NextAuth.js configured in the codebase with working login pages, the user-authentication feature should be "verified" not "backlog".
|
||||
|
||||
**Example of a well-defined feature:**
|
||||
**IMPORTANT: NEVER use "in_progress" status when creating features. Only use "verified" or "backlog".**
|
||||
|
||||
**Example of a well-defined feature (verified - fully complete):**
|
||||
{
|
||||
"featureId": "user-authentication",
|
||||
"status": "verified", // Because we found it's already implemented
|
||||
"status": "verified", // Because we found it's 100% already implemented
|
||||
"description": "Secure user authentication system with email/password login and session management. Already implemented using NextAuth.js with email provider.",
|
||||
"category": "Phase 1: Foundation",
|
||||
"steps": [
|
||||
@@ -422,6 +422,21 @@ If you find NextAuth.js configured in the codebase with working login pages, the
|
||||
"summary": "Authentication implemented with NextAuth.js email provider"
|
||||
}
|
||||
|
||||
**Example of a feature that needs work (backlog):**
|
||||
{
|
||||
"featureId": "user-profile",
|
||||
"status": "backlog", // Needs work - user will manually start this
|
||||
"description": "User profile page where users can view and edit their account settings, change password, and manage preferences.",
|
||||
"category": "Phase 2: Core Features",
|
||||
"steps": [
|
||||
"Create profile page component",
|
||||
"Add form for editing user details",
|
||||
"Implement password change functionality",
|
||||
"Add avatar upload feature"
|
||||
],
|
||||
"summary": "User profile management - needs implementation"
|
||||
}
|
||||
|
||||
**Feature Storage:**
|
||||
Features are stored in .automaker/features/{id}/feature.json - each feature has its own folder.
|
||||
Use the UpdateFeatureStatus tool to create features with ALL the fields above.`,
|
||||
@@ -453,13 +468,15 @@ Use the UpdateFeatureStatus tool to create features with ALL the fields above.`,
|
||||
2. **Then, read .automaker/app_spec.txt** to see the implementation roadmap
|
||||
|
||||
3. **For EACH feature in the roadmap, determine its status:**
|
||||
- Is it ALREADY IMPLEMENTED in the codebase? → status: "verified"
|
||||
- Is it PARTIALLY IMPLEMENTED? → status: "in_progress"
|
||||
- Is it NOT STARTED? → status: "backlog"
|
||||
- Is it 100% FULLY IMPLEMENTED in the codebase? → status: "verified"
|
||||
- Is it PARTIALLY IMPLEMENTED or NOT STARTED? → status: "backlog"
|
||||
|
||||
**CRITICAL: NEVER use "in_progress" status. Only use "verified" or "backlog".**
|
||||
The user will manually move features from backlog to in_progress when they want to start working on them.
|
||||
|
||||
4. **Create each feature with UpdateFeatureStatus including ALL fields:**
|
||||
- featureId: Descriptive ID (lowercase, hyphens)
|
||||
- status: "verified", "in_progress", or "backlog" based on your analysis
|
||||
- status: "verified" or "backlog" ONLY (never in_progress)
|
||||
- description: 2-4 sentences explaining the feature
|
||||
- category: The phase name from the roadmap
|
||||
- steps: Array of 4-8 implementation steps
|
||||
@@ -588,7 +605,7 @@ You should:
|
||||
4. Based on the user's project overview, create a comprehensive app specification
|
||||
5. Be liberal and comprehensive when defining features - include everything needed for a complete, polished application
|
||||
6. Use the XML template format provided
|
||||
7. Write the specification to .automaker/app_spec.txt
|
||||
7. **MANDATORY: Write the spec to EXACTLY \`.automaker/app_spec.txt\` - this exact filename, no alternatives**
|
||||
|
||||
When analyzing, look at:
|
||||
- package.json, cargo.toml, requirements.txt or similar config files for tech stack
|
||||
@@ -598,11 +615,17 @@ When analyzing, look at:
|
||||
- API structures and patterns
|
||||
|
||||
You CAN and SHOULD modify:
|
||||
- .automaker/app_spec.txt (this is your primary target)
|
||||
- .automaker/app_spec.txt (this is your ONLY target file - use EXACTLY this filename)
|
||||
|
||||
You have access to file reading, writing, and search tools. Use them to understand the codebase and write the new spec.
|
||||
You have access to file reading, writing, and search tools. Use them to understand the codebase and WRITE the new spec to .automaker/app_spec.txt.
|
||||
|
||||
**IMPORTANT:** Focus ONLY on creating the app_spec.txt file. Do NOT create any feature files or use any feature management tools during this phase.`;
|
||||
**IMPORTANT:** Focus ONLY on creating the app_spec.txt file. Do NOT create any feature files or use any feature management tools during this phase.
|
||||
|
||||
**CRITICAL FILE NAMING RULES:**
|
||||
- The spec file MUST be named exactly \`app_spec.txt\`
|
||||
- Do NOT create project-spec.md, spec.md, or any other filename
|
||||
- Do NOT use markdown (.md) extension - use .txt
|
||||
- The full path must be: \`.automaker/app_spec.txt\``;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -639,7 +662,11 @@ ${APP_SPEC_XML_TEMPLATE}
|
||||
- **development_workflow**: Note any testing or development patterns
|
||||
- **implementation_roadmap**: Break down the features into phases - be VERY detailed here, listing every feature that needs to be built
|
||||
|
||||
4. **IMPORTANT**: Write the complete specification to the file \`.automaker/app_spec.txt\`
|
||||
4. **MANDATORY FILE WRITE**: You MUST write the spec to EXACTLY this file path: \`.automaker/app_spec.txt\`
|
||||
- The filename MUST be exactly \`app_spec.txt\` - do NOT use any other name
|
||||
- Do NOT create \`project-spec.md\`, \`spec.md\`, or any other filename
|
||||
- Do NOT output the spec in your response - write it to the file
|
||||
- Use the Write tool with path \`.automaker/app_spec.txt\`
|
||||
|
||||
**Guidelines:**
|
||||
- Be comprehensive! Include ALL features needed for a complete application
|
||||
@@ -648,8 +675,9 @@ ${APP_SPEC_XML_TEMPLATE}
|
||||
- The implementation_roadmap should reflect logical phases for building out the app - list EVERY feature individually
|
||||
- Consider user flows, error states, and edge cases when defining features
|
||||
- Each phase should have multiple specific, actionable features
|
||||
- **CRITICAL: Write to EXACTLY \`.automaker/app_spec.txt\` - not project-spec.md or any other name!**
|
||||
|
||||
Begin by exploring the project structure.`;
|
||||
Begin by exploring the project structure, then generate and WRITE the spec to \`.automaker/app_spec.txt\`.`;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -848,7 +876,7 @@ You should:
|
||||
3. Understand the current architecture and patterns used
|
||||
4. Based on the user's project definition, create a comprehensive app specification that includes ALL features needed to realize their vision
|
||||
5. Be liberal and comprehensive when defining features - include everything needed for a complete, polished application
|
||||
6. Write the specification to .automaker/app_spec.txt
|
||||
6. **MANDATORY: Write the spec to EXACTLY \`.automaker/app_spec.txt\` - this exact filename, no alternatives**
|
||||
|
||||
When analyzing, look at:
|
||||
- package.json, cargo.toml, or similar config files for tech stack
|
||||
@@ -861,9 +889,15 @@ When analyzing, look at:
|
||||
Your task is ONLY to update the app_spec.txt file - feature files will be managed separately.
|
||||
|
||||
You CAN and SHOULD modify:
|
||||
- .automaker/app_spec.txt (this is your primary target)
|
||||
- .automaker/app_spec.txt (this is your ONLY target file - use EXACTLY this filename)
|
||||
|
||||
You have access to file reading, writing, and search tools. Use them to understand the codebase and write the new spec.`;
|
||||
You have access to file reading, writing, and search tools. Use them to understand the codebase and WRITE the new spec to .automaker/app_spec.txt.
|
||||
|
||||
**CRITICAL FILE NAMING RULES:**
|
||||
- The spec file MUST be named exactly \`app_spec.txt\`
|
||||
- Do NOT create project-spec.md, spec.md, or any other filename
|
||||
- Do NOT use markdown (.md) extension - use .txt
|
||||
- The full path must be: \`.automaker/app_spec.txt\``;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -892,37 +926,40 @@ ${projectDefinition}
|
||||
- Think about user experience, error handling, edge cases, etc.
|
||||
- Architecture Notes: Any important architectural decisions or patterns
|
||||
|
||||
3. **IMPORTANT**: Write the complete specification to the file \`.automaker/app_spec.txt\`
|
||||
3. **MANDATORY FILE WRITE**: You MUST write the spec to EXACTLY this file path: \`.automaker/app_spec.txt\`
|
||||
- The filename MUST be exactly \`app_spec.txt\` - do NOT use any other name
|
||||
- Do NOT create \`project-spec.md\`, \`spec.md\`, or any other filename
|
||||
- Do NOT output the spec in your response - write it to the file
|
||||
- Use the Write tool with path \`.automaker/app_spec.txt\`
|
||||
|
||||
**Format Guidelines for the Spec:**
|
||||
**Format Guidelines for the Spec (use XML format in app_spec.txt):**
|
||||
|
||||
Use this general structure:
|
||||
Use this XML structure inside app_spec.txt:
|
||||
|
||||
\`\`\`
|
||||
# [App Name] - Application Specification
|
||||
|
||||
## Product Overview
|
||||
[Description of what the app does and its purpose]
|
||||
|
||||
## Tech Stack
|
||||
- Frontend: [frameworks, libraries]
|
||||
- Backend: [frameworks, APIs]
|
||||
- Database: [if applicable]
|
||||
- Other: [other relevant tech]
|
||||
|
||||
## Features
|
||||
|
||||
### [Category 1]
|
||||
- **[Feature Name]**: [Detailed description of the feature]
|
||||
- **[Feature Name]**: [Detailed description]
|
||||
...
|
||||
|
||||
### [Category 2]
|
||||
- **[Feature Name]**: [Detailed description]
|
||||
...
|
||||
|
||||
## Architecture Notes
|
||||
[Any important architectural notes, patterns, or conventions]
|
||||
\`\`\`xml
|
||||
<project_specification>
|
||||
<project_name>[App Name]</project_name>
|
||||
|
||||
<overview>
|
||||
[Description of what the app does and its purpose]
|
||||
</overview>
|
||||
|
||||
<technology_stack>
|
||||
<frontend>[frameworks, libraries]</frontend>
|
||||
<backend>[frameworks, APIs]</backend>
|
||||
<database>[if applicable]</database>
|
||||
</technology_stack>
|
||||
|
||||
<core_capabilities>
|
||||
[List all the major capabilities]
|
||||
</core_capabilities>
|
||||
|
||||
<implementation_roadmap>
|
||||
<phase_1>[Foundation features]</phase_1>
|
||||
<phase_2>[Core features]</phase_2>
|
||||
<phase_3>[Polish features]</phase_3>
|
||||
</implementation_roadmap>
|
||||
</project_specification>
|
||||
\`\`\`
|
||||
|
||||
**Remember:**
|
||||
@@ -930,9 +967,9 @@ Use this general structure:
|
||||
- Consider user flows, error states, loading states, etc.
|
||||
- Include authentication, authorization if relevant
|
||||
- Think about what would make this a polished, production-ready app
|
||||
- The more detailed and complete the spec, the better
|
||||
- **CRITICAL: Write to EXACTLY \`.automaker/app_spec.txt\` - not project-spec.md or any other name!**
|
||||
|
||||
Begin by exploring the project structure.`;
|
||||
Begin by exploring the project structure, then generate and WRITE the spec to \`.automaker/app_spec.txt\`.`;
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -58,6 +58,17 @@
|
||||
"tailwind-merge": "^3.4.0",
|
||||
"zustand": "^5.0.9"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"lightningcss-darwin-arm64": "^1.29.2",
|
||||
"lightningcss-darwin-x64": "^1.29.2",
|
||||
"lightningcss-linux-arm-gnueabihf": "^1.29.2",
|
||||
"lightningcss-linux-arm64-gnu": "^1.29.2",
|
||||
"lightningcss-linux-arm64-musl": "^1.29.2",
|
||||
"lightningcss-linux-x64-gnu": "^1.29.2",
|
||||
"lightningcss-linux-x64-musl": "^1.29.2",
|
||||
"lightningcss-win32-arm64-msvc": "^1.29.2",
|
||||
"lightningcss-win32-x64-msvc": "^1.29.2"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@electron/rebuild": "^4.0.2",
|
||||
"@playwright/test": "^1.57.0",
|
||||
|
||||
@@ -143,6 +143,80 @@
|
||||
--running-indicator-text: oklch(0.6 0.22 265);
|
||||
}
|
||||
|
||||
/* Apply dark mode immediately based on system preference (before JS runs) */
|
||||
@media (prefers-color-scheme: dark) {
|
||||
:root {
|
||||
/* Deep dark backgrounds - zinc-950 family */
|
||||
--background: oklch(0.04 0 0); /* zinc-950 */
|
||||
--background-50: oklch(0.04 0 0 / 0.5); /* zinc-950/50 */
|
||||
--background-80: oklch(0.04 0 0 / 0.8); /* zinc-950/80 */
|
||||
|
||||
/* Text colors following hierarchy */
|
||||
--foreground: oklch(1 0 0); /* text-white */
|
||||
--foreground-secondary: oklch(0.588 0 0); /* text-zinc-400 */
|
||||
--foreground-muted: oklch(0.525 0 0); /* text-zinc-500 */
|
||||
|
||||
/* Card and popover backgrounds */
|
||||
--card: oklch(0.14 0 0);
|
||||
--card-foreground: oklch(1 0 0);
|
||||
--popover: oklch(0.10 0 0);
|
||||
--popover-foreground: oklch(1 0 0);
|
||||
|
||||
/* Brand colors - purple/violet theme */
|
||||
--primary: oklch(0.55 0.25 265);
|
||||
--primary-foreground: oklch(1 0 0);
|
||||
--brand-400: oklch(0.6 0.22 265);
|
||||
--brand-500: oklch(0.55 0.25 265);
|
||||
--brand-600: oklch(0.5 0.28 270);
|
||||
|
||||
/* Glass morphism borders and accents */
|
||||
--secondary: oklch(1 0 0 / 0.05);
|
||||
--secondary-foreground: oklch(1 0 0);
|
||||
--muted: oklch(0.176 0 0);
|
||||
--muted-foreground: oklch(0.588 0 0);
|
||||
--accent: oklch(1 0 0 / 0.1);
|
||||
--accent-foreground: oklch(1 0 0);
|
||||
|
||||
/* Borders with transparency for glass effect */
|
||||
--border: oklch(0.176 0 0);
|
||||
--border-glass: oklch(1 0 0 / 0.1);
|
||||
--destructive: oklch(0.6 0.25 25);
|
||||
--input: oklch(0.04 0 0 / 0.8);
|
||||
--ring: oklch(0.55 0.25 265);
|
||||
|
||||
/* Chart colors with brand theme */
|
||||
--chart-1: oklch(0.55 0.25 265);
|
||||
--chart-2: oklch(0.65 0.2 160);
|
||||
--chart-3: oklch(0.75 0.2 70);
|
||||
--chart-4: oklch(0.6 0.25 300);
|
||||
--chart-5: oklch(0.6 0.25 20);
|
||||
|
||||
/* Sidebar with glass morphism */
|
||||
--sidebar: oklch(0.04 0 0 / 0.5);
|
||||
--sidebar-foreground: oklch(1 0 0);
|
||||
--sidebar-primary: oklch(0.55 0.25 265);
|
||||
--sidebar-primary-foreground: oklch(1 0 0);
|
||||
--sidebar-accent: oklch(1 0 0 / 0.05);
|
||||
--sidebar-accent-foreground: oklch(1 0 0);
|
||||
--sidebar-border: oklch(1 0 0 / 0.1);
|
||||
--sidebar-ring: oklch(0.55 0.25 265);
|
||||
|
||||
/* Action button colors */
|
||||
--action-view: oklch(0.6 0.25 265);
|
||||
--action-view-hover: oklch(0.55 0.27 270);
|
||||
--action-followup: oklch(0.6 0.2 230);
|
||||
--action-followup-hover: oklch(0.55 0.22 230);
|
||||
--action-commit: oklch(0.55 0.2 140);
|
||||
--action-commit-hover: oklch(0.5 0.22 140);
|
||||
--action-verify: oklch(0.55 0.2 140);
|
||||
--action-verify-hover: oklch(0.5 0.22 140);
|
||||
|
||||
/* Running indicator - Purple */
|
||||
--running-indicator: oklch(0.6 0.25 265);
|
||||
--running-indicator-text: oklch(0.65 0.22 265);
|
||||
}
|
||||
}
|
||||
|
||||
.light {
|
||||
/* Explicit light mode - same as root but ensures it overrides any dark defaults */
|
||||
--background: oklch(1 0 0); /* White */
|
||||
@@ -211,10 +285,10 @@
|
||||
--foreground-secondary: oklch(0.588 0 0); /* text-zinc-400 */
|
||||
--foreground-muted: oklch(0.525 0 0); /* text-zinc-500 */
|
||||
|
||||
/* Glass morphism effects */
|
||||
--card: oklch(0.04 0 0 / 0.5); /* zinc-950/50 with transparency */
|
||||
/* Card and popover backgrounds */
|
||||
--card: oklch(0.14 0 0); /* slightly lighter than background for contrast */
|
||||
--card-foreground: oklch(1 0 0);
|
||||
--popover: oklch(0.04 0 0 / 0.8); /* zinc-950/80 for popover */
|
||||
--popover: oklch(0.10 0 0); /* slightly lighter than background */
|
||||
--popover-foreground: oklch(1 0 0);
|
||||
|
||||
/* Brand colors - purple/violet theme */
|
||||
|
||||
@@ -7,7 +7,6 @@ import { BoardView } from "@/components/views/board-view";
|
||||
import { SpecView } from "@/components/views/spec-view";
|
||||
import { AgentView } from "@/components/views/agent-view";
|
||||
import { SettingsView } from "@/components/views/settings-view";
|
||||
import { AgentToolsView } from "@/components/views/agent-tools-view";
|
||||
import { InterviewView } from "@/components/views/interview-view";
|
||||
import { ContextView } from "@/components/views/context-view";
|
||||
import { ProfilesView } from "@/components/views/profiles-view";
|
||||
@@ -16,12 +15,20 @@ import { RunningAgentsView } from "@/components/views/running-agents-view";
|
||||
import { useAppStore } from "@/store/app-store";
|
||||
import { useSetupStore } from "@/store/setup-store";
|
||||
import { getElectronAPI, isElectron } from "@/lib/electron";
|
||||
import { FileBrowserProvider, useFileBrowser, setGlobalFileBrowser } from "@/contexts/file-browser-context";
|
||||
|
||||
export default function Home() {
|
||||
const { currentView, setCurrentView, setIpcConnected, theme, currentProject } = useAppStore();
|
||||
function HomeContent() {
|
||||
const {
|
||||
currentView,
|
||||
setCurrentView,
|
||||
setIpcConnected,
|
||||
theme,
|
||||
currentProject,
|
||||
} = useAppStore();
|
||||
const { isFirstRun, setupComplete } = useSetupStore();
|
||||
const [isMounted, setIsMounted] = useState(false);
|
||||
const [streamerPanelOpen, setStreamerPanelOpen] = useState(false);
|
||||
const { openFileBrowser } = useFileBrowser();
|
||||
|
||||
// Hidden streamer panel - opens with "\" key
|
||||
const handleStreamerPanelShortcut = useCallback((event: KeyboardEvent) => {
|
||||
@@ -29,7 +36,11 @@ export default function Home() {
|
||||
const activeElement = document.activeElement;
|
||||
if (activeElement) {
|
||||
const tagName = activeElement.tagName.toLowerCase();
|
||||
if (tagName === "input" || tagName === "textarea" || tagName === "select") {
|
||||
if (
|
||||
tagName === "input" ||
|
||||
tagName === "textarea" ||
|
||||
tagName === "select"
|
||||
) {
|
||||
return;
|
||||
}
|
||||
if (activeElement.getAttribute("contenteditable") === "true") {
|
||||
@@ -70,6 +81,11 @@ export default function Home() {
|
||||
setIsMounted(true);
|
||||
}, []);
|
||||
|
||||
// Initialize global file browser for HttpApiClient
|
||||
useEffect(() => {
|
||||
setGlobalFileBrowser(openFileBrowser);
|
||||
}, [openFileBrowser]);
|
||||
|
||||
// Check if this is first run and redirect to setup if needed
|
||||
useEffect(() => {
|
||||
console.log("[Setup Flow] Checking setup state:", {
|
||||
@@ -81,7 +97,9 @@ export default function Home() {
|
||||
});
|
||||
|
||||
if (isMounted && isFirstRun && !setupComplete) {
|
||||
console.log("[Setup Flow] Redirecting to setup wizard (first run, not complete)");
|
||||
console.log(
|
||||
"[Setup Flow] Redirecting to setup wizard (first run, not complete)"
|
||||
);
|
||||
setCurrentView("setup");
|
||||
} else if (isMounted && setupComplete) {
|
||||
console.log("[Setup Flow] Setup already complete, showing normal view");
|
||||
@@ -94,7 +112,7 @@ export default function Home() {
|
||||
try {
|
||||
const api = getElectronAPI();
|
||||
const result = await api.ping();
|
||||
setIpcConnected(result === "pong" || result === "pong (mock)");
|
||||
setIpcConnected(result === "pong");
|
||||
} catch (error) {
|
||||
console.error("IPC connection failed:", error);
|
||||
setIpcConnected(false);
|
||||
@@ -171,8 +189,6 @@ export default function Home() {
|
||||
return <AgentView />;
|
||||
case "settings":
|
||||
return <SettingsView />;
|
||||
case "tools":
|
||||
return <AgentToolsView />;
|
||||
case "interview":
|
||||
return <InterviewView />;
|
||||
case "context":
|
||||
@@ -193,8 +209,8 @@ export default function Home() {
|
||||
<SetupView />
|
||||
{/* Environment indicator */}
|
||||
{isMounted && !isElectron() && (
|
||||
<div className="fixed bottom-4 right-4 px-3 py-1.5 bg-yellow-500/10 text-yellow-500 text-xs rounded-full border border-yellow-500/20 pointer-events-none">
|
||||
Web Mode (Mock IPC)
|
||||
<div className="fixed bottom-4 right-4 px-3 py-1.5 bg-blue-500/10 text-blue-500 text-xs rounded-full border border-blue-500/20 pointer-events-none">
|
||||
Web Mode
|
||||
</div>
|
||||
)}
|
||||
</main>
|
||||
@@ -204,23 +220,34 @@ export default function Home() {
|
||||
return (
|
||||
<main className="flex h-screen overflow-hidden" data-testid="app-container">
|
||||
<Sidebar />
|
||||
<div className="flex-1 flex flex-col overflow-hidden transition-all duration-300" style={{ marginRight: streamerPanelOpen ? '250px' : '0' }}>
|
||||
<div
|
||||
className="flex-1 flex flex-col overflow-hidden transition-all duration-300"
|
||||
style={{ marginRight: streamerPanelOpen ? "250px" : "0" }}
|
||||
>
|
||||
{renderView()}
|
||||
</div>
|
||||
|
||||
{/* Environment indicator - only show after mount to prevent hydration issues */}
|
||||
{isMounted && !isElectron() && (
|
||||
<div className="fixed bottom-4 right-4 px-3 py-1.5 bg-yellow-500/10 text-yellow-500 text-xs rounded-full border border-yellow-500/20 pointer-events-none">
|
||||
Web Mode (Mock IPC)
|
||||
<div className="fixed bottom-4 right-4 px-3 py-1.5 bg-blue-500/10 text-blue-500 text-xs rounded-full border border-blue-500/20 pointer-events-none">
|
||||
Web Mode
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Hidden streamer panel - opens with "\" key, pushes content */}
|
||||
<div
|
||||
className={`fixed top-0 right-0 h-full w-[250px] bg-background border-l border-border transition-transform duration-300 ${
|
||||
streamerPanelOpen ? 'translate-x-0' : 'translate-x-full'
|
||||
streamerPanelOpen ? "translate-x-0" : "translate-x-full"
|
||||
}`}
|
||||
/>
|
||||
</main>
|
||||
);
|
||||
}
|
||||
|
||||
export default function Home() {
|
||||
return (
|
||||
<FileBrowserProvider>
|
||||
<HomeContent />
|
||||
</FileBrowserProvider>
|
||||
);
|
||||
}
|
||||
|
||||
231
apps/app/src/components/dialogs/file-browser-dialog.tsx
Normal file
231
apps/app/src/components/dialogs/file-browser-dialog.tsx
Normal file
@@ -0,0 +1,231 @@
|
||||
"use client";
|
||||
|
||||
import { useState, useEffect } from "react";
|
||||
import { FolderOpen, Folder, ChevronRight, Home, ArrowLeft, HardDrive } from "lucide-react";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogDescription,
|
||||
DialogFooter,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
} from "@/components/ui/dialog";
|
||||
import { Button } from "@/components/ui/button";
|
||||
|
||||
interface DirectoryEntry {
|
||||
name: string;
|
||||
path: string;
|
||||
}
|
||||
|
||||
interface BrowseResult {
|
||||
success: boolean;
|
||||
currentPath: string;
|
||||
parentPath: string | null;
|
||||
directories: DirectoryEntry[];
|
||||
drives?: string[];
|
||||
error?: string;
|
||||
}
|
||||
|
||||
interface FileBrowserDialogProps {
|
||||
open: boolean;
|
||||
onOpenChange: (open: boolean) => void;
|
||||
onSelect: (path: string) => void;
|
||||
title?: string;
|
||||
description?: string;
|
||||
}
|
||||
|
||||
export function FileBrowserDialog({
|
||||
open,
|
||||
onOpenChange,
|
||||
onSelect,
|
||||
title = "Select Project Directory",
|
||||
description = "Navigate to your project folder",
|
||||
}: FileBrowserDialogProps) {
|
||||
const [currentPath, setCurrentPath] = useState<string>("");
|
||||
const [parentPath, setParentPath] = useState<string | null>(null);
|
||||
const [directories, setDirectories] = useState<DirectoryEntry[]>([]);
|
||||
const [drives, setDrives] = useState<string[]>([]);
|
||||
const [loading, setLoading] = useState(false);
|
||||
const [error, setError] = useState("");
|
||||
|
||||
const browseDirectory = async (dirPath?: string) => {
|
||||
setLoading(true);
|
||||
setError("");
|
||||
|
||||
try {
|
||||
// Get server URL from environment or default
|
||||
const serverUrl = process.env.NEXT_PUBLIC_SERVER_URL || "http://localhost:3008";
|
||||
|
||||
const response = await fetch(`${serverUrl}/api/fs/browse`, {
|
||||
method: "POST",
|
||||
headers: { "Content-Type": "application/json" },
|
||||
body: JSON.stringify({ dirPath }),
|
||||
});
|
||||
|
||||
const result: BrowseResult = await response.json();
|
||||
|
||||
if (result.success) {
|
||||
setCurrentPath(result.currentPath);
|
||||
setParentPath(result.parentPath);
|
||||
setDirectories(result.directories);
|
||||
setDrives(result.drives || []);
|
||||
} else {
|
||||
setError(result.error || "Failed to browse directory");
|
||||
}
|
||||
} catch (err) {
|
||||
setError(err instanceof Error ? err.message : "Failed to load directories");
|
||||
} finally {
|
||||
setLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
// Load home directory on mount
|
||||
useEffect(() => {
|
||||
if (open && !currentPath) {
|
||||
browseDirectory();
|
||||
}
|
||||
}, [open]);
|
||||
|
||||
const handleSelectDirectory = (dir: DirectoryEntry) => {
|
||||
browseDirectory(dir.path);
|
||||
};
|
||||
|
||||
const handleGoToParent = () => {
|
||||
if (parentPath) {
|
||||
browseDirectory(parentPath);
|
||||
}
|
||||
};
|
||||
|
||||
const handleGoHome = () => {
|
||||
browseDirectory();
|
||||
};
|
||||
|
||||
const handleSelectDrive = (drivePath: string) => {
|
||||
browseDirectory(drivePath);
|
||||
};
|
||||
|
||||
const handleSelect = () => {
|
||||
if (currentPath) {
|
||||
onSelect(currentPath);
|
||||
onOpenChange(false);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<Dialog open={open} onOpenChange={onOpenChange}>
|
||||
<DialogContent className="bg-popover border-border max-w-2xl max-h-[80vh]">
|
||||
<DialogHeader>
|
||||
<DialogTitle className="flex items-center gap-2">
|
||||
<FolderOpen className="w-5 h-5 text-brand-500" />
|
||||
{title}
|
||||
</DialogTitle>
|
||||
<DialogDescription className="text-muted-foreground">
|
||||
{description}
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
|
||||
<div className="flex flex-col gap-3 min-h-[400px]">
|
||||
{/* Drives selector (Windows only) */}
|
||||
{drives.length > 0 && (
|
||||
<div className="flex flex-wrap gap-2 p-3 rounded-lg bg-sidebar-accent/10 border border-sidebar-border">
|
||||
<div className="flex items-center gap-1 text-xs text-muted-foreground mr-2">
|
||||
<HardDrive className="w-3 h-3" />
|
||||
<span>Drives:</span>
|
||||
</div>
|
||||
{drives.map((drive) => (
|
||||
<Button
|
||||
key={drive}
|
||||
variant={currentPath.startsWith(drive) ? "default" : "outline"}
|
||||
size="sm"
|
||||
onClick={() => handleSelectDrive(drive)}
|
||||
className="h-7 px-3 text-xs"
|
||||
disabled={loading}
|
||||
>
|
||||
{drive.replace("\\", "")}
|
||||
</Button>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Current path breadcrumb */}
|
||||
<div className="flex items-center gap-2 p-3 rounded-lg bg-sidebar-accent/10 border border-sidebar-border">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={handleGoHome}
|
||||
className="h-7 px-2"
|
||||
disabled={loading}
|
||||
>
|
||||
<Home className="w-4 h-4" />
|
||||
</Button>
|
||||
{parentPath && (
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={handleGoToParent}
|
||||
className="h-7 px-2"
|
||||
disabled={loading}
|
||||
>
|
||||
<ArrowLeft className="w-4 h-4" />
|
||||
</Button>
|
||||
)}
|
||||
<div className="flex-1 font-mono text-sm truncate text-muted-foreground">
|
||||
{currentPath || "Loading..."}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Directory list */}
|
||||
<div className="flex-1 overflow-y-auto border border-sidebar-border rounded-lg">
|
||||
{loading && (
|
||||
<div className="flex items-center justify-center h-full p-8">
|
||||
<div className="text-sm text-muted-foreground">Loading directories...</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{error && (
|
||||
<div className="flex items-center justify-center h-full p-8">
|
||||
<div className="text-sm text-destructive">{error}</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!loading && !error && directories.length === 0 && (
|
||||
<div className="flex items-center justify-center h-full p-8">
|
||||
<div className="text-sm text-muted-foreground">No subdirectories found</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!loading && !error && directories.length > 0 && (
|
||||
<div className="divide-y divide-sidebar-border">
|
||||
{directories.map((dir) => (
|
||||
<button
|
||||
key={dir.path}
|
||||
onClick={() => handleSelectDirectory(dir)}
|
||||
className="w-full flex items-center gap-3 p-3 hover:bg-sidebar-accent/10 transition-colors text-left group"
|
||||
>
|
||||
<Folder className="w-5 h-5 text-brand-500 shrink-0" />
|
||||
<span className="flex-1 truncate text-sm">{dir.name}</span>
|
||||
<ChevronRight className="w-4 h-4 text-muted-foreground opacity-0 group-hover:opacity-100 transition-opacity shrink-0" />
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="text-xs text-muted-foreground">
|
||||
Click on a folder to navigate. Select the current folder or navigate to a subfolder.
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<DialogFooter className="gap-2 sm:gap-0">
|
||||
<Button variant="ghost" onClick={() => onOpenChange(false)}>
|
||||
Cancel
|
||||
</Button>
|
||||
<Button onClick={handleSelect} disabled={!currentPath || loading}>
|
||||
<FolderOpen className="w-4 h-4 mr-2" />
|
||||
Select Current Folder
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
);
|
||||
}
|
||||
@@ -3,6 +3,7 @@
|
||||
import { useState, useMemo, useEffect, useCallback, useRef } from "react";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { useAppStore, formatShortcut } from "@/store/app-store";
|
||||
import { CoursePromoBadge } from "@/components/ui/course-promo-badge";
|
||||
import {
|
||||
FolderOpen,
|
||||
Plus,
|
||||
@@ -12,7 +13,6 @@ import {
|
||||
Bot,
|
||||
Folder,
|
||||
X,
|
||||
Wrench,
|
||||
PanelLeft,
|
||||
PanelLeftClose,
|
||||
ChevronDown,
|
||||
@@ -210,6 +210,7 @@ export function Sidebar() {
|
||||
cycleNextProject,
|
||||
clearProjectHistory,
|
||||
setProjectTheme,
|
||||
setTheme,
|
||||
theme: globalTheme,
|
||||
} = useAppStore();
|
||||
|
||||
@@ -413,14 +414,37 @@ export function Sidebar() {
|
||||
return;
|
||||
}
|
||||
|
||||
const project = {
|
||||
id: `project-${Date.now()}`,
|
||||
name,
|
||||
path,
|
||||
lastOpened: new Date().toISOString(),
|
||||
};
|
||||
// Check if project already exists (by path) to preserve theme and other settings
|
||||
const existingProject = projects.find((p) => p.path === path);
|
||||
|
||||
let project: Project;
|
||||
if (existingProject) {
|
||||
// Update existing project, preserving theme and other properties
|
||||
project = {
|
||||
...existingProject,
|
||||
name, // Update name in case it changed
|
||||
lastOpened: new Date().toISOString(),
|
||||
};
|
||||
// Update the project in the store (this will update the existing entry)
|
||||
const updatedProjects = projects.map((p) =>
|
||||
p.id === existingProject.id ? project : p
|
||||
);
|
||||
useAppStore.setState({ projects: updatedProjects });
|
||||
} else {
|
||||
// Create new project - check for trashed project with same path first (preserves theme if deleted/recreated)
|
||||
// Then fall back to current effective theme, then global theme
|
||||
const trashedProject = trashedProjects.find((p) => p.path === path);
|
||||
const effectiveTheme = trashedProject?.theme || currentProject?.theme || globalTheme;
|
||||
project = {
|
||||
id: `project-${Date.now()}`,
|
||||
name,
|
||||
path,
|
||||
lastOpened: new Date().toISOString(),
|
||||
theme: effectiveTheme,
|
||||
};
|
||||
addProject(project);
|
||||
}
|
||||
|
||||
addProject(project);
|
||||
setCurrentProject(project);
|
||||
|
||||
// Check if app_spec.txt exists
|
||||
@@ -455,7 +479,7 @@ export function Sidebar() {
|
||||
});
|
||||
}
|
||||
}
|
||||
}, [addProject, setCurrentProject]);
|
||||
}, [projects, trashedProjects, addProject, setCurrentProject, currentProject, globalTheme]);
|
||||
|
||||
const handleRestoreProject = useCallback(
|
||||
(projectId: string) => {
|
||||
@@ -557,12 +581,6 @@ export function Sidebar() {
|
||||
icon: BookOpen,
|
||||
shortcut: shortcuts.context,
|
||||
},
|
||||
{
|
||||
id: "tools",
|
||||
label: "Agent Tools",
|
||||
icon: Wrench,
|
||||
shortcut: shortcuts.tools,
|
||||
},
|
||||
{
|
||||
id: "profiles",
|
||||
label: "AI Profiles",
|
||||
@@ -950,6 +968,10 @@ export function Sidebar() {
|
||||
value={currentProject.theme || ""}
|
||||
onValueChange={(value) => {
|
||||
if (currentProject) {
|
||||
// If selecting an actual theme (not "Use Global"), also update global
|
||||
if (value !== "") {
|
||||
setTheme(value as any);
|
||||
}
|
||||
setProjectTheme(
|
||||
currentProject.id,
|
||||
value === "" ? null : (value as any)
|
||||
@@ -1122,6 +1144,8 @@ export function Sidebar() {
|
||||
|
||||
{/* Bottom Section - Running Agents / Bug Report / Settings */}
|
||||
<div className="border-t border-sidebar-border bg-sidebar-accent/10 shrink-0">
|
||||
{/* Course Promo Badge */}
|
||||
<CoursePromoBadge sidebarOpen={sidebarOpen} />
|
||||
{/* Running Agents Link */}
|
||||
<div className="p-2 pb-0">
|
||||
<button
|
||||
|
||||
@@ -4,14 +4,13 @@ import { useState, useEffect } from "react";
|
||||
import {
|
||||
Card,
|
||||
CardContent,
|
||||
CardDescription,
|
||||
CardHeader,
|
||||
CardTitle,
|
||||
} from "@/components/ui/card";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { HotkeyButton } from "@/components/ui/hotkey-button";
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Tabs, TabsList, TabsTrigger, TabsContent } from "@/components/ui/tabs";
|
||||
import { Tabs, TabsList, TabsTrigger } from "@/components/ui/tabs";
|
||||
import {
|
||||
Plus,
|
||||
MessageSquare,
|
||||
@@ -26,7 +25,7 @@ import {
|
||||
import { cn } from "@/lib/utils";
|
||||
import type { SessionListItem } from "@/types/electron";
|
||||
import { useKeyboardShortcutsConfig } from "@/hooks/use-keyboard-shortcuts";
|
||||
import { useAppStore } from "@/store/app-store";
|
||||
import { getElectronAPI } from "@/lib/electron";
|
||||
|
||||
// Random session name generator
|
||||
const adjectives = [
|
||||
@@ -117,14 +116,15 @@ export function SessionManager({
|
||||
|
||||
// Check running state for all sessions
|
||||
const checkRunningSessions = async (sessionList: SessionListItem[]) => {
|
||||
if (!window.electronAPI?.agent) return;
|
||||
const api = getElectronAPI();
|
||||
if (!api?.agent) return;
|
||||
|
||||
const runningIds = new Set<string>();
|
||||
|
||||
// Check each session's running state
|
||||
for (const session of sessionList) {
|
||||
try {
|
||||
const result = await window.electronAPI.agent.getHistory(session.id);
|
||||
const result = await api.agent.getHistory(session.id);
|
||||
if (result.success && result.isRunning) {
|
||||
runningIds.add(session.id);
|
||||
}
|
||||
@@ -142,10 +142,11 @@ export function SessionManager({
|
||||
|
||||
// Load sessions
|
||||
const loadSessions = async () => {
|
||||
if (!window.electronAPI?.sessions) return;
|
||||
const api = getElectronAPI();
|
||||
if (!api?.sessions) return;
|
||||
|
||||
// Always load all sessions and filter client-side
|
||||
const result = await window.electronAPI.sessions.list(true);
|
||||
const result = await api.sessions.list(true);
|
||||
if (result.success && result.sessions) {
|
||||
setSessions(result.sessions);
|
||||
// Check running state for all sessions
|
||||
@@ -173,39 +174,41 @@ export function SessionManager({
|
||||
|
||||
// Create new session with random name
|
||||
const handleCreateSession = async () => {
|
||||
if (!window.electronAPI?.sessions) return;
|
||||
const api = getElectronAPI();
|
||||
if (!api?.sessions) return;
|
||||
|
||||
const sessionName = newSessionName.trim() || generateRandomSessionName();
|
||||
|
||||
const result = await window.electronAPI.sessions.create(
|
||||
const result = await api.sessions.create(
|
||||
sessionName,
|
||||
projectPath,
|
||||
projectPath
|
||||
);
|
||||
|
||||
if (result.success && result.sessionId) {
|
||||
if (result.success && result.session?.id) {
|
||||
setNewSessionName("");
|
||||
setIsCreating(false);
|
||||
await loadSessions();
|
||||
onSelectSession(result.sessionId);
|
||||
onSelectSession(result.session.id);
|
||||
}
|
||||
};
|
||||
|
||||
// Create new session directly with a random name (one-click)
|
||||
const handleQuickCreateSession = async () => {
|
||||
if (!window.electronAPI?.sessions) return;
|
||||
const api = getElectronAPI();
|
||||
if (!api?.sessions) return;
|
||||
|
||||
const sessionName = generateRandomSessionName();
|
||||
|
||||
const result = await window.electronAPI.sessions.create(
|
||||
const result = await api.sessions.create(
|
||||
sessionName,
|
||||
projectPath,
|
||||
projectPath
|
||||
);
|
||||
|
||||
if (result.success && result.sessionId) {
|
||||
if (result.success && result.session?.id) {
|
||||
await loadSessions();
|
||||
onSelectSession(result.sessionId);
|
||||
onSelectSession(result.session.id);
|
||||
}
|
||||
};
|
||||
|
||||
@@ -223,9 +226,10 @@ export function SessionManager({
|
||||
|
||||
// Rename session
|
||||
const handleRenameSession = async (sessionId: string) => {
|
||||
if (!editingName.trim() || !window.electronAPI?.sessions) return;
|
||||
const api = getElectronAPI();
|
||||
if (!editingName.trim() || !api?.sessions) return;
|
||||
|
||||
const result = await window.electronAPI.sessions.update(
|
||||
const result = await api.sessions.update(
|
||||
sessionId,
|
||||
editingName,
|
||||
undefined
|
||||
@@ -240,34 +244,55 @@ export function SessionManager({
|
||||
|
||||
// Archive session
|
||||
const handleArchiveSession = async (sessionId: string) => {
|
||||
if (!window.electronAPI?.sessions) return;
|
||||
const api = getElectronAPI();
|
||||
if (!api?.sessions) {
|
||||
console.error("[SessionManager] Sessions API not available");
|
||||
return;
|
||||
}
|
||||
|
||||
const result = await window.electronAPI.sessions.archive(sessionId);
|
||||
if (result.success) {
|
||||
// If the archived session was currently selected, deselect it
|
||||
if (currentSessionId === sessionId) {
|
||||
onSelectSession(null);
|
||||
try {
|
||||
const result = await api.sessions.archive(sessionId);
|
||||
if (result.success) {
|
||||
// If the archived session was currently selected, deselect it
|
||||
if (currentSessionId === sessionId) {
|
||||
onSelectSession(null);
|
||||
}
|
||||
await loadSessions();
|
||||
} else {
|
||||
console.error("[SessionManager] Archive failed:", result.error);
|
||||
}
|
||||
await loadSessions();
|
||||
} catch (error) {
|
||||
console.error("[SessionManager] Archive error:", error);
|
||||
}
|
||||
};
|
||||
|
||||
// Unarchive session
|
||||
const handleUnarchiveSession = async (sessionId: string) => {
|
||||
if (!window.electronAPI?.sessions) return;
|
||||
const api = getElectronAPI();
|
||||
if (!api?.sessions) {
|
||||
console.error("[SessionManager] Sessions API not available");
|
||||
return;
|
||||
}
|
||||
|
||||
const result = await window.electronAPI.sessions.unarchive(sessionId);
|
||||
if (result.success) {
|
||||
await loadSessions();
|
||||
try {
|
||||
const result = await api.sessions.unarchive(sessionId);
|
||||
if (result.success) {
|
||||
await loadSessions();
|
||||
} else {
|
||||
console.error("[SessionManager] Unarchive failed:", result.error);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("[SessionManager] Unarchive error:", error);
|
||||
}
|
||||
};
|
||||
|
||||
// Delete session
|
||||
const handleDeleteSession = async (sessionId: string) => {
|
||||
if (!window.electronAPI?.sessions) return;
|
||||
const api = getElectronAPI();
|
||||
if (!api?.sessions) return;
|
||||
if (!confirm("Are you sure you want to delete this session?")) return;
|
||||
|
||||
const result = await window.electronAPI.sessions.delete(sessionId);
|
||||
const result = await api.sessions.delete(sessionId);
|
||||
if (result.success) {
|
||||
await loadSessions();
|
||||
if (currentSessionId === sessionId) {
|
||||
|
||||
88
apps/app/src/components/ui/course-promo-badge.tsx
Normal file
88
apps/app/src/components/ui/course-promo-badge.tsx
Normal file
@@ -0,0 +1,88 @@
|
||||
"use client";
|
||||
|
||||
import * as React from "react";
|
||||
import { Sparkles, X } from "lucide-react";
|
||||
import {
|
||||
Tooltip,
|
||||
TooltipContent,
|
||||
TooltipProvider,
|
||||
TooltipTrigger,
|
||||
} from "@/components/ui/tooltip";
|
||||
|
||||
interface CoursePromoBadgeProps {
|
||||
sidebarOpen?: boolean;
|
||||
}
|
||||
|
||||
export function CoursePromoBadge({ sidebarOpen = true }: CoursePromoBadgeProps) {
|
||||
const [dismissed, setDismissed] = React.useState(false);
|
||||
|
||||
if (dismissed) {
|
||||
return null;
|
||||
}
|
||||
|
||||
// Collapsed state - show only icon with tooltip
|
||||
if (!sidebarOpen) {
|
||||
return (
|
||||
<div className="p-2 pb-0 flex justify-center">
|
||||
<TooltipProvider delayDuration={300}>
|
||||
<Tooltip>
|
||||
<TooltipTrigger asChild>
|
||||
<a
|
||||
href="https://agenticjumpstart.com"
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="group cursor-pointer flex items-center justify-center w-10 h-10 bg-primary/10 text-primary rounded-lg hover:bg-primary/20 transition-all border border-primary/30"
|
||||
data-testid="course-promo-badge-collapsed"
|
||||
>
|
||||
<Sparkles className="size-4 shrink-0" />
|
||||
</a>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent side="right" className="flex items-center gap-2">
|
||||
<span>Become a 10x Dev</span>
|
||||
<span
|
||||
onClick={(e) => {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
setDismissed(true);
|
||||
}}
|
||||
className="p-0.5 rounded-full hover:bg-primary/30 transition-colors cursor-pointer"
|
||||
aria-label="Dismiss"
|
||||
>
|
||||
<X className="size-3" />
|
||||
</span>
|
||||
</TooltipContent>
|
||||
</Tooltip>
|
||||
</TooltipProvider>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
// Expanded state - show full badge
|
||||
return (
|
||||
<div className="p-2 pb-0">
|
||||
<a
|
||||
href="https://agenticjumpstart.com"
|
||||
target="_blank"
|
||||
rel="noopener noreferrer"
|
||||
className="group cursor-pointer flex items-center justify-between w-full px-2 lg:px-3 py-2.5 bg-primary/10 text-primary rounded-lg font-medium text-sm hover:bg-primary/20 transition-all border border-primary/30"
|
||||
data-testid="course-promo-badge"
|
||||
>
|
||||
<div className="flex items-center gap-2">
|
||||
<Sparkles className="size-4 shrink-0" />
|
||||
<span className="hidden lg:block">Become a 10x Dev</span>
|
||||
</div>
|
||||
<span
|
||||
onClick={(e) => {
|
||||
e.preventDefault();
|
||||
e.stopPropagation();
|
||||
setDismissed(true);
|
||||
}}
|
||||
className="hidden lg:block p-1 rounded-full hover:bg-primary/30 transition-colors cursor-pointer"
|
||||
aria-label="Dismiss"
|
||||
>
|
||||
<X className="size-3.5" />
|
||||
</span>
|
||||
</a>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
@@ -1,6 +1,6 @@
|
||||
"use client";
|
||||
|
||||
import React, { useState, useRef, useCallback, useEffect } from "react";
|
||||
import React, { useState, useRef, useCallback } from "react";
|
||||
import { cn } from "@/lib/utils";
|
||||
import { ImageIcon, X, Loader2 } from "lucide-react";
|
||||
import { Textarea } from "@/components/ui/textarea";
|
||||
@@ -83,6 +83,13 @@ export function DescriptionImageDropZone({
|
||||
const fileInputRef = useRef<HTMLInputElement>(null);
|
||||
const currentProject = useAppStore((state) => state.currentProject);
|
||||
|
||||
// Construct server URL for loading saved images
|
||||
const getImageServerUrl = useCallback((imagePath: string): string => {
|
||||
const serverUrl = process.env.NEXT_PUBLIC_SERVER_URL || "http://localhost:3008";
|
||||
const projectPath = currentProject?.path || "";
|
||||
return `${serverUrl}/api/fs/image?path=${encodeURIComponent(imagePath)}&projectPath=${encodeURIComponent(projectPath)}`;
|
||||
}, [currentProject?.path]);
|
||||
|
||||
const fileToBase64 = (file: File): Promise<string> => {
|
||||
return new Promise((resolve, reject) => {
|
||||
const reader = new FileReader();
|
||||
@@ -98,7 +105,7 @@ export function DescriptionImageDropZone({
|
||||
});
|
||||
};
|
||||
|
||||
const saveImageToTemp = async (
|
||||
const saveImageToTemp = useCallback(async (
|
||||
base64Data: string,
|
||||
filename: string,
|
||||
mimeType: string
|
||||
@@ -107,8 +114,8 @@ export function DescriptionImageDropZone({
|
||||
const api = getElectronAPI();
|
||||
// Check if saveImageToTemp method exists
|
||||
if (!api.saveImageToTemp) {
|
||||
// Fallback for mock API - return a mock path in .automaker/images
|
||||
console.log("[DescriptionImageDropZone] Using mock path for image");
|
||||
// Fallback path when saveImageToTemp is not available
|
||||
console.log("[DescriptionImageDropZone] Using fallback path for image");
|
||||
return `.automaker/images/${Date.now()}_${filename}`;
|
||||
}
|
||||
|
||||
@@ -124,7 +131,7 @@ export function DescriptionImageDropZone({
|
||||
console.error("[DescriptionImageDropZone] Error saving image:", error);
|
||||
return null;
|
||||
}
|
||||
};
|
||||
}, [currentProject?.path]);
|
||||
|
||||
const processFiles = useCallback(
|
||||
async (files: FileList) => {
|
||||
@@ -193,7 +200,7 @@ export function DescriptionImageDropZone({
|
||||
|
||||
setIsProcessing(false);
|
||||
},
|
||||
[disabled, isProcessing, images, maxFiles, maxFileSize, onImagesChange, previewImages]
|
||||
[disabled, isProcessing, images, maxFiles, maxFileSize, onImagesChange, previewImages, saveImageToTemp]
|
||||
);
|
||||
|
||||
const handleDrop = useCallback(
|
||||
@@ -374,7 +381,15 @@ export function DescriptionImageDropZone({
|
||||
className="max-w-full max-h-full object-contain"
|
||||
/>
|
||||
) : (
|
||||
<ImageIcon className="w-6 h-6 text-muted-foreground" />
|
||||
<img
|
||||
src={getImageServerUrl(image.path)}
|
||||
alt={image.filename}
|
||||
className="max-w-full max-h-full object-contain"
|
||||
onError={(e) => {
|
||||
// If image fails to load, hide it
|
||||
(e.target as HTMLImageElement).style.display = 'none';
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
{/* Remove button */}
|
||||
|
||||
@@ -88,7 +88,6 @@ const SHORTCUT_LABELS: Record<keyof KeyboardShortcuts, string> = {
|
||||
agent: "Agent Runner",
|
||||
spec: "Spec Editor",
|
||||
context: "Context",
|
||||
tools: "Agent Tools",
|
||||
settings: "Settings",
|
||||
profiles: "AI Profiles",
|
||||
toggleSidebar: "Toggle Sidebar",
|
||||
@@ -109,7 +108,6 @@ const SHORTCUT_CATEGORIES: Record<keyof KeyboardShortcuts, "navigation" | "ui" |
|
||||
agent: "navigation",
|
||||
spec: "navigation",
|
||||
context: "navigation",
|
||||
tools: "navigation",
|
||||
settings: "navigation",
|
||||
profiles: "navigation",
|
||||
toggleSidebar: "ui",
|
||||
|
||||
@@ -149,12 +149,12 @@ export function AgentToolsView() {
|
||||
setTerminalResult(null);
|
||||
|
||||
try {
|
||||
// Simulate agent requesting terminal command execution
|
||||
console.log(`[Agent Tool] Requesting to run command: ${terminalCommand}`);
|
||||
// Terminal command simulation for demonstration purposes
|
||||
console.log(`[Agent Tool] Simulating command: ${terminalCommand}`);
|
||||
|
||||
// In mock mode, simulate terminal output
|
||||
// In real Electron mode, this would use child_process
|
||||
const mockOutputs: Record<string, string> = {
|
||||
// Simulated outputs for common commands (preview mode)
|
||||
// In production, the agent executes commands via Claude SDK
|
||||
const simulatedOutputs: Record<string, string> = {
|
||||
ls: "app_spec.txt\nfeatures\nnode_modules\npackage.json\nsrc\ntests\ntsconfig.json",
|
||||
pwd: currentProject?.path || "/Users/demo/project",
|
||||
"echo hello": "hello",
|
||||
@@ -168,8 +168,8 @@ export function AgentToolsView() {
|
||||
await new Promise((resolve) => setTimeout(resolve, 500));
|
||||
|
||||
const output =
|
||||
mockOutputs[terminalCommand.toLowerCase()] ||
|
||||
`Command executed: ${terminalCommand}\n(Mock output - real execution requires Electron mode)`;
|
||||
simulatedOutputs[terminalCommand.toLowerCase()] ||
|
||||
`[Preview] ${terminalCommand}\n(Terminal commands are executed by the agent during feature implementation)`;
|
||||
|
||||
setTerminalResult({
|
||||
success: true,
|
||||
|
||||
@@ -399,7 +399,7 @@ ${Object.entries(projectAnalysis.filesByExtension)
|
||||
`;
|
||||
|
||||
// Write the spec file
|
||||
const specPath = `${currentProject.path}/app_spec.txt`;
|
||||
const specPath = `${currentProject.path}/.automaker/app_spec.txt`;
|
||||
const writeResult = await api.writeFile(specPath, specContent);
|
||||
|
||||
if (writeResult.success) {
|
||||
|
||||
@@ -394,22 +394,25 @@ export function BoardView() {
|
||||
}, []);
|
||||
|
||||
// Load features using features API
|
||||
// IMPORTANT: Do NOT add 'features' to dependency array - it would cause infinite reload loop
|
||||
const loadFeatures = useCallback(async () => {
|
||||
if (!currentProject) return;
|
||||
|
||||
const currentPath = currentProject.path;
|
||||
const previousPath = prevProjectPathRef.current;
|
||||
const isProjectSwitch = previousPath !== null && currentPath !== previousPath;
|
||||
|
||||
// If project switched, clear features first to prevent cross-contamination
|
||||
// Also treat this as an initial load for the new project
|
||||
if (previousPath !== null && currentPath !== previousPath) {
|
||||
// Get cached features from store (without adding to dependencies)
|
||||
const cachedFeatures = useAppStore.getState().features;
|
||||
|
||||
// If project switched, mark it but don't clear features yet
|
||||
// We'll clear after successful API load to prevent data loss
|
||||
if (isProjectSwitch) {
|
||||
console.log(
|
||||
`[BoardView] Project switch detected: ${previousPath} -> ${currentPath}, clearing features`
|
||||
`[BoardView] Project switch detected: ${previousPath} -> ${currentPath}`
|
||||
);
|
||||
isSwitchingProjectRef.current = true;
|
||||
isInitialLoadRef.current = true;
|
||||
setFeatures([]);
|
||||
setPersistedCategories([]); // Also clear categories
|
||||
}
|
||||
|
||||
// Update the ref to track current project
|
||||
@@ -424,6 +427,7 @@ export function BoardView() {
|
||||
const api = getElectronAPI();
|
||||
if (!api.features) {
|
||||
console.error("[BoardView] Features API not available");
|
||||
// Keep cached features if API is unavailable
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -441,10 +445,31 @@ export function BoardView() {
|
||||
thinkingLevel: f.thinkingLevel || "none",
|
||||
})
|
||||
);
|
||||
// Successfully loaded features - now safe to set them
|
||||
setFeatures(featuresWithIds);
|
||||
|
||||
// Only clear categories on project switch AFTER successful load
|
||||
if (isProjectSwitch) {
|
||||
setPersistedCategories([]);
|
||||
}
|
||||
} else if (!result.success && result.error) {
|
||||
console.error("[BoardView] API returned error:", result.error);
|
||||
// If it's a new project or the error indicates no features found,
|
||||
// that's expected - start with empty array
|
||||
if (isProjectSwitch) {
|
||||
setFeatures([]);
|
||||
setPersistedCategories([]);
|
||||
}
|
||||
// Otherwise keep cached features
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Failed to load features:", error);
|
||||
// On error, keep existing cached features for the current project
|
||||
// Only clear on project switch if we have no features from server
|
||||
if (isProjectSwitch && cachedFeatures.length === 0) {
|
||||
setFeatures([]);
|
||||
setPersistedCategories([]);
|
||||
}
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
isInitialLoadRef.current = false;
|
||||
@@ -558,10 +583,24 @@ export function BoardView() {
|
||||
}
|
||||
|
||||
loadFeatures();
|
||||
// Show error toast
|
||||
toast.error("Agent encountered an error", {
|
||||
description: event.error || "Check the logs for details",
|
||||
});
|
||||
|
||||
// Check for authentication errors and show a more helpful message
|
||||
const isAuthError = event.errorType === "authentication" ||
|
||||
(event.error && (
|
||||
event.error.includes("Authentication failed") ||
|
||||
event.error.includes("Invalid API key")
|
||||
));
|
||||
|
||||
if (isAuthError) {
|
||||
toast.error("Authentication Failed", {
|
||||
description: "Your API key is invalid or expired. Please check Settings or run 'claude login' in terminal.",
|
||||
duration: 10000,
|
||||
});
|
||||
} else {
|
||||
toast.error("Agent encountered an error", {
|
||||
description: event.error || "Check the logs for details",
|
||||
});
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
@@ -1475,8 +1514,14 @@ export function BoardView() {
|
||||
if (isRunning) {
|
||||
map.in_progress.push(f);
|
||||
} else {
|
||||
// Otherwise, use the feature's status
|
||||
map[f.status].push(f);
|
||||
// Otherwise, use the feature's status (fallback to backlog for unknown statuses)
|
||||
const status = f.status as ColumnId;
|
||||
if (map[status]) {
|
||||
map[status].push(f);
|
||||
} else {
|
||||
// Unknown status, default to backlog
|
||||
map.backlog.push(f);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
@@ -207,10 +207,12 @@ export const KanbanCard = memo(function KanbanCard({
|
||||
// - Backlog items can always be dragged
|
||||
// - skipTests items can be dragged even when in_progress or verified (unless currently running)
|
||||
// - waiting_approval items can always be dragged (to allow manual verification via drag)
|
||||
// - Non-skipTests (TDD) items in progress or verified cannot be dragged
|
||||
// - verified items can always be dragged (to allow moving back to waiting_approval or backlog)
|
||||
// - Non-skipTests (TDD) items in progress cannot be dragged (they are running)
|
||||
const isDraggable =
|
||||
feature.status === "backlog" ||
|
||||
feature.status === "waiting_approval" ||
|
||||
feature.status === "verified" ||
|
||||
(feature.skipTests && !isCurrentAutoTask);
|
||||
const {
|
||||
attributes,
|
||||
|
||||
@@ -82,12 +82,14 @@ export function SettingsView() {
|
||||
// Compute the effective theme for the current project
|
||||
const effectiveTheme = (settingsProject?.theme || theme) as Theme;
|
||||
|
||||
// Handler to set theme - saves to project if one is selected, otherwise to global
|
||||
// Handler to set theme - always updates global theme (user's preference),
|
||||
// and also sets per-project theme if a project is selected
|
||||
const handleSetTheme = (newTheme: typeof theme) => {
|
||||
// Always update global theme so user's preference persists across all projects
|
||||
setTheme(newTheme);
|
||||
// Also set per-project theme if a project is selected
|
||||
if (currentProject) {
|
||||
setProjectTheme(currentProject.id, newTheme);
|
||||
} else {
|
||||
setTheme(newTheme);
|
||||
}
|
||||
};
|
||||
|
||||
|
||||
@@ -61,12 +61,16 @@ export function AuthenticationStatusDisplay({
|
||||
{claudeAuthStatus.method === "oauth_token_env"
|
||||
? "Using CLAUDE_CODE_OAUTH_TOKEN"
|
||||
: claudeAuthStatus.method === "oauth_token"
|
||||
? "Using stored OAuth token"
|
||||
? "Using stored OAuth token (subscription)"
|
||||
: claudeAuthStatus.method === "api_key_env"
|
||||
? "Using ANTHROPIC_API_KEY"
|
||||
: claudeAuthStatus.method === "api_key"
|
||||
? "Using stored API key"
|
||||
: "Unknown method"}
|
||||
: claudeAuthStatus.method === "credentials_file"
|
||||
? "Using credentials file"
|
||||
: claudeAuthStatus.method === "cli_authenticated"
|
||||
? "Using Claude CLI authentication"
|
||||
: `Using ${claudeAuthStatus.method || "detected"} authentication`}
|
||||
</span>
|
||||
</div>
|
||||
</>
|
||||
@@ -107,14 +111,16 @@ export function AuthenticationStatusDisplay({
|
||||
<div className="flex items-center gap-2 text-muted-foreground">
|
||||
<Info className="w-3 h-3 shrink-0" />
|
||||
<span>
|
||||
{codexAuthStatus.method === "cli_verified" ||
|
||||
codexAuthStatus.method === "cli_tokens"
|
||||
{codexAuthStatus.method === "subscription"
|
||||
? "Using Codex subscription (Plus/Team)"
|
||||
: codexAuthStatus.method === "cli_verified" ||
|
||||
codexAuthStatus.method === "cli_tokens"
|
||||
? "Using CLI login (OpenAI account)"
|
||||
: codexAuthStatus.method === "api_key"
|
||||
? "Using stored API key"
|
||||
: codexAuthStatus.method === "env"
|
||||
? "Using OPENAI_API_KEY"
|
||||
: "Unknown method"}
|
||||
: `Using ${codexAuthStatus.method || "unknown"} authentication`}
|
||||
</span>
|
||||
</div>
|
||||
</>
|
||||
|
||||
@@ -68,19 +68,24 @@ export function useCliStatus() {
|
||||
try {
|
||||
const result = await api.setup.getClaudeStatus();
|
||||
if (result.success && result.auth) {
|
||||
const auth = result.auth;
|
||||
// Validate method is one of the expected values, default to "none"
|
||||
const validMethods = ["oauth_token_env", "oauth_token", "api_key", "api_key_env", "none"] as const;
|
||||
// Cast to extended type that includes server-added fields
|
||||
const auth = result.auth as typeof result.auth & {
|
||||
oauthTokenValid?: boolean;
|
||||
apiKeyValid?: boolean;
|
||||
};
|
||||
// Map server method names to client method types
|
||||
// Server returns: oauth_token_env, oauth_token, api_key_env, api_key, credentials_file, cli_authenticated, none
|
||||
const validMethods = ["oauth_token_env", "oauth_token", "api_key", "api_key_env", "credentials_file", "cli_authenticated", "none"] as const;
|
||||
type AuthMethod = typeof validMethods[number];
|
||||
const method: AuthMethod = validMethods.includes(auth.method as AuthMethod)
|
||||
? (auth.method as AuthMethod)
|
||||
: "none";
|
||||
: auth.authenticated ? "api_key" : "none"; // Default authenticated to api_key, not none
|
||||
const authStatus = {
|
||||
authenticated: auth.authenticated,
|
||||
method,
|
||||
hasCredentialsFile: auth.hasCredentialsFile ?? false,
|
||||
oauthTokenValid: auth.hasStoredOAuthToken || auth.hasEnvOAuthToken,
|
||||
apiKeyValid: auth.hasStoredApiKey || auth.hasEnvApiKey,
|
||||
oauthTokenValid: auth.oauthTokenValid || auth.hasStoredOAuthToken || auth.hasEnvOAuthToken,
|
||||
apiKeyValid: auth.apiKeyValid || auth.hasStoredApiKey || auth.hasEnvApiKey,
|
||||
hasEnvOAuthToken: auth.hasEnvOAuthToken,
|
||||
hasEnvApiKey: auth.hasEnvApiKey,
|
||||
};
|
||||
@@ -96,27 +101,30 @@ export function useCliStatus() {
|
||||
try {
|
||||
const result = await api.setup.getCodexStatus();
|
||||
if (result.success && result.auth) {
|
||||
const auth = result.auth;
|
||||
// Determine method - prioritize cli_verified and cli_tokens over auth_file
|
||||
const method =
|
||||
auth.method === "cli_verified" || auth.method === "cli_tokens"
|
||||
? auth.method === "cli_verified"
|
||||
? ("cli_verified" as const)
|
||||
: ("cli_tokens" as const)
|
||||
: auth.method === "auth_file"
|
||||
? ("api_key" as const)
|
||||
: auth.method === "env_var"
|
||||
? ("env" as const)
|
||||
: ("none" as const);
|
||||
// Cast to extended type that includes server-added fields
|
||||
const auth = result.auth as typeof result.auth & {
|
||||
hasSubscription?: boolean;
|
||||
cliLoggedIn?: boolean;
|
||||
hasEnvApiKey?: boolean;
|
||||
};
|
||||
// Map server method names to client method types
|
||||
// Server returns: subscription, cli_verified, cli_tokens, api_key, env, none
|
||||
const validMethods = ["subscription", "cli_verified", "cli_tokens", "api_key", "env", "none"] as const;
|
||||
type CodexMethod = typeof validMethods[number];
|
||||
const method: CodexMethod = validMethods.includes(auth.method as CodexMethod)
|
||||
? (auth.method as CodexMethod)
|
||||
: auth.authenticated ? "api_key" : "none"; // Default authenticated to api_key
|
||||
|
||||
const authStatus = {
|
||||
authenticated: auth.authenticated,
|
||||
method,
|
||||
// Only set apiKeyValid for actual API key methods, not CLI login
|
||||
// Only set apiKeyValid for actual API key methods, not CLI login or subscription
|
||||
apiKeyValid:
|
||||
method === "cli_verified" || method === "cli_tokens"
|
||||
method === "cli_verified" || method === "cli_tokens" || method === "subscription"
|
||||
? undefined
|
||||
: auth.hasAuthFile || auth.hasEnvKey,
|
||||
: auth.hasAuthFile || auth.hasEnvKey || auth.hasEnvApiKey,
|
||||
hasSubscription: auth.hasSubscription,
|
||||
cliLoggedIn: auth.cliLoggedIn,
|
||||
};
|
||||
setCodexAuthStatus(authStatus);
|
||||
}
|
||||
|
||||
@@ -40,6 +40,8 @@ export function useCliStatus({
|
||||
"oauth_token",
|
||||
"api_key",
|
||||
"api_key_env",
|
||||
"credentials_file",
|
||||
"cli_authenticated",
|
||||
"none",
|
||||
] as const;
|
||||
type AuthMethod = (typeof validMethods)[number];
|
||||
|
||||
@@ -14,7 +14,8 @@ import {
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
} from "@/components/ui/dialog";
|
||||
import { Save, RefreshCw, FileText, Sparkles, Loader2, FilePlus2, AlertCircle, ListPlus } from "lucide-react";
|
||||
import { Save, RefreshCw, FileText, Sparkles, Loader2, FilePlus2, AlertCircle, ListPlus, CheckCircle2 } from "lucide-react";
|
||||
import { toast } from "sonner";
|
||||
import { Checkbox } from "@/components/ui/checkbox";
|
||||
import { XmlSyntaxEditor } from "@/components/ui/xml-syntax-editor";
|
||||
import type { SpecRegenerationEvent } from "@/types/electron";
|
||||
@@ -311,14 +312,22 @@ export function SpecView() {
|
||||
// The backend sends explicit signals for completion:
|
||||
// 1. "All tasks completed" in the message
|
||||
// 2. [Phase: complete] marker in logs
|
||||
// 3. "Spec regeneration complete!" for regeneration
|
||||
// 4. "Initial spec creation complete!" for creation without features
|
||||
const isFinalCompletionMessage = event.message?.includes("All tasks completed") ||
|
||||
event.message === "All tasks completed!" ||
|
||||
event.message === "All tasks completed";
|
||||
event.message === "All tasks completed" ||
|
||||
event.message === "Spec regeneration complete!" ||
|
||||
event.message === "Initial spec creation complete!";
|
||||
|
||||
const hasCompletePhase = logsRef.current.includes("[Phase: complete]");
|
||||
|
||||
// Intermediate completion means features are being generated after spec creation
|
||||
const isIntermediateCompletion = event.message?.includes("Features are being generated") ||
|
||||
event.message?.includes("features are being generated");
|
||||
|
||||
// Rely solely on explicit backend signals
|
||||
const shouldComplete = isFinalCompletionMessage || hasCompletePhase;
|
||||
const shouldComplete = (isFinalCompletionMessage || hasCompletePhase) && !isIntermediateCompletion;
|
||||
|
||||
if (shouldComplete) {
|
||||
// Fully complete - clear all states immediately
|
||||
@@ -337,9 +346,29 @@ export function SpecView() {
|
||||
setProjectOverview("");
|
||||
setErrorMessage("");
|
||||
stateRestoredRef.current = false;
|
||||
// Reload the spec to show the new content
|
||||
loadSpec();
|
||||
} else {
|
||||
|
||||
// Reload the spec with delay to ensure file is written to disk
|
||||
setTimeout(() => {
|
||||
loadSpec();
|
||||
}, SPEC_FILE_WRITE_DELAY);
|
||||
|
||||
// Show success toast notification
|
||||
const isRegeneration = event.message?.includes("regeneration");
|
||||
const isFeatureGeneration = event.message?.includes("Feature generation");
|
||||
toast.success(
|
||||
isFeatureGeneration
|
||||
? "Feature Generation Complete"
|
||||
: isRegeneration
|
||||
? "Spec Regeneration Complete"
|
||||
: "Spec Creation Complete",
|
||||
{
|
||||
description: isFeatureGeneration
|
||||
? "Features have been created from the app specification."
|
||||
: "Your app specification has been saved.",
|
||||
icon: <CheckCircle2 className="w-4 h-4" />,
|
||||
}
|
||||
);
|
||||
} else if (isIntermediateCompletion) {
|
||||
// Intermediate completion - keep state active for feature generation
|
||||
setIsCreating(true);
|
||||
setIsRegenerating(true);
|
||||
|
||||
@@ -21,7 +21,7 @@ import {
|
||||
CardTitle,
|
||||
} from "@/components/ui/card";
|
||||
import { useAppStore } from "@/store/app-store";
|
||||
import { getElectronAPI } from "@/lib/electron";
|
||||
import { getElectronAPI, type Project } from "@/lib/electron";
|
||||
import { initializeProject } from "@/lib/project-init";
|
||||
import {
|
||||
FolderOpen,
|
||||
@@ -40,6 +40,8 @@ import {
|
||||
DropdownMenuTrigger,
|
||||
} from "@/components/ui/dropdown-menu";
|
||||
import { toast } from "sonner";
|
||||
import { WorkspacePickerModal } from "@/components/workspace-picker-modal";
|
||||
import { getHttpApiClient } from "@/lib/http-api-client";
|
||||
|
||||
export function WelcomeView() {
|
||||
const { projects, addProject, setCurrentProject, setCurrentView } =
|
||||
@@ -57,6 +59,7 @@ export function WelcomeView() {
|
||||
projectName: string;
|
||||
projectPath: string;
|
||||
} | null>(null);
|
||||
const [showWorkspacePicker, setShowWorkspacePicker] = useState(false);
|
||||
|
||||
/**
|
||||
* Kick off project analysis agent to analyze the codebase
|
||||
@@ -105,14 +108,34 @@ export function WelcomeView() {
|
||||
return;
|
||||
}
|
||||
|
||||
const project = {
|
||||
id: `project-${Date.now()}`,
|
||||
name,
|
||||
path,
|
||||
lastOpened: new Date().toISOString(),
|
||||
};
|
||||
// Check if project already exists (by path) to preserve theme and other settings
|
||||
const existingProject = projects.find((p) => p.path === path);
|
||||
|
||||
let project: Project;
|
||||
if (existingProject) {
|
||||
// Update existing project, preserving theme and other properties
|
||||
project = {
|
||||
...existingProject,
|
||||
name, // Update name in case it changed
|
||||
lastOpened: new Date().toISOString(),
|
||||
};
|
||||
// Update the project in the store (this will update the existing entry)
|
||||
const updatedProjects = projects.map((p) =>
|
||||
p.id === existingProject.id ? project : p
|
||||
);
|
||||
// We need to manually update projects since addProject would create a duplicate
|
||||
useAppStore.setState({ projects: updatedProjects });
|
||||
} else {
|
||||
// Create new project
|
||||
project = {
|
||||
id: `project-${Date.now()}`,
|
||||
name,
|
||||
path,
|
||||
lastOpened: new Date().toISOString(),
|
||||
};
|
||||
addProject(project);
|
||||
}
|
||||
|
||||
addProject(project);
|
||||
setCurrentProject(project);
|
||||
|
||||
// Show initialization dialog if files were created
|
||||
@@ -148,21 +171,55 @@ export function WelcomeView() {
|
||||
setIsOpening(false);
|
||||
}
|
||||
},
|
||||
[addProject, setCurrentProject, analyzeProject]
|
||||
[projects, addProject, setCurrentProject, analyzeProject]
|
||||
);
|
||||
|
||||
const handleOpenProject = useCallback(async () => {
|
||||
const api = getElectronAPI();
|
||||
const result = await api.openDirectory();
|
||||
try {
|
||||
// Check if workspace is configured
|
||||
const httpClient = getHttpApiClient();
|
||||
const configResult = await httpClient.workspace.getConfig();
|
||||
|
||||
if (!result.canceled && result.filePaths[0]) {
|
||||
const path = result.filePaths[0];
|
||||
// Extract folder name from path (works on both Windows and Mac/Linux)
|
||||
const name = path.split(/[/\\]/).filter(Boolean).pop() || "Untitled Project";
|
||||
await initializeAndOpenProject(path, name);
|
||||
if (configResult.success && configResult.configured) {
|
||||
// Show workspace picker modal
|
||||
setShowWorkspacePicker(true);
|
||||
} else {
|
||||
// Fall back to current behavior (native dialog or manual input)
|
||||
const api = getElectronAPI();
|
||||
const result = await api.openDirectory();
|
||||
|
||||
if (!result.canceled && result.filePaths[0]) {
|
||||
const path = result.filePaths[0];
|
||||
// Extract folder name from path (works on both Windows and Mac/Linux)
|
||||
const name = path.split(/[/\\]/).filter(Boolean).pop() || "Untitled Project";
|
||||
await initializeAndOpenProject(path, name);
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("[Welcome] Failed to check workspace config:", error);
|
||||
// Fall back to current behavior on error
|
||||
const api = getElectronAPI();
|
||||
const result = await api.openDirectory();
|
||||
|
||||
if (!result.canceled && result.filePaths[0]) {
|
||||
const path = result.filePaths[0];
|
||||
const name = path.split(/[/\\]/).filter(Boolean).pop() || "Untitled Project";
|
||||
await initializeAndOpenProject(path, name);
|
||||
}
|
||||
}
|
||||
}, [initializeAndOpenProject]);
|
||||
|
||||
/**
|
||||
* Handle selecting a project from workspace picker
|
||||
*/
|
||||
const handleWorkspaceSelect = useCallback(
|
||||
async (path: string, name: string) => {
|
||||
setShowWorkspacePicker(false);
|
||||
await initializeAndOpenProject(path, name);
|
||||
},
|
||||
[initializeAndOpenProject]
|
||||
);
|
||||
|
||||
/**
|
||||
* Handle clicking on a recent project
|
||||
*/
|
||||
@@ -601,6 +658,13 @@ export function WelcomeView() {
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
|
||||
{/* Workspace Picker Modal */}
|
||||
<WorkspacePickerModal
|
||||
open={showWorkspacePicker}
|
||||
onOpenChange={setShowWorkspacePicker}
|
||||
onSelect={handleWorkspaceSelect}
|
||||
/>
|
||||
|
||||
{/* Loading overlay when opening project */}
|
||||
{isOpening && (
|
||||
<div
|
||||
|
||||
154
apps/app/src/components/workspace-picker-modal.tsx
Normal file
154
apps/app/src/components/workspace-picker-modal.tsx
Normal file
@@ -0,0 +1,154 @@
|
||||
"use client";
|
||||
|
||||
import { useState, useEffect, useCallback } from "react";
|
||||
import {
|
||||
Dialog,
|
||||
DialogContent,
|
||||
DialogDescription,
|
||||
DialogFooter,
|
||||
DialogHeader,
|
||||
DialogTitle,
|
||||
} from "@/components/ui/dialog";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Folder, Loader2, FolderOpen, AlertCircle } from "lucide-react";
|
||||
import { getHttpApiClient } from "@/lib/http-api-client";
|
||||
|
||||
interface WorkspaceDirectory {
|
||||
name: string;
|
||||
path: string;
|
||||
}
|
||||
|
||||
interface WorkspacePickerModalProps {
|
||||
open: boolean;
|
||||
onOpenChange: (open: boolean) => void;
|
||||
onSelect: (path: string, name: string) => void;
|
||||
}
|
||||
|
||||
export function WorkspacePickerModal({
|
||||
open,
|
||||
onOpenChange,
|
||||
onSelect,
|
||||
}: WorkspacePickerModalProps) {
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const [directories, setDirectories] = useState<WorkspaceDirectory[]>([]);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
const loadDirectories = useCallback(async () => {
|
||||
setIsLoading(true);
|
||||
setError(null);
|
||||
|
||||
try {
|
||||
const client = getHttpApiClient();
|
||||
const result = await client.workspace.getDirectories();
|
||||
|
||||
if (result.success && result.directories) {
|
||||
setDirectories(result.directories);
|
||||
} else {
|
||||
setError(result.error || "Failed to load directories");
|
||||
}
|
||||
} catch (err) {
|
||||
setError(err instanceof Error ? err.message : "Failed to load directories");
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
}, []);
|
||||
|
||||
// Load directories when modal opens
|
||||
useEffect(() => {
|
||||
if (open) {
|
||||
loadDirectories();
|
||||
}
|
||||
}, [open, loadDirectories]);
|
||||
|
||||
const handleSelect = (dir: WorkspaceDirectory) => {
|
||||
onSelect(dir.path, dir.name);
|
||||
};
|
||||
|
||||
return (
|
||||
<Dialog open={open} onOpenChange={onOpenChange}>
|
||||
<DialogContent className="bg-card border-border max-w-lg max-h-[80vh] flex flex-col">
|
||||
<DialogHeader>
|
||||
<DialogTitle className="flex items-center gap-2 text-foreground">
|
||||
<FolderOpen className="w-5 h-5 text-brand-500" />
|
||||
Select Project
|
||||
</DialogTitle>
|
||||
<DialogDescription className="text-muted-foreground">
|
||||
Choose a project from your workspace directory
|
||||
</DialogDescription>
|
||||
</DialogHeader>
|
||||
|
||||
<div className="flex-1 overflow-y-auto py-4 min-h-[200px]">
|
||||
{isLoading && (
|
||||
<div className="flex flex-col items-center justify-center h-full gap-3">
|
||||
<Loader2 className="w-8 h-8 text-brand-500 animate-spin" />
|
||||
<p className="text-sm text-muted-foreground">Loading projects...</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{error && !isLoading && (
|
||||
<div className="flex flex-col items-center justify-center h-full gap-3 text-center px-4">
|
||||
<div className="w-12 h-12 rounded-full bg-destructive/10 flex items-center justify-center">
|
||||
<AlertCircle className="w-6 h-6 text-destructive" />
|
||||
</div>
|
||||
<p className="text-sm text-destructive">{error}</p>
|
||||
<Button
|
||||
variant="secondary"
|
||||
size="sm"
|
||||
onClick={loadDirectories}
|
||||
className="mt-2"
|
||||
>
|
||||
Try Again
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!isLoading && !error && directories.length === 0 && (
|
||||
<div className="flex flex-col items-center justify-center h-full gap-3 text-center px-4">
|
||||
<div className="w-12 h-12 rounded-full bg-muted flex items-center justify-center">
|
||||
<Folder className="w-6 h-6 text-muted-foreground" />
|
||||
</div>
|
||||
<p className="text-sm text-muted-foreground">
|
||||
No projects found in workspace directory
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{!isLoading && !error && directories.length > 0 && (
|
||||
<div className="space-y-2">
|
||||
{directories.map((dir) => (
|
||||
<button
|
||||
key={dir.path}
|
||||
onClick={() => handleSelect(dir)}
|
||||
className="w-full flex items-center gap-3 p-3 rounded-lg border border-border bg-card hover:bg-card/70 hover:border-brand-500/50 transition-all duration-200 text-left group"
|
||||
data-testid={`workspace-dir-${dir.name}`}
|
||||
>
|
||||
<div className="w-10 h-10 rounded-lg bg-muted border border-border flex items-center justify-center group-hover:border-brand-500/50 transition-colors shrink-0">
|
||||
<Folder className="w-5 h-5 text-muted-foreground group-hover:text-brand-500 transition-colors" />
|
||||
</div>
|
||||
<div className="flex-1 min-w-0">
|
||||
<p className="font-medium text-foreground truncate group-hover:text-brand-500 transition-colors">
|
||||
{dir.name}
|
||||
</p>
|
||||
<p className="text-xs text-muted-foreground/70 truncate">
|
||||
{dir.path}
|
||||
</p>
|
||||
</div>
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<DialogFooter>
|
||||
<Button
|
||||
variant="ghost"
|
||||
onClick={() => onOpenChange(false)}
|
||||
className="text-muted-foreground hover:text-foreground hover:bg-accent"
|
||||
>
|
||||
Cancel
|
||||
</Button>
|
||||
</DialogFooter>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
);
|
||||
}
|
||||
68
apps/app/src/contexts/file-browser-context.tsx
Normal file
68
apps/app/src/contexts/file-browser-context.tsx
Normal file
@@ -0,0 +1,68 @@
|
||||
"use client";
|
||||
|
||||
import { createContext, useContext, useState, useCallback, type ReactNode } from "react";
|
||||
import { FileBrowserDialog } from "@/components/dialogs/file-browser-dialog";
|
||||
|
||||
interface FileBrowserContextValue {
|
||||
openFileBrowser: () => Promise<string | null>;
|
||||
}
|
||||
|
||||
const FileBrowserContext = createContext<FileBrowserContextValue | null>(null);
|
||||
|
||||
export function FileBrowserProvider({ children }: { children: ReactNode }) {
|
||||
const [isOpen, setIsOpen] = useState(false);
|
||||
const [resolver, setResolver] = useState<((value: string | null) => void) | null>(null);
|
||||
|
||||
const openFileBrowser = useCallback((): Promise<string | null> => {
|
||||
return new Promise((resolve) => {
|
||||
setIsOpen(true);
|
||||
setResolver(() => resolve);
|
||||
});
|
||||
}, []);
|
||||
|
||||
const handleSelect = useCallback((path: string) => {
|
||||
if (resolver) {
|
||||
resolver(path);
|
||||
setResolver(null);
|
||||
}
|
||||
setIsOpen(false);
|
||||
}, [resolver]);
|
||||
|
||||
const handleOpenChange = useCallback((open: boolean) => {
|
||||
if (!open && resolver) {
|
||||
resolver(null);
|
||||
setResolver(null);
|
||||
}
|
||||
setIsOpen(open);
|
||||
}, [resolver]);
|
||||
|
||||
return (
|
||||
<FileBrowserContext.Provider value={{ openFileBrowser }}>
|
||||
{children}
|
||||
<FileBrowserDialog
|
||||
open={isOpen}
|
||||
onOpenChange={handleOpenChange}
|
||||
onSelect={handleSelect}
|
||||
/>
|
||||
</FileBrowserContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
export function useFileBrowser() {
|
||||
const context = useContext(FileBrowserContext);
|
||||
if (!context) {
|
||||
throw new Error("useFileBrowser must be used within FileBrowserProvider");
|
||||
}
|
||||
return context;
|
||||
}
|
||||
|
||||
// Global reference for non-React code (like HttpApiClient)
|
||||
let globalFileBrowserFn: (() => Promise<string | null>) | null = null;
|
||||
|
||||
export function setGlobalFileBrowser(fn: () => Promise<string | null>) {
|
||||
globalFileBrowserFn = fn;
|
||||
}
|
||||
|
||||
export function getGlobalFileBrowser() {
|
||||
return globalFileBrowserFn;
|
||||
}
|
||||
@@ -121,11 +121,26 @@ export function useAutoMode() {
|
||||
case "auto_mode_error":
|
||||
console.error("[AutoMode Error]", event.error);
|
||||
if (event.featureId && event.error) {
|
||||
// Check for authentication errors and provide a more helpful message
|
||||
const isAuthError = event.errorType === "authentication" ||
|
||||
event.error.includes("Authentication failed") ||
|
||||
event.error.includes("Invalid API key");
|
||||
|
||||
const errorMessage = isAuthError
|
||||
? `Authentication failed: Please check your API key in Settings or run 'claude login' in terminal to re-authenticate.`
|
||||
: event.error;
|
||||
|
||||
addAutoModeActivity({
|
||||
featureId: event.featureId,
|
||||
type: "error",
|
||||
message: event.error,
|
||||
message: errorMessage,
|
||||
errorType: isAuthError ? "authentication" : "execution",
|
||||
});
|
||||
|
||||
// Remove the task from running since it failed
|
||||
if (eventProjectId) {
|
||||
removeRunningTask(eventProjectId, event.featureId);
|
||||
}
|
||||
}
|
||||
break;
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@ import { useState, useEffect, useCallback, useRef } from "react";
|
||||
import type { Message, StreamEvent } from "@/types/electron";
|
||||
import { useMessageQueue } from "./use-message-queue";
|
||||
import type { ImageAttachment } from "@/store/app-store";
|
||||
import { getElectronAPI } from "@/lib/electron";
|
||||
|
||||
interface UseElectronAgentOptions {
|
||||
sessionId: string;
|
||||
@@ -44,8 +45,9 @@ export function useElectronAgent({
|
||||
// Send message directly to the agent (bypassing queue)
|
||||
const sendMessageDirectly = useCallback(
|
||||
async (content: string, images?: ImageAttachment[]) => {
|
||||
if (!window.electronAPI?.agent) {
|
||||
setError("Electron API not available");
|
||||
const api = getElectronAPI();
|
||||
if (!api?.agent) {
|
||||
setError("API not available");
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -64,10 +66,10 @@ export function useElectronAgent({
|
||||
|
||||
// Save images to .automaker/images and get paths
|
||||
let imagePaths: string[] | undefined;
|
||||
if (images && images.length > 0) {
|
||||
if (images && images.length > 0 && api.saveImageToTemp) {
|
||||
imagePaths = [];
|
||||
for (const image of images) {
|
||||
const result = await window.electronAPI.saveImageToTemp(
|
||||
const result = await api.saveImageToTemp(
|
||||
image.data,
|
||||
image.filename,
|
||||
image.mimeType,
|
||||
@@ -82,7 +84,7 @@ export function useElectronAgent({
|
||||
}
|
||||
}
|
||||
|
||||
const result = await window.electronAPI.agent.send(
|
||||
const result = await api.agent!.send(
|
||||
sessionId,
|
||||
content,
|
||||
workingDirectory,
|
||||
@@ -120,8 +122,9 @@ export function useElectronAgent({
|
||||
|
||||
// Initialize connection and load history
|
||||
useEffect(() => {
|
||||
if (!window.electronAPI?.agent) {
|
||||
setError("Electron API not available. Please run in Electron.");
|
||||
const api = getElectronAPI();
|
||||
if (!api?.agent) {
|
||||
setError("API not available.");
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -142,7 +145,7 @@ export function useElectronAgent({
|
||||
|
||||
try {
|
||||
console.log("[useElectronAgent] Starting session:", sessionId);
|
||||
const result = await window.electronAPI.agent.start(
|
||||
const result = await api.agent!.start(
|
||||
sessionId,
|
||||
workingDirectory
|
||||
);
|
||||
@@ -155,7 +158,7 @@ export function useElectronAgent({
|
||||
setIsConnected(true);
|
||||
|
||||
// Check if the agent is currently running for this session
|
||||
const historyResult = await window.electronAPI.agent.getHistory(sessionId);
|
||||
const historyResult = await api.agent!.getHistory(sessionId);
|
||||
if (mounted && historyResult.success) {
|
||||
const isRunning = historyResult.isRunning || false;
|
||||
console.log("[useElectronAgent] Session running state:", isRunning);
|
||||
@@ -190,7 +193,8 @@ export function useElectronAgent({
|
||||
|
||||
// Subscribe to streaming events
|
||||
useEffect(() => {
|
||||
if (!window.electronAPI?.agent) return;
|
||||
const api = getElectronAPI();
|
||||
if (!api?.agent) return;
|
||||
if (!sessionId) return; // Don't subscribe if no session
|
||||
|
||||
console.log("[useElectronAgent] Subscribing to stream events for session:", sessionId);
|
||||
@@ -282,7 +286,7 @@ export function useElectronAgent({
|
||||
}
|
||||
};
|
||||
|
||||
unsubscribeRef.current = window.electronAPI.agent.onStream(handleStream);
|
||||
unsubscribeRef.current = api.agent!.onStream(handleStream as (data: unknown) => void);
|
||||
|
||||
return () => {
|
||||
if (unsubscribeRef.current) {
|
||||
@@ -296,8 +300,9 @@ export function useElectronAgent({
|
||||
// Send a message to the agent
|
||||
const sendMessage = useCallback(
|
||||
async (content: string, images?: ImageAttachment[]) => {
|
||||
if (!window.electronAPI?.agent) {
|
||||
setError("Electron API not available");
|
||||
const api = getElectronAPI();
|
||||
if (!api?.agent) {
|
||||
setError("API not available");
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -317,10 +322,10 @@ export function useElectronAgent({
|
||||
|
||||
// Save images to .automaker/images and get paths
|
||||
let imagePaths: string[] | undefined;
|
||||
if (images && images.length > 0) {
|
||||
if (images && images.length > 0 && api.saveImageToTemp) {
|
||||
imagePaths = [];
|
||||
for (const image of images) {
|
||||
const result = await window.electronAPI.saveImageToTemp(
|
||||
const result = await api.saveImageToTemp(
|
||||
image.data,
|
||||
image.filename,
|
||||
image.mimeType,
|
||||
@@ -335,7 +340,7 @@ export function useElectronAgent({
|
||||
}
|
||||
}
|
||||
|
||||
const result = await window.electronAPI.agent.send(
|
||||
const result = await api.agent!.send(
|
||||
sessionId,
|
||||
content,
|
||||
workingDirectory,
|
||||
@@ -359,14 +364,15 @@ export function useElectronAgent({
|
||||
|
||||
// Stop current execution
|
||||
const stopExecution = useCallback(async () => {
|
||||
if (!window.electronAPI?.agent) {
|
||||
setError("Electron API not available");
|
||||
const api = getElectronAPI();
|
||||
if (!api?.agent) {
|
||||
setError("API not available");
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
console.log("[useElectronAgent] Stopping execution");
|
||||
const result = await window.electronAPI.agent.stop(sessionId);
|
||||
const result = await api.agent!.stop(sessionId);
|
||||
|
||||
if (!result.success) {
|
||||
setError(result.error || "Failed to stop execution");
|
||||
@@ -381,14 +387,15 @@ export function useElectronAgent({
|
||||
|
||||
// Clear conversation history
|
||||
const clearHistory = useCallback(async () => {
|
||||
if (!window.electronAPI?.agent) {
|
||||
setError("Electron API not available");
|
||||
const api = getElectronAPI();
|
||||
if (!api?.agent) {
|
||||
setError("API not available");
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
console.log("[useElectronAgent] Clearing history");
|
||||
const result = await window.electronAPI.agent.clear(sessionId);
|
||||
const result = await api.agent!.clear(sessionId);
|
||||
|
||||
if (result.success) {
|
||||
setMessages([]);
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
// Type definitions for Electron IPC API
|
||||
import type { SessionListItem, Message } from "@/types/electron";
|
||||
|
||||
export interface FileEntry {
|
||||
name: string;
|
||||
@@ -413,6 +414,68 @@ export interface ElectronAPI {
|
||||
onInstallProgress?: (callback: (progress: any) => void) => () => void;
|
||||
onAuthProgress?: (callback: (progress: any) => void) => () => void;
|
||||
};
|
||||
agent?: {
|
||||
start: (
|
||||
sessionId: string,
|
||||
workingDirectory?: string
|
||||
) => Promise<{
|
||||
success: boolean;
|
||||
messages?: Message[];
|
||||
error?: string;
|
||||
}>;
|
||||
send: (
|
||||
sessionId: string,
|
||||
message: string,
|
||||
workingDirectory?: string,
|
||||
imagePaths?: string[]
|
||||
) => Promise<{ success: boolean; error?: string }>;
|
||||
getHistory: (sessionId: string) => Promise<{
|
||||
success: boolean;
|
||||
messages?: Message[];
|
||||
isRunning?: boolean;
|
||||
error?: string;
|
||||
}>;
|
||||
stop: (sessionId: string) => Promise<{ success: boolean; error?: string }>;
|
||||
clear: (sessionId: string) => Promise<{ success: boolean; error?: string }>;
|
||||
onStream: (callback: (data: unknown) => void) => () => void;
|
||||
};
|
||||
sessions?: {
|
||||
list: (includeArchived?: boolean) => Promise<{
|
||||
success: boolean;
|
||||
sessions?: SessionListItem[];
|
||||
error?: string;
|
||||
}>;
|
||||
create: (
|
||||
name: string,
|
||||
projectPath: string,
|
||||
workingDirectory?: string
|
||||
) => Promise<{
|
||||
success: boolean;
|
||||
session?: {
|
||||
id: string;
|
||||
name: string;
|
||||
projectPath: string;
|
||||
workingDirectory?: string;
|
||||
createdAt: string;
|
||||
updatedAt: string;
|
||||
};
|
||||
error?: string;
|
||||
}>;
|
||||
update: (
|
||||
sessionId: string,
|
||||
name?: string,
|
||||
tags?: string[]
|
||||
) => Promise<{ success: boolean; error?: string }>;
|
||||
archive: (
|
||||
sessionId: string
|
||||
) => Promise<{ success: boolean; error?: string }>;
|
||||
unarchive: (
|
||||
sessionId: string
|
||||
) => Promise<{ success: boolean; error?: string }>;
|
||||
delete: (
|
||||
sessionId: string
|
||||
) => Promise<{ success: boolean; error?: string }>;
|
||||
};
|
||||
}
|
||||
|
||||
// Note: Window interface is declared in @/types/electron.d.ts
|
||||
@@ -438,18 +501,92 @@ const STORAGE_KEYS = {
|
||||
// Mock file system using localStorage
|
||||
const mockFileSystem: Record<string, string> = {};
|
||||
|
||||
// Check if we're in Electron
|
||||
// Check if we're in Electron (for UI indicators only)
|
||||
export const isElectron = (): boolean => {
|
||||
return typeof window !== "undefined" && window.isElectron === true;
|
||||
};
|
||||
|
||||
// Get the Electron API or a mock for web development
|
||||
// Check if backend server is available
|
||||
let serverAvailable: boolean | null = null;
|
||||
let serverCheckPromise: Promise<boolean> | null = null;
|
||||
|
||||
export const checkServerAvailable = async (): Promise<boolean> => {
|
||||
if (serverAvailable !== null) return serverAvailable;
|
||||
if (serverCheckPromise) return serverCheckPromise;
|
||||
|
||||
serverCheckPromise = (async () => {
|
||||
try {
|
||||
const serverUrl =
|
||||
process.env.NEXT_PUBLIC_SERVER_URL || "http://localhost:3008";
|
||||
const response = await fetch(`${serverUrl}/api/health`, {
|
||||
method: "GET",
|
||||
signal: AbortSignal.timeout(2000),
|
||||
});
|
||||
serverAvailable = response.ok;
|
||||
} catch {
|
||||
serverAvailable = false;
|
||||
}
|
||||
return serverAvailable;
|
||||
})();
|
||||
|
||||
return serverCheckPromise;
|
||||
};
|
||||
|
||||
// Reset server check (useful for retrying connection)
|
||||
export const resetServerCheck = (): void => {
|
||||
serverAvailable = null;
|
||||
serverCheckPromise = null;
|
||||
};
|
||||
|
||||
// Cached HTTP client instance
|
||||
let httpClientInstance: ElectronAPI | null = null;
|
||||
|
||||
/**
|
||||
* Get the HTTP API client
|
||||
*
|
||||
* All API calls go through HTTP to the backend server.
|
||||
* This is the only transport mode supported.
|
||||
*/
|
||||
export const getElectronAPI = (): ElectronAPI => {
|
||||
if (isElectron() && window.electronAPI) {
|
||||
return window.electronAPI;
|
||||
if (typeof window === "undefined") {
|
||||
throw new Error("Cannot get API during SSR");
|
||||
}
|
||||
|
||||
// Return mock API for web development
|
||||
if (!httpClientInstance) {
|
||||
const { getHttpApiClient } = require("./http-api-client");
|
||||
httpClientInstance = getHttpApiClient();
|
||||
}
|
||||
return httpClientInstance!;
|
||||
};
|
||||
|
||||
// Async version (same as sync since HTTP client is synchronously instantiated)
|
||||
export const getElectronAPIAsync = async (): Promise<ElectronAPI> => {
|
||||
return getElectronAPI();
|
||||
};
|
||||
|
||||
// Check if backend is connected (for showing connection status in UI)
|
||||
export const isBackendConnected = async (): Promise<boolean> => {
|
||||
return await checkServerAvailable();
|
||||
};
|
||||
|
||||
/**
|
||||
* Get the current API mode being used
|
||||
* Always returns "http" since that's the only mode now
|
||||
*/
|
||||
export const getCurrentApiMode = (): "http" => {
|
||||
return "http";
|
||||
};
|
||||
|
||||
// Debug helpers
|
||||
if (typeof window !== "undefined") {
|
||||
(window as any).__checkApiMode = () => {
|
||||
console.log("Current API mode:", getCurrentApiMode());
|
||||
console.log("isElectron():", isElectron());
|
||||
};
|
||||
}
|
||||
|
||||
// Mock API for development/fallback when no backend is available
|
||||
const getMockElectronAPI = (): ElectronAPI => {
|
||||
return {
|
||||
ping: async () => "pong (mock)",
|
||||
|
||||
@@ -748,17 +885,21 @@ interface SetupAPI {
|
||||
getClaudeStatus: () => Promise<{
|
||||
success: boolean;
|
||||
status?: string;
|
||||
installed?: boolean;
|
||||
method?: string;
|
||||
version?: string;
|
||||
path?: string;
|
||||
auth?: {
|
||||
authenticated: boolean;
|
||||
method: string;
|
||||
hasCredentialsFile: boolean;
|
||||
hasToken: boolean;
|
||||
hasCredentialsFile?: boolean;
|
||||
hasToken?: boolean;
|
||||
hasStoredOAuthToken?: boolean;
|
||||
hasStoredApiKey?: boolean;
|
||||
hasEnvApiKey?: boolean;
|
||||
hasEnvOAuthToken?: boolean;
|
||||
hasCliAuth?: boolean;
|
||||
hasRecentActivity?: boolean;
|
||||
};
|
||||
error?: string;
|
||||
}>;
|
||||
@@ -838,11 +979,14 @@ function createMockSetupAPI(): SetupAPI {
|
||||
return {
|
||||
success: true,
|
||||
status: "not_installed",
|
||||
installed: false,
|
||||
auth: {
|
||||
authenticated: false,
|
||||
method: "none",
|
||||
hasCredentialsFile: false,
|
||||
hasToken: false,
|
||||
hasCliAuth: false,
|
||||
hasRecentActivity: false,
|
||||
},
|
||||
};
|
||||
},
|
||||
@@ -1866,7 +2010,9 @@ function createMockSpecRegenerationAPI(): SpecRegenerationAPI {
|
||||
}
|
||||
|
||||
mockSpecRegenerationRunning = true;
|
||||
console.log(`[Mock] Generating features from existing spec for: ${projectPath}`);
|
||||
console.log(
|
||||
`[Mock] Generating features from existing spec for: ${projectPath}`
|
||||
);
|
||||
|
||||
// Simulate async feature generation
|
||||
simulateFeatureGeneration(projectPath);
|
||||
@@ -2053,7 +2199,8 @@ async function simulateFeatureGeneration(projectPath: string) {
|
||||
mockSpecRegenerationPhase = "initialization";
|
||||
emitSpecRegenerationEvent({
|
||||
type: "spec_regeneration_progress",
|
||||
content: "[Phase: initialization] Starting feature generation from existing app_spec.txt...\n",
|
||||
content:
|
||||
"[Phase: initialization] Starting feature generation from existing app_spec.txt...\n",
|
||||
});
|
||||
|
||||
await new Promise((resolve) => {
|
||||
|
||||
279
apps/app/src/lib/file-picker.ts
Normal file
279
apps/app/src/lib/file-picker.ts
Normal file
@@ -0,0 +1,279 @@
|
||||
/**
|
||||
* File Picker Utility for Web Browsers
|
||||
*
|
||||
* Provides cross-platform file and directory selection using:
|
||||
* 1. HTML5 webkitdirectory input - primary method (works on Windows)
|
||||
* 2. File System Access API (showDirectoryPicker) - fallback for modern browsers
|
||||
*
|
||||
* Note: Browsers don't expose absolute file paths for security reasons.
|
||||
* This implementation extracts directory information and may require
|
||||
* user confirmation or server-side path resolution.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Directory picker result with structure information for server-side resolution
|
||||
*/
|
||||
export interface DirectoryPickerResult {
|
||||
directoryName: string;
|
||||
sampleFiles: string[]; // Relative paths of sample files for identification
|
||||
fileCount: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Opens a directory picker dialog
|
||||
* @returns Promise resolving to directory information, or null if canceled
|
||||
*
|
||||
* Note: Browsers don't expose absolute file paths for security reasons.
|
||||
* This function returns directory structure information that the server
|
||||
* can use to locate the actual directory path.
|
||||
*/
|
||||
export async function openDirectoryPicker(): Promise<DirectoryPickerResult | null> {
|
||||
// Use webkitdirectory (works on Windows and all modern browsers)
|
||||
return new Promise<DirectoryPickerResult | null>((resolve) => {
|
||||
let resolved = false;
|
||||
const input = document.createElement("input");
|
||||
input.type = "file";
|
||||
input.webkitdirectory = true;
|
||||
input.style.display = "none";
|
||||
|
||||
const cleanup = () => {
|
||||
if (input.parentNode) {
|
||||
document.body.removeChild(input);
|
||||
}
|
||||
};
|
||||
|
||||
let changeEventFired = false;
|
||||
let focusTimeout: ReturnType<typeof setTimeout> | null = null;
|
||||
|
||||
const safeResolve = (value: DirectoryPickerResult | null) => {
|
||||
if (!resolved) {
|
||||
resolved = true;
|
||||
changeEventFired = true;
|
||||
if (focusTimeout) {
|
||||
clearTimeout(focusTimeout);
|
||||
focusTimeout = null;
|
||||
}
|
||||
cleanup();
|
||||
resolve(value);
|
||||
}
|
||||
};
|
||||
|
||||
input.addEventListener("change", (e) => {
|
||||
changeEventFired = true;
|
||||
if (focusTimeout) {
|
||||
clearTimeout(focusTimeout);
|
||||
focusTimeout = null;
|
||||
}
|
||||
|
||||
console.log("[FilePicker] Change event fired");
|
||||
const files = input.files;
|
||||
console.log("[FilePicker] Files selected:", files?.length || 0);
|
||||
|
||||
if (!files || files.length === 0) {
|
||||
console.log("[FilePicker] No files selected");
|
||||
safeResolve(null);
|
||||
return;
|
||||
}
|
||||
|
||||
const firstFile = files[0];
|
||||
console.log("[FilePicker] First file:", {
|
||||
name: firstFile.name,
|
||||
webkitRelativePath: firstFile.webkitRelativePath,
|
||||
// @ts-expect-error
|
||||
path: firstFile.path,
|
||||
});
|
||||
|
||||
// Extract directory name from webkitRelativePath
|
||||
// webkitRelativePath format: "directoryName/subfolder/file.txt" or "directoryName/file.txt"
|
||||
let directoryName = "Selected Directory";
|
||||
|
||||
// Method 1: Try to get absolute path from File object (non-standard, works in Electron/Chromium)
|
||||
// @ts-expect-error - path property is non-standard but available in some browsers
|
||||
if (firstFile.path) {
|
||||
// @ts-expect-error
|
||||
const filePath = firstFile.path as string;
|
||||
console.log("[FilePicker] Found file.path:", filePath);
|
||||
// Extract directory path (remove filename)
|
||||
const lastSeparator = Math.max(
|
||||
filePath.lastIndexOf("\\"),
|
||||
filePath.lastIndexOf("/")
|
||||
);
|
||||
if (lastSeparator > 0) {
|
||||
const absolutePath = filePath.substring(0, lastSeparator);
|
||||
console.log("[FilePicker] Found absolute path:", absolutePath);
|
||||
// Return as directory name for now - server can validate it directly
|
||||
directoryName = absolutePath;
|
||||
}
|
||||
}
|
||||
|
||||
// Method 2: Extract directory name from webkitRelativePath
|
||||
if (directoryName === "Selected Directory" && firstFile.webkitRelativePath) {
|
||||
const relativePath = firstFile.webkitRelativePath;
|
||||
console.log("[FilePicker] Using webkitRelativePath:", relativePath);
|
||||
const pathParts = relativePath.split("/");
|
||||
if (pathParts.length > 0) {
|
||||
directoryName = pathParts[0]; // Top-level directory name
|
||||
console.log("[FilePicker] Extracted directory name:", directoryName);
|
||||
}
|
||||
}
|
||||
|
||||
// Collect sample file paths for server-side directory matching
|
||||
// Take first 10 files to identify the directory
|
||||
const sampleFiles: string[] = [];
|
||||
const maxSamples = 10;
|
||||
for (let i = 0; i < Math.min(files.length, maxSamples); i++) {
|
||||
const file = files[i];
|
||||
if (file.webkitRelativePath) {
|
||||
sampleFiles.push(file.webkitRelativePath);
|
||||
} else if (file.name) {
|
||||
sampleFiles.push(file.name);
|
||||
}
|
||||
}
|
||||
|
||||
console.log("[FilePicker] Directory info:", {
|
||||
directoryName,
|
||||
fileCount: files.length,
|
||||
sampleFiles: sampleFiles.slice(0, 5), // Log first 5
|
||||
});
|
||||
|
||||
safeResolve({
|
||||
directoryName,
|
||||
sampleFiles,
|
||||
fileCount: files.length,
|
||||
});
|
||||
});
|
||||
|
||||
// Handle cancellation - but be very careful not to interfere with change event
|
||||
// On Windows, the dialog might take time to process, so we wait longer
|
||||
const handleFocus = () => {
|
||||
// Wait longer on Windows - the dialog might take time to process
|
||||
// Only resolve as canceled if change event hasn't fired after a delay
|
||||
focusTimeout = setTimeout(() => {
|
||||
if (!resolved && !changeEventFired && (!input.files || input.files.length === 0)) {
|
||||
console.log("[FilePicker] Dialog canceled (no files after focus and no change event)");
|
||||
safeResolve(null);
|
||||
}
|
||||
}, 2000); // Increased timeout for Windows - give it time
|
||||
};
|
||||
|
||||
// Add to DOM temporarily
|
||||
document.body.appendChild(input);
|
||||
console.log("[FilePicker] Opening directory picker...");
|
||||
|
||||
// Try to show picker programmatically
|
||||
if ("showPicker" in HTMLInputElement.prototype) {
|
||||
try {
|
||||
(input as any).showPicker();
|
||||
console.log("[FilePicker] Using showPicker()");
|
||||
} catch (error) {
|
||||
console.log("[FilePicker] showPicker() failed, using click()", error);
|
||||
input.click();
|
||||
}
|
||||
} else {
|
||||
console.log("[FilePicker] Using click()");
|
||||
input.click();
|
||||
}
|
||||
|
||||
// Set up cancellation detection with longer delay
|
||||
// Only add focus listener if we're not already resolved
|
||||
window.addEventListener("focus", handleFocus, { once: true });
|
||||
|
||||
// Also handle blur as a cancellation signal (but with delay)
|
||||
window.addEventListener("blur", () => {
|
||||
// Dialog opened, wait for it to close
|
||||
setTimeout(() => {
|
||||
window.addEventListener("focus", handleFocus, { once: true });
|
||||
}, 100);
|
||||
}, { once: true });
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Opens a file picker dialog
|
||||
* @param options Optional configuration (multiple files, file types, etc.)
|
||||
* @returns Promise resolving to selected file path(s), or null if canceled
|
||||
*/
|
||||
export async function openFilePicker(
|
||||
options?: {
|
||||
multiple?: boolean;
|
||||
accept?: string;
|
||||
}
|
||||
): Promise<string | string[] | null> {
|
||||
// Use standard file input (works on all browsers including Windows)
|
||||
return new Promise<string | string[] | null>((resolve) => {
|
||||
const input = document.createElement("input");
|
||||
input.type = "file";
|
||||
input.multiple = options?.multiple ?? false;
|
||||
if (options?.accept) {
|
||||
input.accept = options.accept;
|
||||
}
|
||||
input.style.display = "none";
|
||||
|
||||
const cleanup = () => {
|
||||
if (input.parentNode) {
|
||||
document.body.removeChild(input);
|
||||
}
|
||||
};
|
||||
|
||||
input.addEventListener("change", () => {
|
||||
const files = input.files;
|
||||
if (!files || files.length === 0) {
|
||||
cleanup();
|
||||
resolve(null);
|
||||
return;
|
||||
}
|
||||
|
||||
// Try to extract paths from File objects
|
||||
const extractPath = (file: File): string => {
|
||||
// Try to get path from File object (non-standard, but available in some browsers)
|
||||
// @ts-expect-error - path property is non-standard
|
||||
if (file.path) {
|
||||
// @ts-expect-error
|
||||
return file.path as string;
|
||||
}
|
||||
// Fallback to filename (server will need to resolve)
|
||||
return file.name;
|
||||
};
|
||||
|
||||
if (options?.multiple) {
|
||||
const paths = Array.from(files).map(extractPath);
|
||||
cleanup();
|
||||
resolve(paths);
|
||||
} else {
|
||||
const path = extractPath(files[0]);
|
||||
cleanup();
|
||||
resolve(path);
|
||||
}
|
||||
});
|
||||
|
||||
// Handle window focus (user may have canceled)
|
||||
const handleFocus = () => {
|
||||
setTimeout(() => {
|
||||
if (!input.files || input.files.length === 0) {
|
||||
cleanup();
|
||||
resolve(null);
|
||||
}
|
||||
}, 200);
|
||||
};
|
||||
|
||||
// Add to DOM temporarily
|
||||
document.body.appendChild(input);
|
||||
|
||||
// Try to show picker programmatically
|
||||
// Note: showPicker() is available in modern browsers but TypeScript types it as void
|
||||
// In practice, it may return a Promise in some implementations, but we'll handle errors via try/catch
|
||||
if ("showPicker" in HTMLInputElement.prototype) {
|
||||
try {
|
||||
(input as any).showPicker();
|
||||
} catch {
|
||||
// Fallback to click if showPicker fails
|
||||
input.click();
|
||||
}
|
||||
} else {
|
||||
input.click();
|
||||
}
|
||||
|
||||
// Set up cancellation detection
|
||||
window.addEventListener("focus", handleFocus, { once: true });
|
||||
});
|
||||
}
|
||||
722
apps/app/src/lib/http-api-client.ts
Normal file
722
apps/app/src/lib/http-api-client.ts
Normal file
@@ -0,0 +1,722 @@
|
||||
/**
|
||||
* HTTP API Client for web mode
|
||||
*
|
||||
* This client provides the same API as the Electron IPC bridge,
|
||||
* but communicates with the backend server via HTTP/WebSocket.
|
||||
*/
|
||||
|
||||
import type {
|
||||
ElectronAPI,
|
||||
FileResult,
|
||||
WriteResult,
|
||||
ReaddirResult,
|
||||
StatResult,
|
||||
DialogResult,
|
||||
SaveImageResult,
|
||||
AutoModeAPI,
|
||||
FeaturesAPI,
|
||||
SuggestionsAPI,
|
||||
SpecRegenerationAPI,
|
||||
AutoModeEvent,
|
||||
SuggestionsEvent,
|
||||
SpecRegenerationEvent,
|
||||
FeatureSuggestion,
|
||||
SuggestionType,
|
||||
} from "./electron";
|
||||
import type { Message, SessionListItem } from "@/types/electron";
|
||||
import type { Feature } from "@/store/app-store";
|
||||
import type {
|
||||
WorktreeAPI,
|
||||
GitAPI,
|
||||
ModelDefinition,
|
||||
ProviderStatus,
|
||||
} from "@/types/electron";
|
||||
import { getGlobalFileBrowser } from "@/contexts/file-browser-context";
|
||||
|
||||
|
||||
// Server URL - configurable via environment variable
|
||||
const getServerUrl = (): string => {
|
||||
if (typeof window !== "undefined") {
|
||||
const envUrl = process.env.NEXT_PUBLIC_SERVER_URL;
|
||||
if (envUrl) return envUrl;
|
||||
}
|
||||
return "http://localhost:3008";
|
||||
};
|
||||
|
||||
|
||||
// Get API key from environment variable
|
||||
const getApiKey = (): string | null => {
|
||||
if (typeof window !== "undefined") {
|
||||
return process.env.NEXT_PUBLIC_AUTOMAKER_API_KEY || null;
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
type EventType =
|
||||
| "agent:stream"
|
||||
| "auto-mode:event"
|
||||
| "suggestions:event"
|
||||
| "spec-regeneration:event";
|
||||
|
||||
type EventCallback = (payload: unknown) => void;
|
||||
|
||||
/**
|
||||
* HTTP API Client that implements ElectronAPI interface
|
||||
*/
|
||||
export class HttpApiClient implements ElectronAPI {
|
||||
private serverUrl: string;
|
||||
private ws: WebSocket | null = null;
|
||||
private eventCallbacks: Map<EventType, Set<EventCallback>> = new Map();
|
||||
private reconnectTimer: NodeJS.Timeout | null = null;
|
||||
private isConnecting = false;
|
||||
|
||||
constructor() {
|
||||
this.serverUrl = getServerUrl();
|
||||
this.connectWebSocket();
|
||||
}
|
||||
|
||||
private connectWebSocket(): void {
|
||||
if (this.isConnecting || (this.ws && this.ws.readyState === WebSocket.OPEN)) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.isConnecting = true;
|
||||
|
||||
try {
|
||||
const wsUrl = this.serverUrl.replace(/^http/, "ws") + "/api/events";
|
||||
this.ws = new WebSocket(wsUrl);
|
||||
|
||||
this.ws.onopen = () => {
|
||||
console.log("[HttpApiClient] WebSocket connected");
|
||||
this.isConnecting = false;
|
||||
if (this.reconnectTimer) {
|
||||
clearTimeout(this.reconnectTimer);
|
||||
this.reconnectTimer = null;
|
||||
}
|
||||
};
|
||||
|
||||
this.ws.onmessage = (event) => {
|
||||
try {
|
||||
const data = JSON.parse(event.data);
|
||||
const callbacks = this.eventCallbacks.get(data.type);
|
||||
if (callbacks) {
|
||||
callbacks.forEach((cb) => cb(data.payload));
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("[HttpApiClient] Failed to parse WebSocket message:", error);
|
||||
}
|
||||
};
|
||||
|
||||
this.ws.onclose = () => {
|
||||
console.log("[HttpApiClient] WebSocket disconnected");
|
||||
this.isConnecting = false;
|
||||
this.ws = null;
|
||||
// Attempt to reconnect after 5 seconds
|
||||
if (!this.reconnectTimer) {
|
||||
this.reconnectTimer = setTimeout(() => {
|
||||
this.reconnectTimer = null;
|
||||
this.connectWebSocket();
|
||||
}, 5000);
|
||||
}
|
||||
};
|
||||
|
||||
this.ws.onerror = (error) => {
|
||||
console.error("[HttpApiClient] WebSocket error:", error);
|
||||
this.isConnecting = false;
|
||||
};
|
||||
} catch (error) {
|
||||
console.error("[HttpApiClient] Failed to create WebSocket:", error);
|
||||
this.isConnecting = false;
|
||||
}
|
||||
}
|
||||
|
||||
private subscribeToEvent(type: EventType, callback: EventCallback): () => void {
|
||||
if (!this.eventCallbacks.has(type)) {
|
||||
this.eventCallbacks.set(type, new Set());
|
||||
}
|
||||
this.eventCallbacks.get(type)!.add(callback);
|
||||
|
||||
// Ensure WebSocket is connected
|
||||
this.connectWebSocket();
|
||||
|
||||
return () => {
|
||||
const callbacks = this.eventCallbacks.get(type);
|
||||
if (callbacks) {
|
||||
callbacks.delete(callback);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private getHeaders(): Record<string, string> {
|
||||
const headers: Record<string, string> = {
|
||||
"Content-Type": "application/json",
|
||||
};
|
||||
const apiKey = getApiKey();
|
||||
if (apiKey) {
|
||||
headers["X-API-Key"] = apiKey;
|
||||
}
|
||||
return headers;
|
||||
}
|
||||
|
||||
private async post<T>(endpoint: string, body?: unknown): Promise<T> {
|
||||
const response = await fetch(`${this.serverUrl}${endpoint}`, {
|
||||
method: "POST",
|
||||
headers: this.getHeaders(),
|
||||
body: body ? JSON.stringify(body) : undefined,
|
||||
});
|
||||
return response.json();
|
||||
}
|
||||
|
||||
private async get<T>(endpoint: string): Promise<T> {
|
||||
const headers = this.getHeaders();
|
||||
const response = await fetch(`${this.serverUrl}${endpoint}`, { headers });
|
||||
return response.json();
|
||||
}
|
||||
|
||||
private async put<T>(endpoint: string, body?: unknown): Promise<T> {
|
||||
const response = await fetch(`${this.serverUrl}${endpoint}`, {
|
||||
method: "PUT",
|
||||
headers: this.getHeaders(),
|
||||
body: body ? JSON.stringify(body) : undefined,
|
||||
});
|
||||
return response.json();
|
||||
}
|
||||
|
||||
private async httpDelete<T>(endpoint: string): Promise<T> {
|
||||
const response = await fetch(`${this.serverUrl}${endpoint}`, {
|
||||
method: "DELETE",
|
||||
headers: this.getHeaders(),
|
||||
});
|
||||
return response.json();
|
||||
}
|
||||
|
||||
// Basic operations
|
||||
async ping(): Promise<string> {
|
||||
const result = await this.get<{ status: string }>("/api/health");
|
||||
return result.status === "ok" ? "pong" : "error";
|
||||
}
|
||||
|
||||
async openExternalLink(url: string): Promise<{ success: boolean; error?: string }> {
|
||||
// Open in new tab
|
||||
window.open(url, "_blank", "noopener,noreferrer");
|
||||
return { success: true };
|
||||
}
|
||||
|
||||
// File picker - uses server-side file browser dialog
|
||||
async openDirectory(): Promise<DialogResult> {
|
||||
const fileBrowser = getGlobalFileBrowser();
|
||||
|
||||
if (!fileBrowser) {
|
||||
console.error("File browser not initialized");
|
||||
return { canceled: true, filePaths: [] };
|
||||
}
|
||||
|
||||
const path = await fileBrowser();
|
||||
|
||||
if (!path) {
|
||||
return { canceled: true, filePaths: [] };
|
||||
}
|
||||
|
||||
// Validate with server
|
||||
const result = await this.post<{
|
||||
success: boolean;
|
||||
path?: string;
|
||||
error?: string;
|
||||
}>("/api/fs/validate-path", { filePath: path });
|
||||
|
||||
if (result.success && result.path) {
|
||||
return { canceled: false, filePaths: [result.path] };
|
||||
}
|
||||
|
||||
console.error("Invalid directory:", result.error);
|
||||
return { canceled: true, filePaths: [] };
|
||||
}
|
||||
|
||||
async openFile(options?: object): Promise<DialogResult> {
|
||||
const fileBrowser = getGlobalFileBrowser();
|
||||
|
||||
if (!fileBrowser) {
|
||||
console.error("File browser not initialized");
|
||||
return { canceled: true, filePaths: [] };
|
||||
}
|
||||
|
||||
// For now, use the same directory browser (could be enhanced for file selection)
|
||||
const path = await fileBrowser();
|
||||
|
||||
if (!path) {
|
||||
return { canceled: true, filePaths: [] };
|
||||
}
|
||||
|
||||
const result = await this.post<{ success: boolean; exists: boolean }>(
|
||||
"/api/fs/exists",
|
||||
{ filePath: path }
|
||||
);
|
||||
|
||||
if (result.success && result.exists) {
|
||||
return { canceled: false, filePaths: [path] };
|
||||
}
|
||||
|
||||
console.error("File not found");
|
||||
return { canceled: true, filePaths: [] };
|
||||
}
|
||||
|
||||
// File system operations
|
||||
async readFile(filePath: string): Promise<FileResult> {
|
||||
return this.post("/api/fs/read", { filePath });
|
||||
}
|
||||
|
||||
async writeFile(filePath: string, content: string): Promise<WriteResult> {
|
||||
return this.post("/api/fs/write", { filePath, content });
|
||||
}
|
||||
|
||||
async mkdir(dirPath: string): Promise<WriteResult> {
|
||||
return this.post("/api/fs/mkdir", { dirPath });
|
||||
}
|
||||
|
||||
async readdir(dirPath: string): Promise<ReaddirResult> {
|
||||
return this.post("/api/fs/readdir", { dirPath });
|
||||
}
|
||||
|
||||
async exists(filePath: string): Promise<boolean> {
|
||||
const result = await this.post<{ success: boolean; exists: boolean }>(
|
||||
"/api/fs/exists",
|
||||
{ filePath }
|
||||
);
|
||||
return result.exists;
|
||||
}
|
||||
|
||||
async stat(filePath: string): Promise<StatResult> {
|
||||
return this.post("/api/fs/stat", { filePath });
|
||||
}
|
||||
|
||||
async deleteFile(filePath: string): Promise<WriteResult> {
|
||||
return this.post("/api/fs/delete", { filePath });
|
||||
}
|
||||
|
||||
async trashItem(filePath: string): Promise<WriteResult> {
|
||||
// In web mode, trash is just delete
|
||||
return this.deleteFile(filePath);
|
||||
}
|
||||
|
||||
async getPath(name: string): Promise<string> {
|
||||
// Server provides data directory
|
||||
if (name === "userData") {
|
||||
const result = await this.get<{ dataDir: string }>("/api/health/detailed");
|
||||
return result.dataDir || "/data";
|
||||
}
|
||||
return `/data/${name}`;
|
||||
}
|
||||
|
||||
async saveImageToTemp(
|
||||
data: string,
|
||||
filename: string,
|
||||
mimeType: string,
|
||||
projectPath?: string
|
||||
): Promise<SaveImageResult> {
|
||||
return this.post("/api/fs/save-image", { data, filename, mimeType, projectPath });
|
||||
}
|
||||
|
||||
// CLI checks - server-side
|
||||
async checkClaudeCli(): Promise<{
|
||||
success: boolean;
|
||||
status?: string;
|
||||
method?: string;
|
||||
version?: string;
|
||||
path?: string;
|
||||
recommendation?: string;
|
||||
installCommands?: {
|
||||
macos?: string;
|
||||
windows?: string;
|
||||
linux?: string;
|
||||
npm?: string;
|
||||
};
|
||||
error?: string;
|
||||
}> {
|
||||
return this.get("/api/setup/claude-status");
|
||||
}
|
||||
|
||||
async checkCodexCli(): Promise<{
|
||||
success: boolean;
|
||||
status?: string;
|
||||
method?: string;
|
||||
version?: string;
|
||||
path?: string;
|
||||
hasApiKey?: boolean;
|
||||
recommendation?: string;
|
||||
installCommands?: {
|
||||
macos?: string;
|
||||
windows?: string;
|
||||
linux?: string;
|
||||
npm?: string;
|
||||
};
|
||||
error?: string;
|
||||
}> {
|
||||
return this.get("/api/setup/codex-status");
|
||||
}
|
||||
|
||||
// Model API
|
||||
model = {
|
||||
getAvailable: async (): Promise<{
|
||||
success: boolean;
|
||||
models?: ModelDefinition[];
|
||||
error?: string;
|
||||
}> => {
|
||||
return this.get("/api/models/available");
|
||||
},
|
||||
checkProviders: async (): Promise<{
|
||||
success: boolean;
|
||||
providers?: Record<string, ProviderStatus>;
|
||||
error?: string;
|
||||
}> => {
|
||||
return this.get("/api/models/providers");
|
||||
},
|
||||
};
|
||||
|
||||
async testOpenAIConnection(apiKey?: string): Promise<{
|
||||
success: boolean;
|
||||
message?: string;
|
||||
error?: string;
|
||||
}> {
|
||||
return this.post("/api/setup/test-openai", { apiKey });
|
||||
}
|
||||
|
||||
// Setup API
|
||||
setup = {
|
||||
getClaudeStatus: (): Promise<{
|
||||
success: boolean;
|
||||
status?: string;
|
||||
installed?: boolean;
|
||||
method?: string;
|
||||
version?: string;
|
||||
path?: string;
|
||||
auth?: {
|
||||
authenticated: boolean;
|
||||
method: string;
|
||||
hasCredentialsFile?: boolean;
|
||||
hasToken?: boolean;
|
||||
hasStoredOAuthToken?: boolean;
|
||||
hasStoredApiKey?: boolean;
|
||||
hasEnvApiKey?: boolean;
|
||||
hasEnvOAuthToken?: boolean;
|
||||
hasCliAuth?: boolean;
|
||||
hasRecentActivity?: boolean;
|
||||
};
|
||||
error?: string;
|
||||
}> => this.get("/api/setup/claude-status"),
|
||||
|
||||
getCodexStatus: (): Promise<{
|
||||
success: boolean;
|
||||
status?: string;
|
||||
method?: string;
|
||||
version?: string;
|
||||
path?: string;
|
||||
auth?: {
|
||||
authenticated: boolean;
|
||||
method: string;
|
||||
hasAuthFile: boolean;
|
||||
hasEnvKey: boolean;
|
||||
hasStoredApiKey?: boolean;
|
||||
hasEnvApiKey?: boolean;
|
||||
};
|
||||
error?: string;
|
||||
}> => this.get("/api/setup/codex-status"),
|
||||
|
||||
installClaude: (): Promise<{
|
||||
success: boolean;
|
||||
message?: string;
|
||||
error?: string;
|
||||
}> => this.post("/api/setup/install-claude"),
|
||||
|
||||
installCodex: (): Promise<{
|
||||
success: boolean;
|
||||
message?: string;
|
||||
error?: string;
|
||||
}> => this.post("/api/setup/install-codex"),
|
||||
|
||||
authClaude: (): Promise<{
|
||||
success: boolean;
|
||||
token?: string;
|
||||
requiresManualAuth?: boolean;
|
||||
terminalOpened?: boolean;
|
||||
command?: string;
|
||||
error?: string;
|
||||
message?: string;
|
||||
output?: string;
|
||||
}> => this.post("/api/setup/auth-claude"),
|
||||
|
||||
authCodex: (apiKey?: string): Promise<{
|
||||
success: boolean;
|
||||
requiresManualAuth?: boolean;
|
||||
command?: string;
|
||||
error?: string;
|
||||
}> => this.post("/api/setup/auth-codex", { apiKey }),
|
||||
|
||||
storeApiKey: (provider: string, apiKey: string): Promise<{
|
||||
success: boolean;
|
||||
error?: string;
|
||||
}> => this.post("/api/setup/store-api-key", { provider, apiKey }),
|
||||
|
||||
getApiKeys: (): Promise<{
|
||||
success: boolean;
|
||||
hasAnthropicKey: boolean;
|
||||
hasOpenAIKey: boolean;
|
||||
hasGoogleKey: boolean;
|
||||
}> => this.get("/api/setup/api-keys"),
|
||||
|
||||
configureCodexMcp: (projectPath: string): Promise<{
|
||||
success: boolean;
|
||||
configPath?: string;
|
||||
error?: string;
|
||||
}> => this.post("/api/setup/configure-codex-mcp", { projectPath }),
|
||||
|
||||
getPlatform: (): Promise<{
|
||||
success: boolean;
|
||||
platform: string;
|
||||
arch: string;
|
||||
homeDir: string;
|
||||
isWindows: boolean;
|
||||
isMac: boolean;
|
||||
isLinux: boolean;
|
||||
}> => this.get("/api/setup/platform"),
|
||||
|
||||
onInstallProgress: (callback: (progress: unknown) => void) => {
|
||||
return this.subscribeToEvent("agent:stream", callback);
|
||||
},
|
||||
|
||||
onAuthProgress: (callback: (progress: unknown) => void) => {
|
||||
return this.subscribeToEvent("agent:stream", callback);
|
||||
},
|
||||
};
|
||||
|
||||
// Features API
|
||||
features: FeaturesAPI = {
|
||||
getAll: (projectPath: string) =>
|
||||
this.post("/api/features/list", { projectPath }),
|
||||
get: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/features/get", { projectPath, featureId }),
|
||||
create: (projectPath: string, feature: Feature) =>
|
||||
this.post("/api/features/create", { projectPath, feature }),
|
||||
update: (projectPath: string, featureId: string, updates: Partial<Feature>) =>
|
||||
this.post("/api/features/update", { projectPath, featureId, updates }),
|
||||
delete: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/features/delete", { projectPath, featureId }),
|
||||
getAgentOutput: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/features/agent-output", { projectPath, featureId }),
|
||||
};
|
||||
|
||||
// Auto Mode API
|
||||
autoMode: AutoModeAPI = {
|
||||
start: (projectPath: string, maxConcurrency?: number) =>
|
||||
this.post("/api/auto-mode/start", { projectPath, maxConcurrency }),
|
||||
stop: (projectPath: string) =>
|
||||
this.post("/api/auto-mode/stop", { projectPath }),
|
||||
stopFeature: (featureId: string) =>
|
||||
this.post("/api/auto-mode/stop-feature", { featureId }),
|
||||
status: (projectPath?: string) =>
|
||||
this.post("/api/auto-mode/status", { projectPath }),
|
||||
runFeature: (projectPath: string, featureId: string, useWorktrees?: boolean) =>
|
||||
this.post("/api/auto-mode/run-feature", { projectPath, featureId, useWorktrees }),
|
||||
verifyFeature: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/auto-mode/verify-feature", { projectPath, featureId }),
|
||||
resumeFeature: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/auto-mode/resume-feature", { projectPath, featureId }),
|
||||
contextExists: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/auto-mode/context-exists", { projectPath, featureId }),
|
||||
analyzeProject: (projectPath: string) =>
|
||||
this.post("/api/auto-mode/analyze-project", { projectPath }),
|
||||
followUpFeature: (
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
prompt: string,
|
||||
imagePaths?: string[]
|
||||
) =>
|
||||
this.post("/api/auto-mode/follow-up-feature", {
|
||||
projectPath,
|
||||
featureId,
|
||||
prompt,
|
||||
imagePaths,
|
||||
}),
|
||||
commitFeature: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/auto-mode/commit-feature", { projectPath, featureId }),
|
||||
onEvent: (callback: (event: AutoModeEvent) => void) => {
|
||||
return this.subscribeToEvent("auto-mode:event", callback as EventCallback);
|
||||
},
|
||||
};
|
||||
|
||||
// Worktree API
|
||||
worktree: WorktreeAPI = {
|
||||
revertFeature: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/worktree/revert", { projectPath, featureId }),
|
||||
mergeFeature: (projectPath: string, featureId: string, options?: object) =>
|
||||
this.post("/api/worktree/merge", { projectPath, featureId, options }),
|
||||
getInfo: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/worktree/info", { projectPath, featureId }),
|
||||
getStatus: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/worktree/status", { projectPath, featureId }),
|
||||
list: (projectPath: string) =>
|
||||
this.post("/api/worktree/list", { projectPath }),
|
||||
getDiffs: (projectPath: string, featureId: string) =>
|
||||
this.post("/api/worktree/diffs", { projectPath, featureId }),
|
||||
getFileDiff: (projectPath: string, featureId: string, filePath: string) =>
|
||||
this.post("/api/worktree/file-diff", { projectPath, featureId, filePath }),
|
||||
};
|
||||
|
||||
// Git API
|
||||
git: GitAPI = {
|
||||
getDiffs: (projectPath: string) =>
|
||||
this.post("/api/git/diffs", { projectPath }),
|
||||
getFileDiff: (projectPath: string, filePath: string) =>
|
||||
this.post("/api/git/file-diff", { projectPath, filePath }),
|
||||
};
|
||||
|
||||
// Suggestions API
|
||||
suggestions: SuggestionsAPI = {
|
||||
generate: (projectPath: string, suggestionType?: SuggestionType) =>
|
||||
this.post("/api/suggestions/generate", { projectPath, suggestionType }),
|
||||
stop: () => this.post("/api/suggestions/stop"),
|
||||
status: () => this.get("/api/suggestions/status"),
|
||||
onEvent: (callback: (event: SuggestionsEvent) => void) => {
|
||||
return this.subscribeToEvent("suggestions:event", callback as EventCallback);
|
||||
},
|
||||
};
|
||||
|
||||
// Spec Regeneration API
|
||||
specRegeneration: SpecRegenerationAPI = {
|
||||
create: (projectPath: string, projectOverview: string, generateFeatures?: boolean) =>
|
||||
this.post("/api/spec-regeneration/create", {
|
||||
projectPath,
|
||||
projectOverview,
|
||||
generateFeatures,
|
||||
}),
|
||||
generate: (projectPath: string, projectDefinition: string) =>
|
||||
this.post("/api/spec-regeneration/generate", { projectPath, projectDefinition }),
|
||||
generateFeatures: (projectPath: string) =>
|
||||
this.post("/api/spec-regeneration/generate-features", { projectPath }),
|
||||
stop: () => this.post("/api/spec-regeneration/stop"),
|
||||
status: () => this.get("/api/spec-regeneration/status"),
|
||||
onEvent: (callback: (event: SpecRegenerationEvent) => void) => {
|
||||
return this.subscribeToEvent(
|
||||
"spec-regeneration:event",
|
||||
callback as EventCallback
|
||||
);
|
||||
},
|
||||
};
|
||||
|
||||
// Running Agents API
|
||||
runningAgents = {
|
||||
getAll: (): Promise<{
|
||||
success: boolean;
|
||||
runningAgents?: Array<{
|
||||
featureId: string;
|
||||
projectPath: string;
|
||||
projectName: string;
|
||||
isAutoMode: boolean;
|
||||
}>;
|
||||
totalCount?: number;
|
||||
autoLoopRunning?: boolean;
|
||||
error?: string;
|
||||
}> => this.get("/api/running-agents"),
|
||||
};
|
||||
|
||||
// Workspace API
|
||||
workspace = {
|
||||
getConfig: (): Promise<{
|
||||
success: boolean;
|
||||
configured: boolean;
|
||||
workspaceDir?: string;
|
||||
error?: string;
|
||||
}> => this.get("/api/workspace/config"),
|
||||
|
||||
getDirectories: (): Promise<{
|
||||
success: boolean;
|
||||
directories?: Array<{ name: string; path: string }>;
|
||||
error?: string;
|
||||
}> => this.get("/api/workspace/directories"),
|
||||
};
|
||||
|
||||
// Agent API
|
||||
agent = {
|
||||
start: (sessionId: string, workingDirectory?: string): Promise<{
|
||||
success: boolean;
|
||||
messages?: Message[];
|
||||
error?: string;
|
||||
}> => this.post("/api/agent/start", { sessionId, workingDirectory }),
|
||||
|
||||
send: (
|
||||
sessionId: string,
|
||||
message: string,
|
||||
workingDirectory?: string,
|
||||
imagePaths?: string[]
|
||||
): Promise<{ success: boolean; error?: string }> =>
|
||||
this.post("/api/agent/send", { sessionId, message, workingDirectory, imagePaths }),
|
||||
|
||||
getHistory: (sessionId: string): Promise<{
|
||||
success: boolean;
|
||||
messages?: Message[];
|
||||
isRunning?: boolean;
|
||||
error?: string;
|
||||
}> => this.post("/api/agent/history", { sessionId }),
|
||||
|
||||
stop: (sessionId: string): Promise<{ success: boolean; error?: string }> =>
|
||||
this.post("/api/agent/stop", { sessionId }),
|
||||
|
||||
clear: (sessionId: string): Promise<{ success: boolean; error?: string }> =>
|
||||
this.post("/api/agent/clear", { sessionId }),
|
||||
|
||||
onStream: (callback: (data: unknown) => void): (() => void) => {
|
||||
return this.subscribeToEvent("agent:stream", callback as EventCallback);
|
||||
},
|
||||
};
|
||||
|
||||
// Sessions API
|
||||
sessions = {
|
||||
list: (includeArchived?: boolean): Promise<{
|
||||
success: boolean;
|
||||
sessions?: SessionListItem[];
|
||||
error?: string;
|
||||
}> => this.get(`/api/sessions?includeArchived=${includeArchived || false}`),
|
||||
|
||||
create: (
|
||||
name: string,
|
||||
projectPath: string,
|
||||
workingDirectory?: string
|
||||
): Promise<{
|
||||
success: boolean;
|
||||
session?: {
|
||||
id: string;
|
||||
name: string;
|
||||
projectPath: string;
|
||||
workingDirectory?: string;
|
||||
createdAt: string;
|
||||
updatedAt: string;
|
||||
};
|
||||
error?: string;
|
||||
}> => this.post("/api/sessions", { name, projectPath, workingDirectory }),
|
||||
|
||||
update: (
|
||||
sessionId: string,
|
||||
name?: string,
|
||||
tags?: string[]
|
||||
): Promise<{ success: boolean; error?: string }> =>
|
||||
this.put(`/api/sessions/${sessionId}`, { name, tags }),
|
||||
|
||||
archive: (sessionId: string): Promise<{ success: boolean; error?: string }> =>
|
||||
this.post(`/api/sessions/${sessionId}/archive`, {}),
|
||||
|
||||
unarchive: (sessionId: string): Promise<{ success: boolean; error?: string }> =>
|
||||
this.post(`/api/sessions/${sessionId}/unarchive`, {}),
|
||||
|
||||
delete: (sessionId: string): Promise<{ success: boolean; error?: string }> =>
|
||||
this.httpDelete(`/api/sessions/${sessionId}`),
|
||||
};
|
||||
}
|
||||
|
||||
// Singleton instance
|
||||
let httpApiClientInstance: HttpApiClient | null = null;
|
||||
|
||||
export function getHttpApiClient(): HttpApiClient {
|
||||
if (!httpApiClientInstance) {
|
||||
httpApiClientInstance = new HttpApiClient();
|
||||
}
|
||||
return httpApiClientInstance;
|
||||
}
|
||||
@@ -9,7 +9,6 @@ export type ViewMode =
|
||||
| "board"
|
||||
| "agent"
|
||||
| "settings"
|
||||
| "tools"
|
||||
| "interview"
|
||||
| "context"
|
||||
| "profiles"
|
||||
@@ -113,7 +112,6 @@ export interface KeyboardShortcuts {
|
||||
agent: string;
|
||||
spec: string;
|
||||
context: string;
|
||||
tools: string;
|
||||
settings: string;
|
||||
profiles: string;
|
||||
|
||||
@@ -139,7 +137,6 @@ export const DEFAULT_KEYBOARD_SHORTCUTS: KeyboardShortcuts = {
|
||||
agent: "A",
|
||||
spec: "D",
|
||||
context: "C",
|
||||
tools: "T",
|
||||
settings: "S",
|
||||
profiles: "M",
|
||||
|
||||
@@ -354,6 +351,7 @@ export interface AutoModeActivity {
|
||||
tool?: string;
|
||||
passes?: boolean;
|
||||
phase?: "planning" | "action" | "verification";
|
||||
errorType?: "authentication" | "execution";
|
||||
}
|
||||
|
||||
export interface AppActions {
|
||||
@@ -638,6 +636,7 @@ export const useAppStore = create<AppState & AppActions>()(
|
||||
name: trashed.name,
|
||||
path: trashed.path,
|
||||
lastOpened: new Date().toISOString(),
|
||||
theme: trashed.theme, // Preserve theme from trashed project
|
||||
};
|
||||
|
||||
set({
|
||||
@@ -1138,26 +1137,32 @@ export const useAppStore = create<AppState & AppActions>()(
|
||||
{
|
||||
name: "automaker-storage",
|
||||
partialize: (state) => ({
|
||||
// Project management
|
||||
projects: state.projects,
|
||||
currentProject: state.currentProject,
|
||||
trashedProjects: state.trashedProjects,
|
||||
projectHistory: state.projectHistory,
|
||||
projectHistoryIndex: state.projectHistoryIndex,
|
||||
// Features - cached locally for faster hydration (authoritative source is server)
|
||||
features: state.features,
|
||||
// UI state
|
||||
currentView: state.currentView,
|
||||
theme: state.theme,
|
||||
sidebarOpen: state.sidebarOpen,
|
||||
apiKeys: state.apiKeys,
|
||||
chatSessions: state.chatSessions,
|
||||
chatHistoryOpen: state.chatHistoryOpen,
|
||||
kanbanCardDetailLevel: state.kanbanCardDetailLevel,
|
||||
// Settings
|
||||
apiKeys: state.apiKeys,
|
||||
maxConcurrency: state.maxConcurrency,
|
||||
autoModeByProject: state.autoModeByProject,
|
||||
kanbanCardDetailLevel: state.kanbanCardDetailLevel,
|
||||
defaultSkipTests: state.defaultSkipTests,
|
||||
useWorktrees: state.useWorktrees,
|
||||
showProfilesOnly: state.showProfilesOnly,
|
||||
keyboardShortcuts: state.keyboardShortcuts,
|
||||
muteDoneSound: state.muteDoneSound,
|
||||
// Profiles and sessions
|
||||
aiProfiles: state.aiProfiles,
|
||||
chatSessions: state.chatSessions,
|
||||
lastSelectedSessionByProject: state.lastSelectedSessionByProject,
|
||||
}),
|
||||
}
|
||||
|
||||
@@ -10,10 +10,20 @@ export interface CliStatus {
|
||||
error?: string;
|
||||
}
|
||||
|
||||
// Claude Auth Method - all possible authentication sources
|
||||
export type ClaudeAuthMethod =
|
||||
| "oauth_token_env" // CLAUDE_CODE_OAUTH_TOKEN environment variable
|
||||
| "oauth_token" // Stored OAuth token from claude login
|
||||
| "api_key_env" // ANTHROPIC_API_KEY environment variable
|
||||
| "api_key" // Manually stored API key
|
||||
| "credentials_file" // Generic credentials file detection
|
||||
| "cli_authenticated" // Claude CLI is installed and has active sessions/activity
|
||||
| "none";
|
||||
|
||||
// Claude Auth Status
|
||||
export interface ClaudeAuthStatus {
|
||||
authenticated: boolean;
|
||||
method: "oauth_token_env" | "oauth_token" | "api_key" | "api_key_env" | "none";
|
||||
method: ClaudeAuthMethod;
|
||||
hasCredentialsFile?: boolean;
|
||||
oauthTokenValid?: boolean;
|
||||
apiKeyValid?: boolean;
|
||||
@@ -22,12 +32,23 @@ export interface ClaudeAuthStatus {
|
||||
error?: string;
|
||||
}
|
||||
|
||||
// Codex Auth Method - all possible authentication sources
|
||||
export type CodexAuthMethod =
|
||||
| "subscription" // Codex/OpenAI Plus or Team subscription
|
||||
| "cli_verified" // CLI logged in with OpenAI account
|
||||
| "cli_tokens" // CLI with stored access tokens
|
||||
| "api_key" // Manually stored API key
|
||||
| "env" // OPENAI_API_KEY environment variable
|
||||
| "none";
|
||||
|
||||
// Codex Auth Status
|
||||
export interface CodexAuthStatus {
|
||||
authenticated: boolean;
|
||||
method: "api_key" | "env" | "cli_verified" | "cli_tokens" | "none";
|
||||
method: CodexAuthMethod;
|
||||
apiKeyValid?: boolean;
|
||||
mcpConfigured?: boolean;
|
||||
hasSubscription?: boolean;
|
||||
cliLoggedIn?: boolean;
|
||||
error?: string;
|
||||
}
|
||||
|
||||
|
||||
1
apps/app/src/types/electron.d.ts
vendored
1
apps/app/src/types/electron.d.ts
vendored
@@ -192,6 +192,7 @@ export type AutoModeEvent =
|
||||
| {
|
||||
type: "auto_mode_error";
|
||||
error: string;
|
||||
errorType?: "authentication" | "execution";
|
||||
featureId?: string;
|
||||
projectId?: string;
|
||||
projectPath?: string;
|
||||
|
||||
@@ -298,6 +298,65 @@
|
||||
.feature-card:nth-child(4) { animation-delay: 0.4s; }
|
||||
.feature-card:nth-child(5) { animation-delay: 0.5s; }
|
||||
.feature-card:nth-child(6) { animation-delay: 0.6s; }
|
||||
|
||||
/* Download Buttons */
|
||||
.download-section {
|
||||
margin-top: 2.5rem;
|
||||
}
|
||||
|
||||
.download-label {
|
||||
color: var(--text-muted);
|
||||
font-size: 0.9rem;
|
||||
margin-bottom: 1rem;
|
||||
}
|
||||
|
||||
.download-buttons {
|
||||
display: flex;
|
||||
gap: 1rem;
|
||||
justify-content: center;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.btn-download {
|
||||
padding: 0.75rem 1.5rem;
|
||||
border-radius: 0.5rem;
|
||||
text-decoration: none;
|
||||
font-weight: 600;
|
||||
transition: all 0.3s;
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
background: rgba(30, 41, 59, 0.8);
|
||||
color: var(--text);
|
||||
border: 1px solid rgba(148, 163, 184, 0.2);
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
.btn-download:hover {
|
||||
background: rgba(99, 102, 241, 0.2);
|
||||
border-color: var(--primary);
|
||||
transform: translateY(-2px);
|
||||
}
|
||||
|
||||
.btn-download svg {
|
||||
width: 20px;
|
||||
height: 20px;
|
||||
}
|
||||
|
||||
.download-subtitle {
|
||||
color: var(--text-muted);
|
||||
font-size: 0.9rem;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
.download-subtitle a {
|
||||
color: var(--primary);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.download-subtitle a:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
@@ -307,6 +366,7 @@
|
||||
<ul class="nav-links">
|
||||
<li><a href="#features">Features</a></li>
|
||||
<li><a href="#tech">Tech Stack</a></li>
|
||||
<li><a href="releases.html">Releases</a></li>
|
||||
<li><a href="https://github.com/AutoMaker-Org/automaker" target="_blank">GitHub</a></li>
|
||||
</ul>
|
||||
</nav>
|
||||
@@ -321,6 +381,27 @@
|
||||
<a href="https://github.com/AutoMaker-Org/automaker" class="btn btn-primary" target="_blank">View on GitHub</a>
|
||||
<a href="https://github.com/AutoMaker-Org/automaker#getting-started" class="btn btn-secondary" target="_blank">Get Started</a>
|
||||
</div>
|
||||
|
||||
<div class="download-section" id="downloadSection" style="display: none;">
|
||||
<p class="download-label">Download for your platform:</p>
|
||||
<div class="download-buttons">
|
||||
<a href="#" class="btn-download" id="download-windows" style="display: none;">
|
||||
<svg viewBox="0 0 24 24" fill="currentColor"><path d="M0 3.449L9.75 2.1v9.451H0m10.949-9.602L24 0v11.4H10.949M0 12.6h9.75v9.451L0 20.699M10.949 12.6H24V24l-12.9-1.801"/></svg>
|
||||
Windows
|
||||
</a>
|
||||
<a href="#" class="btn-download" id="download-macos" style="display: none;">
|
||||
<svg viewBox="0 0 24 24" fill="currentColor"><path d="M18.71 19.5c-.83 1.24-1.71 2.45-3.05 2.47-1.34.03-1.77-.79-3.29-.79-1.53 0-2 .77-3.27.82-1.31.05-2.3-1.32-3.14-2.53C4.25 17 2.94 12.45 4.7 9.39c.87-1.52 2.43-2.48 4.12-2.51 1.28-.02 2.5.87 3.29.87.78 0 2.26-1.07 3.81-.91.65.03 2.47.26 3.64 1.98-.09.06-2.17 1.28-2.15 3.81.03 3.02 2.65 4.03 2.68 4.04-.03.07-.42 1.44-1.38 2.83M13 3.5c.73-.83 1.94-1.46 2.94-1.5.13 1.17-.34 2.35-1.04 3.19-.69.85-1.83 1.51-2.95 1.42-.15-1.15.41-2.35 1.05-3.11z"/></svg>
|
||||
macOS
|
||||
</a>
|
||||
<a href="#" class="btn-download" id="download-linux" style="display: none;">
|
||||
<svg viewBox="0 0 24 24" fill="currentColor"><path d="M12.504 0c-.155 0-.315.008-.48.021-4.226.333-3.105 4.807-3.17 6.298-.076 1.092-.3 1.953-1.05 3.02-.885 1.051-2.127 2.75-2.716 4.521-.278.832-.41 1.684-.287 2.489a.424.424 0 00-.11.135c-.26.268-.45.6-.663.839-.199.199-.485.267-.797.4-.313.136-.658.269-.864.68-.09.189-.136.394-.132.602 0 .199.027.4.055.536.058.399.116.728.04.97-.249.68-.28 1.145-.106 1.484.174.334.535.47.94.601.81.2 1.91.135 2.774.6.926.466 1.866.67 2.616.47.526-.116.97-.464 1.208-.946.587-.003 1.23-.269 2.26-.334.699-.058 1.574.267 2.577.2.025.134.063.198.114.333l.003.003c.391.778 1.113 1.132 1.884 1.071.771-.06 1.592-.536 2.257-1.306.631-.765 1.683-1.084 2.378-1.503.348-.199.629-.469.649-.853.023-.4-.2-.811-.714-1.376v-.097l-.003-.003c-.17-.2-.25-.535-.338-.926-.085-.401-.182-.786-.492-1.046h-.003c-.059-.054-.123-.067-.188-.135a.357.357 0 00-.19-.064c.431-1.278.264-2.55-.173-3.694-.533-1.41-1.465-2.638-2.175-3.483-.796-1.005-1.576-1.957-1.56-3.368.026-2.152.236-6.133-3.544-6.139zm.529 3.405h.013c.213 0 .396.062.584.198.19.135.33.332.438.533.105.259.158.459.166.724 0-.02.006-.04.006-.06v.105a.086.086 0 01-.004-.021l-.004-.024a1.807 1.807 0 01-.15.706.953.953 0 01-.213.335.71.71 0 00-.088-.042c-.104-.045-.198-.064-.284-.133a1.312 1.312 0 00-.22-.066c.05-.06.146-.133.183-.198.053-.128.082-.264.088-.402v-.02a1.21 1.21 0 00-.061-.4c-.045-.134-.101-.2-.183-.333-.084-.066-.167-.132-.267-.132h-.016c-.093 0-.176.03-.262.132a.8.8 0 00-.205.334 1.18 1.18 0 00-.09.4v.019c.002.089.008.179.02.267-.193-.067-.438-.135-.607-.202a1.635 1.635 0 01-.018-.2v-.02a1.772 1.772 0 01.15-.768c.082-.22.232-.406.43-.533a.985.985 0 01.594-.2zm-2.962.059h.036c.142 0 .27.048.399.135.146.129.264.288.344.465.09.199.14.4.153.667v.004c.007.134.006.2-.002.266v.08c-.03.007-.056.018-.083.024-.152.055-.274.135-.393.2.012-.09.013-.18.003-.267v-.015c-.012-.133-.04-.2-.082-.333a.613.613 0 00-.166-.267.248.248 0 00-.183-.064h-.021c-.071.006-.13.04-.186.132a.552.552 0 00-.12.27.944.944 0 00-.023.33v.015c.012.135.037.2.08.334.046.134.098.2.166.268.01.009.02.018.034.024-.07.057-.117.07-.176.136a.304.304 0 01-.131.068 2.62 2.62 0 01-.275-.402 1.772 1.772 0 01-.155-.667 1.759 1.759 0 01.08-.668 1.43 1.43 0 01.283-.535c.128-.133.26-.2.418-.2zm1.37 1.706c.332 0 .733.065 1.216.399.293.2.523.269 1.052.468h.003c.255.136.405.266.478.399v-.131a.571.571 0 01.016.47c-.123.31-.516.643-1.063.842v.002c-.268.135-.501.333-.775.465-.276.135-.588.292-1.012.267a1.139 1.139 0 01-.448-.067 3.566 3.566 0 01-.322-.198c-.195-.135-.363-.332-.612-.465v-.005h-.005c-.4-.246-.616-.512-.686-.71-.07-.268-.005-.47.193-.6.224-.135.38-.271.483-.336.104-.074.143-.102.176-.131h.002v-.003c.169-.202.436-.47.839-.601.139-.036.294-.065.466-.065zm2.8 2.142c.358 1.417 1.196 3.475 1.735 4.473.286.534.855 1.659 1.102 3.024.156-.005.33.018.513.064.646-1.671-.546-3.467-1.089-3.966-.22-.2-.232-.335-.123-.335.59.534 1.365 1.572 1.646 2.757.13.535.16 1.104.021 1.67.067.028.135.06.205.067 1.032.534 1.413.938 1.23 1.537v-.002c-.06-.003-.12 0-.18 0h-.016c.151-.467-.182-.825-1.065-1.224-.915-.4-1.646-.336-1.77.465-.008.043-.013.066-.018.135-.068.023-.139.053-.209.064-.43.268-.662.669-.793 1.187-.13.533-.17 1.156-.205 1.869v.003c-.02.482-.04 1.053-.158 1.425-.06.134-.133.27-.238.465h-.003c-.067-.004-.003-.401-.004-.469.006-.534.011-1.2.036-1.534.006-.468.011-.534-.021-.267-.18.936-.323 1.2-.608 1.67a1.016 1.016 0 01-.112.134v.003l-.005-.003c-.07-.2-.044-.401-.044-.535-.002-.468.006-.869-.089-1.334-.066-.468-.353-.935-.711-1.469-.074-.104-.264-.333-.376-.533-.073-.133-.067-.267.123-.336.104-.037.2-.135.29-.2.09-.067.18-.136.27-.2.02-.015.04-.018.059-.036.14-.083.267-.2.368-.335a.838.838 0 00.145-.262l.002-.004c.028-.087.042-.133.034-.2-.034-.135-.232-.333-.393-.468-.226-.2-.4-.333-.673-.467l-.005-.002c-.569-.27-1.322-.534-1.927-.8a.082.082 0 01-.026-.013c-.136-.071-.27-.2-.406-.4-.466-.735-.727-1.536-.727-1.936 0-.2.067-.4.129-.533.032-.067.065-.135.102-.2.036-.067.257-.2.378-.267.143-.095.287-.191.441-.263z"/></svg>
|
||||
Linux
|
||||
</a>
|
||||
</div>
|
||||
<p class="download-subtitle">
|
||||
<span id="latestVersion"></span> | <a href="releases.html">All releases</a>
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
@@ -408,5 +489,53 @@
|
||||
</p>
|
||||
</div>
|
||||
</footer>
|
||||
|
||||
<script>
|
||||
(function() {
|
||||
const R2_RELEASES_URL = window.RELEASES_JSON_URL || 'https://releases.automaker.dev/releases.json';
|
||||
|
||||
async function loadLatestRelease() {
|
||||
try {
|
||||
const response = await fetch(R2_RELEASES_URL);
|
||||
if (!response.ok) throw new Error('Failed to fetch releases');
|
||||
|
||||
const data = await response.json();
|
||||
if (!data.releases || data.releases.length === 0) return;
|
||||
|
||||
const latest = data.releases[0];
|
||||
let hasAnyAsset = false;
|
||||
|
||||
if (latest.assets.windows) {
|
||||
const btn = document.getElementById('download-windows');
|
||||
btn.href = latest.assets.windows.url;
|
||||
btn.style.display = 'inline-flex';
|
||||
hasAnyAsset = true;
|
||||
}
|
||||
if (latest.assets.macos || latest.assets.macosArm) {
|
||||
const btn = document.getElementById('download-macos');
|
||||
const macAsset = latest.assets.macosArm || latest.assets.macos;
|
||||
btn.href = macAsset.url;
|
||||
btn.style.display = 'inline-flex';
|
||||
hasAnyAsset = true;
|
||||
}
|
||||
if (latest.assets.linux) {
|
||||
const btn = document.getElementById('download-linux');
|
||||
btn.href = latest.assets.linux.url;
|
||||
btn.style.display = 'inline-flex';
|
||||
hasAnyAsset = true;
|
||||
}
|
||||
|
||||
if (hasAnyAsset) {
|
||||
document.getElementById('latestVersion').textContent = latest.version;
|
||||
document.getElementById('downloadSection').style.display = 'block';
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to load releases:', error);
|
||||
}
|
||||
}
|
||||
|
||||
loadLatestRelease();
|
||||
})();
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
422
apps/marketing/public/releases.html
Normal file
422
apps/marketing/public/releases.html
Normal file
@@ -0,0 +1,422 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Releases - Automaker</title>
|
||||
<style>
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
:root {
|
||||
--primary: #6366f1;
|
||||
--primary-dark: #4f46e5;
|
||||
--secondary: #8b5cf6;
|
||||
--accent: #ec4899;
|
||||
--dark: #0f172a;
|
||||
--dark-light: #1e293b;
|
||||
--text: #e2e8f0;
|
||||
--text-muted: #94a3b8;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
|
||||
background: linear-gradient(135deg, #0f172a 0%, #1e293b 50%, #0f172a 100%);
|
||||
color: var(--text);
|
||||
line-height: 1.6;
|
||||
overflow-x: hidden;
|
||||
min-height: 100vh;
|
||||
}
|
||||
|
||||
.container {
|
||||
max-width: 1200px;
|
||||
margin: 0 auto;
|
||||
padding: 0 2rem;
|
||||
}
|
||||
|
||||
/* Header */
|
||||
header {
|
||||
padding: 2rem 0;
|
||||
position: sticky;
|
||||
top: 0;
|
||||
background: rgba(15, 23, 42, 0.8);
|
||||
backdrop-filter: blur(10px);
|
||||
z-index: 100;
|
||||
border-bottom: 1px solid rgba(148, 163, 184, 0.1);
|
||||
}
|
||||
|
||||
nav {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.logo {
|
||||
font-size: 1.5rem;
|
||||
font-weight: 700;
|
||||
background: linear-gradient(135deg, var(--primary), var(--secondary));
|
||||
-webkit-background-clip: text;
|
||||
-webkit-text-fill-color: transparent;
|
||||
background-clip: text;
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.nav-links {
|
||||
display: flex;
|
||||
gap: 2rem;
|
||||
list-style: none;
|
||||
}
|
||||
|
||||
.nav-links a {
|
||||
color: var(--text-muted);
|
||||
text-decoration: none;
|
||||
transition: color 0.3s;
|
||||
}
|
||||
|
||||
.nav-links a:hover,
|
||||
.nav-links a.active {
|
||||
color: var(--text);
|
||||
}
|
||||
|
||||
/* Page Header */
|
||||
.page-header {
|
||||
padding: 4rem 0 2rem;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.page-header h1 {
|
||||
font-size: 3rem;
|
||||
font-weight: 800;
|
||||
margin-bottom: 1rem;
|
||||
background: linear-gradient(135deg, #ffffff 0%, var(--text-muted) 100%);
|
||||
-webkit-background-clip: text;
|
||||
-webkit-text-fill-color: transparent;
|
||||
background-clip: text;
|
||||
}
|
||||
|
||||
.page-header p {
|
||||
font-size: 1.2rem;
|
||||
color: var(--text-muted);
|
||||
max-width: 600px;
|
||||
margin: 0 auto;
|
||||
}
|
||||
|
||||
/* Releases Section */
|
||||
.releases-section {
|
||||
padding: 2rem 0 6rem;
|
||||
}
|
||||
|
||||
.releases-list {
|
||||
max-width: 800px;
|
||||
margin: 0 auto;
|
||||
}
|
||||
|
||||
.release-card {
|
||||
background: rgba(30, 41, 59, 0.5);
|
||||
border: 1px solid rgba(148, 163, 184, 0.1);
|
||||
border-radius: 1rem;
|
||||
padding: 2rem;
|
||||
margin-bottom: 1.5rem;
|
||||
transition: all 0.3s;
|
||||
}
|
||||
|
||||
.release-card:first-child {
|
||||
border-color: var(--primary);
|
||||
background: rgba(99, 102, 241, 0.1);
|
||||
}
|
||||
|
||||
.release-card:hover {
|
||||
border-color: rgba(99, 102, 241, 0.5);
|
||||
}
|
||||
|
||||
.release-header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
align-items: center;
|
||||
margin-bottom: 1rem;
|
||||
flex-wrap: wrap;
|
||||
gap: 0.5rem;
|
||||
}
|
||||
|
||||
.release-version {
|
||||
font-size: 1.5rem;
|
||||
font-weight: 700;
|
||||
color: var(--text);
|
||||
}
|
||||
|
||||
.release-date {
|
||||
color: var(--text-muted);
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
.latest-badge {
|
||||
background: linear-gradient(135deg, var(--primary), var(--secondary));
|
||||
color: white;
|
||||
padding: 0.25rem 0.75rem;
|
||||
border-radius: 1rem;
|
||||
font-size: 0.75rem;
|
||||
font-weight: 600;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.release-downloads {
|
||||
display: flex;
|
||||
gap: 1rem;
|
||||
flex-wrap: wrap;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
.download-link {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: 0.5rem;
|
||||
padding: 0.5rem 1rem;
|
||||
background: rgba(148, 163, 184, 0.1);
|
||||
border: 1px solid rgba(148, 163, 184, 0.2);
|
||||
border-radius: 0.5rem;
|
||||
color: var(--text);
|
||||
text-decoration: none;
|
||||
font-size: 0.9rem;
|
||||
transition: all 0.3s;
|
||||
}
|
||||
|
||||
.download-link:hover {
|
||||
background: rgba(99, 102, 241, 0.2);
|
||||
border-color: var(--primary);
|
||||
}
|
||||
|
||||
.download-link svg {
|
||||
width: 18px;
|
||||
height: 18px;
|
||||
}
|
||||
|
||||
.download-size {
|
||||
color: var(--text-muted);
|
||||
font-size: 0.8rem;
|
||||
}
|
||||
|
||||
.release-notes-link {
|
||||
color: var(--primary);
|
||||
text-decoration: none;
|
||||
font-size: 0.9rem;
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: 0.25rem;
|
||||
margin-top: 1rem;
|
||||
}
|
||||
|
||||
.release-notes-link:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
|
||||
.loading-spinner {
|
||||
text-align: center;
|
||||
padding: 4rem;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
.error-message {
|
||||
text-align: center;
|
||||
padding: 2rem;
|
||||
color: var(--accent);
|
||||
background: rgba(236, 72, 153, 0.1);
|
||||
border-radius: 0.5rem;
|
||||
}
|
||||
|
||||
.error-message a {
|
||||
color: var(--primary);
|
||||
}
|
||||
|
||||
.no-releases {
|
||||
text-align: center;
|
||||
padding: 4rem;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
/* Footer */
|
||||
footer {
|
||||
padding: 3rem 0;
|
||||
text-align: center;
|
||||
border-top: 1px solid rgba(148, 163, 184, 0.1);
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
footer a {
|
||||
color: var(--primary);
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
footer a:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
|
||||
/* Responsive */
|
||||
@media (max-width: 768px) {
|
||||
.page-header h1 {
|
||||
font-size: 2rem;
|
||||
}
|
||||
|
||||
.nav-links {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.release-header {
|
||||
flex-direction: column;
|
||||
align-items: flex-start;
|
||||
}
|
||||
|
||||
.release-downloads {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.download-link {
|
||||
width: 100%;
|
||||
justify-content: center;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<header>
|
||||
<nav class="container">
|
||||
<a href="index.html" class="logo">Automaker</a>
|
||||
<ul class="nav-links">
|
||||
<li><a href="index.html#features">Features</a></li>
|
||||
<li><a href="index.html#tech">Tech Stack</a></li>
|
||||
<li><a href="releases.html" class="active">Releases</a></li>
|
||||
<li><a href="https://github.com/AutoMaker-Org/automaker" target="_blank">GitHub</a></li>
|
||||
</ul>
|
||||
</nav>
|
||||
</header>
|
||||
|
||||
<main>
|
||||
<section class="page-header">
|
||||
<div class="container">
|
||||
<h1>Releases</h1>
|
||||
<p>Download Automaker for your platform. All versions are available below.</p>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="releases-section">
|
||||
<div class="container">
|
||||
<div class="releases-list" id="releasesList">
|
||||
<div class="loading-spinner">Loading releases...</div>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
</main>
|
||||
|
||||
<footer>
|
||||
<div class="container">
|
||||
<p>Made with love by <a href="mailto:webdevcody@gmail.com">Cody Seibert</a></p>
|
||||
<p style="margin-top: 1rem;">
|
||||
<a href="https://github.com/AutoMaker-Org/automaker" target="_blank">GitHub</a> |
|
||||
<a href="https://github.com/AutoMaker-Org/automaker/blob/main/LICENSE" target="_blank">License</a>
|
||||
</p>
|
||||
</div>
|
||||
</footer>
|
||||
|
||||
<script>
|
||||
(function() {
|
||||
const R2_RELEASES_URL = window.RELEASES_JSON_URL || 'https://releases.automaker.dev/releases.json';
|
||||
|
||||
const platformIcons = {
|
||||
windows: '<svg viewBox="0 0 24 24" fill="currentColor"><path d="M0 3.449L9.75 2.1v9.451H0m10.949-9.602L24 0v11.4H10.949M0 12.6h9.75v9.451L0 20.699M10.949 12.6H24V24l-12.9-1.801"/></svg>',
|
||||
macos: '<svg viewBox="0 0 24 24" fill="currentColor"><path d="M18.71 19.5c-.83 1.24-1.71 2.45-3.05 2.47-1.34.03-1.77-.79-3.29-.79-1.53 0-2 .77-3.27.82-1.31.05-2.3-1.32-3.14-2.53C4.25 17 2.94 12.45 4.7 9.39c.87-1.52 2.43-2.48 4.12-2.51 1.28-.02 2.5.87 3.29.87.78 0 2.26-1.07 3.81-.91.65.03 2.47.26 3.64 1.98-.09.06-2.17 1.28-2.15 3.81.03 3.02 2.65 4.03 2.68 4.04-.03.07-.42 1.44-1.38 2.83M13 3.5c.73-.83 1.94-1.46 2.94-1.5.13 1.17-.34 2.35-1.04 3.19-.69.85-1.83 1.51-2.95 1.42-.15-1.15.41-2.35 1.05-3.11z"/></svg>',
|
||||
macosArm: '<svg viewBox="0 0 24 24" fill="currentColor"><path d="M18.71 19.5c-.83 1.24-1.71 2.45-3.05 2.47-1.34.03-1.77-.79-3.29-.79-1.53 0-2 .77-3.27.82-1.31.05-2.3-1.32-3.14-2.53C4.25 17 2.94 12.45 4.7 9.39c.87-1.52 2.43-2.48 4.12-2.51 1.28-.02 2.5.87 3.29.87.78 0 2.26-1.07 3.81-.91.65.03 2.47.26 3.64 1.98-.09.06-2.17 1.28-2.15 3.81.03 3.02 2.65 4.03 2.68 4.04-.03.07-.42 1.44-1.38 2.83M13 3.5c.73-.83 1.94-1.46 2.94-1.5.13 1.17-.34 2.35-1.04 3.19-.69.85-1.83 1.51-2.95 1.42-.15-1.15.41-2.35 1.05-3.11z"/></svg>',
|
||||
linux: '<svg viewBox="0 0 24 24" fill="currentColor"><path d="M12.504 0c-.155 0-.315.008-.48.021-4.226.333-3.105 4.807-3.17 6.298-.076 1.092-.3 1.953-1.05 3.02-.885 1.051-2.127 2.75-2.716 4.521-.278.832-.41 1.684-.287 2.489a.424.424 0 00-.11.135c-.26.268-.45.6-.663.839-.199.199-.485.267-.797.4-.313.136-.658.269-.864.68-.09.189-.136.394-.132.602 0 .199.027.4.055.536.058.399.116.728.04.97-.249.68-.28 1.145-.106 1.484.174.334.535.47.94.601.81.2 1.91.135 2.774.6.926.466 1.866.67 2.616.47.526-.116.97-.464 1.208-.946.587-.003 1.23-.269 2.26-.334.699-.058 1.574.267 2.577.2.025.134.063.198.114.333l.003.003c.391.778 1.113 1.132 1.884 1.071.771-.06 1.592-.536 2.257-1.306.631-.765 1.683-1.084 2.378-1.503.348-.199.629-.469.649-.853.023-.4-.2-.811-.714-1.376v-.097l-.003-.003c-.17-.2-.25-.535-.338-.926-.085-.401-.182-.786-.492-1.046h-.003c-.059-.054-.123-.067-.188-.135a.357.357 0 00-.19-.064c.431-1.278.264-2.55-.173-3.694-.533-1.41-1.465-2.638-2.175-3.483-.796-1.005-1.576-1.957-1.56-3.368.026-2.152.236-6.133-3.544-6.139zm.529 3.405h.013c.213 0 .396.062.584.198.19.135.33.332.438.533.105.259.158.459.166.724 0-.02.006-.04.006-.06v.105a.086.086 0 01-.004-.021l-.004-.024a1.807 1.807 0 01-.15.706.953.953 0 01-.213.335.71.71 0 00-.088-.042c-.104-.045-.198-.064-.284-.133a1.312 1.312 0 00-.22-.066c.05-.06.146-.133.183-.198.053-.128.082-.264.088-.402v-.02a1.21 1.21 0 00-.061-.4c-.045-.134-.101-.2-.183-.333-.084-.066-.167-.132-.267-.132h-.016c-.093 0-.176.03-.262.132a.8.8 0 00-.205.334 1.18 1.18 0 00-.09.4v.019c.002.089.008.179.02.267-.193-.067-.438-.135-.607-.202a1.635 1.635 0 01-.018-.2v-.02a1.772 1.772 0 01.15-.768c.082-.22.232-.406.43-.533a.985.985 0 01.594-.2zm-2.962.059h.036c.142 0 .27.048.399.135.146.129.264.288.344.465.09.199.14.4.153.667v.004c.007.134.006.2-.002.266v.08c-.03.007-.056.018-.083.024-.152.055-.274.135-.393.2.012-.09.013-.18.003-.267v-.015c-.012-.133-.04-.2-.082-.333a.613.613 0 00-.166-.267.248.248 0 00-.183-.064h-.021c-.071.006-.13.04-.186.132a.552.552 0 00-.12.27.944.944 0 00-.023.33v.015c.012.135.037.2.08.334.046.134.098.2.166.268.01.009.02.018.034.024-.07.057-.117.07-.176.136a.304.304 0 01-.131.068 2.62 2.62 0 01-.275-.402 1.772 1.772 0 01-.155-.667 1.759 1.759 0 01.08-.668 1.43 1.43 0 01.283-.535c.128-.133.26-.2.418-.2zm1.37 1.706c.332 0 .733.065 1.216.399.293.2.523.269 1.052.468h.003c.255.136.405.266.478.399v-.131a.571.571 0 01.016.47c-.123.31-.516.643-1.063.842v.002c-.268.135-.501.333-.775.465-.276.135-.588.292-1.012.267a1.139 1.139 0 01-.448-.067 3.566 3.566 0 01-.322-.198c-.195-.135-.363-.332-.612-.465v-.005h-.005c-.4-.246-.616-.512-.686-.71-.07-.268-.005-.47.193-.6.224-.135.38-.271.483-.336.104-.074.143-.102.176-.131h.002v-.003c.169-.202.436-.47.839-.601.139-.036.294-.065.466-.065zm2.8 2.142c.358 1.417 1.196 3.475 1.735 4.473.286.534.855 1.659 1.102 3.024.156-.005.33.018.513.064.646-1.671-.546-3.467-1.089-3.966-.22-.2-.232-.335-.123-.335.59.534 1.365 1.572 1.646 2.757.13.535.16 1.104.021 1.67.067.028.135.06.205.067 1.032.534 1.413.938 1.23 1.537v-.002c-.06-.003-.12 0-.18 0h-.016c.151-.467-.182-.825-1.065-1.224-.915-.4-1.646-.336-1.77.465-.008.043-.013.066-.018.135-.068.023-.139.053-.209.064-.43.268-.662.669-.793 1.187-.13.533-.17 1.156-.205 1.869v.003c-.02.482-.04 1.053-.158 1.425-.06.134-.133.27-.238.465h-.003c-.067-.004-.003-.401-.004-.469.006-.534.011-1.2.036-1.534.006-.468.011-.534-.021-.267-.18.936-.323 1.2-.608 1.67a1.016 1.016 0 01-.112.134v.003l-.005-.003c-.07-.2-.044-.401-.044-.535-.002-.468.006-.869-.089-1.334-.066-.468-.353-.935-.711-1.469-.074-.104-.264-.333-.376-.533-.073-.133-.067-.267.123-.336.104-.037.2-.135.29-.2.09-.067.18-.136.27-.2.02-.015.04-.018.059-.036.14-.083.267-.2.368-.335a.838.838 0 00.145-.262l.002-.004c.028-.087.042-.133.034-.2-.034-.135-.232-.333-.393-.468-.226-.2-.4-.333-.673-.467l-.005-.002c-.569-.27-1.322-.534-1.927-.8a.082.082 0 01-.026-.013c-.136-.071-.27-.2-.406-.4-.466-.735-.727-1.536-.727-1.936 0-.2.067-.4.129-.533.032-.067.065-.135.102-.2.036-.067.257-.2.378-.267.143-.095.287-.191.441-.263z"/></svg>'
|
||||
};
|
||||
|
||||
const platformLabels = {
|
||||
windows: 'Windows',
|
||||
macos: 'macOS (Intel)',
|
||||
macosArm: 'macOS (Apple Silicon)',
|
||||
linux: 'Linux'
|
||||
};
|
||||
|
||||
function formatDate(isoString) {
|
||||
const date = new Date(isoString);
|
||||
return date.toLocaleDateString('en-US', {
|
||||
year: 'numeric',
|
||||
month: 'long',
|
||||
day: 'numeric'
|
||||
});
|
||||
}
|
||||
|
||||
function formatSize(bytes) {
|
||||
const mb = bytes / (1024 * 1024);
|
||||
return mb.toFixed(1) + ' MB';
|
||||
}
|
||||
|
||||
function renderRelease(release, isLatest) {
|
||||
const assets = Object.entries(release.assets)
|
||||
.filter(([_, asset]) => asset)
|
||||
.map(([platform, asset]) => `
|
||||
<a href="${asset.url}" class="download-link">
|
||||
${platformIcons[platform] || ''}
|
||||
<span>${platformLabels[platform] || platform}</span>
|
||||
<span class="download-size">${formatSize(asset.size)}</span>
|
||||
</a>
|
||||
`).join('');
|
||||
|
||||
return `
|
||||
<div class="release-card">
|
||||
<div class="release-header">
|
||||
<div>
|
||||
<span class="release-version">${release.version}</span>
|
||||
${isLatest ? '<span class="latest-badge">Latest</span>' : ''}
|
||||
</div>
|
||||
<span class="release-date">${formatDate(release.date)}</span>
|
||||
</div>
|
||||
<div class="release-downloads">
|
||||
${assets}
|
||||
</div>
|
||||
<a href="${release.githubReleaseUrl}" class="release-notes-link" target="_blank">
|
||||
View release notes on GitHub
|
||||
<svg width="14" height="14" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<path d="M18 13v6a2 2 0 0 1-2 2H5a2 2 0 0 1-2-2V8a2 2 0 0 1 2-2h6"></path>
|
||||
<polyline points="15 3 21 3 21 9"></polyline>
|
||||
<line x1="10" y1="14" x2="21" y2="3"></line>
|
||||
</svg>
|
||||
</a>
|
||||
</div>
|
||||
`;
|
||||
}
|
||||
|
||||
async function loadReleases() {
|
||||
const container = document.getElementById('releasesList');
|
||||
|
||||
try {
|
||||
const response = await fetch(R2_RELEASES_URL);
|
||||
if (!response.ok) throw new Error('Failed to fetch releases');
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (!data.releases || data.releases.length === 0) {
|
||||
container.innerHTML = '<div class="no-releases">No releases available yet. Check back soon!</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = data.releases
|
||||
.map((release, index) => renderRelease(release, index === 0))
|
||||
.join('');
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to load releases:', error);
|
||||
container.innerHTML = `
|
||||
<div class="error-message">
|
||||
<p>Unable to load releases. Please try again later or visit our
|
||||
<a href="https://github.com/AutoMaker-Org/automaker/releases" target="_blank">GitHub releases page</a>.</p>
|
||||
</div>
|
||||
`;
|
||||
}
|
||||
}
|
||||
|
||||
loadReleases();
|
||||
})();
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
45
apps/server/.env.example
Normal file
45
apps/server/.env.example
Normal file
@@ -0,0 +1,45 @@
|
||||
# Automaker Server Configuration
|
||||
# Copy this file to .env and configure your settings
|
||||
|
||||
# ============================================
|
||||
# REQUIRED
|
||||
# ============================================
|
||||
|
||||
# Your Anthropic API key for Claude models
|
||||
ANTHROPIC_API_KEY=sk-ant-...
|
||||
|
||||
# ============================================
|
||||
# OPTIONAL - Security
|
||||
# ============================================
|
||||
|
||||
# API key for authenticating requests (leave empty to disable auth)
|
||||
# If set, all API requests must include X-API-Key header
|
||||
AUTOMAKER_API_KEY=
|
||||
|
||||
# Restrict file operations to these directories (comma-separated)
|
||||
# Important for security in multi-tenant environments
|
||||
ALLOWED_PROJECT_DIRS=/home/user/projects,/var/www
|
||||
|
||||
# CORS origin - which domains can access the API
|
||||
# Use "*" for development, set specific origin for production
|
||||
CORS_ORIGIN=*
|
||||
|
||||
# ============================================
|
||||
# OPTIONAL - Server
|
||||
# ============================================
|
||||
|
||||
# Port to run the server on
|
||||
PORT=3008
|
||||
|
||||
# Data directory for sessions and metadata
|
||||
DATA_DIR=./data
|
||||
|
||||
# ============================================
|
||||
# OPTIONAL - Additional AI Providers
|
||||
# ============================================
|
||||
|
||||
# OpenAI API key (for Codex CLI support)
|
||||
OPENAI_API_KEY=
|
||||
|
||||
# Google API key (for future Gemini support)
|
||||
GOOGLE_API_KEY=
|
||||
1
apps/server/.gitignore
vendored
Normal file
1
apps/server/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
.env
|
||||
55
apps/server/Dockerfile
Normal file
55
apps/server/Dockerfile
Normal file
@@ -0,0 +1,55 @@
|
||||
# Automaker Backend Server
|
||||
# Multi-stage build for minimal production image
|
||||
|
||||
# Build stage
|
||||
FROM node:20-alpine AS builder
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Copy package files
|
||||
COPY package*.json ./
|
||||
COPY apps/server/package*.json ./apps/server/
|
||||
|
||||
# Install dependencies
|
||||
RUN npm ci --workspace=apps/server
|
||||
|
||||
# Copy source
|
||||
COPY apps/server ./apps/server
|
||||
|
||||
# Build TypeScript
|
||||
RUN npm run build --workspace=apps/server
|
||||
|
||||
# Production stage
|
||||
FROM node:20-alpine
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Create non-root user
|
||||
RUN addgroup -g 1001 -S automaker && \
|
||||
adduser -S automaker -u 1001
|
||||
|
||||
# Copy built files and production dependencies
|
||||
COPY --from=builder /app/apps/server/dist ./dist
|
||||
COPY --from=builder /app/apps/server/package*.json ./
|
||||
COPY --from=builder /app/node_modules ./node_modules
|
||||
|
||||
# Create data directory
|
||||
RUN mkdir -p /data && chown automaker:automaker /data
|
||||
|
||||
# Switch to non-root user
|
||||
USER automaker
|
||||
|
||||
# Environment variables
|
||||
ENV NODE_ENV=production
|
||||
ENV PORT=3008
|
||||
ENV DATA_DIR=/data
|
||||
|
||||
# Expose port
|
||||
EXPOSE 3008
|
||||
|
||||
# Health check
|
||||
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
|
||||
CMD wget --no-verbose --tries=1 --spider http://localhost:3008/api/health || exit 1
|
||||
|
||||
# Start server
|
||||
CMD ["node", "dist/index.js"]
|
||||
@@ -0,0 +1,38 @@
|
||||
[
|
||||
{
|
||||
"id": "msg_1765523527444_d8xyk8rh9",
|
||||
"role": "user",
|
||||
"content": "this is a test",
|
||||
"timestamp": "2025-12-12T07:12:07.444Z"
|
||||
},
|
||||
{
|
||||
"id": "msg_1765523534075_351i0rcft",
|
||||
"role": "assistant",
|
||||
"content": "Hello! This is a test response. I'm working correctly and ready to help you with your software development projects.\n\nIs there something specific you'd like to work on today? I can help you with:\n\n- **Planning a new project** - Define requirements, architecture, and technical approach\n- **Writing code** - Create new files, implement features, or build components\n- **Editing existing code** - Modify, refactor, or fix bugs in your codebase\n- **Running commands** - Execute tests, builds, or other terminal commands\n- **Exploring your codebase** - Search for patterns, understand existing code structure\n\nJust let me know what you need!",
|
||||
"timestamp": "2025-12-12T07:12:14.075Z"
|
||||
},
|
||||
{
|
||||
"id": "msg_1765524006202_3d2cuya6b",
|
||||
"role": "user",
|
||||
"content": "test",
|
||||
"timestamp": "2025-12-12T07:20:06.202Z"
|
||||
},
|
||||
{
|
||||
"id": "msg_1765524011389_sdfbuai3k",
|
||||
"role": "assistant",
|
||||
"content": "Hello! I'm here to help you build software. How can I assist you today?\n\nI can help you with:\n\n- **Planning** - Define requirements, discuss architecture, and design your project\n- **Coding** - Write, edit, and modify code files\n- **Debugging** - Analyze issues and fix bugs\n- **Testing** - Run tests and verify functionality\n- **Exploring** - Search and understand existing codebases\n\nWhat would you like to work on?",
|
||||
"timestamp": "2025-12-12T07:20:11.389Z"
|
||||
},
|
||||
{
|
||||
"id": "msg_1765524491273_7ocghnbb5",
|
||||
"role": "user",
|
||||
"content": "test",
|
||||
"timestamp": "2025-12-12T07:28:11.273Z"
|
||||
},
|
||||
{
|
||||
"id": "msg_1765524498237_x8b4148gc",
|
||||
"role": "assistant",
|
||||
"content": "Hello! I'm here to help you build software. How can I assist you today?\n\nI can help you with:\n- **Planning** - Define requirements, architecture, and technical approaches\n- **Coding** - Write, edit, and modify code files\n- **Debugging** - Find and fix issues in your codebase\n- **Testing** - Run tests and help improve test coverage\n- **Exploring** - Search and analyze your existing codebase\n\nWhat would you like to work on?",
|
||||
"timestamp": "2025-12-12T07:28:18.237Z"
|
||||
}
|
||||
]
|
||||
File diff suppressed because one or more lines are too long
18
apps/server/data/sessions-metadata.json
Normal file
18
apps/server/data/sessions-metadata.json
Normal file
@@ -0,0 +1,18 @@
|
||||
{
|
||||
"msg_1765523524581_xhk6u45v2": {
|
||||
"id": "msg_1765523524581_xhk6u45v2",
|
||||
"name": "Bright Agent 2",
|
||||
"projectPath": "/Users/webdevcody/Workspace/automaker",
|
||||
"workingDirectory": "/Users/webdevcody/Workspace/automaker",
|
||||
"createdAt": "2025-12-12T07:12:04.582Z",
|
||||
"updatedAt": "2025-12-12T07:28:18.571Z"
|
||||
},
|
||||
"msg_1765525491205_xeuqv7i9v": {
|
||||
"id": "msg_1765525491205_xeuqv7i9v",
|
||||
"name": "Optimal Helper 52",
|
||||
"projectPath": "/Users/webdevcody/Workspace/automaker",
|
||||
"workingDirectory": "/Users/webdevcody/Workspace/automaker",
|
||||
"createdAt": "2025-12-12T07:44:51.205Z",
|
||||
"updatedAt": "2025-12-12T07:46:03.339Z"
|
||||
}
|
||||
}
|
||||
29
apps/server/package.json
Normal file
29
apps/server/package.json
Normal file
@@ -0,0 +1,29 @@
|
||||
{
|
||||
"name": "@automaker/server",
|
||||
"version": "0.1.0",
|
||||
"description": "Backend server for Automaker - provides API for both web and Electron modes",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"main": "dist/index.js",
|
||||
"scripts": {
|
||||
"dev": "tsx watch src/index.ts",
|
||||
"build": "tsc",
|
||||
"start": "node dist/index.js",
|
||||
"lint": "eslint src/"
|
||||
},
|
||||
"dependencies": {
|
||||
"@anthropic-ai/claude-agent-sdk": "^0.1.61",
|
||||
"cors": "^2.8.5",
|
||||
"dotenv": "^17.2.3",
|
||||
"express": "^5.1.0",
|
||||
"ws": "^8.18.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@types/cors": "^2.8.18",
|
||||
"@types/express": "^5.0.1",
|
||||
"@types/node": "^20",
|
||||
"@types/ws": "^8.18.1",
|
||||
"tsx": "^4.19.4",
|
||||
"typescript": "^5"
|
||||
}
|
||||
}
|
||||
171
apps/server/src/index.ts
Normal file
171
apps/server/src/index.ts
Normal file
@@ -0,0 +1,171 @@
|
||||
/**
|
||||
* Automaker Backend Server
|
||||
*
|
||||
* Provides HTTP/WebSocket API for both web and Electron modes.
|
||||
* In Electron mode, this server runs locally.
|
||||
* In web mode, this server runs on a remote host.
|
||||
*/
|
||||
|
||||
import express from "express";
|
||||
import cors from "cors";
|
||||
import { WebSocketServer, WebSocket } from "ws";
|
||||
import { createServer } from "http";
|
||||
import dotenv from "dotenv";
|
||||
|
||||
import { createEventEmitter, type EventEmitter } from "./lib/events.js";
|
||||
import { initAllowedPaths } from "./lib/security.js";
|
||||
import { authMiddleware, getAuthStatus } from "./lib/auth.js";
|
||||
import { createFsRoutes } from "./routes/fs.js";
|
||||
import { createHealthRoutes } from "./routes/health.js";
|
||||
import { createAgentRoutes } from "./routes/agent.js";
|
||||
import { createSessionsRoutes } from "./routes/sessions.js";
|
||||
import { createFeaturesRoutes } from "./routes/features.js";
|
||||
import { createAutoModeRoutes } from "./routes/auto-mode.js";
|
||||
import { createWorktreeRoutes } from "./routes/worktree.js";
|
||||
import { createGitRoutes } from "./routes/git.js";
|
||||
import { createSetupRoutes } from "./routes/setup.js";
|
||||
import { createSuggestionsRoutes } from "./routes/suggestions.js";
|
||||
import { createModelsRoutes } from "./routes/models.js";
|
||||
import { createSpecRegenerationRoutes } from "./routes/spec-regeneration.js";
|
||||
import { createRunningAgentsRoutes } from "./routes/running-agents.js";
|
||||
import { createWorkspaceRoutes } from "./routes/workspace.js";
|
||||
import { AgentService } from "./services/agent-service.js";
|
||||
import { FeatureLoader } from "./services/feature-loader.js";
|
||||
|
||||
// Load environment variables
|
||||
dotenv.config();
|
||||
|
||||
const PORT = parseInt(process.env.PORT || "3008", 10);
|
||||
const DATA_DIR = process.env.DATA_DIR || "./data";
|
||||
|
||||
// Check for required environment variables
|
||||
// Claude Agent SDK supports EITHER OAuth token (subscription) OR API key (pay-per-use)
|
||||
const hasAnthropicKey = !!process.env.ANTHROPIC_API_KEY;
|
||||
const hasOAuthToken = !!process.env.CLAUDE_CODE_OAUTH_TOKEN;
|
||||
|
||||
if (!hasAnthropicKey && !hasOAuthToken) {
|
||||
console.warn(`
|
||||
╔═══════════════════════════════════════════════════════════════════════╗
|
||||
║ ⚠️ WARNING: No Claude authentication configured ║
|
||||
║ ║
|
||||
║ The Claude Agent SDK requires authentication to function. ║
|
||||
║ ║
|
||||
║ Option 1 - Subscription (OAuth Token): ║
|
||||
║ export CLAUDE_CODE_OAUTH_TOKEN="your-oauth-token" ║
|
||||
║ ║
|
||||
║ Option 2 - Pay-per-use (API Key): ║
|
||||
║ export ANTHROPIC_API_KEY="sk-ant-..." ║
|
||||
║ ║
|
||||
║ Or use the setup wizard in Settings to configure authentication. ║
|
||||
╚═══════════════════════════════════════════════════════════════════════╝
|
||||
`);
|
||||
} else if (hasOAuthToken) {
|
||||
console.log("[Server] ✓ CLAUDE_CODE_OAUTH_TOKEN detected (subscription auth)");
|
||||
} else {
|
||||
console.log("[Server] ✓ ANTHROPIC_API_KEY detected (API key auth)");
|
||||
}
|
||||
|
||||
// Initialize security
|
||||
initAllowedPaths();
|
||||
|
||||
// Create Express app
|
||||
const app = express();
|
||||
|
||||
// Middleware
|
||||
app.use(
|
||||
cors({
|
||||
origin: process.env.CORS_ORIGIN || "*",
|
||||
credentials: true,
|
||||
})
|
||||
);
|
||||
app.use(express.json({ limit: "50mb" }));
|
||||
|
||||
// Create shared event emitter for streaming
|
||||
const events: EventEmitter = createEventEmitter();
|
||||
|
||||
// Create services
|
||||
const agentService = new AgentService(DATA_DIR, events);
|
||||
const featureLoader = new FeatureLoader();
|
||||
|
||||
// Initialize services
|
||||
(async () => {
|
||||
await agentService.initialize();
|
||||
console.log("[Server] Agent service initialized");
|
||||
})();
|
||||
|
||||
// Mount API routes - health is unauthenticated for monitoring
|
||||
app.use("/api/health", createHealthRoutes());
|
||||
|
||||
// Apply authentication to all other routes
|
||||
app.use("/api", authMiddleware);
|
||||
|
||||
app.use("/api/fs", createFsRoutes(events));
|
||||
app.use("/api/agent", createAgentRoutes(agentService, events));
|
||||
app.use("/api/sessions", createSessionsRoutes(agentService));
|
||||
app.use("/api/features", createFeaturesRoutes(featureLoader));
|
||||
app.use("/api/auto-mode", createAutoModeRoutes(events));
|
||||
app.use("/api/worktree", createWorktreeRoutes());
|
||||
app.use("/api/git", createGitRoutes());
|
||||
app.use("/api/setup", createSetupRoutes());
|
||||
app.use("/api/suggestions", createSuggestionsRoutes(events));
|
||||
app.use("/api/models", createModelsRoutes());
|
||||
app.use("/api/spec-regeneration", createSpecRegenerationRoutes(events));
|
||||
app.use("/api/running-agents", createRunningAgentsRoutes());
|
||||
app.use("/api/workspace", createWorkspaceRoutes());
|
||||
|
||||
// Create HTTP server
|
||||
const server = createServer(app);
|
||||
|
||||
// WebSocket server for streaming events
|
||||
const wss = new WebSocketServer({ server, path: "/api/events" });
|
||||
|
||||
wss.on("connection", (ws: WebSocket) => {
|
||||
console.log("[WebSocket] Client connected");
|
||||
|
||||
// Subscribe to all events and forward to this client
|
||||
const unsubscribe = events.subscribe((type, payload) => {
|
||||
if (ws.readyState === WebSocket.OPEN) {
|
||||
ws.send(JSON.stringify({ type, payload }));
|
||||
}
|
||||
});
|
||||
|
||||
ws.on("close", () => {
|
||||
console.log("[WebSocket] Client disconnected");
|
||||
unsubscribe();
|
||||
});
|
||||
|
||||
ws.on("error", (error) => {
|
||||
console.error("[WebSocket] Error:", error);
|
||||
unsubscribe();
|
||||
});
|
||||
});
|
||||
|
||||
// Start server
|
||||
server.listen(PORT, () => {
|
||||
console.log(`
|
||||
╔═══════════════════════════════════════════════════════╗
|
||||
║ Automaker Backend Server ║
|
||||
╠═══════════════════════════════════════════════════════╣
|
||||
║ HTTP API: http://localhost:${PORT} ║
|
||||
║ WebSocket: ws://localhost:${PORT}/api/events ║
|
||||
║ Health: http://localhost:${PORT}/api/health ║
|
||||
╚═══════════════════════════════════════════════════════╝
|
||||
`);
|
||||
});
|
||||
|
||||
// Graceful shutdown
|
||||
process.on("SIGTERM", () => {
|
||||
console.log("SIGTERM received, shutting down...");
|
||||
server.close(() => {
|
||||
console.log("Server closed");
|
||||
process.exit(0);
|
||||
});
|
||||
});
|
||||
|
||||
process.on("SIGINT", () => {
|
||||
console.log("SIGINT received, shutting down...");
|
||||
server.close(() => {
|
||||
console.log("Server closed");
|
||||
process.exit(0);
|
||||
});
|
||||
});
|
||||
62
apps/server/src/lib/auth.ts
Normal file
62
apps/server/src/lib/auth.ts
Normal file
@@ -0,0 +1,62 @@
|
||||
/**
|
||||
* Authentication middleware for API security
|
||||
*
|
||||
* Supports API key authentication via header or environment variable.
|
||||
*/
|
||||
|
||||
import type { Request, Response, NextFunction } from "express";
|
||||
|
||||
// API key from environment (optional - if not set, auth is disabled)
|
||||
const API_KEY = process.env.AUTOMAKER_API_KEY;
|
||||
|
||||
/**
|
||||
* Authentication middleware
|
||||
*
|
||||
* If AUTOMAKER_API_KEY is set, requires matching key in X-API-Key header.
|
||||
* If not set, allows all requests (development mode).
|
||||
*/
|
||||
export function authMiddleware(req: Request, res: Response, next: NextFunction): void {
|
||||
// If no API key is configured, allow all requests
|
||||
if (!API_KEY) {
|
||||
next();
|
||||
return;
|
||||
}
|
||||
|
||||
// Check for API key in header
|
||||
const providedKey = req.headers["x-api-key"] as string | undefined;
|
||||
|
||||
if (!providedKey) {
|
||||
res.status(401).json({
|
||||
success: false,
|
||||
error: "Authentication required. Provide X-API-Key header.",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
if (providedKey !== API_KEY) {
|
||||
res.status(403).json({
|
||||
success: false,
|
||||
error: "Invalid API key.",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
next();
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if authentication is enabled
|
||||
*/
|
||||
export function isAuthEnabled(): boolean {
|
||||
return !!API_KEY;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get authentication status for health endpoint
|
||||
*/
|
||||
export function getAuthStatus(): { enabled: boolean; method: string } {
|
||||
return {
|
||||
enabled: !!API_KEY,
|
||||
method: API_KEY ? "api_key" : "none",
|
||||
};
|
||||
}
|
||||
57
apps/server/src/lib/events.ts
Normal file
57
apps/server/src/lib/events.ts
Normal file
@@ -0,0 +1,57 @@
|
||||
/**
|
||||
* Event emitter for streaming events to WebSocket clients
|
||||
*/
|
||||
|
||||
export type EventType =
|
||||
| "agent:stream"
|
||||
| "auto-mode:event"
|
||||
| "auto-mode:started"
|
||||
| "auto-mode:stopped"
|
||||
| "auto-mode:idle"
|
||||
| "auto-mode:error"
|
||||
| "feature:started"
|
||||
| "feature:completed"
|
||||
| "feature:stopped"
|
||||
| "feature:error"
|
||||
| "feature:progress"
|
||||
| "feature:tool-use"
|
||||
| "feature:follow-up-started"
|
||||
| "feature:follow-up-completed"
|
||||
| "feature:verified"
|
||||
| "feature:committed"
|
||||
| "project:analysis-started"
|
||||
| "project:analysis-progress"
|
||||
| "project:analysis-completed"
|
||||
| "project:analysis-error"
|
||||
| "suggestions:event"
|
||||
| "spec-regeneration:event";
|
||||
|
||||
export type EventCallback = (type: EventType, payload: unknown) => void;
|
||||
|
||||
export interface EventEmitter {
|
||||
emit: (type: EventType, payload: unknown) => void;
|
||||
subscribe: (callback: EventCallback) => () => void;
|
||||
}
|
||||
|
||||
export function createEventEmitter(): EventEmitter {
|
||||
const subscribers = new Set<EventCallback>();
|
||||
|
||||
return {
|
||||
emit(type: EventType, payload: unknown) {
|
||||
for (const callback of subscribers) {
|
||||
try {
|
||||
callback(type, payload);
|
||||
} catch (error) {
|
||||
console.error("Error in event subscriber:", error);
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
subscribe(callback: EventCallback) {
|
||||
subscribers.add(callback);
|
||||
return () => {
|
||||
subscribers.delete(callback);
|
||||
};
|
||||
},
|
||||
};
|
||||
}
|
||||
72
apps/server/src/lib/security.ts
Normal file
72
apps/server/src/lib/security.ts
Normal file
@@ -0,0 +1,72 @@
|
||||
/**
|
||||
* Security utilities for path validation
|
||||
*/
|
||||
|
||||
import path from "path";
|
||||
|
||||
// Allowed project directories - loaded from environment
|
||||
const allowedPaths = new Set<string>();
|
||||
|
||||
/**
|
||||
* Initialize allowed paths from environment variable
|
||||
*/
|
||||
export function initAllowedPaths(): void {
|
||||
const dirs = process.env.ALLOWED_PROJECT_DIRS;
|
||||
if (dirs) {
|
||||
for (const dir of dirs.split(",")) {
|
||||
const trimmed = dir.trim();
|
||||
if (trimmed) {
|
||||
allowedPaths.add(path.resolve(trimmed));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Always allow the data directory
|
||||
const dataDir = process.env.DATA_DIR;
|
||||
if (dataDir) {
|
||||
allowedPaths.add(path.resolve(dataDir));
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Add a path to the allowed list
|
||||
*/
|
||||
export function addAllowedPath(filePath: string): void {
|
||||
allowedPaths.add(path.resolve(filePath));
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a path is allowed
|
||||
*/
|
||||
export function isPathAllowed(filePath: string): boolean {
|
||||
const resolved = path.resolve(filePath);
|
||||
|
||||
// Check if the path is under any allowed directory
|
||||
for (const allowed of allowedPaths) {
|
||||
if (resolved.startsWith(allowed + path.sep) || resolved === allowed) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Validate a path and throw if not allowed
|
||||
*/
|
||||
export function validatePath(filePath: string): string {
|
||||
const resolved = path.resolve(filePath);
|
||||
|
||||
if (!isPathAllowed(resolved)) {
|
||||
throw new Error(`Access denied: ${filePath} is not in an allowed directory`);
|
||||
}
|
||||
|
||||
return resolved;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get list of allowed paths (for debugging)
|
||||
*/
|
||||
export function getAllowedPaths(): string[] {
|
||||
return Array.from(allowedPaths);
|
||||
}
|
||||
132
apps/server/src/routes/agent.ts
Normal file
132
apps/server/src/routes/agent.ts
Normal file
@@ -0,0 +1,132 @@
|
||||
/**
|
||||
* Agent routes - HTTP API for Claude agent interactions
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import { AgentService } from "../services/agent-service.js";
|
||||
import type { EventEmitter } from "../lib/events.js";
|
||||
|
||||
export function createAgentRoutes(
|
||||
agentService: AgentService,
|
||||
_events: EventEmitter
|
||||
): Router {
|
||||
const router = Router();
|
||||
|
||||
// Start a conversation
|
||||
router.post("/start", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId, workingDirectory } = req.body as {
|
||||
sessionId: string;
|
||||
workingDirectory?: string;
|
||||
};
|
||||
|
||||
if (!sessionId) {
|
||||
res.status(400).json({ success: false, error: "sessionId is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const result = await agentService.startConversation({
|
||||
sessionId,
|
||||
workingDirectory,
|
||||
});
|
||||
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Send a message
|
||||
router.post("/send", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId, message, workingDirectory, imagePaths } = req.body as {
|
||||
sessionId: string;
|
||||
message: string;
|
||||
workingDirectory?: string;
|
||||
imagePaths?: string[];
|
||||
};
|
||||
|
||||
if (!sessionId || !message) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "sessionId and message are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Start the message processing (don't await - it streams via WebSocket)
|
||||
agentService
|
||||
.sendMessage({
|
||||
sessionId,
|
||||
message,
|
||||
workingDirectory,
|
||||
imagePaths,
|
||||
})
|
||||
.catch((error) => {
|
||||
console.error("[Agent Route] Error sending message:", error);
|
||||
});
|
||||
|
||||
// Return immediately - responses come via WebSocket
|
||||
res.json({ success: true, message: "Message sent" });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get conversation history
|
||||
router.post("/history", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId } = req.body as { sessionId: string };
|
||||
|
||||
if (!sessionId) {
|
||||
res.status(400).json({ success: false, error: "sessionId is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const result = agentService.getHistory(sessionId);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Stop execution
|
||||
router.post("/stop", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId } = req.body as { sessionId: string };
|
||||
|
||||
if (!sessionId) {
|
||||
res.status(400).json({ success: false, error: "sessionId is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const result = await agentService.stopExecution(sessionId);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Clear conversation
|
||||
router.post("/clear", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId } = req.body as { sessionId: string };
|
||||
|
||||
if (!sessionId) {
|
||||
res.status(400).json({ success: false, error: "sessionId is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const result = await agentService.clearSession(sessionId);
|
||||
res.json(result);
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
263
apps/server/src/routes/auto-mode.ts
Normal file
263
apps/server/src/routes/auto-mode.ts
Normal file
@@ -0,0 +1,263 @@
|
||||
/**
|
||||
* Auto Mode routes - HTTP API for autonomous feature implementation
|
||||
*
|
||||
* Uses the AutoModeService for real feature execution with Claude Agent SDK
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import type { EventEmitter } from "../lib/events.js";
|
||||
import { AutoModeService } from "../services/auto-mode-service.js";
|
||||
|
||||
export function createAutoModeRoutes(events: EventEmitter): Router {
|
||||
const router = Router();
|
||||
const autoModeService = new AutoModeService(events);
|
||||
|
||||
// Start auto mode loop
|
||||
router.post("/start", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, maxConcurrency } = req.body as {
|
||||
projectPath: string;
|
||||
maxConcurrency?: number;
|
||||
};
|
||||
|
||||
if (!projectPath) {
|
||||
res.status(400).json({ success: false, error: "projectPath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
await autoModeService.startAutoLoop(projectPath, maxConcurrency || 3);
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Stop auto mode loop
|
||||
router.post("/stop", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const runningCount = await autoModeService.stopAutoLoop();
|
||||
res.json({ success: true, runningFeatures: runningCount });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Stop a specific feature
|
||||
router.post("/stop-feature", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { featureId } = req.body as { featureId: string };
|
||||
|
||||
if (!featureId) {
|
||||
res.status(400).json({ success: false, error: "featureId is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const stopped = await autoModeService.stopFeature(featureId);
|
||||
res.json({ success: true, stopped });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get auto mode status
|
||||
router.post("/status", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const status = autoModeService.getStatus();
|
||||
res.json({
|
||||
success: true,
|
||||
...status,
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Run a single feature
|
||||
router.post("/run-feature", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId, useWorktrees } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
useWorktrees?: boolean;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Start execution in background
|
||||
autoModeService
|
||||
.executeFeature(projectPath, featureId, useWorktrees ?? true, false)
|
||||
.catch((error) => {
|
||||
console.error(`[AutoMode] Feature ${featureId} error:`, error);
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Verify a feature
|
||||
router.post("/verify-feature", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const passes = await autoModeService.verifyFeature(projectPath, featureId);
|
||||
res.json({ success: true, passes });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Resume a feature
|
||||
router.post("/resume-feature", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId, useWorktrees } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
useWorktrees?: boolean;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Start resume in background
|
||||
autoModeService
|
||||
.resumeFeature(projectPath, featureId, useWorktrees ?? true)
|
||||
.catch((error) => {
|
||||
console.error(`[AutoMode] Resume feature ${featureId} error:`, error);
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Check if context exists for a feature
|
||||
router.post("/context-exists", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const exists = await autoModeService.contextExists(projectPath, featureId);
|
||||
res.json({ success: true, exists });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Analyze project
|
||||
router.post("/analyze-project", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath } = req.body as { projectPath: string };
|
||||
|
||||
if (!projectPath) {
|
||||
res.status(400).json({ success: false, error: "projectPath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Start analysis in background
|
||||
autoModeService.analyzeProject(projectPath).catch((error) => {
|
||||
console.error(`[AutoMode] Project analysis error:`, error);
|
||||
});
|
||||
|
||||
res.json({ success: true, message: "Project analysis started" });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Follow up on a feature
|
||||
router.post("/follow-up-feature", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId, prompt, imagePaths } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
prompt: string;
|
||||
imagePaths?: string[];
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId || !prompt) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: "projectPath, featureId, and prompt are required",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Start follow-up in background
|
||||
autoModeService
|
||||
.followUpFeature(projectPath, featureId, prompt, imagePaths)
|
||||
.catch((error) => {
|
||||
console.error(`[AutoMode] Follow up feature ${featureId} error:`, error);
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Commit feature changes
|
||||
router.post("/commit-feature", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const commitHash = await autoModeService.commitFeature(projectPath, featureId);
|
||||
res.json({ success: true, commitHash });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
159
apps/server/src/routes/features.ts
Normal file
159
apps/server/src/routes/features.ts
Normal file
@@ -0,0 +1,159 @@
|
||||
/**
|
||||
* Features routes - HTTP API for feature management
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import { FeatureLoader, type Feature } from "../services/feature-loader.js";
|
||||
import { addAllowedPath } from "../lib/security.js";
|
||||
|
||||
export function createFeaturesRoutes(featureLoader: FeatureLoader): Router {
|
||||
const router = Router();
|
||||
|
||||
// List all features for a project
|
||||
router.post("/list", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath } = req.body as { projectPath: string };
|
||||
|
||||
if (!projectPath) {
|
||||
res.status(400).json({ success: false, error: "projectPath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Add project path to allowed paths
|
||||
addAllowedPath(projectPath);
|
||||
|
||||
const features = await featureLoader.getAll(projectPath);
|
||||
res.json({ success: true, features });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get a single feature
|
||||
router.post("/get", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const feature = await featureLoader.get(projectPath, featureId);
|
||||
if (!feature) {
|
||||
res.status(404).json({ success: false, error: "Feature not found" });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({ success: true, feature });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Create a new feature
|
||||
router.post("/create", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, feature } = req.body as {
|
||||
projectPath: string;
|
||||
feature: Partial<Feature>;
|
||||
};
|
||||
|
||||
if (!projectPath || !feature) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and feature are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Add project path to allowed paths
|
||||
addAllowedPath(projectPath);
|
||||
|
||||
const created = await featureLoader.create(projectPath, feature);
|
||||
res.json({ success: true, feature: created });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Update a feature
|
||||
router.post("/update", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId, updates } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
updates: Partial<Feature>;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId || !updates) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: "projectPath, featureId, and updates are required",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const updated = await featureLoader.update(projectPath, featureId, updates);
|
||||
res.json({ success: true, feature: updated });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Delete a feature
|
||||
router.post("/delete", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const success = await featureLoader.delete(projectPath, featureId);
|
||||
res.json({ success });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get agent output for a feature
|
||||
router.post("/agent-output", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId are required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const content = await featureLoader.getAgentOutput(projectPath, featureId);
|
||||
res.json({ success: true, content });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
622
apps/server/src/routes/fs.ts
Normal file
622
apps/server/src/routes/fs.ts
Normal file
@@ -0,0 +1,622 @@
|
||||
/**
|
||||
* File system routes
|
||||
* Provides REST API equivalents for Electron IPC file operations
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import fs from "fs/promises";
|
||||
import os from "os";
|
||||
import path from "path";
|
||||
import os from "os";
|
||||
import { validatePath, addAllowedPath, isPathAllowed } from "../lib/security.js";
|
||||
import type { EventEmitter } from "../lib/events.js";
|
||||
|
||||
export function createFsRoutes(_events: EventEmitter): Router {
|
||||
const router = Router();
|
||||
|
||||
// Read file
|
||||
router.post("/read", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { filePath } = req.body as { filePath: string };
|
||||
|
||||
if (!filePath) {
|
||||
res.status(400).json({ success: false, error: "filePath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const resolvedPath = validatePath(filePath);
|
||||
const content = await fs.readFile(resolvedPath, "utf-8");
|
||||
|
||||
res.json({ success: true, content });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Write file
|
||||
router.post("/write", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { filePath, content } = req.body as {
|
||||
filePath: string;
|
||||
content: string;
|
||||
};
|
||||
|
||||
if (!filePath) {
|
||||
res.status(400).json({ success: false, error: "filePath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const resolvedPath = validatePath(filePath);
|
||||
|
||||
// Ensure parent directory exists
|
||||
await fs.mkdir(path.dirname(resolvedPath), { recursive: true });
|
||||
await fs.writeFile(resolvedPath, content, "utf-8");
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Create directory
|
||||
router.post("/mkdir", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { dirPath } = req.body as { dirPath: string };
|
||||
|
||||
if (!dirPath) {
|
||||
res.status(400).json({ success: false, error: "dirPath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const resolvedPath = validatePath(dirPath);
|
||||
await fs.mkdir(resolvedPath, { recursive: true });
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Read directory
|
||||
router.post("/readdir", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { dirPath } = req.body as { dirPath: string };
|
||||
|
||||
if (!dirPath) {
|
||||
res.status(400).json({ success: false, error: "dirPath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const resolvedPath = validatePath(dirPath);
|
||||
const entries = await fs.readdir(resolvedPath, { withFileTypes: true });
|
||||
|
||||
const result = entries.map((entry) => ({
|
||||
name: entry.name,
|
||||
isDirectory: entry.isDirectory(),
|
||||
isFile: entry.isFile(),
|
||||
}));
|
||||
|
||||
res.json({ success: true, entries: result });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Check if file/directory exists
|
||||
router.post("/exists", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { filePath } = req.body as { filePath: string };
|
||||
|
||||
if (!filePath) {
|
||||
res.status(400).json({ success: false, error: "filePath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// For exists, we check but don't require the path to be pre-allowed
|
||||
// This allows the UI to validate user-entered paths
|
||||
const resolvedPath = path.resolve(filePath);
|
||||
|
||||
try {
|
||||
await fs.access(resolvedPath);
|
||||
res.json({ success: true, exists: true });
|
||||
} catch {
|
||||
res.json({ success: true, exists: false });
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get file stats
|
||||
router.post("/stat", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { filePath } = req.body as { filePath: string };
|
||||
|
||||
if (!filePath) {
|
||||
res.status(400).json({ success: false, error: "filePath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const resolvedPath = validatePath(filePath);
|
||||
const stats = await fs.stat(resolvedPath);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
stats: {
|
||||
isDirectory: stats.isDirectory(),
|
||||
isFile: stats.isFile(),
|
||||
size: stats.size,
|
||||
mtime: stats.mtime,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Delete file
|
||||
router.post("/delete", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { filePath } = req.body as { filePath: string };
|
||||
|
||||
if (!filePath) {
|
||||
res.status(400).json({ success: false, error: "filePath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const resolvedPath = validatePath(filePath);
|
||||
await fs.rm(resolvedPath, { recursive: true });
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Validate and add path to allowed list
|
||||
// This is the web equivalent of dialog:openDirectory
|
||||
router.post("/validate-path", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { filePath } = req.body as { filePath: string };
|
||||
|
||||
if (!filePath) {
|
||||
res.status(400).json({ success: false, error: "filePath is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const resolvedPath = path.resolve(filePath);
|
||||
|
||||
// Check if path exists
|
||||
try {
|
||||
const stats = await fs.stat(resolvedPath);
|
||||
|
||||
if (!stats.isDirectory()) {
|
||||
res.status(400).json({ success: false, error: "Path is not a directory" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Add to allowed paths
|
||||
addAllowedPath(resolvedPath);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
path: resolvedPath,
|
||||
isAllowed: isPathAllowed(resolvedPath),
|
||||
});
|
||||
} catch {
|
||||
res.status(400).json({ success: false, error: "Path does not exist" });
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Resolve directory path from directory name and file structure
|
||||
// Used when browser file picker only provides directory name (not full path)
|
||||
router.post("/resolve-directory", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { directoryName, sampleFiles, fileCount } = req.body as {
|
||||
directoryName: string;
|
||||
sampleFiles?: string[];
|
||||
fileCount?: number;
|
||||
};
|
||||
|
||||
if (!directoryName) {
|
||||
res.status(400).json({ success: false, error: "directoryName is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// If directoryName looks like an absolute path, try validating it directly
|
||||
if (path.isAbsolute(directoryName) || directoryName.includes(path.sep)) {
|
||||
try {
|
||||
const resolvedPath = path.resolve(directoryName);
|
||||
const stats = await fs.stat(resolvedPath);
|
||||
if (stats.isDirectory()) {
|
||||
addAllowedPath(resolvedPath);
|
||||
return res.json({
|
||||
success: true,
|
||||
path: resolvedPath,
|
||||
});
|
||||
}
|
||||
} catch {
|
||||
// Not a valid absolute path, continue to search
|
||||
}
|
||||
}
|
||||
|
||||
// Search for directory in common locations
|
||||
const searchPaths: string[] = [
|
||||
process.cwd(), // Current working directory
|
||||
process.env.HOME || process.env.USERPROFILE || "", // User home
|
||||
path.join(process.env.HOME || process.env.USERPROFILE || "", "Documents"),
|
||||
path.join(process.env.HOME || process.env.USERPROFILE || "", "Desktop"),
|
||||
// Common project locations
|
||||
path.join(process.env.HOME || process.env.USERPROFILE || "", "Projects"),
|
||||
].filter(Boolean);
|
||||
|
||||
// Also check parent of current working directory
|
||||
try {
|
||||
const parentDir = path.dirname(process.cwd());
|
||||
if (!searchPaths.includes(parentDir)) {
|
||||
searchPaths.push(parentDir);
|
||||
}
|
||||
} catch {
|
||||
// Ignore
|
||||
}
|
||||
|
||||
// Search for directory matching the name and file structure
|
||||
for (const searchPath of searchPaths) {
|
||||
try {
|
||||
const candidatePath = path.join(searchPath, directoryName);
|
||||
const stats = await fs.stat(candidatePath);
|
||||
|
||||
if (stats.isDirectory()) {
|
||||
// Verify it matches by checking for sample files
|
||||
if (sampleFiles && sampleFiles.length > 0) {
|
||||
let matches = 0;
|
||||
for (const sampleFile of sampleFiles.slice(0, 5)) {
|
||||
// Remove directory name prefix from sample file path
|
||||
const relativeFile = sampleFile.startsWith(directoryName + "/")
|
||||
? sampleFile.substring(directoryName.length + 1)
|
||||
: sampleFile.split("/").slice(1).join("/") || sampleFile.split("/").pop() || sampleFile;
|
||||
|
||||
try {
|
||||
const filePath = path.join(candidatePath, relativeFile);
|
||||
await fs.access(filePath);
|
||||
matches++;
|
||||
} catch {
|
||||
// File doesn't exist, continue checking
|
||||
}
|
||||
}
|
||||
|
||||
// If at least one file matches, consider it a match
|
||||
if (matches === 0 && sampleFiles.length > 0) {
|
||||
continue; // Try next candidate
|
||||
}
|
||||
}
|
||||
|
||||
// Found matching directory
|
||||
addAllowedPath(candidatePath);
|
||||
return res.json({
|
||||
success: true,
|
||||
path: candidatePath,
|
||||
});
|
||||
}
|
||||
} catch {
|
||||
// Directory doesn't exist at this location, continue searching
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
// Directory not found
|
||||
res.status(404).json({
|
||||
success: false,
|
||||
error: `Directory "${directoryName}" not found in common locations. Please ensure the directory exists.`,
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Save image to .automaker/images directory
|
||||
router.post("/save-image", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { data, filename, mimeType, projectPath } = req.body as {
|
||||
data: string;
|
||||
filename: string;
|
||||
mimeType: string;
|
||||
projectPath: string;
|
||||
};
|
||||
|
||||
if (!data || !filename || !projectPath) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: "data, filename, and projectPath are required",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Create .automaker/images directory if it doesn't exist
|
||||
const imagesDir = path.join(projectPath, ".automaker", "images");
|
||||
await fs.mkdir(imagesDir, { recursive: true });
|
||||
|
||||
// Decode base64 data (remove data URL prefix if present)
|
||||
const base64Data = data.replace(/^data:image\/\w+;base64,/, "");
|
||||
const buffer = Buffer.from(base64Data, "base64");
|
||||
|
||||
// Generate unique filename with timestamp
|
||||
const timestamp = Date.now();
|
||||
const ext = path.extname(filename) || ".png";
|
||||
const baseName = path.basename(filename, ext);
|
||||
const uniqueFilename = `${baseName}-${timestamp}${ext}`;
|
||||
const filePath = path.join(imagesDir, uniqueFilename);
|
||||
|
||||
// Write file
|
||||
await fs.writeFile(filePath, buffer);
|
||||
|
||||
// Add project path to allowed paths if not already
|
||||
addAllowedPath(projectPath);
|
||||
|
||||
res.json({ success: true, path: filePath });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Browse directories - for file browser UI
|
||||
router.post("/browse", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { dirPath } = req.body as { dirPath?: string };
|
||||
|
||||
// Default to home directory if no path provided
|
||||
const targetPath = dirPath ? path.resolve(dirPath) : os.homedir();
|
||||
|
||||
// Detect available drives on Windows
|
||||
const detectDrives = async (): Promise<string[]> => {
|
||||
if (os.platform() !== "win32") {
|
||||
return [];
|
||||
}
|
||||
|
||||
const drives: string[] = [];
|
||||
const letters = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
|
||||
|
||||
for (const letter of letters) {
|
||||
const drivePath = `${letter}:\\`;
|
||||
try {
|
||||
await fs.access(drivePath);
|
||||
drives.push(drivePath);
|
||||
} catch {
|
||||
// Drive doesn't exist, skip it
|
||||
}
|
||||
}
|
||||
|
||||
return drives;
|
||||
};
|
||||
|
||||
try {
|
||||
const stats = await fs.stat(targetPath);
|
||||
|
||||
if (!stats.isDirectory()) {
|
||||
res.status(400).json({ success: false, error: "Path is not a directory" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Read directory contents
|
||||
const entries = await fs.readdir(targetPath, { withFileTypes: true });
|
||||
|
||||
// Filter for directories only and add parent directory option
|
||||
const directories = entries
|
||||
.filter((entry) => entry.isDirectory() && !entry.name.startsWith("."))
|
||||
.map((entry) => ({
|
||||
name: entry.name,
|
||||
path: path.join(targetPath, entry.name),
|
||||
}))
|
||||
.sort((a, b) => a.name.localeCompare(b.name));
|
||||
|
||||
// Get parent directory
|
||||
const parentPath = path.dirname(targetPath);
|
||||
const hasParent = parentPath !== targetPath;
|
||||
|
||||
// Get available drives
|
||||
const drives = await detectDrives();
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
currentPath: targetPath,
|
||||
parentPath: hasParent ? parentPath : null,
|
||||
directories,
|
||||
drives,
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : "Failed to read directory",
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Serve image files
|
||||
router.get("/image", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { path: imagePath, projectPath } = req.query as {
|
||||
path?: string;
|
||||
projectPath?: string;
|
||||
};
|
||||
|
||||
if (!imagePath) {
|
||||
res.status(400).json({ success: false, error: "path is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Resolve full path
|
||||
const fullPath = path.isAbsolute(imagePath)
|
||||
? imagePath
|
||||
: projectPath
|
||||
? path.join(projectPath, imagePath)
|
||||
: imagePath;
|
||||
|
||||
// Check if file exists
|
||||
try {
|
||||
await fs.access(fullPath);
|
||||
} catch {
|
||||
res.status(404).json({ success: false, error: "Image not found" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Read the file
|
||||
const buffer = await fs.readFile(fullPath);
|
||||
|
||||
// Determine MIME type from extension
|
||||
const ext = path.extname(fullPath).toLowerCase();
|
||||
const mimeTypes: Record<string, string> = {
|
||||
".png": "image/png",
|
||||
".jpg": "image/jpeg",
|
||||
".jpeg": "image/jpeg",
|
||||
".gif": "image/gif",
|
||||
".webp": "image/webp",
|
||||
".svg": "image/svg+xml",
|
||||
".bmp": "image/bmp",
|
||||
};
|
||||
|
||||
res.setHeader("Content-Type", mimeTypes[ext] || "application/octet-stream");
|
||||
res.setHeader("Cache-Control", "public, max-age=3600");
|
||||
res.send(buffer);
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Browse directories for file picker
|
||||
// SECURITY: Restricted to home directory, allowed paths, and drive roots on Windows
|
||||
router.post("/browse", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { dirPath } = req.body as { dirPath?: string };
|
||||
const homeDir = os.homedir();
|
||||
|
||||
// Detect available drives on Windows
|
||||
const detectDrives = async (): Promise<string[]> => {
|
||||
if (os.platform() !== "win32") {
|
||||
return [];
|
||||
}
|
||||
|
||||
const drives: string[] = [];
|
||||
const letters = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
|
||||
|
||||
for (const letter of letters) {
|
||||
const drivePath = `${letter}:\\`;
|
||||
try {
|
||||
await fs.access(drivePath);
|
||||
drives.push(drivePath);
|
||||
} catch {
|
||||
// Drive doesn't exist, skip it
|
||||
}
|
||||
}
|
||||
|
||||
return drives;
|
||||
};
|
||||
|
||||
// Check if a path is safe to browse
|
||||
const isSafePath = (targetPath: string): boolean => {
|
||||
const resolved = path.resolve(targetPath);
|
||||
const normalizedHome = path.resolve(homeDir);
|
||||
|
||||
// Allow browsing within home directory
|
||||
if (resolved === normalizedHome || resolved.startsWith(normalizedHome + path.sep)) {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Allow browsing already-allowed paths
|
||||
if (isPathAllowed(resolved)) {
|
||||
return true;
|
||||
}
|
||||
|
||||
// On Windows, allow drive roots for initial navigation
|
||||
if (os.platform() === "win32") {
|
||||
const driveRootMatch = /^[A-Z]:\\$/i.test(resolved);
|
||||
if (driveRootMatch) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
// On Unix, allow root for initial navigation (but only list, not read files)
|
||||
if (os.platform() !== "win32" && resolved === "/") {
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
};
|
||||
|
||||
// Default to home directory if no path provided
|
||||
const targetPath = dirPath ? path.resolve(dirPath) : homeDir;
|
||||
|
||||
// Security check: validate the path is safe to browse
|
||||
if (!isSafePath(targetPath)) {
|
||||
res.status(403).json({
|
||||
success: false,
|
||||
error: "Access denied: browsing is restricted to your home directory and allowed project paths",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const stats = await fs.stat(targetPath);
|
||||
|
||||
if (!stats.isDirectory()) {
|
||||
res.status(400).json({ success: false, error: "Path is not a directory" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Read directory contents
|
||||
const entries = await fs.readdir(targetPath, { withFileTypes: true });
|
||||
|
||||
// Filter for directories only and exclude hidden directories
|
||||
const directories = entries
|
||||
.filter((entry) => entry.isDirectory() && !entry.name.startsWith("."))
|
||||
.map((entry) => ({
|
||||
name: entry.name,
|
||||
path: path.join(targetPath, entry.name),
|
||||
}))
|
||||
.sort((a, b) => a.name.localeCompare(b.name));
|
||||
|
||||
// Get parent directory (only if parent is also safe to browse)
|
||||
const parentPath = path.dirname(targetPath);
|
||||
const hasParent = parentPath !== targetPath && isSafePath(parentPath);
|
||||
|
||||
// Get available drives on Windows
|
||||
const drives = await detectDrives();
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
currentPath: targetPath,
|
||||
parentPath: hasParent ? parentPath : null,
|
||||
directories,
|
||||
drives,
|
||||
});
|
||||
} catch (error) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: error instanceof Error ? error.message : "Failed to read directory",
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
102
apps/server/src/routes/git.ts
Normal file
102
apps/server/src/routes/git.ts
Normal file
@@ -0,0 +1,102 @@
|
||||
/**
|
||||
* Git routes - HTTP API for git operations (non-worktree)
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import { exec } from "child_process";
|
||||
import { promisify } from "util";
|
||||
|
||||
const execAsync = promisify(exec);
|
||||
|
||||
export function createGitRoutes(): Router {
|
||||
const router = Router();
|
||||
|
||||
// Get diffs for the main project
|
||||
router.post("/diffs", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath } = req.body as { projectPath: string };
|
||||
|
||||
if (!projectPath) {
|
||||
res.status(400).json({ success: false, error: "projectPath required" });
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const { stdout: diff } = await execAsync("git diff HEAD", {
|
||||
cwd: projectPath,
|
||||
maxBuffer: 10 * 1024 * 1024,
|
||||
});
|
||||
const { stdout: status } = await execAsync("git status --porcelain", {
|
||||
cwd: projectPath,
|
||||
});
|
||||
|
||||
const files = status
|
||||
.split("\n")
|
||||
.filter(Boolean)
|
||||
.map((line) => {
|
||||
const statusChar = line[0];
|
||||
const filePath = line.slice(3);
|
||||
const statusMap: Record<string, string> = {
|
||||
M: "Modified",
|
||||
A: "Added",
|
||||
D: "Deleted",
|
||||
R: "Renamed",
|
||||
C: "Copied",
|
||||
U: "Updated",
|
||||
"?": "Untracked",
|
||||
};
|
||||
return {
|
||||
status: statusChar,
|
||||
path: filePath,
|
||||
statusText: statusMap[statusChar] || "Unknown",
|
||||
};
|
||||
});
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
diff,
|
||||
files,
|
||||
hasChanges: files.length > 0,
|
||||
});
|
||||
} catch {
|
||||
res.json({ success: true, diff: "", files: [], hasChanges: false });
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get diff for a specific file
|
||||
router.post("/file-diff", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, filePath } = req.body as {
|
||||
projectPath: string;
|
||||
filePath: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !filePath) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and filePath required" });
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
const { stdout: diff } = await execAsync(`git diff HEAD -- "${filePath}"`, {
|
||||
cwd: projectPath,
|
||||
maxBuffer: 10 * 1024 * 1024,
|
||||
});
|
||||
|
||||
res.json({ success: true, diff, filePath });
|
||||
} catch {
|
||||
res.json({ success: true, diff: "", filePath });
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
39
apps/server/src/routes/health.ts
Normal file
39
apps/server/src/routes/health.ts
Normal file
@@ -0,0 +1,39 @@
|
||||
/**
|
||||
* Health check routes
|
||||
*/
|
||||
|
||||
import { Router } from "express";
|
||||
import { getAuthStatus } from "../lib/auth.js";
|
||||
|
||||
export function createHealthRoutes(): Router {
|
||||
const router = Router();
|
||||
|
||||
// Basic health check
|
||||
router.get("/", (_req, res) => {
|
||||
res.json({
|
||||
status: "ok",
|
||||
timestamp: new Date().toISOString(),
|
||||
version: process.env.npm_package_version || "0.1.0",
|
||||
});
|
||||
});
|
||||
|
||||
// Detailed health check
|
||||
router.get("/detailed", (_req, res) => {
|
||||
res.json({
|
||||
status: "ok",
|
||||
timestamp: new Date().toISOString(),
|
||||
version: process.env.npm_package_version || "0.1.0",
|
||||
uptime: process.uptime(),
|
||||
memory: process.memoryUsage(),
|
||||
dataDir: process.env.DATA_DIR || "./data",
|
||||
auth: getAuthStatus(),
|
||||
env: {
|
||||
nodeVersion: process.version,
|
||||
platform: process.platform,
|
||||
arch: process.arch,
|
||||
},
|
||||
});
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
128
apps/server/src/routes/models.ts
Normal file
128
apps/server/src/routes/models.ts
Normal file
@@ -0,0 +1,128 @@
|
||||
/**
|
||||
* Models routes - HTTP API for model providers and availability
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
|
||||
interface ModelDefinition {
|
||||
id: string;
|
||||
name: string;
|
||||
provider: string;
|
||||
contextWindow: number;
|
||||
maxOutputTokens: number;
|
||||
supportsVision: boolean;
|
||||
supportsTools: boolean;
|
||||
}
|
||||
|
||||
interface ProviderStatus {
|
||||
available: boolean;
|
||||
hasApiKey: boolean;
|
||||
error?: string;
|
||||
}
|
||||
|
||||
export function createModelsRoutes(): Router {
|
||||
const router = Router();
|
||||
|
||||
// Get available models
|
||||
router.get("/available", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
const models: ModelDefinition[] = [
|
||||
{
|
||||
id: "claude-opus-4-5-20251101",
|
||||
name: "Claude Opus 4.5",
|
||||
provider: "anthropic",
|
||||
contextWindow: 200000,
|
||||
maxOutputTokens: 16384,
|
||||
supportsVision: true,
|
||||
supportsTools: true,
|
||||
},
|
||||
{
|
||||
id: "claude-sonnet-4-20250514",
|
||||
name: "Claude Sonnet 4",
|
||||
provider: "anthropic",
|
||||
contextWindow: 200000,
|
||||
maxOutputTokens: 16384,
|
||||
supportsVision: true,
|
||||
supportsTools: true,
|
||||
},
|
||||
{
|
||||
id: "claude-3-5-sonnet-20241022",
|
||||
name: "Claude 3.5 Sonnet",
|
||||
provider: "anthropic",
|
||||
contextWindow: 200000,
|
||||
maxOutputTokens: 8192,
|
||||
supportsVision: true,
|
||||
supportsTools: true,
|
||||
},
|
||||
{
|
||||
id: "claude-3-5-haiku-20241022",
|
||||
name: "Claude 3.5 Haiku",
|
||||
provider: "anthropic",
|
||||
contextWindow: 200000,
|
||||
maxOutputTokens: 8192,
|
||||
supportsVision: true,
|
||||
supportsTools: true,
|
||||
},
|
||||
{
|
||||
id: "gpt-4o",
|
||||
name: "GPT-4o",
|
||||
provider: "openai",
|
||||
contextWindow: 128000,
|
||||
maxOutputTokens: 16384,
|
||||
supportsVision: true,
|
||||
supportsTools: true,
|
||||
},
|
||||
{
|
||||
id: "gpt-4o-mini",
|
||||
name: "GPT-4o Mini",
|
||||
provider: "openai",
|
||||
contextWindow: 128000,
|
||||
maxOutputTokens: 16384,
|
||||
supportsVision: true,
|
||||
supportsTools: true,
|
||||
},
|
||||
{
|
||||
id: "o1",
|
||||
name: "o1",
|
||||
provider: "openai",
|
||||
contextWindow: 200000,
|
||||
maxOutputTokens: 100000,
|
||||
supportsVision: true,
|
||||
supportsTools: false,
|
||||
},
|
||||
];
|
||||
|
||||
res.json({ success: true, models });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Check provider status
|
||||
router.get("/providers", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
const providers: Record<string, ProviderStatus> = {
|
||||
anthropic: {
|
||||
available: !!process.env.ANTHROPIC_API_KEY,
|
||||
hasApiKey: !!process.env.ANTHROPIC_API_KEY,
|
||||
},
|
||||
openai: {
|
||||
available: !!process.env.OPENAI_API_KEY,
|
||||
hasApiKey: !!process.env.OPENAI_API_KEY,
|
||||
},
|
||||
google: {
|
||||
available: !!process.env.GOOGLE_API_KEY,
|
||||
hasApiKey: !!process.env.GOOGLE_API_KEY,
|
||||
},
|
||||
};
|
||||
|
||||
res.json({ success: true, providers });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
70
apps/server/src/routes/running-agents.ts
Normal file
70
apps/server/src/routes/running-agents.ts
Normal file
@@ -0,0 +1,70 @@
|
||||
/**
|
||||
* Running Agents routes - HTTP API for tracking active agent executions
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import path from "path";
|
||||
|
||||
interface RunningAgent {
|
||||
featureId: string;
|
||||
projectPath: string;
|
||||
projectName: string;
|
||||
isAutoMode: boolean;
|
||||
}
|
||||
|
||||
// In-memory tracking of running agents (shared with auto-mode service via reference)
|
||||
const runningAgentsMap = new Map<string, RunningAgent>();
|
||||
let autoLoopRunning = false;
|
||||
|
||||
export function createRunningAgentsRoutes(): Router {
|
||||
const router = Router();
|
||||
|
||||
// Get all running agents
|
||||
router.get("/", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
const runningAgents = Array.from(runningAgentsMap.values());
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
runningAgents,
|
||||
totalCount: runningAgents.length,
|
||||
autoLoopRunning,
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
|
||||
// Export functions to update running agents from other services
|
||||
export function registerRunningAgent(
|
||||
featureId: string,
|
||||
projectPath: string,
|
||||
isAutoMode: boolean
|
||||
): void {
|
||||
runningAgentsMap.set(featureId, {
|
||||
featureId,
|
||||
projectPath,
|
||||
projectName: path.basename(projectPath),
|
||||
isAutoMode,
|
||||
});
|
||||
}
|
||||
|
||||
export function unregisterRunningAgent(featureId: string): void {
|
||||
runningAgentsMap.delete(featureId);
|
||||
}
|
||||
|
||||
export function setAutoLoopRunning(running: boolean): void {
|
||||
autoLoopRunning = running;
|
||||
}
|
||||
|
||||
export function getRunningAgentsCount(): number {
|
||||
return runningAgentsMap.size;
|
||||
}
|
||||
|
||||
export function isAgentRunning(featureId: string): boolean {
|
||||
return runningAgentsMap.has(featureId);
|
||||
}
|
||||
149
apps/server/src/routes/sessions.ts
Normal file
149
apps/server/src/routes/sessions.ts
Normal file
@@ -0,0 +1,149 @@
|
||||
/**
|
||||
* Sessions routes - HTTP API for session management
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import { AgentService } from "../services/agent-service.js";
|
||||
|
||||
export function createSessionsRoutes(agentService: AgentService): Router {
|
||||
const router = Router();
|
||||
|
||||
// List all sessions
|
||||
router.get("/", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const includeArchived = req.query.includeArchived === "true";
|
||||
const sessionsRaw = await agentService.listSessions(includeArchived);
|
||||
|
||||
// Transform to match frontend SessionListItem interface
|
||||
const sessions = await Promise.all(
|
||||
sessionsRaw.map(async (s) => {
|
||||
const messages = await agentService.loadSession(s.id);
|
||||
const lastMessage = messages[messages.length - 1];
|
||||
const preview = lastMessage?.content?.slice(0, 100) || "";
|
||||
|
||||
return {
|
||||
id: s.id,
|
||||
name: s.name,
|
||||
projectPath: s.projectPath || s.workingDirectory,
|
||||
workingDirectory: s.workingDirectory,
|
||||
createdAt: s.createdAt,
|
||||
updatedAt: s.updatedAt,
|
||||
isArchived: s.archived || false,
|
||||
tags: s.tags || [],
|
||||
messageCount: messages.length,
|
||||
preview,
|
||||
};
|
||||
})
|
||||
);
|
||||
|
||||
res.json({ success: true, sessions });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Create a new session
|
||||
router.post("/", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { name, projectPath, workingDirectory } = req.body as {
|
||||
name: string;
|
||||
projectPath?: string;
|
||||
workingDirectory?: string;
|
||||
};
|
||||
|
||||
if (!name) {
|
||||
res.status(400).json({ success: false, error: "name is required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const session = await agentService.createSession(
|
||||
name,
|
||||
projectPath,
|
||||
workingDirectory
|
||||
);
|
||||
res.json({ success: true, session });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Update a session
|
||||
router.put("/:sessionId", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId } = req.params;
|
||||
const { name, tags } = req.body as {
|
||||
name?: string;
|
||||
tags?: string[];
|
||||
};
|
||||
|
||||
const session = await agentService.updateSession(sessionId, { name, tags });
|
||||
if (!session) {
|
||||
res.status(404).json({ success: false, error: "Session not found" });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({ success: true, session });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Archive a session
|
||||
router.post("/:sessionId/archive", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId } = req.params;
|
||||
const success = await agentService.archiveSession(sessionId);
|
||||
|
||||
if (!success) {
|
||||
res.status(404).json({ success: false, error: "Session not found" });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Unarchive a session
|
||||
router.post("/:sessionId/unarchive", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId } = req.params;
|
||||
const success = await agentService.unarchiveSession(sessionId);
|
||||
|
||||
if (!success) {
|
||||
res.status(404).json({ success: false, error: "Session not found" });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Delete a session
|
||||
router.delete("/:sessionId", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { sessionId } = req.params;
|
||||
const success = await agentService.deleteSession(sessionId);
|
||||
|
||||
if (!success) {
|
||||
res.status(404).json({ success: false, error: "Session not found" });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
521
apps/server/src/routes/setup.ts
Normal file
521
apps/server/src/routes/setup.ts
Normal file
@@ -0,0 +1,521 @@
|
||||
/**
|
||||
* Setup routes - HTTP API for CLI detection, API keys, and platform info
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import { exec } from "child_process";
|
||||
import { promisify } from "util";
|
||||
import os from "os";
|
||||
import path from "path";
|
||||
import fs from "fs/promises";
|
||||
|
||||
const execAsync = promisify(exec);
|
||||
|
||||
// Storage for API keys (in-memory cache)
|
||||
const apiKeys: Record<string, string> = {};
|
||||
|
||||
// Helper to persist API keys to .env file
|
||||
async function persistApiKeyToEnv(key: string, value: string): Promise<void> {
|
||||
const envPath = path.join(process.cwd(), ".env");
|
||||
|
||||
try {
|
||||
let envContent = "";
|
||||
try {
|
||||
envContent = await fs.readFile(envPath, "utf-8");
|
||||
} catch {
|
||||
// .env file doesn't exist, we'll create it
|
||||
}
|
||||
|
||||
// Parse existing env content
|
||||
const lines = envContent.split("\n");
|
||||
const keyRegex = new RegExp(`^${key}=`);
|
||||
let found = false;
|
||||
const newLines = lines.map((line) => {
|
||||
if (keyRegex.test(line)) {
|
||||
found = true;
|
||||
return `${key}=${value}`;
|
||||
}
|
||||
return line;
|
||||
});
|
||||
|
||||
if (!found) {
|
||||
// Add the key at the end
|
||||
newLines.push(`${key}=${value}`);
|
||||
}
|
||||
|
||||
await fs.writeFile(envPath, newLines.join("\n"));
|
||||
console.log(`[Setup] Persisted ${key} to .env file`);
|
||||
} catch (error) {
|
||||
console.error(`[Setup] Failed to persist ${key} to .env:`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export function createSetupRoutes(): Router {
|
||||
const router = Router();
|
||||
|
||||
// Get Claude CLI status
|
||||
router.get("/claude-status", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
let installed = false;
|
||||
let version = "";
|
||||
let cliPath = "";
|
||||
let method = "none";
|
||||
|
||||
// Try to find Claude CLI
|
||||
try {
|
||||
const { stdout } = await execAsync("which claude || where claude 2>/dev/null");
|
||||
cliPath = stdout.trim();
|
||||
installed = true;
|
||||
method = "path";
|
||||
|
||||
// Get version
|
||||
try {
|
||||
const { stdout: versionOut } = await execAsync("claude --version");
|
||||
version = versionOut.trim();
|
||||
} catch {
|
||||
// Version command might not be available
|
||||
}
|
||||
} catch {
|
||||
// Not in PATH, try common locations
|
||||
const commonPaths = [
|
||||
path.join(os.homedir(), ".local", "bin", "claude"),
|
||||
path.join(os.homedir(), ".claude", "local", "claude"),
|
||||
"/usr/local/bin/claude",
|
||||
path.join(os.homedir(), ".npm-global", "bin", "claude"),
|
||||
];
|
||||
|
||||
for (const p of commonPaths) {
|
||||
try {
|
||||
await fs.access(p);
|
||||
cliPath = p;
|
||||
installed = true;
|
||||
method = "local";
|
||||
|
||||
// Get version from this path
|
||||
try {
|
||||
const { stdout: versionOut } = await execAsync(`"${p}" --version`);
|
||||
version = versionOut.trim();
|
||||
} catch {
|
||||
// Version command might not be available
|
||||
}
|
||||
break;
|
||||
} catch {
|
||||
// Not found at this path
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check authentication - detect all possible auth methods
|
||||
// Note: apiKeys.anthropic_oauth_token stores OAuth tokens from subscription auth
|
||||
// apiKeys.anthropic stores direct API keys for pay-per-use
|
||||
let auth = {
|
||||
authenticated: false,
|
||||
method: "none" as string,
|
||||
hasCredentialsFile: false,
|
||||
hasToken: false,
|
||||
hasStoredOAuthToken: !!apiKeys.anthropic_oauth_token,
|
||||
hasStoredApiKey: !!apiKeys.anthropic,
|
||||
hasEnvApiKey: !!process.env.ANTHROPIC_API_KEY,
|
||||
hasEnvOAuthToken: !!process.env.CLAUDE_CODE_OAUTH_TOKEN,
|
||||
// Additional fields for detailed status
|
||||
oauthTokenValid: false,
|
||||
apiKeyValid: false,
|
||||
hasCliAuth: false,
|
||||
hasRecentActivity: false,
|
||||
};
|
||||
|
||||
const claudeDir = path.join(os.homedir(), ".claude");
|
||||
|
||||
// Check for recent Claude CLI activity - indicates working authentication
|
||||
// The stats-cache.json file is only populated when the CLI is working properly
|
||||
const statsCachePath = path.join(claudeDir, "stats-cache.json");
|
||||
try {
|
||||
const statsContent = await fs.readFile(statsCachePath, "utf-8");
|
||||
const stats = JSON.parse(statsContent);
|
||||
|
||||
// Check if there's any activity (which means the CLI is authenticated and working)
|
||||
if (stats.dailyActivity && stats.dailyActivity.length > 0) {
|
||||
auth.hasRecentActivity = true;
|
||||
auth.hasCliAuth = true;
|
||||
auth.authenticated = true;
|
||||
auth.method = "cli_authenticated";
|
||||
}
|
||||
} catch {
|
||||
// Stats file doesn't exist or is invalid
|
||||
}
|
||||
|
||||
// Check for settings.json - indicates CLI has been set up
|
||||
const settingsPath = path.join(claudeDir, "settings.json");
|
||||
try {
|
||||
await fs.access(settingsPath);
|
||||
// If settings exist but no activity, CLI might be set up but not authenticated
|
||||
if (!auth.hasCliAuth) {
|
||||
// Try to check for other indicators of auth
|
||||
const sessionsDir = path.join(claudeDir, "projects");
|
||||
try {
|
||||
const sessions = await fs.readdir(sessionsDir);
|
||||
if (sessions.length > 0) {
|
||||
auth.hasCliAuth = true;
|
||||
auth.authenticated = true;
|
||||
auth.method = "cli_authenticated";
|
||||
}
|
||||
} catch {
|
||||
// Sessions directory doesn't exist
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
// Settings file doesn't exist
|
||||
}
|
||||
|
||||
// Check for credentials file (OAuth tokens from claude login) - legacy/alternative auth
|
||||
const credentialsPath = path.join(claudeDir, "credentials.json");
|
||||
try {
|
||||
const credentialsContent = await fs.readFile(credentialsPath, "utf-8");
|
||||
const credentials = JSON.parse(credentialsContent);
|
||||
auth.hasCredentialsFile = true;
|
||||
|
||||
// Check what type of token is in credentials
|
||||
if (credentials.oauth_token || credentials.access_token) {
|
||||
auth.hasStoredOAuthToken = true;
|
||||
auth.oauthTokenValid = true;
|
||||
auth.authenticated = true;
|
||||
auth.method = "oauth_token"; // Stored OAuth token from credentials file
|
||||
} else if (credentials.api_key) {
|
||||
auth.apiKeyValid = true;
|
||||
auth.authenticated = true;
|
||||
auth.method = "api_key"; // Stored API key in credentials file
|
||||
}
|
||||
} catch {
|
||||
// No credentials file or invalid format
|
||||
}
|
||||
|
||||
// Environment variables override stored credentials (higher priority)
|
||||
if (auth.hasEnvOAuthToken) {
|
||||
auth.authenticated = true;
|
||||
auth.oauthTokenValid = true;
|
||||
auth.method = "oauth_token_env"; // OAuth token from CLAUDE_CODE_OAUTH_TOKEN env var
|
||||
} else if (auth.hasEnvApiKey) {
|
||||
auth.authenticated = true;
|
||||
auth.apiKeyValid = true;
|
||||
auth.method = "api_key_env"; // API key from ANTHROPIC_API_KEY env var
|
||||
}
|
||||
|
||||
// In-memory stored OAuth token (from setup wizard - subscription auth)
|
||||
if (!auth.authenticated && apiKeys.anthropic_oauth_token) {
|
||||
auth.authenticated = true;
|
||||
auth.oauthTokenValid = true;
|
||||
auth.method = "oauth_token"; // Stored OAuth token from setup wizard
|
||||
}
|
||||
|
||||
// In-memory stored API key (from settings UI - pay-per-use)
|
||||
if (!auth.authenticated && apiKeys.anthropic) {
|
||||
auth.authenticated = true;
|
||||
auth.apiKeyValid = true;
|
||||
auth.method = "api_key"; // Manually stored API key
|
||||
}
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
status: installed ? "installed" : "not_installed",
|
||||
installed,
|
||||
method,
|
||||
version,
|
||||
path: cliPath,
|
||||
auth,
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get Codex CLI status
|
||||
router.get("/codex-status", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
let installed = false;
|
||||
let version = "";
|
||||
let cliPath = "";
|
||||
let method = "none";
|
||||
|
||||
// Try to find Codex CLI
|
||||
try {
|
||||
const { stdout } = await execAsync("which codex || where codex 2>/dev/null");
|
||||
cliPath = stdout.trim();
|
||||
installed = true;
|
||||
method = "path";
|
||||
|
||||
try {
|
||||
const { stdout: versionOut } = await execAsync("codex --version");
|
||||
version = versionOut.trim();
|
||||
} catch {
|
||||
// Version command might not be available
|
||||
}
|
||||
} catch {
|
||||
// Not found
|
||||
}
|
||||
|
||||
// Check for OpenAI/Codex authentication
|
||||
let auth = {
|
||||
authenticated: false,
|
||||
method: "none" as string,
|
||||
hasAuthFile: false,
|
||||
hasEnvKey: !!process.env.OPENAI_API_KEY,
|
||||
hasStoredApiKey: !!apiKeys.openai,
|
||||
hasEnvApiKey: !!process.env.OPENAI_API_KEY,
|
||||
// Additional fields for subscription/account detection
|
||||
hasSubscription: false,
|
||||
cliLoggedIn: false,
|
||||
};
|
||||
|
||||
// Check for OpenAI CLI auth file (~/.codex/auth.json or similar)
|
||||
const codexAuthPaths = [
|
||||
path.join(os.homedir(), ".codex", "auth.json"),
|
||||
path.join(os.homedir(), ".openai", "credentials"),
|
||||
path.join(os.homedir(), ".config", "openai", "credentials.json"),
|
||||
];
|
||||
|
||||
for (const authPath of codexAuthPaths) {
|
||||
try {
|
||||
const authContent = await fs.readFile(authPath, "utf-8");
|
||||
const authData = JSON.parse(authContent);
|
||||
auth.hasAuthFile = true;
|
||||
|
||||
// Check for subscription/tokens
|
||||
if (authData.subscription || authData.plan || authData.account_type) {
|
||||
auth.hasSubscription = true;
|
||||
auth.authenticated = true;
|
||||
auth.method = "subscription"; // Codex subscription (Plus/Team)
|
||||
} else if (authData.access_token || authData.api_key) {
|
||||
auth.cliLoggedIn = true;
|
||||
auth.authenticated = true;
|
||||
auth.method = "cli_verified"; // CLI logged in with account
|
||||
}
|
||||
break;
|
||||
} catch {
|
||||
// Auth file not found at this path
|
||||
}
|
||||
}
|
||||
|
||||
// Environment variable has highest priority
|
||||
if (auth.hasEnvApiKey) {
|
||||
auth.authenticated = true;
|
||||
auth.method = "env"; // OPENAI_API_KEY environment variable
|
||||
}
|
||||
|
||||
// In-memory stored API key (from settings UI)
|
||||
if (!auth.authenticated && apiKeys.openai) {
|
||||
auth.authenticated = true;
|
||||
auth.method = "api_key"; // Manually stored API key
|
||||
}
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
status: installed ? "installed" : "not_installed",
|
||||
method,
|
||||
version,
|
||||
path: cliPath,
|
||||
auth,
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Install Claude CLI
|
||||
router.post("/install-claude", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
// In web mode, we can't install CLIs directly
|
||||
// Return instructions instead
|
||||
res.json({
|
||||
success: false,
|
||||
error:
|
||||
"CLI installation requires terminal access. Please install manually using: npm install -g @anthropic-ai/claude-code",
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Install Codex CLI
|
||||
router.post("/install-codex", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
res.json({
|
||||
success: false,
|
||||
error:
|
||||
"CLI installation requires terminal access. Please install manually using: npm install -g @openai/codex",
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Auth Claude
|
||||
router.post("/auth-claude", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
res.json({
|
||||
success: true,
|
||||
requiresManualAuth: true,
|
||||
command: "claude login",
|
||||
message: "Please run 'claude login' in your terminal to authenticate",
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Auth Codex
|
||||
router.post("/auth-codex", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { apiKey } = req.body as { apiKey?: string };
|
||||
|
||||
if (apiKey) {
|
||||
apiKeys.openai = apiKey;
|
||||
process.env.OPENAI_API_KEY = apiKey;
|
||||
res.json({ success: true });
|
||||
} else {
|
||||
res.json({
|
||||
success: true,
|
||||
requiresManualAuth: true,
|
||||
command: "codex auth login",
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Store API key
|
||||
router.post("/store-api-key", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { provider, apiKey } = req.body as { provider: string; apiKey: string };
|
||||
|
||||
if (!provider || !apiKey) {
|
||||
res.status(400).json({ success: false, error: "provider and apiKey required" });
|
||||
return;
|
||||
}
|
||||
|
||||
apiKeys[provider] = apiKey;
|
||||
|
||||
// Also set as environment variable and persist to .env
|
||||
// IMPORTANT: OAuth tokens and API keys must be stored separately
|
||||
// - OAuth tokens (subscription auth) -> CLAUDE_CODE_OAUTH_TOKEN
|
||||
// - API keys (pay-per-use) -> ANTHROPIC_API_KEY
|
||||
if (provider === "anthropic_oauth_token") {
|
||||
// OAuth token from claude setup-token (subscription-based auth)
|
||||
process.env.CLAUDE_CODE_OAUTH_TOKEN = apiKey;
|
||||
await persistApiKeyToEnv("CLAUDE_CODE_OAUTH_TOKEN", apiKey);
|
||||
console.log("[Setup] Stored OAuth token as CLAUDE_CODE_OAUTH_TOKEN");
|
||||
} else if (provider === "anthropic") {
|
||||
// Direct API key (pay-per-use)
|
||||
process.env.ANTHROPIC_API_KEY = apiKey;
|
||||
await persistApiKeyToEnv("ANTHROPIC_API_KEY", apiKey);
|
||||
console.log("[Setup] Stored API key as ANTHROPIC_API_KEY");
|
||||
} else if (provider === "openai") {
|
||||
process.env.OPENAI_API_KEY = apiKey;
|
||||
await persistApiKeyToEnv("OPENAI_API_KEY", apiKey);
|
||||
} else if (provider === "google") {
|
||||
process.env.GOOGLE_API_KEY = apiKey;
|
||||
await persistApiKeyToEnv("GOOGLE_API_KEY", apiKey);
|
||||
}
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get API keys status
|
||||
router.get("/api-keys", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
res.json({
|
||||
success: true,
|
||||
hasAnthropicKey: !!apiKeys.anthropic || !!process.env.ANTHROPIC_API_KEY,
|
||||
hasOpenAIKey: !!apiKeys.openai || !!process.env.OPENAI_API_KEY,
|
||||
hasGoogleKey: !!apiKeys.google || !!process.env.GOOGLE_API_KEY,
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Configure Codex MCP
|
||||
router.post("/configure-codex-mcp", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath } = req.body as { projectPath: string };
|
||||
|
||||
if (!projectPath) {
|
||||
res.status(400).json({ success: false, error: "projectPath required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Create .codex directory and config
|
||||
const codexDir = path.join(projectPath, ".codex");
|
||||
await fs.mkdir(codexDir, { recursive: true });
|
||||
|
||||
const configPath = path.join(codexDir, "config.toml");
|
||||
const config = `# Codex configuration
|
||||
[mcp]
|
||||
enabled = true
|
||||
`;
|
||||
await fs.writeFile(configPath, config);
|
||||
|
||||
res.json({ success: true, configPath });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get platform info
|
||||
router.get("/platform", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
const platform = os.platform();
|
||||
res.json({
|
||||
success: true,
|
||||
platform,
|
||||
arch: os.arch(),
|
||||
homeDir: os.homedir(),
|
||||
isWindows: platform === "win32",
|
||||
isMac: platform === "darwin",
|
||||
isLinux: platform === "linux",
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Test OpenAI connection
|
||||
router.post("/test-openai", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { apiKey } = req.body as { apiKey?: string };
|
||||
const key = apiKey || apiKeys.openai || process.env.OPENAI_API_KEY;
|
||||
|
||||
if (!key) {
|
||||
res.json({ success: false, error: "No OpenAI API key provided" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Simple test - just verify the key format
|
||||
if (!key.startsWith("sk-")) {
|
||||
res.json({ success: false, error: "Invalid OpenAI API key format" });
|
||||
return;
|
||||
}
|
||||
|
||||
res.json({ success: true, message: "API key format is valid" });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
597
apps/server/src/routes/spec-regeneration.ts
Normal file
597
apps/server/src/routes/spec-regeneration.ts
Normal file
@@ -0,0 +1,597 @@
|
||||
/**
|
||||
* Spec Regeneration routes - HTTP API for AI-powered spec generation
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import { query, type Options } from "@anthropic-ai/claude-agent-sdk";
|
||||
import path from "path";
|
||||
import fs from "fs/promises";
|
||||
import type { EventEmitter } from "../lib/events.js";
|
||||
|
||||
let isRunning = false;
|
||||
let currentAbortController: AbortController | null = null;
|
||||
|
||||
// Helper to log authentication status
|
||||
function logAuthStatus(context: string): void {
|
||||
const hasOAuthToken = !!process.env.CLAUDE_CODE_OAUTH_TOKEN;
|
||||
const hasApiKey = !!process.env.ANTHROPIC_API_KEY;
|
||||
|
||||
console.log(`[SpecRegeneration] ${context} - Auth Status:`);
|
||||
console.log(`[SpecRegeneration] CLAUDE_CODE_OAUTH_TOKEN: ${hasOAuthToken ? 'SET (' + process.env.CLAUDE_CODE_OAUTH_TOKEN?.substring(0, 20) + '...)' : 'NOT SET'}`);
|
||||
console.log(`[SpecRegeneration] ANTHROPIC_API_KEY: ${hasApiKey ? 'SET (' + process.env.ANTHROPIC_API_KEY?.substring(0, 20) + '...)' : 'NOT SET'}`);
|
||||
|
||||
if (!hasOAuthToken && !hasApiKey) {
|
||||
console.error(`[SpecRegeneration] ⚠️ WARNING: No authentication configured! SDK will fail.`);
|
||||
}
|
||||
}
|
||||
|
||||
export function createSpecRegenerationRoutes(events: EventEmitter): Router {
|
||||
const router = Router();
|
||||
|
||||
// Create project spec from overview
|
||||
router.post("/create", async (req: Request, res: Response) => {
|
||||
console.log("[SpecRegeneration] ========== /create endpoint called ==========");
|
||||
console.log("[SpecRegeneration] Request body:", JSON.stringify(req.body, null, 2));
|
||||
|
||||
try {
|
||||
const { projectPath, projectOverview, generateFeatures } = req.body as {
|
||||
projectPath: string;
|
||||
projectOverview: string;
|
||||
generateFeatures?: boolean;
|
||||
};
|
||||
|
||||
console.log(`[SpecRegeneration] Parsed params:`);
|
||||
console.log(`[SpecRegeneration] projectPath: ${projectPath}`);
|
||||
console.log(`[SpecRegeneration] projectOverview length: ${projectOverview?.length || 0} chars`);
|
||||
console.log(`[SpecRegeneration] generateFeatures: ${generateFeatures}`);
|
||||
|
||||
if (!projectPath || !projectOverview) {
|
||||
console.error("[SpecRegeneration] Missing required parameters");
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: "projectPath and projectOverview required",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
if (isRunning) {
|
||||
console.warn("[SpecRegeneration] Generation already running, rejecting request");
|
||||
res.json({ success: false, error: "Spec generation already running" });
|
||||
return;
|
||||
}
|
||||
|
||||
logAuthStatus("Before starting generation");
|
||||
|
||||
isRunning = true;
|
||||
currentAbortController = new AbortController();
|
||||
console.log("[SpecRegeneration] Starting background generation task...");
|
||||
|
||||
// Start generation in background
|
||||
generateSpec(
|
||||
projectPath,
|
||||
projectOverview,
|
||||
events,
|
||||
currentAbortController,
|
||||
generateFeatures
|
||||
)
|
||||
.catch((error) => {
|
||||
console.error("[SpecRegeneration] ❌ Generation failed with error:");
|
||||
console.error("[SpecRegeneration] Error name:", error?.name);
|
||||
console.error("[SpecRegeneration] Error message:", error?.message);
|
||||
console.error("[SpecRegeneration] Error stack:", error?.stack);
|
||||
console.error("[SpecRegeneration] Full error object:", JSON.stringify(error, Object.getOwnPropertyNames(error), 2));
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "spec_error",
|
||||
error: error.message || String(error),
|
||||
});
|
||||
})
|
||||
.finally(() => {
|
||||
console.log("[SpecRegeneration] Generation task finished (success or error)");
|
||||
isRunning = false;
|
||||
currentAbortController = null;
|
||||
});
|
||||
|
||||
console.log("[SpecRegeneration] Returning success response (generation running in background)");
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error("[SpecRegeneration] ❌ Route handler exception:");
|
||||
console.error("[SpecRegeneration] Error:", error);
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Generate from project definition
|
||||
router.post("/generate", async (req: Request, res: Response) => {
|
||||
console.log("[SpecRegeneration] ========== /generate endpoint called ==========");
|
||||
console.log("[SpecRegeneration] Request body:", JSON.stringify(req.body, null, 2));
|
||||
|
||||
try {
|
||||
const { projectPath, projectDefinition } = req.body as {
|
||||
projectPath: string;
|
||||
projectDefinition: string;
|
||||
};
|
||||
|
||||
console.log(`[SpecRegeneration] Parsed params:`);
|
||||
console.log(`[SpecRegeneration] projectPath: ${projectPath}`);
|
||||
console.log(`[SpecRegeneration] projectDefinition length: ${projectDefinition?.length || 0} chars`);
|
||||
|
||||
if (!projectPath || !projectDefinition) {
|
||||
console.error("[SpecRegeneration] Missing required parameters");
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: "projectPath and projectDefinition required",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
if (isRunning) {
|
||||
console.warn("[SpecRegeneration] Generation already running, rejecting request");
|
||||
res.json({ success: false, error: "Spec generation already running" });
|
||||
return;
|
||||
}
|
||||
|
||||
logAuthStatus("Before starting generation");
|
||||
|
||||
isRunning = true;
|
||||
currentAbortController = new AbortController();
|
||||
console.log("[SpecRegeneration] Starting background generation task...");
|
||||
|
||||
generateSpec(
|
||||
projectPath,
|
||||
projectDefinition,
|
||||
events,
|
||||
currentAbortController,
|
||||
false
|
||||
)
|
||||
.catch((error) => {
|
||||
console.error("[SpecRegeneration] ❌ Generation failed with error:");
|
||||
console.error("[SpecRegeneration] Error name:", error?.name);
|
||||
console.error("[SpecRegeneration] Error message:", error?.message);
|
||||
console.error("[SpecRegeneration] Error stack:", error?.stack);
|
||||
console.error("[SpecRegeneration] Full error object:", JSON.stringify(error, Object.getOwnPropertyNames(error), 2));
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "spec_error",
|
||||
error: error.message || String(error),
|
||||
});
|
||||
})
|
||||
.finally(() => {
|
||||
console.log("[SpecRegeneration] Generation task finished (success or error)");
|
||||
isRunning = false;
|
||||
currentAbortController = null;
|
||||
});
|
||||
|
||||
console.log("[SpecRegeneration] Returning success response (generation running in background)");
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error("[SpecRegeneration] ❌ Route handler exception:");
|
||||
console.error("[SpecRegeneration] Error:", error);
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Generate features from existing spec
|
||||
router.post("/generate-features", async (req: Request, res: Response) => {
|
||||
console.log("[SpecRegeneration] ========== /generate-features endpoint called ==========");
|
||||
console.log("[SpecRegeneration] Request body:", JSON.stringify(req.body, null, 2));
|
||||
|
||||
try {
|
||||
const { projectPath } = req.body as { projectPath: string };
|
||||
|
||||
console.log(`[SpecRegeneration] projectPath: ${projectPath}`);
|
||||
|
||||
if (!projectPath) {
|
||||
console.error("[SpecRegeneration] Missing projectPath parameter");
|
||||
res.status(400).json({ success: false, error: "projectPath required" });
|
||||
return;
|
||||
}
|
||||
|
||||
if (isRunning) {
|
||||
console.warn("[SpecRegeneration] Generation already running, rejecting request");
|
||||
res.json({ success: false, error: "Generation already running" });
|
||||
return;
|
||||
}
|
||||
|
||||
logAuthStatus("Before starting feature generation");
|
||||
|
||||
isRunning = true;
|
||||
currentAbortController = new AbortController();
|
||||
console.log("[SpecRegeneration] Starting background feature generation task...");
|
||||
|
||||
generateFeaturesFromSpec(projectPath, events, currentAbortController)
|
||||
.catch((error) => {
|
||||
console.error("[SpecRegeneration] ❌ Feature generation failed with error:");
|
||||
console.error("[SpecRegeneration] Error name:", error?.name);
|
||||
console.error("[SpecRegeneration] Error message:", error?.message);
|
||||
console.error("[SpecRegeneration] Error stack:", error?.stack);
|
||||
console.error("[SpecRegeneration] Full error object:", JSON.stringify(error, Object.getOwnPropertyNames(error), 2));
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "features_error",
|
||||
error: error.message || String(error),
|
||||
});
|
||||
})
|
||||
.finally(() => {
|
||||
console.log("[SpecRegeneration] Feature generation task finished (success or error)");
|
||||
isRunning = false;
|
||||
currentAbortController = null;
|
||||
});
|
||||
|
||||
console.log("[SpecRegeneration] Returning success response (generation running in background)");
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
console.error("[SpecRegeneration] ❌ Route handler exception:");
|
||||
console.error("[SpecRegeneration] Error:", error);
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Stop generation
|
||||
router.post("/stop", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
if (currentAbortController) {
|
||||
currentAbortController.abort();
|
||||
}
|
||||
isRunning = false;
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get status
|
||||
router.get("/status", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
res.json({ success: true, isRunning });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
|
||||
async function generateSpec(
|
||||
projectPath: string,
|
||||
projectOverview: string,
|
||||
events: EventEmitter,
|
||||
abortController: AbortController,
|
||||
generateFeatures?: boolean
|
||||
) {
|
||||
console.log("[SpecRegeneration] ========== generateSpec() started ==========");
|
||||
console.log(`[SpecRegeneration] projectPath: ${projectPath}`);
|
||||
console.log(`[SpecRegeneration] projectOverview length: ${projectOverview.length} chars`);
|
||||
console.log(`[SpecRegeneration] generateFeatures: ${generateFeatures}`);
|
||||
|
||||
const prompt = `You are helping to define a software project specification.
|
||||
|
||||
Project Overview:
|
||||
${projectOverview}
|
||||
|
||||
Based on this overview, analyze the project and create a comprehensive specification that includes:
|
||||
|
||||
1. **Project Summary** - Brief description of what the project does
|
||||
2. **Core Features** - Main functionality the project needs
|
||||
3. **Technical Stack** - Recommended technologies and frameworks
|
||||
4. **Architecture** - High-level system design
|
||||
5. **Data Models** - Key entities and their relationships
|
||||
6. **API Design** - Main endpoints/interfaces needed
|
||||
7. **User Experience** - Key user flows and interactions
|
||||
|
||||
${generateFeatures ? `
|
||||
Also generate a list of features to implement. For each feature provide:
|
||||
- ID (lowercase-hyphenated)
|
||||
- Title
|
||||
- Description
|
||||
- Priority (1=high, 2=medium, 3=low)
|
||||
- Estimated complexity (simple, moderate, complex)
|
||||
` : ""}
|
||||
|
||||
Format your response as markdown. Be specific and actionable.`;
|
||||
|
||||
console.log(`[SpecRegeneration] Prompt length: ${prompt.length} chars`);
|
||||
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "spec_progress",
|
||||
content: "Starting spec generation...\n",
|
||||
});
|
||||
|
||||
const options: Options = {
|
||||
model: "claude-opus-4-5-20251101",
|
||||
maxTurns: 10,
|
||||
cwd: projectPath,
|
||||
allowedTools: ["Read", "Glob", "Grep"],
|
||||
permissionMode: "acceptEdits",
|
||||
abortController,
|
||||
};
|
||||
|
||||
console.log("[SpecRegeneration] SDK Options:", JSON.stringify(options, null, 2));
|
||||
console.log("[SpecRegeneration] Calling Claude Agent SDK query()...");
|
||||
|
||||
// Log auth status right before the SDK call
|
||||
logAuthStatus("Right before SDK query()");
|
||||
|
||||
let stream;
|
||||
try {
|
||||
stream = query({ prompt, options });
|
||||
console.log("[SpecRegeneration] query() returned stream successfully");
|
||||
} catch (queryError) {
|
||||
console.error("[SpecRegeneration] ❌ query() threw an exception:");
|
||||
console.error("[SpecRegeneration] Error:", queryError);
|
||||
throw queryError;
|
||||
}
|
||||
|
||||
let responseText = "";
|
||||
let messageCount = 0;
|
||||
|
||||
console.log("[SpecRegeneration] Starting to iterate over stream...");
|
||||
|
||||
try {
|
||||
for await (const msg of stream) {
|
||||
messageCount++;
|
||||
console.log(`[SpecRegeneration] Stream message #${messageCount}:`, JSON.stringify({ type: msg.type, subtype: (msg as any).subtype }, null, 2));
|
||||
|
||||
if (msg.type === "assistant" && msg.message.content) {
|
||||
for (const block of msg.message.content) {
|
||||
if (block.type === "text") {
|
||||
responseText = block.text;
|
||||
console.log(`[SpecRegeneration] Text block received (${block.text.length} chars)`);
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "spec_progress",
|
||||
content: block.text,
|
||||
});
|
||||
} else if (block.type === "tool_use") {
|
||||
console.log(`[SpecRegeneration] Tool use: ${block.name}`);
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "spec_tool",
|
||||
tool: block.name,
|
||||
input: block.input,
|
||||
});
|
||||
}
|
||||
}
|
||||
} else if (msg.type === "result" && (msg as any).subtype === "success") {
|
||||
console.log("[SpecRegeneration] Received success result");
|
||||
responseText = (msg as any).result || responseText;
|
||||
} else if (msg.type === "error") {
|
||||
console.error("[SpecRegeneration] ❌ Received error message from stream:");
|
||||
console.error("[SpecRegeneration] Error message:", JSON.stringify(msg, null, 2));
|
||||
}
|
||||
}
|
||||
} catch (streamError) {
|
||||
console.error("[SpecRegeneration] ❌ Error while iterating stream:");
|
||||
console.error("[SpecRegeneration] Stream error:", streamError);
|
||||
throw streamError;
|
||||
}
|
||||
|
||||
console.log(`[SpecRegeneration] Stream iteration complete. Total messages: ${messageCount}`);
|
||||
console.log(`[SpecRegeneration] Response text length: ${responseText.length} chars`);
|
||||
|
||||
// Save spec
|
||||
const specDir = path.join(projectPath, ".automaker");
|
||||
const specPath = path.join(specDir, "app_spec.txt");
|
||||
|
||||
console.log(`[SpecRegeneration] Saving spec to: ${specPath}`);
|
||||
|
||||
await fs.mkdir(specDir, { recursive: true });
|
||||
await fs.writeFile(specPath, responseText);
|
||||
|
||||
console.log("[SpecRegeneration] Spec saved successfully");
|
||||
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "spec_complete",
|
||||
specPath,
|
||||
content: responseText,
|
||||
});
|
||||
|
||||
// If generate features was requested, parse and create them
|
||||
if (generateFeatures) {
|
||||
console.log("[SpecRegeneration] Starting feature generation...");
|
||||
await parseAndCreateFeatures(projectPath, responseText, events);
|
||||
}
|
||||
|
||||
console.log("[SpecRegeneration] ========== generateSpec() completed ==========");
|
||||
}
|
||||
|
||||
async function generateFeaturesFromSpec(
|
||||
projectPath: string,
|
||||
events: EventEmitter,
|
||||
abortController: AbortController
|
||||
) {
|
||||
console.log("[SpecRegeneration] ========== generateFeaturesFromSpec() started ==========");
|
||||
console.log(`[SpecRegeneration] projectPath: ${projectPath}`);
|
||||
|
||||
// Read existing spec
|
||||
const specPath = path.join(projectPath, ".automaker", "app_spec.txt");
|
||||
let spec: string;
|
||||
|
||||
console.log(`[SpecRegeneration] Reading spec from: ${specPath}`);
|
||||
|
||||
try {
|
||||
spec = await fs.readFile(specPath, "utf-8");
|
||||
console.log(`[SpecRegeneration] Spec loaded successfully (${spec.length} chars)`);
|
||||
} catch (readError) {
|
||||
console.error("[SpecRegeneration] ❌ Failed to read spec file:", readError);
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "features_error",
|
||||
error: "No project spec found. Generate spec first.",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const prompt = `Based on this project specification:
|
||||
|
||||
${spec}
|
||||
|
||||
Generate a prioritized list of implementable features. For each feature provide:
|
||||
|
||||
1. **id**: A unique lowercase-hyphenated identifier
|
||||
2. **title**: Short descriptive title
|
||||
3. **description**: What this feature does (2-3 sentences)
|
||||
4. **priority**: 1 (high), 2 (medium), or 3 (low)
|
||||
5. **complexity**: "simple", "moderate", or "complex"
|
||||
6. **dependencies**: Array of feature IDs this depends on (can be empty)
|
||||
|
||||
Format as JSON:
|
||||
{
|
||||
"features": [
|
||||
{
|
||||
"id": "feature-id",
|
||||
"title": "Feature Title",
|
||||
"description": "What it does",
|
||||
"priority": 1,
|
||||
"complexity": "moderate",
|
||||
"dependencies": []
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
Generate 5-15 features that build on each other logically.`;
|
||||
|
||||
console.log(`[SpecRegeneration] Prompt length: ${prompt.length} chars`);
|
||||
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "features_progress",
|
||||
content: "Analyzing spec and generating features...\n",
|
||||
});
|
||||
|
||||
const options: Options = {
|
||||
model: "claude-sonnet-4-20250514",
|
||||
maxTurns: 5,
|
||||
cwd: projectPath,
|
||||
allowedTools: ["Read", "Glob"],
|
||||
permissionMode: "acceptEdits",
|
||||
abortController,
|
||||
};
|
||||
|
||||
console.log("[SpecRegeneration] SDK Options:", JSON.stringify(options, null, 2));
|
||||
console.log("[SpecRegeneration] Calling Claude Agent SDK query() for features...");
|
||||
|
||||
logAuthStatus("Right before SDK query() for features");
|
||||
|
||||
let stream;
|
||||
try {
|
||||
stream = query({ prompt, options });
|
||||
console.log("[SpecRegeneration] query() returned stream successfully");
|
||||
} catch (queryError) {
|
||||
console.error("[SpecRegeneration] ❌ query() threw an exception:");
|
||||
console.error("[SpecRegeneration] Error:", queryError);
|
||||
throw queryError;
|
||||
}
|
||||
|
||||
let responseText = "";
|
||||
let messageCount = 0;
|
||||
|
||||
console.log("[SpecRegeneration] Starting to iterate over feature stream...");
|
||||
|
||||
try {
|
||||
for await (const msg of stream) {
|
||||
messageCount++;
|
||||
console.log(`[SpecRegeneration] Feature stream message #${messageCount}:`, JSON.stringify({ type: msg.type, subtype: (msg as any).subtype }, null, 2));
|
||||
|
||||
if (msg.type === "assistant" && msg.message.content) {
|
||||
for (const block of msg.message.content) {
|
||||
if (block.type === "text") {
|
||||
responseText = block.text;
|
||||
console.log(`[SpecRegeneration] Feature text block received (${block.text.length} chars)`);
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "features_progress",
|
||||
content: block.text,
|
||||
});
|
||||
}
|
||||
}
|
||||
} else if (msg.type === "result" && (msg as any).subtype === "success") {
|
||||
console.log("[SpecRegeneration] Received success result for features");
|
||||
responseText = (msg as any).result || responseText;
|
||||
} else if (msg.type === "error") {
|
||||
console.error("[SpecRegeneration] ❌ Received error message from feature stream:");
|
||||
console.error("[SpecRegeneration] Error message:", JSON.stringify(msg, null, 2));
|
||||
}
|
||||
}
|
||||
} catch (streamError) {
|
||||
console.error("[SpecRegeneration] ❌ Error while iterating feature stream:");
|
||||
console.error("[SpecRegeneration] Stream error:", streamError);
|
||||
throw streamError;
|
||||
}
|
||||
|
||||
console.log(`[SpecRegeneration] Feature stream complete. Total messages: ${messageCount}`);
|
||||
console.log(`[SpecRegeneration] Feature response length: ${responseText.length} chars`);
|
||||
|
||||
await parseAndCreateFeatures(projectPath, responseText, events);
|
||||
|
||||
console.log("[SpecRegeneration] ========== generateFeaturesFromSpec() completed ==========");
|
||||
}
|
||||
|
||||
async function parseAndCreateFeatures(
|
||||
projectPath: string,
|
||||
content: string,
|
||||
events: EventEmitter
|
||||
) {
|
||||
console.log("[SpecRegeneration] ========== parseAndCreateFeatures() started ==========");
|
||||
console.log(`[SpecRegeneration] Content length: ${content.length} chars`);
|
||||
|
||||
try {
|
||||
// Extract JSON from response
|
||||
console.log("[SpecRegeneration] Extracting JSON from response...");
|
||||
const jsonMatch = content.match(/\{[\s\S]*"features"[\s\S]*\}/);
|
||||
if (!jsonMatch) {
|
||||
console.error("[SpecRegeneration] ❌ No valid JSON found in response");
|
||||
console.error("[SpecRegeneration] Content preview:", content.substring(0, 500));
|
||||
throw new Error("No valid JSON found in response");
|
||||
}
|
||||
|
||||
console.log(`[SpecRegeneration] JSON match found (${jsonMatch[0].length} chars)`);
|
||||
|
||||
const parsed = JSON.parse(jsonMatch[0]);
|
||||
console.log(`[SpecRegeneration] Parsed ${parsed.features?.length || 0} features`);
|
||||
|
||||
const featuresDir = path.join(projectPath, ".automaker", "features");
|
||||
await fs.mkdir(featuresDir, { recursive: true });
|
||||
|
||||
const createdFeatures: Array<{ id: string; title: string }> = [];
|
||||
|
||||
for (const feature of parsed.features) {
|
||||
console.log(`[SpecRegeneration] Creating feature: ${feature.id}`);
|
||||
const featureDir = path.join(featuresDir, feature.id);
|
||||
await fs.mkdir(featureDir, { recursive: true });
|
||||
|
||||
const featureData = {
|
||||
id: feature.id,
|
||||
title: feature.title,
|
||||
description: feature.description,
|
||||
status: "backlog", // Features go to backlog - user must manually start them
|
||||
priority: feature.priority || 2,
|
||||
complexity: feature.complexity || "moderate",
|
||||
dependencies: feature.dependencies || [],
|
||||
createdAt: new Date().toISOString(),
|
||||
updatedAt: new Date().toISOString(),
|
||||
};
|
||||
|
||||
await fs.writeFile(
|
||||
path.join(featureDir, "feature.json"),
|
||||
JSON.stringify(featureData, null, 2)
|
||||
);
|
||||
|
||||
createdFeatures.push({ id: feature.id, title: feature.title });
|
||||
}
|
||||
|
||||
console.log(`[SpecRegeneration] ✓ Created ${createdFeatures.length} features successfully`);
|
||||
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "features_complete",
|
||||
features: createdFeatures,
|
||||
count: createdFeatures.length,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("[SpecRegeneration] ❌ parseAndCreateFeatures() failed:");
|
||||
console.error("[SpecRegeneration] Error:", error);
|
||||
events.emit("spec-regeneration:event", {
|
||||
type: "features_error",
|
||||
error: (error as Error).message,
|
||||
});
|
||||
}
|
||||
|
||||
console.log("[SpecRegeneration] ========== parseAndCreateFeatures() completed ==========");
|
||||
}
|
||||
192
apps/server/src/routes/suggestions.ts
Normal file
192
apps/server/src/routes/suggestions.ts
Normal file
@@ -0,0 +1,192 @@
|
||||
/**
|
||||
* Suggestions routes - HTTP API for AI-powered feature suggestions
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import { query, type Options } from "@anthropic-ai/claude-agent-sdk";
|
||||
import type { EventEmitter } from "../lib/events.js";
|
||||
|
||||
let isRunning = false;
|
||||
let currentAbortController: AbortController | null = null;
|
||||
|
||||
export function createSuggestionsRoutes(events: EventEmitter): Router {
|
||||
const router = Router();
|
||||
|
||||
// Generate suggestions
|
||||
router.post("/generate", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, suggestionType = "features" } = req.body as {
|
||||
projectPath: string;
|
||||
suggestionType?: string;
|
||||
};
|
||||
|
||||
if (!projectPath) {
|
||||
res.status(400).json({ success: false, error: "projectPath required" });
|
||||
return;
|
||||
}
|
||||
|
||||
if (isRunning) {
|
||||
res.json({ success: false, error: "Suggestions generation is already running" });
|
||||
return;
|
||||
}
|
||||
|
||||
isRunning = true;
|
||||
currentAbortController = new AbortController();
|
||||
|
||||
// Start generation in background
|
||||
generateSuggestions(projectPath, suggestionType, events, currentAbortController)
|
||||
.catch((error) => {
|
||||
console.error("[Suggestions] Error:", error);
|
||||
events.emit("suggestions:event", {
|
||||
type: "suggestions_error",
|
||||
error: error.message,
|
||||
});
|
||||
})
|
||||
.finally(() => {
|
||||
isRunning = false;
|
||||
currentAbortController = null;
|
||||
});
|
||||
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Stop suggestions generation
|
||||
router.post("/stop", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
if (currentAbortController) {
|
||||
currentAbortController.abort();
|
||||
}
|
||||
isRunning = false;
|
||||
res.json({ success: true });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get status
|
||||
router.get("/status", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
res.json({ success: true, isRunning });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
|
||||
async function generateSuggestions(
|
||||
projectPath: string,
|
||||
suggestionType: string,
|
||||
events: EventEmitter,
|
||||
abortController: AbortController
|
||||
) {
|
||||
const typePrompts: Record<string, string> = {
|
||||
features: "Analyze this project and suggest new features that would add value.",
|
||||
refactoring: "Analyze this project and identify refactoring opportunities.",
|
||||
security: "Analyze this project for security vulnerabilities and suggest fixes.",
|
||||
performance: "Analyze this project for performance issues and suggest optimizations.",
|
||||
};
|
||||
|
||||
const prompt = `${typePrompts[suggestionType] || typePrompts.features}
|
||||
|
||||
Look at the codebase and provide 3-5 concrete suggestions.
|
||||
|
||||
For each suggestion, provide:
|
||||
1. A category (e.g., "User Experience", "Security", "Performance")
|
||||
2. A clear description of what to implement
|
||||
3. Concrete steps to implement it
|
||||
4. Priority (1=high, 2=medium, 3=low)
|
||||
5. Brief reasoning for why this would help
|
||||
|
||||
Format your response as JSON:
|
||||
{
|
||||
"suggestions": [
|
||||
{
|
||||
"id": "suggestion-123",
|
||||
"category": "Category",
|
||||
"description": "What to implement",
|
||||
"steps": ["Step 1", "Step 2"],
|
||||
"priority": 1,
|
||||
"reasoning": "Why this helps"
|
||||
}
|
||||
]
|
||||
}`;
|
||||
|
||||
events.emit("suggestions:event", {
|
||||
type: "suggestions_progress",
|
||||
content: `Starting ${suggestionType} analysis...\n`,
|
||||
});
|
||||
|
||||
const options: Options = {
|
||||
model: "claude-opus-4-5-20251101",
|
||||
maxTurns: 5,
|
||||
cwd: projectPath,
|
||||
allowedTools: ["Read", "Glob", "Grep"],
|
||||
permissionMode: "acceptEdits",
|
||||
abortController,
|
||||
};
|
||||
|
||||
const stream = query({ prompt, options });
|
||||
let responseText = "";
|
||||
|
||||
for await (const msg of stream) {
|
||||
if (msg.type === "assistant" && msg.message.content) {
|
||||
for (const block of msg.message.content) {
|
||||
if (block.type === "text") {
|
||||
responseText = block.text;
|
||||
events.emit("suggestions:event", {
|
||||
type: "suggestions_progress",
|
||||
content: block.text,
|
||||
});
|
||||
} else if (block.type === "tool_use") {
|
||||
events.emit("suggestions:event", {
|
||||
type: "suggestions_tool",
|
||||
tool: block.name,
|
||||
input: block.input,
|
||||
});
|
||||
}
|
||||
}
|
||||
} else if (msg.type === "result" && msg.subtype === "success") {
|
||||
responseText = msg.result || responseText;
|
||||
}
|
||||
}
|
||||
|
||||
// Parse suggestions from response
|
||||
try {
|
||||
const jsonMatch = responseText.match(/\{[\s\S]*"suggestions"[\s\S]*\}/);
|
||||
if (jsonMatch) {
|
||||
const parsed = JSON.parse(jsonMatch[0]);
|
||||
events.emit("suggestions:event", {
|
||||
type: "suggestions_complete",
|
||||
suggestions: parsed.suggestions.map((s: Record<string, unknown>, i: number) => ({
|
||||
...s,
|
||||
id: s.id || `suggestion-${Date.now()}-${i}`,
|
||||
})),
|
||||
});
|
||||
} else {
|
||||
throw new Error("No valid JSON found in response");
|
||||
}
|
||||
} catch (error) {
|
||||
// Return generic suggestions if parsing fails
|
||||
events.emit("suggestions:event", {
|
||||
type: "suggestions_complete",
|
||||
suggestions: [
|
||||
{
|
||||
id: `suggestion-${Date.now()}-0`,
|
||||
category: "Analysis",
|
||||
description: "Review the AI analysis output for insights",
|
||||
steps: ["Review the generated analysis"],
|
||||
priority: 1,
|
||||
reasoning: "The AI provided analysis but suggestions need manual review",
|
||||
},
|
||||
],
|
||||
});
|
||||
}
|
||||
}
|
||||
113
apps/server/src/routes/workspace.ts
Normal file
113
apps/server/src/routes/workspace.ts
Normal file
@@ -0,0 +1,113 @@
|
||||
/**
|
||||
* Workspace routes
|
||||
* Provides API endpoints for workspace directory management
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import fs from "fs/promises";
|
||||
import path from "path";
|
||||
import { addAllowedPath } from "../lib/security.js";
|
||||
|
||||
export function createWorkspaceRoutes(): Router {
|
||||
const router = Router();
|
||||
|
||||
// Get workspace configuration status
|
||||
router.get("/config", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
const workspaceDir = process.env.WORKSPACE_DIR;
|
||||
|
||||
if (!workspaceDir) {
|
||||
res.json({
|
||||
success: true,
|
||||
configured: false,
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if the directory exists
|
||||
try {
|
||||
const stats = await fs.stat(workspaceDir);
|
||||
if (!stats.isDirectory()) {
|
||||
res.json({
|
||||
success: true,
|
||||
configured: false,
|
||||
error: "WORKSPACE_DIR is not a valid directory",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Add workspace dir to allowed paths
|
||||
addAllowedPath(workspaceDir);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
configured: true,
|
||||
workspaceDir,
|
||||
});
|
||||
} catch {
|
||||
res.json({
|
||||
success: true,
|
||||
configured: false,
|
||||
error: "WORKSPACE_DIR path does not exist",
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// List directories in workspace
|
||||
router.get("/directories", async (_req: Request, res: Response) => {
|
||||
try {
|
||||
const workspaceDir = process.env.WORKSPACE_DIR;
|
||||
|
||||
if (!workspaceDir) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: "WORKSPACE_DIR is not configured",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if directory exists
|
||||
try {
|
||||
await fs.stat(workspaceDir);
|
||||
} catch {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: "WORKSPACE_DIR path does not exist",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Add workspace dir to allowed paths
|
||||
addAllowedPath(workspaceDir);
|
||||
|
||||
// Read directory contents
|
||||
const entries = await fs.readdir(workspaceDir, { withFileTypes: true });
|
||||
|
||||
// Filter to directories only and map to result format
|
||||
const directories = entries
|
||||
.filter((entry) => entry.isDirectory() && !entry.name.startsWith("."))
|
||||
.map((entry) => ({
|
||||
name: entry.name,
|
||||
path: path.join(workspaceDir, entry.name),
|
||||
}))
|
||||
.sort((a, b) => a.name.localeCompare(b.name));
|
||||
|
||||
// Add each directory to allowed paths
|
||||
directories.forEach((dir) => addAllowedPath(dir.path));
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
directories,
|
||||
});
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
355
apps/server/src/routes/worktree.ts
Normal file
355
apps/server/src/routes/worktree.ts
Normal file
@@ -0,0 +1,355 @@
|
||||
/**
|
||||
* Worktree routes - HTTP API for git worktree operations
|
||||
*/
|
||||
|
||||
import { Router, type Request, type Response } from "express";
|
||||
import { exec } from "child_process";
|
||||
import { promisify } from "util";
|
||||
import path from "path";
|
||||
import fs from "fs/promises";
|
||||
|
||||
const execAsync = promisify(exec);
|
||||
|
||||
export function createWorktreeRoutes(): Router {
|
||||
const router = Router();
|
||||
|
||||
// Check if a path is a git repo
|
||||
async function isGitRepo(repoPath: string): Promise<boolean> {
|
||||
try {
|
||||
await execAsync("git rev-parse --is-inside-work-tree", { cwd: repoPath });
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Get worktree info
|
||||
router.post("/info", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId required" });
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if worktree exists
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
try {
|
||||
await fs.access(worktreePath);
|
||||
const { stdout } = await execAsync("git rev-parse --abbrev-ref HEAD", {
|
||||
cwd: worktreePath,
|
||||
});
|
||||
res.json({
|
||||
success: true,
|
||||
worktreePath,
|
||||
branchName: stdout.trim(),
|
||||
});
|
||||
} catch {
|
||||
res.json({ success: true, worktreePath: null, branchName: null });
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get worktree status
|
||||
router.post("/status", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
|
||||
try {
|
||||
await fs.access(worktreePath);
|
||||
const { stdout: status } = await execAsync("git status --porcelain", {
|
||||
cwd: worktreePath,
|
||||
});
|
||||
const files = status
|
||||
.split("\n")
|
||||
.filter(Boolean)
|
||||
.map((line) => line.slice(3));
|
||||
const { stdout: diffStat } = await execAsync("git diff --stat", {
|
||||
cwd: worktreePath,
|
||||
});
|
||||
const { stdout: logOutput } = await execAsync(
|
||||
'git log --oneline -5 --format="%h %s"',
|
||||
{ cwd: worktreePath }
|
||||
);
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
modifiedFiles: files.length,
|
||||
files,
|
||||
diffStat: diffStat.trim(),
|
||||
recentCommits: logOutput.trim().split("\n").filter(Boolean),
|
||||
});
|
||||
} catch {
|
||||
res.json({
|
||||
success: true,
|
||||
modifiedFiles: 0,
|
||||
files: [],
|
||||
diffStat: "",
|
||||
recentCommits: [],
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// List all worktrees
|
||||
router.post("/list", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath } = req.body as { projectPath: string };
|
||||
|
||||
if (!projectPath) {
|
||||
res.status(400).json({ success: false, error: "projectPath required" });
|
||||
return;
|
||||
}
|
||||
|
||||
if (!(await isGitRepo(projectPath))) {
|
||||
res.json({ success: true, worktrees: [] });
|
||||
return;
|
||||
}
|
||||
|
||||
const { stdout } = await execAsync("git worktree list --porcelain", {
|
||||
cwd: projectPath,
|
||||
});
|
||||
|
||||
const worktrees: Array<{ path: string; branch: string }> = [];
|
||||
const lines = stdout.split("\n");
|
||||
let current: { path?: string; branch?: string } = {};
|
||||
|
||||
for (const line of lines) {
|
||||
if (line.startsWith("worktree ")) {
|
||||
current.path = line.slice(9);
|
||||
} else if (line.startsWith("branch ")) {
|
||||
current.branch = line.slice(7).replace("refs/heads/", "");
|
||||
} else if (line === "") {
|
||||
if (current.path && current.branch) {
|
||||
worktrees.push({ path: current.path, branch: current.branch });
|
||||
}
|
||||
current = {};
|
||||
}
|
||||
}
|
||||
|
||||
res.json({ success: true, worktrees });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get diffs for a worktree
|
||||
router.post("/diffs", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
|
||||
try {
|
||||
await fs.access(worktreePath);
|
||||
const { stdout: diff } = await execAsync("git diff HEAD", {
|
||||
cwd: worktreePath,
|
||||
maxBuffer: 10 * 1024 * 1024,
|
||||
});
|
||||
const { stdout: status } = await execAsync("git status --porcelain", {
|
||||
cwd: worktreePath,
|
||||
});
|
||||
|
||||
const files = status
|
||||
.split("\n")
|
||||
.filter(Boolean)
|
||||
.map((line) => {
|
||||
const statusChar = line[0];
|
||||
const filePath = line.slice(3);
|
||||
const statusMap: Record<string, string> = {
|
||||
M: "Modified",
|
||||
A: "Added",
|
||||
D: "Deleted",
|
||||
R: "Renamed",
|
||||
C: "Copied",
|
||||
U: "Updated",
|
||||
"?": "Untracked",
|
||||
};
|
||||
return {
|
||||
status: statusChar,
|
||||
path: filePath,
|
||||
statusText: statusMap[statusChar] || "Unknown",
|
||||
};
|
||||
});
|
||||
|
||||
res.json({
|
||||
success: true,
|
||||
diff,
|
||||
files,
|
||||
hasChanges: files.length > 0,
|
||||
});
|
||||
} catch {
|
||||
res.json({ success: true, diff: "", files: [], hasChanges: false });
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Get diff for a specific file
|
||||
router.post("/file-diff", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId, filePath } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
filePath: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId || !filePath) {
|
||||
res.status(400).json({
|
||||
success: false,
|
||||
error: "projectPath, featureId, and filePath required",
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
|
||||
try {
|
||||
await fs.access(worktreePath);
|
||||
const { stdout: diff } = await execAsync(`git diff HEAD -- "${filePath}"`, {
|
||||
cwd: worktreePath,
|
||||
maxBuffer: 10 * 1024 * 1024,
|
||||
});
|
||||
|
||||
res.json({ success: true, diff, filePath });
|
||||
} catch {
|
||||
res.json({ success: true, diff: "", filePath });
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Revert feature (remove worktree)
|
||||
router.post("/revert", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
|
||||
try {
|
||||
// Remove worktree
|
||||
await execAsync(`git worktree remove "${worktreePath}" --force`, {
|
||||
cwd: projectPath,
|
||||
});
|
||||
// Delete branch
|
||||
await execAsync(`git branch -D feature/${featureId}`, { cwd: projectPath });
|
||||
|
||||
res.json({ success: true, removedPath: worktreePath });
|
||||
} catch (error) {
|
||||
// Worktree might not exist
|
||||
res.json({ success: true, removedPath: null });
|
||||
}
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
// Merge feature (merge worktree branch into main)
|
||||
router.post("/merge", async (req: Request, res: Response) => {
|
||||
try {
|
||||
const { projectPath, featureId, options } = req.body as {
|
||||
projectPath: string;
|
||||
featureId: string;
|
||||
options?: { squash?: boolean; message?: string };
|
||||
};
|
||||
|
||||
if (!projectPath || !featureId) {
|
||||
res
|
||||
.status(400)
|
||||
.json({ success: false, error: "projectPath and featureId required" });
|
||||
return;
|
||||
}
|
||||
|
||||
const branchName = `feature/${featureId}`;
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
|
||||
// Get current branch
|
||||
const { stdout: currentBranch } = await execAsync(
|
||||
"git rev-parse --abbrev-ref HEAD",
|
||||
{ cwd: projectPath }
|
||||
);
|
||||
|
||||
// Merge the feature branch
|
||||
const mergeCmd = options?.squash
|
||||
? `git merge --squash ${branchName}`
|
||||
: `git merge ${branchName} -m "${options?.message || `Merge ${branchName}`}"`;
|
||||
|
||||
await execAsync(mergeCmd, { cwd: projectPath });
|
||||
|
||||
// If squash merge, need to commit
|
||||
if (options?.squash) {
|
||||
await execAsync(
|
||||
`git commit -m "${options?.message || `Merge ${branchName} (squash)`}"`,
|
||||
{ cwd: projectPath }
|
||||
);
|
||||
}
|
||||
|
||||
// Clean up worktree and branch
|
||||
try {
|
||||
await execAsync(`git worktree remove "${worktreePath}" --force`, {
|
||||
cwd: projectPath,
|
||||
});
|
||||
await execAsync(`git branch -D ${branchName}`, { cwd: projectPath });
|
||||
} catch {
|
||||
// Cleanup errors are non-fatal
|
||||
}
|
||||
|
||||
res.json({ success: true, mergedBranch: branchName });
|
||||
} catch (error) {
|
||||
const message = error instanceof Error ? error.message : "Unknown error";
|
||||
res.status(500).json({ success: false, error: message });
|
||||
}
|
||||
});
|
||||
|
||||
return router;
|
||||
}
|
||||
562
apps/server/src/services/agent-service.ts
Normal file
562
apps/server/src/services/agent-service.ts
Normal file
@@ -0,0 +1,562 @@
|
||||
/**
|
||||
* Agent Service - Runs Claude agents via the Claude Agent SDK
|
||||
* Manages conversation sessions and streams responses via WebSocket
|
||||
*/
|
||||
|
||||
import { query, AbortError, type Options } from "@anthropic-ai/claude-agent-sdk";
|
||||
import path from "path";
|
||||
import fs from "fs/promises";
|
||||
import type { EventEmitter } from "../lib/events.js";
|
||||
|
||||
interface Message {
|
||||
id: string;
|
||||
role: "user" | "assistant";
|
||||
content: string;
|
||||
images?: Array<{
|
||||
data: string;
|
||||
mimeType: string;
|
||||
filename: string;
|
||||
}>;
|
||||
timestamp: string;
|
||||
isError?: boolean;
|
||||
}
|
||||
|
||||
interface Session {
|
||||
messages: Message[];
|
||||
isRunning: boolean;
|
||||
abortController: AbortController | null;
|
||||
workingDirectory: string;
|
||||
}
|
||||
|
||||
interface SessionMetadata {
|
||||
id: string;
|
||||
name: string;
|
||||
projectPath?: string;
|
||||
workingDirectory: string;
|
||||
createdAt: string;
|
||||
updatedAt: string;
|
||||
archived?: boolean;
|
||||
tags?: string[];
|
||||
}
|
||||
|
||||
export class AgentService {
|
||||
private sessions = new Map<string, Session>();
|
||||
private stateDir: string;
|
||||
private metadataFile: string;
|
||||
private events: EventEmitter;
|
||||
|
||||
constructor(dataDir: string, events: EventEmitter) {
|
||||
this.stateDir = path.join(dataDir, "agent-sessions");
|
||||
this.metadataFile = path.join(dataDir, "sessions-metadata.json");
|
||||
this.events = events;
|
||||
}
|
||||
|
||||
async initialize(): Promise<void> {
|
||||
await fs.mkdir(this.stateDir, { recursive: true });
|
||||
}
|
||||
|
||||
/**
|
||||
* Start or resume a conversation
|
||||
*/
|
||||
async startConversation({
|
||||
sessionId,
|
||||
workingDirectory,
|
||||
}: {
|
||||
sessionId: string;
|
||||
workingDirectory?: string;
|
||||
}) {
|
||||
if (!this.sessions.has(sessionId)) {
|
||||
const messages = await this.loadSession(sessionId);
|
||||
this.sessions.set(sessionId, {
|
||||
messages,
|
||||
isRunning: false,
|
||||
abortController: null,
|
||||
workingDirectory: workingDirectory || process.cwd(),
|
||||
});
|
||||
}
|
||||
|
||||
const session = this.sessions.get(sessionId)!;
|
||||
return {
|
||||
success: true,
|
||||
messages: session.messages,
|
||||
sessionId,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Send a message to the agent and stream responses
|
||||
*/
|
||||
async sendMessage({
|
||||
sessionId,
|
||||
message,
|
||||
workingDirectory,
|
||||
imagePaths,
|
||||
}: {
|
||||
sessionId: string;
|
||||
message: string;
|
||||
workingDirectory?: string;
|
||||
imagePaths?: string[];
|
||||
}) {
|
||||
const session = this.sessions.get(sessionId);
|
||||
if (!session) {
|
||||
throw new Error(`Session ${sessionId} not found`);
|
||||
}
|
||||
|
||||
if (session.isRunning) {
|
||||
throw new Error("Agent is already processing a message");
|
||||
}
|
||||
|
||||
// Read images and convert to base64
|
||||
const images: Message["images"] = [];
|
||||
if (imagePaths && imagePaths.length > 0) {
|
||||
for (const imagePath of imagePaths) {
|
||||
try {
|
||||
const imageBuffer = await fs.readFile(imagePath);
|
||||
const base64Data = imageBuffer.toString("base64");
|
||||
const ext = path.extname(imagePath).toLowerCase();
|
||||
const mimeTypeMap: Record<string, string> = {
|
||||
".jpg": "image/jpeg",
|
||||
".jpeg": "image/jpeg",
|
||||
".png": "image/png",
|
||||
".gif": "image/gif",
|
||||
".webp": "image/webp",
|
||||
};
|
||||
const mediaType = mimeTypeMap[ext] || "image/png";
|
||||
|
||||
images.push({
|
||||
data: base64Data,
|
||||
mimeType: mediaType,
|
||||
filename: path.basename(imagePath),
|
||||
});
|
||||
} catch (error) {
|
||||
console.error(`[AgentService] Failed to load image ${imagePath}:`, error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Add user message
|
||||
const userMessage: Message = {
|
||||
id: this.generateId(),
|
||||
role: "user",
|
||||
content: message,
|
||||
images: images.length > 0 ? images : undefined,
|
||||
timestamp: new Date().toISOString(),
|
||||
};
|
||||
|
||||
session.messages.push(userMessage);
|
||||
session.isRunning = true;
|
||||
session.abortController = new AbortController();
|
||||
|
||||
// Emit user message event
|
||||
this.emitAgentEvent(sessionId, {
|
||||
type: "message",
|
||||
message: userMessage,
|
||||
});
|
||||
|
||||
await this.saveSession(sessionId, session.messages);
|
||||
|
||||
try {
|
||||
const options: Options = {
|
||||
model: "claude-opus-4-5-20251101",
|
||||
systemPrompt: this.getSystemPrompt(),
|
||||
maxTurns: 20,
|
||||
cwd: workingDirectory || session.workingDirectory,
|
||||
allowedTools: [
|
||||
"Read",
|
||||
"Write",
|
||||
"Edit",
|
||||
"Glob",
|
||||
"Grep",
|
||||
"Bash",
|
||||
"WebSearch",
|
||||
"WebFetch",
|
||||
],
|
||||
permissionMode: "acceptEdits",
|
||||
sandbox: {
|
||||
enabled: true,
|
||||
autoAllowBashIfSandboxed: true,
|
||||
},
|
||||
abortController: session.abortController!,
|
||||
};
|
||||
|
||||
// Build prompt content
|
||||
let promptContent: string | Array<{ type: string; text?: string; source?: object }> =
|
||||
message;
|
||||
|
||||
if (imagePaths && imagePaths.length > 0) {
|
||||
const contentBlocks: Array<{ type: string; text?: string; source?: object }> = [];
|
||||
|
||||
if (message && message.trim()) {
|
||||
contentBlocks.push({ type: "text", text: message });
|
||||
}
|
||||
|
||||
for (const imagePath of imagePaths) {
|
||||
try {
|
||||
const imageBuffer = await fs.readFile(imagePath);
|
||||
const base64Data = imageBuffer.toString("base64");
|
||||
const ext = path.extname(imagePath).toLowerCase();
|
||||
const mimeTypeMap: Record<string, string> = {
|
||||
".jpg": "image/jpeg",
|
||||
".jpeg": "image/jpeg",
|
||||
".png": "image/png",
|
||||
".gif": "image/gif",
|
||||
".webp": "image/webp",
|
||||
};
|
||||
const mediaType = mimeTypeMap[ext] || "image/png";
|
||||
|
||||
contentBlocks.push({
|
||||
type: "image",
|
||||
source: {
|
||||
type: "base64",
|
||||
media_type: mediaType,
|
||||
data: base64Data,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
console.error(`[AgentService] Failed to load image ${imagePath}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
if (contentBlocks.length > 1 || contentBlocks[0]?.type === "image") {
|
||||
promptContent = contentBlocks;
|
||||
}
|
||||
}
|
||||
|
||||
// Build payload
|
||||
const promptPayload = Array.isArray(promptContent)
|
||||
? (async function* () {
|
||||
yield {
|
||||
type: "user" as const,
|
||||
session_id: "",
|
||||
message: {
|
||||
role: "user" as const,
|
||||
content: promptContent,
|
||||
},
|
||||
parent_tool_use_id: null,
|
||||
};
|
||||
})()
|
||||
: promptContent;
|
||||
|
||||
const stream = query({ prompt: promptPayload, options });
|
||||
|
||||
let currentAssistantMessage: Message | null = null;
|
||||
let responseText = "";
|
||||
const toolUses: Array<{ name: string; input: unknown }> = [];
|
||||
|
||||
for await (const msg of stream) {
|
||||
if (msg.type === "assistant") {
|
||||
if (msg.message.content) {
|
||||
for (const block of msg.message.content) {
|
||||
if (block.type === "text") {
|
||||
responseText += block.text;
|
||||
|
||||
if (!currentAssistantMessage) {
|
||||
currentAssistantMessage = {
|
||||
id: this.generateId(),
|
||||
role: "assistant",
|
||||
content: responseText,
|
||||
timestamp: new Date().toISOString(),
|
||||
};
|
||||
session.messages.push(currentAssistantMessage);
|
||||
} else {
|
||||
currentAssistantMessage.content = responseText;
|
||||
}
|
||||
|
||||
this.emitAgentEvent(sessionId, {
|
||||
type: "stream",
|
||||
messageId: currentAssistantMessage.id,
|
||||
content: responseText,
|
||||
isComplete: false,
|
||||
});
|
||||
} else if (block.type === "tool_use") {
|
||||
const toolUse = {
|
||||
name: block.name,
|
||||
input: block.input,
|
||||
};
|
||||
toolUses.push(toolUse);
|
||||
|
||||
this.emitAgentEvent(sessionId, {
|
||||
type: "tool_use",
|
||||
tool: toolUse,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
} else if (msg.type === "result") {
|
||||
if (msg.subtype === "success" && msg.result) {
|
||||
if (currentAssistantMessage) {
|
||||
currentAssistantMessage.content = msg.result;
|
||||
responseText = msg.result;
|
||||
}
|
||||
}
|
||||
|
||||
this.emitAgentEvent(sessionId, {
|
||||
type: "complete",
|
||||
messageId: currentAssistantMessage?.id,
|
||||
content: responseText,
|
||||
toolUses,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
await this.saveSession(sessionId, session.messages);
|
||||
|
||||
session.isRunning = false;
|
||||
session.abortController = null;
|
||||
|
||||
return {
|
||||
success: true,
|
||||
message: currentAssistantMessage,
|
||||
};
|
||||
} catch (error) {
|
||||
if (error instanceof AbortError || (error as Error)?.name === "AbortError") {
|
||||
session.isRunning = false;
|
||||
session.abortController = null;
|
||||
return { success: false, aborted: true };
|
||||
}
|
||||
|
||||
console.error("[AgentService] Error:", error);
|
||||
|
||||
session.isRunning = false;
|
||||
session.abortController = null;
|
||||
|
||||
const errorMessage: Message = {
|
||||
id: this.generateId(),
|
||||
role: "assistant",
|
||||
content: `Error: ${(error as Error).message}`,
|
||||
timestamp: new Date().toISOString(),
|
||||
isError: true,
|
||||
};
|
||||
|
||||
session.messages.push(errorMessage);
|
||||
await this.saveSession(sessionId, session.messages);
|
||||
|
||||
this.emitAgentEvent(sessionId, {
|
||||
type: "error",
|
||||
error: (error as Error).message,
|
||||
message: errorMessage,
|
||||
});
|
||||
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get conversation history
|
||||
*/
|
||||
getHistory(sessionId: string) {
|
||||
const session = this.sessions.get(sessionId);
|
||||
if (!session) {
|
||||
return { success: false, error: "Session not found" };
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
messages: session.messages,
|
||||
isRunning: session.isRunning,
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop current agent execution
|
||||
*/
|
||||
async stopExecution(sessionId: string) {
|
||||
const session = this.sessions.get(sessionId);
|
||||
if (!session) {
|
||||
return { success: false, error: "Session not found" };
|
||||
}
|
||||
|
||||
if (session.abortController) {
|
||||
session.abortController.abort();
|
||||
session.isRunning = false;
|
||||
session.abortController = null;
|
||||
}
|
||||
|
||||
return { success: true };
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear conversation history
|
||||
*/
|
||||
async clearSession(sessionId: string) {
|
||||
const session = this.sessions.get(sessionId);
|
||||
if (session) {
|
||||
session.messages = [];
|
||||
session.isRunning = false;
|
||||
await this.saveSession(sessionId, []);
|
||||
}
|
||||
|
||||
return { success: true };
|
||||
}
|
||||
|
||||
// Session management
|
||||
|
||||
async loadSession(sessionId: string): Promise<Message[]> {
|
||||
const sessionFile = path.join(this.stateDir, `${sessionId}.json`);
|
||||
|
||||
try {
|
||||
const data = await fs.readFile(sessionFile, "utf-8");
|
||||
return JSON.parse(data);
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
async saveSession(sessionId: string, messages: Message[]): Promise<void> {
|
||||
const sessionFile = path.join(this.stateDir, `${sessionId}.json`);
|
||||
|
||||
try {
|
||||
await fs.writeFile(sessionFile, JSON.stringify(messages, null, 2), "utf-8");
|
||||
await this.updateSessionTimestamp(sessionId);
|
||||
} catch (error) {
|
||||
console.error("[AgentService] Failed to save session:", error);
|
||||
}
|
||||
}
|
||||
|
||||
async loadMetadata(): Promise<Record<string, SessionMetadata>> {
|
||||
try {
|
||||
const data = await fs.readFile(this.metadataFile, "utf-8");
|
||||
return JSON.parse(data);
|
||||
} catch {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
|
||||
async saveMetadata(metadata: Record<string, SessionMetadata>): Promise<void> {
|
||||
await fs.writeFile(this.metadataFile, JSON.stringify(metadata, null, 2), "utf-8");
|
||||
}
|
||||
|
||||
async updateSessionTimestamp(sessionId: string): Promise<void> {
|
||||
const metadata = await this.loadMetadata();
|
||||
if (metadata[sessionId]) {
|
||||
metadata[sessionId].updatedAt = new Date().toISOString();
|
||||
await this.saveMetadata(metadata);
|
||||
}
|
||||
}
|
||||
|
||||
async listSessions(includeArchived = false): Promise<SessionMetadata[]> {
|
||||
const metadata = await this.loadMetadata();
|
||||
let sessions = Object.values(metadata);
|
||||
|
||||
if (!includeArchived) {
|
||||
sessions = sessions.filter((s) => !s.archived);
|
||||
}
|
||||
|
||||
return sessions.sort(
|
||||
(a, b) => new Date(b.updatedAt).getTime() - new Date(a.updatedAt).getTime()
|
||||
);
|
||||
}
|
||||
|
||||
async createSession(
|
||||
name: string,
|
||||
projectPath?: string,
|
||||
workingDirectory?: string
|
||||
): Promise<SessionMetadata> {
|
||||
const sessionId = this.generateId();
|
||||
const metadata = await this.loadMetadata();
|
||||
|
||||
const session: SessionMetadata = {
|
||||
id: sessionId,
|
||||
name,
|
||||
projectPath,
|
||||
workingDirectory: workingDirectory || projectPath || process.cwd(),
|
||||
createdAt: new Date().toISOString(),
|
||||
updatedAt: new Date().toISOString(),
|
||||
};
|
||||
|
||||
metadata[sessionId] = session;
|
||||
await this.saveMetadata(metadata);
|
||||
|
||||
return session;
|
||||
}
|
||||
|
||||
async updateSession(
|
||||
sessionId: string,
|
||||
updates: Partial<SessionMetadata>
|
||||
): Promise<SessionMetadata | null> {
|
||||
const metadata = await this.loadMetadata();
|
||||
if (!metadata[sessionId]) return null;
|
||||
|
||||
metadata[sessionId] = {
|
||||
...metadata[sessionId],
|
||||
...updates,
|
||||
updatedAt: new Date().toISOString(),
|
||||
};
|
||||
|
||||
await this.saveMetadata(metadata);
|
||||
return metadata[sessionId];
|
||||
}
|
||||
|
||||
async archiveSession(sessionId: string): Promise<boolean> {
|
||||
const result = await this.updateSession(sessionId, { archived: true });
|
||||
return result !== null;
|
||||
}
|
||||
|
||||
async unarchiveSession(sessionId: string): Promise<boolean> {
|
||||
const result = await this.updateSession(sessionId, { archived: false });
|
||||
return result !== null;
|
||||
}
|
||||
|
||||
async deleteSession(sessionId: string): Promise<boolean> {
|
||||
const metadata = await this.loadMetadata();
|
||||
if (!metadata[sessionId]) return false;
|
||||
|
||||
delete metadata[sessionId];
|
||||
await this.saveMetadata(metadata);
|
||||
|
||||
// Delete session file
|
||||
try {
|
||||
const sessionFile = path.join(this.stateDir, `${sessionId}.json`);
|
||||
await fs.unlink(sessionFile);
|
||||
} catch {
|
||||
// File may not exist
|
||||
}
|
||||
|
||||
// Clear from memory
|
||||
this.sessions.delete(sessionId);
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
private emitAgentEvent(sessionId: string, data: Record<string, unknown>): void {
|
||||
this.events.emit("agent:stream", { sessionId, ...data });
|
||||
}
|
||||
|
||||
private getSystemPrompt(): string {
|
||||
return `You are an AI assistant helping users build software. You are part of the Automaker application,
|
||||
which is designed to help developers plan, design, and implement software projects autonomously.
|
||||
|
||||
**Feature Storage:**
|
||||
Features are stored in .automaker/features/{id}/feature.json - each feature has its own folder.
|
||||
Use the UpdateFeatureStatus tool to manage features, not direct file edits.
|
||||
|
||||
Your role is to:
|
||||
- Help users define their project requirements and specifications
|
||||
- Ask clarifying questions to better understand their needs
|
||||
- Suggest technical approaches and architectures
|
||||
- Guide them through the development process
|
||||
- Be conversational and helpful
|
||||
- Write, edit, and modify code files as requested
|
||||
- Execute commands and tests
|
||||
- Search and analyze the codebase
|
||||
|
||||
When discussing projects, help users think through:
|
||||
- Core functionality and features
|
||||
- Technical stack choices
|
||||
- Data models and architecture
|
||||
- User experience considerations
|
||||
- Testing strategies
|
||||
|
||||
You have full access to the codebase and can:
|
||||
- Read files to understand existing code
|
||||
- Write new files
|
||||
- Edit existing files
|
||||
- Run bash commands
|
||||
- Search for code patterns
|
||||
- Execute tests and builds`;
|
||||
}
|
||||
|
||||
private generateId(): string {
|
||||
return `msg_${Date.now()}_${Math.random().toString(36).substring(2, 11)}`;
|
||||
}
|
||||
}
|
||||
845
apps/server/src/services/auto-mode-service.ts
Normal file
845
apps/server/src/services/auto-mode-service.ts
Normal file
@@ -0,0 +1,845 @@
|
||||
/**
|
||||
* Auto Mode Service - Autonomous feature implementation using Claude Agent SDK
|
||||
*
|
||||
* Manages:
|
||||
* - Worktree creation for isolated development
|
||||
* - Feature execution with Claude
|
||||
* - Concurrent execution with max concurrency limits
|
||||
* - Progress streaming via events
|
||||
* - Verification and merge workflows
|
||||
*/
|
||||
|
||||
import { query, AbortError, type Options } from "@anthropic-ai/claude-agent-sdk";
|
||||
import { exec } from "child_process";
|
||||
import { promisify } from "util";
|
||||
import path from "path";
|
||||
import fs from "fs/promises";
|
||||
import type { EventEmitter, EventType } from "../lib/events.js";
|
||||
|
||||
const execAsync = promisify(exec);
|
||||
|
||||
interface Feature {
|
||||
id: string;
|
||||
title: string;
|
||||
description: string;
|
||||
status: string;
|
||||
priority?: number;
|
||||
spec?: string;
|
||||
}
|
||||
|
||||
interface RunningFeature {
|
||||
featureId: string;
|
||||
projectPath: string;
|
||||
worktreePath: string | null;
|
||||
branchName: string | null;
|
||||
abortController: AbortController;
|
||||
isAutoMode: boolean;
|
||||
startTime: number;
|
||||
}
|
||||
|
||||
interface AutoModeConfig {
|
||||
maxConcurrency: number;
|
||||
useWorktrees: boolean;
|
||||
projectPath: string;
|
||||
}
|
||||
|
||||
export class AutoModeService {
|
||||
private events: EventEmitter;
|
||||
private runningFeatures = new Map<string, RunningFeature>();
|
||||
private autoLoopRunning = false;
|
||||
private autoLoopAbortController: AbortController | null = null;
|
||||
private config: AutoModeConfig | null = null;
|
||||
|
||||
constructor(events: EventEmitter) {
|
||||
this.events = events;
|
||||
}
|
||||
|
||||
/**
|
||||
* Start the auto mode loop - continuously picks and executes pending features
|
||||
*/
|
||||
async startAutoLoop(projectPath: string, maxConcurrency = 3): Promise<void> {
|
||||
if (this.autoLoopRunning) {
|
||||
throw new Error("Auto mode is already running");
|
||||
}
|
||||
|
||||
this.autoLoopRunning = true;
|
||||
this.autoLoopAbortController = new AbortController();
|
||||
this.config = {
|
||||
maxConcurrency,
|
||||
useWorktrees: true,
|
||||
projectPath,
|
||||
};
|
||||
|
||||
this.emitAutoModeEvent("auto_mode_complete", {
|
||||
message: `Auto mode started with max ${maxConcurrency} concurrent features`,
|
||||
projectPath,
|
||||
});
|
||||
|
||||
// Run the loop in the background
|
||||
this.runAutoLoop().catch((error) => {
|
||||
console.error("[AutoMode] Loop error:", error);
|
||||
this.emitAutoModeEvent("auto_mode_error", {
|
||||
error: error.message,
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
private async runAutoLoop(): Promise<void> {
|
||||
while (this.autoLoopRunning && this.autoLoopAbortController && !this.autoLoopAbortController.signal.aborted) {
|
||||
try {
|
||||
// Check if we have capacity
|
||||
if (this.runningFeatures.size >= (this.config?.maxConcurrency || 3)) {
|
||||
await this.sleep(5000);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Load pending features
|
||||
const pendingFeatures = await this.loadPendingFeatures(this.config!.projectPath);
|
||||
|
||||
if (pendingFeatures.length === 0) {
|
||||
this.emitAutoModeEvent("auto_mode_complete", {
|
||||
message: "No pending features - auto mode idle",
|
||||
});
|
||||
await this.sleep(10000);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Find a feature not currently running
|
||||
const nextFeature = pendingFeatures.find((f) => !this.runningFeatures.has(f.id));
|
||||
|
||||
if (nextFeature) {
|
||||
// Start feature execution in background
|
||||
this.executeFeature(
|
||||
this.config!.projectPath,
|
||||
nextFeature.id,
|
||||
this.config!.useWorktrees,
|
||||
true
|
||||
).catch((error) => {
|
||||
console.error(`[AutoMode] Feature ${nextFeature.id} error:`, error);
|
||||
});
|
||||
}
|
||||
|
||||
await this.sleep(2000);
|
||||
} catch (error) {
|
||||
console.error("[AutoMode] Loop iteration error:", error);
|
||||
await this.sleep(5000);
|
||||
}
|
||||
}
|
||||
|
||||
this.autoLoopRunning = false;
|
||||
this.emitAutoModeEvent("auto_mode_complete", {
|
||||
message: "Auto mode stopped",
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop the auto mode loop
|
||||
*/
|
||||
async stopAutoLoop(): Promise<number> {
|
||||
this.autoLoopRunning = false;
|
||||
if (this.autoLoopAbortController) {
|
||||
this.autoLoopAbortController.abort();
|
||||
this.autoLoopAbortController = null;
|
||||
}
|
||||
|
||||
return this.runningFeatures.size;
|
||||
}
|
||||
|
||||
/**
|
||||
* Execute a single feature
|
||||
*/
|
||||
async executeFeature(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
useWorktrees = true,
|
||||
isAutoMode = false
|
||||
): Promise<void> {
|
||||
if (this.runningFeatures.has(featureId)) {
|
||||
throw new Error(`Feature ${featureId} is already running`);
|
||||
}
|
||||
|
||||
const abortController = new AbortController();
|
||||
const branchName = `feature/${featureId}`;
|
||||
let worktreePath: string | null = null;
|
||||
|
||||
// Setup worktree if enabled
|
||||
if (useWorktrees) {
|
||||
worktreePath = await this.setupWorktree(projectPath, featureId, branchName);
|
||||
}
|
||||
|
||||
const workDir = worktreePath || projectPath;
|
||||
|
||||
this.runningFeatures.set(featureId, {
|
||||
featureId,
|
||||
projectPath,
|
||||
worktreePath,
|
||||
branchName,
|
||||
abortController,
|
||||
isAutoMode,
|
||||
startTime: Date.now(),
|
||||
});
|
||||
|
||||
// Emit feature start event
|
||||
this.emitAutoModeEvent("auto_mode_feature_start", {
|
||||
featureId,
|
||||
projectPath,
|
||||
feature: { id: featureId, title: "Loading...", description: "Feature is starting" },
|
||||
});
|
||||
|
||||
try {
|
||||
// Load feature details
|
||||
const feature = await this.loadFeature(projectPath, featureId);
|
||||
if (!feature) {
|
||||
throw new Error(`Feature ${featureId} not found`);
|
||||
}
|
||||
|
||||
// Update feature status to in_progress
|
||||
await this.updateFeatureStatus(projectPath, featureId, "in_progress");
|
||||
|
||||
// Build the prompt
|
||||
const prompt = this.buildFeaturePrompt(feature);
|
||||
|
||||
// Run the agent
|
||||
await this.runAgent(workDir, featureId, prompt, abortController);
|
||||
|
||||
// Mark as waiting_approval for user review
|
||||
await this.updateFeatureStatus(projectPath, featureId, "waiting_approval");
|
||||
|
||||
this.emitAutoModeEvent("auto_mode_feature_complete", {
|
||||
featureId,
|
||||
passes: true,
|
||||
message: `Feature completed in ${Math.round((Date.now() - this.runningFeatures.get(featureId)!.startTime) / 1000)}s`,
|
||||
projectPath,
|
||||
});
|
||||
} catch (error) {
|
||||
if (error instanceof AbortError || (error as Error)?.name === "AbortError") {
|
||||
this.emitAutoModeEvent("auto_mode_feature_complete", {
|
||||
featureId,
|
||||
passes: false,
|
||||
message: "Feature stopped by user",
|
||||
projectPath,
|
||||
});
|
||||
} else {
|
||||
const errorMessage = (error as Error).message || "Unknown error";
|
||||
const isAuthError = errorMessage.includes("Authentication failed") ||
|
||||
errorMessage.includes("Invalid API key") ||
|
||||
errorMessage.includes("authentication_failed");
|
||||
|
||||
console.error(`[AutoMode] Feature ${featureId} failed:`, error);
|
||||
await this.updateFeatureStatus(projectPath, featureId, "backlog");
|
||||
this.emitAutoModeEvent("auto_mode_error", {
|
||||
featureId,
|
||||
error: errorMessage,
|
||||
errorType: isAuthError ? "authentication" : "execution",
|
||||
projectPath,
|
||||
});
|
||||
}
|
||||
} finally {
|
||||
this.runningFeatures.delete(featureId);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Stop a specific feature
|
||||
*/
|
||||
async stopFeature(featureId: string): Promise<boolean> {
|
||||
const running = this.runningFeatures.get(featureId);
|
||||
if (!running) {
|
||||
return false;
|
||||
}
|
||||
|
||||
running.abortController.abort();
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
* Resume a feature (continues from saved context)
|
||||
*/
|
||||
async resumeFeature(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
useWorktrees = true
|
||||
): Promise<void> {
|
||||
// Check if context exists
|
||||
const contextPath = path.join(
|
||||
projectPath,
|
||||
".automaker",
|
||||
"features",
|
||||
featureId,
|
||||
"agent-output.md"
|
||||
);
|
||||
|
||||
let hasContext = false;
|
||||
try {
|
||||
await fs.access(contextPath);
|
||||
hasContext = true;
|
||||
} catch {
|
||||
// No context
|
||||
}
|
||||
|
||||
if (hasContext) {
|
||||
// Load previous context and continue
|
||||
const context = await fs.readFile(contextPath, "utf-8");
|
||||
return this.executeFeatureWithContext(projectPath, featureId, context, useWorktrees);
|
||||
}
|
||||
|
||||
// No context, start fresh
|
||||
return this.executeFeature(projectPath, featureId, useWorktrees, false);
|
||||
}
|
||||
|
||||
/**
|
||||
* Follow up on a feature with additional instructions
|
||||
*/
|
||||
async followUpFeature(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
prompt: string,
|
||||
imagePaths?: string[]
|
||||
): Promise<void> {
|
||||
if (this.runningFeatures.has(featureId)) {
|
||||
throw new Error(`Feature ${featureId} is already running`);
|
||||
}
|
||||
|
||||
const abortController = new AbortController();
|
||||
|
||||
// Check if worktree exists
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
let workDir = projectPath;
|
||||
|
||||
try {
|
||||
await fs.access(worktreePath);
|
||||
workDir = worktreePath;
|
||||
} catch {
|
||||
// No worktree, use project path
|
||||
}
|
||||
|
||||
this.runningFeatures.set(featureId, {
|
||||
featureId,
|
||||
projectPath,
|
||||
worktreePath: workDir !== projectPath ? worktreePath : null,
|
||||
branchName: `feature/${featureId}`,
|
||||
abortController,
|
||||
isAutoMode: false,
|
||||
startTime: Date.now(),
|
||||
});
|
||||
|
||||
this.emitAutoModeEvent("auto_mode_feature_start", {
|
||||
featureId,
|
||||
projectPath,
|
||||
feature: { id: featureId, title: "Follow-up", description: prompt.substring(0, 100) },
|
||||
});
|
||||
|
||||
try {
|
||||
await this.runAgent(workDir, featureId, prompt, abortController, imagePaths);
|
||||
|
||||
this.emitAutoModeEvent("auto_mode_feature_complete", {
|
||||
featureId,
|
||||
passes: true,
|
||||
message: "Follow-up completed successfully",
|
||||
projectPath,
|
||||
});
|
||||
} catch (error) {
|
||||
if (!(error instanceof AbortError)) {
|
||||
this.emitAutoModeEvent("auto_mode_error", {
|
||||
featureId,
|
||||
error: (error as Error).message,
|
||||
projectPath,
|
||||
});
|
||||
}
|
||||
} finally {
|
||||
this.runningFeatures.delete(featureId);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Verify a feature's implementation
|
||||
*/
|
||||
async verifyFeature(projectPath: string, featureId: string): Promise<boolean> {
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
let workDir = projectPath;
|
||||
|
||||
try {
|
||||
await fs.access(worktreePath);
|
||||
workDir = worktreePath;
|
||||
} catch {
|
||||
// No worktree
|
||||
}
|
||||
|
||||
// Run verification - check if tests pass, build works, etc.
|
||||
const verificationChecks = [
|
||||
{ cmd: "npm run lint", name: "Lint" },
|
||||
{ cmd: "npm run typecheck", name: "Type check" },
|
||||
{ cmd: "npm test", name: "Tests" },
|
||||
{ cmd: "npm run build", name: "Build" },
|
||||
];
|
||||
|
||||
let allPassed = true;
|
||||
const results: Array<{ check: string; passed: boolean; output?: string }> = [];
|
||||
|
||||
for (const check of verificationChecks) {
|
||||
try {
|
||||
const { stdout, stderr } = await execAsync(check.cmd, {
|
||||
cwd: workDir,
|
||||
timeout: 120000,
|
||||
});
|
||||
results.push({ check: check.name, passed: true, output: stdout || stderr });
|
||||
} catch (error) {
|
||||
allPassed = false;
|
||||
results.push({
|
||||
check: check.name,
|
||||
passed: false,
|
||||
output: (error as Error).message,
|
||||
});
|
||||
break; // Stop on first failure
|
||||
}
|
||||
}
|
||||
|
||||
this.emitAutoModeEvent("auto_mode_feature_complete", {
|
||||
featureId,
|
||||
passes: allPassed,
|
||||
message: allPassed
|
||||
? "All verification checks passed"
|
||||
: `Verification failed: ${results.find(r => !r.passed)?.check || "Unknown"}`,
|
||||
});
|
||||
|
||||
return allPassed;
|
||||
}
|
||||
|
||||
/**
|
||||
* Commit feature changes
|
||||
*/
|
||||
async commitFeature(projectPath: string, featureId: string): Promise<string | null> {
|
||||
const worktreePath = path.join(projectPath, ".automaker", "worktrees", featureId);
|
||||
let workDir = projectPath;
|
||||
|
||||
try {
|
||||
await fs.access(worktreePath);
|
||||
workDir = worktreePath;
|
||||
} catch {
|
||||
// No worktree
|
||||
}
|
||||
|
||||
try {
|
||||
// Check for changes
|
||||
const { stdout: status } = await execAsync("git status --porcelain", { cwd: workDir });
|
||||
if (!status.trim()) {
|
||||
return null; // No changes
|
||||
}
|
||||
|
||||
// Load feature for commit message
|
||||
const feature = await this.loadFeature(projectPath, featureId);
|
||||
const commitMessage = feature
|
||||
? `feat: ${feature.title}\n\nImplemented by Automaker auto-mode`
|
||||
: `feat: Feature ${featureId}`;
|
||||
|
||||
// Stage and commit
|
||||
await execAsync("git add -A", { cwd: workDir });
|
||||
await execAsync(`git commit -m "${commitMessage.replace(/"/g, '\\"')}"`, {
|
||||
cwd: workDir,
|
||||
});
|
||||
|
||||
// Get commit hash
|
||||
const { stdout: hash } = await execAsync("git rev-parse HEAD", { cwd: workDir });
|
||||
|
||||
this.emitAutoModeEvent("auto_mode_feature_complete", {
|
||||
featureId,
|
||||
passes: true,
|
||||
message: `Changes committed: ${hash.trim().substring(0, 8)}`,
|
||||
});
|
||||
|
||||
return hash.trim();
|
||||
} catch (error) {
|
||||
console.error(`[AutoMode] Commit failed for ${featureId}:`, error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if context exists for a feature
|
||||
*/
|
||||
async contextExists(projectPath: string, featureId: string): Promise<boolean> {
|
||||
const contextPath = path.join(
|
||||
projectPath,
|
||||
".automaker",
|
||||
"features",
|
||||
featureId,
|
||||
"agent-output.md"
|
||||
);
|
||||
|
||||
try {
|
||||
await fs.access(contextPath);
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Analyze project to gather context
|
||||
*/
|
||||
async analyzeProject(projectPath: string): Promise<void> {
|
||||
const abortController = new AbortController();
|
||||
|
||||
const analysisFeatureId = `analysis-${Date.now()}`;
|
||||
this.emitAutoModeEvent("auto_mode_feature_start", {
|
||||
featureId: analysisFeatureId,
|
||||
projectPath,
|
||||
feature: { id: analysisFeatureId, title: "Project Analysis", description: "Analyzing project structure" },
|
||||
});
|
||||
|
||||
const prompt = `Analyze this project and provide a summary of:
|
||||
1. Project structure and architecture
|
||||
2. Main technologies and frameworks used
|
||||
3. Key components and their responsibilities
|
||||
4. Build and test commands
|
||||
5. Any existing conventions or patterns
|
||||
|
||||
Format your response as a structured markdown document.`;
|
||||
|
||||
try {
|
||||
const options: Options = {
|
||||
model: "claude-sonnet-4-20250514",
|
||||
maxTurns: 5,
|
||||
cwd: projectPath,
|
||||
allowedTools: ["Read", "Glob", "Grep"],
|
||||
permissionMode: "acceptEdits",
|
||||
abortController,
|
||||
};
|
||||
|
||||
const stream = query({ prompt, options });
|
||||
let analysisResult = "";
|
||||
|
||||
for await (const msg of stream) {
|
||||
if (msg.type === "assistant" && msg.message.content) {
|
||||
for (const block of msg.message.content) {
|
||||
if (block.type === "text") {
|
||||
analysisResult = block.text;
|
||||
this.emitAutoModeEvent("auto_mode_progress", {
|
||||
featureId: analysisFeatureId,
|
||||
content: block.text,
|
||||
projectPath,
|
||||
});
|
||||
}
|
||||
}
|
||||
} else if (msg.type === "result" && msg.subtype === "success") {
|
||||
analysisResult = msg.result || analysisResult;
|
||||
}
|
||||
}
|
||||
|
||||
// Save analysis
|
||||
const analysisPath = path.join(projectPath, ".automaker", "project-analysis.md");
|
||||
await fs.mkdir(path.dirname(analysisPath), { recursive: true });
|
||||
await fs.writeFile(analysisPath, analysisResult);
|
||||
|
||||
this.emitAutoModeEvent("auto_mode_feature_complete", {
|
||||
featureId: analysisFeatureId,
|
||||
passes: true,
|
||||
message: "Project analysis completed",
|
||||
projectPath,
|
||||
});
|
||||
} catch (error) {
|
||||
this.emitAutoModeEvent("auto_mode_error", {
|
||||
featureId: analysisFeatureId,
|
||||
error: (error as Error).message,
|
||||
projectPath,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current status
|
||||
*/
|
||||
getStatus(): {
|
||||
isRunning: boolean;
|
||||
autoLoopRunning: boolean;
|
||||
runningFeatures: string[];
|
||||
runningCount: number;
|
||||
} {
|
||||
return {
|
||||
isRunning: this.autoLoopRunning || this.runningFeatures.size > 0,
|
||||
autoLoopRunning: this.autoLoopRunning,
|
||||
runningFeatures: Array.from(this.runningFeatures.keys()),
|
||||
runningCount: this.runningFeatures.size,
|
||||
};
|
||||
}
|
||||
|
||||
// Private helpers
|
||||
|
||||
private async setupWorktree(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
branchName: string
|
||||
): Promise<string> {
|
||||
const worktreesDir = path.join(projectPath, ".automaker", "worktrees");
|
||||
const worktreePath = path.join(worktreesDir, featureId);
|
||||
|
||||
await fs.mkdir(worktreesDir, { recursive: true });
|
||||
|
||||
// Check if worktree already exists
|
||||
try {
|
||||
await fs.access(worktreePath);
|
||||
return worktreePath;
|
||||
} catch {
|
||||
// Create new worktree
|
||||
}
|
||||
|
||||
// Create branch if it doesn't exist
|
||||
try {
|
||||
await execAsync(`git branch ${branchName}`, { cwd: projectPath });
|
||||
} catch {
|
||||
// Branch may already exist
|
||||
}
|
||||
|
||||
// Create worktree
|
||||
try {
|
||||
await execAsync(`git worktree add "${worktreePath}" ${branchName}`, {
|
||||
cwd: projectPath,
|
||||
});
|
||||
} catch (error) {
|
||||
// Worktree creation failed, fall back to direct execution
|
||||
console.error(`[AutoMode] Worktree creation failed:`, error);
|
||||
return projectPath;
|
||||
}
|
||||
|
||||
return worktreePath;
|
||||
}
|
||||
|
||||
private async loadFeature(projectPath: string, featureId: string): Promise<Feature | null> {
|
||||
const featurePath = path.join(
|
||||
projectPath,
|
||||
".automaker",
|
||||
"features",
|
||||
featureId,
|
||||
"feature.json"
|
||||
);
|
||||
|
||||
try {
|
||||
const data = await fs.readFile(featurePath, "utf-8");
|
||||
return JSON.parse(data);
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private async updateFeatureStatus(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
status: string
|
||||
): Promise<void> {
|
||||
const featurePath = path.join(
|
||||
projectPath,
|
||||
".automaker",
|
||||
"features",
|
||||
featureId,
|
||||
"feature.json"
|
||||
);
|
||||
|
||||
try {
|
||||
const data = await fs.readFile(featurePath, "utf-8");
|
||||
const feature = JSON.parse(data);
|
||||
feature.status = status;
|
||||
feature.updatedAt = new Date().toISOString();
|
||||
await fs.writeFile(featurePath, JSON.stringify(feature, null, 2));
|
||||
} catch {
|
||||
// Feature file may not exist
|
||||
}
|
||||
}
|
||||
|
||||
private async loadPendingFeatures(projectPath: string): Promise<Feature[]> {
|
||||
const featuresDir = path.join(projectPath, ".automaker", "features");
|
||||
|
||||
try {
|
||||
const entries = await fs.readdir(featuresDir, { withFileTypes: true });
|
||||
const features: Feature[] = [];
|
||||
|
||||
for (const entry of entries) {
|
||||
if (entry.isDirectory()) {
|
||||
const featurePath = path.join(featuresDir, entry.name, "feature.json");
|
||||
try {
|
||||
const data = await fs.readFile(featurePath, "utf-8");
|
||||
const feature = JSON.parse(data);
|
||||
if (feature.status === "pending" || feature.status === "ready") {
|
||||
features.push(feature);
|
||||
}
|
||||
} catch {
|
||||
// Skip invalid features
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by priority
|
||||
return features.sort((a, b) => (a.priority || 999) - (b.priority || 999));
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
private buildFeaturePrompt(feature: Feature): string {
|
||||
let prompt = `## Feature Implementation Task
|
||||
|
||||
**Feature ID:** ${feature.id}
|
||||
**Title:** ${feature.title}
|
||||
**Description:** ${feature.description}
|
||||
`;
|
||||
|
||||
if (feature.spec) {
|
||||
prompt += `
|
||||
**Specification:**
|
||||
${feature.spec}
|
||||
`;
|
||||
}
|
||||
|
||||
prompt += `
|
||||
## Instructions
|
||||
|
||||
Implement this feature by:
|
||||
1. First, explore the codebase to understand the existing structure
|
||||
2. Plan your implementation approach
|
||||
3. Write the necessary code changes
|
||||
4. Add or update tests as needed
|
||||
5. Ensure the code follows existing patterns and conventions
|
||||
|
||||
When done, summarize what you implemented and any notes for the developer.`;
|
||||
|
||||
return prompt;
|
||||
}
|
||||
|
||||
private async runAgent(
|
||||
workDir: string,
|
||||
featureId: string,
|
||||
prompt: string,
|
||||
abortController: AbortController,
|
||||
imagePaths?: string[]
|
||||
): Promise<void> {
|
||||
const options: Options = {
|
||||
model: "claude-opus-4-5-20251101",
|
||||
maxTurns: 50,
|
||||
cwd: workDir,
|
||||
allowedTools: [
|
||||
"Read",
|
||||
"Write",
|
||||
"Edit",
|
||||
"Glob",
|
||||
"Grep",
|
||||
"Bash",
|
||||
],
|
||||
permissionMode: "acceptEdits",
|
||||
sandbox: {
|
||||
enabled: true,
|
||||
autoAllowBashIfSandboxed: true,
|
||||
},
|
||||
abortController,
|
||||
};
|
||||
|
||||
// Build prompt - include image paths for the agent to read
|
||||
let finalPrompt = prompt;
|
||||
|
||||
if (imagePaths && imagePaths.length > 0) {
|
||||
finalPrompt = `${prompt}\n\n## Reference Images\nThe following images are available for reference. Use the Read tool to view them:\n${imagePaths.map((p) => `- ${p}`).join("\n")}`;
|
||||
}
|
||||
|
||||
const stream = query({ prompt: finalPrompt, options });
|
||||
let responseText = "";
|
||||
const outputPath = path.join(workDir, ".automaker", "features", featureId, "agent-output.md");
|
||||
|
||||
for await (const msg of stream) {
|
||||
if (msg.type === "assistant" && msg.message.content) {
|
||||
for (const block of msg.message.content) {
|
||||
if (block.type === "text") {
|
||||
responseText = block.text;
|
||||
|
||||
// Check for authentication errors in the response
|
||||
if (block.text.includes("Invalid API key") ||
|
||||
block.text.includes("authentication_failed") ||
|
||||
block.text.includes("Fix external API key")) {
|
||||
throw new Error(
|
||||
"Authentication failed: Invalid or expired API key. " +
|
||||
"Please check your ANTHROPIC_API_KEY or run 'claude login' to re-authenticate."
|
||||
);
|
||||
}
|
||||
|
||||
this.emitAutoModeEvent("auto_mode_progress", {
|
||||
featureId,
|
||||
content: block.text,
|
||||
});
|
||||
} else if (block.type === "tool_use") {
|
||||
this.emitAutoModeEvent("auto_mode_tool", {
|
||||
featureId,
|
||||
tool: block.name,
|
||||
input: block.input,
|
||||
});
|
||||
}
|
||||
}
|
||||
} else if (msg.type === "assistant" && (msg as { error?: string }).error === "authentication_failed") {
|
||||
// Handle authentication error from the SDK
|
||||
throw new Error(
|
||||
"Authentication failed: Invalid or expired API key. " +
|
||||
"Please set a valid ANTHROPIC_API_KEY environment variable or run 'claude login' to authenticate."
|
||||
);
|
||||
} else if (msg.type === "result" && msg.subtype === "success") {
|
||||
// Check if result indicates an error
|
||||
if (msg.is_error && msg.result?.includes("Invalid API key")) {
|
||||
throw new Error(
|
||||
"Authentication failed: Invalid or expired API key. " +
|
||||
"Please set a valid ANTHROPIC_API_KEY environment variable or run 'claude login' to authenticate."
|
||||
);
|
||||
}
|
||||
responseText = msg.result || responseText;
|
||||
}
|
||||
}
|
||||
|
||||
// Save agent output
|
||||
try {
|
||||
await fs.mkdir(path.dirname(outputPath), { recursive: true });
|
||||
await fs.writeFile(outputPath, responseText);
|
||||
} catch {
|
||||
// May fail if directory doesn't exist
|
||||
}
|
||||
}
|
||||
|
||||
private async executeFeatureWithContext(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
context: string,
|
||||
useWorktrees: boolean
|
||||
): Promise<void> {
|
||||
const feature = await this.loadFeature(projectPath, featureId);
|
||||
if (!feature) {
|
||||
throw new Error(`Feature ${featureId} not found`);
|
||||
}
|
||||
|
||||
const prompt = `## Continuing Feature Implementation
|
||||
|
||||
${this.buildFeaturePrompt(feature)}
|
||||
|
||||
## Previous Context
|
||||
The following is the output from a previous implementation attempt. Continue from where you left off:
|
||||
|
||||
${context}
|
||||
|
||||
## Instructions
|
||||
Review the previous work and continue the implementation. If the feature appears complete, verify it works correctly.`;
|
||||
|
||||
return this.executeFeature(projectPath, featureId, useWorktrees, false);
|
||||
}
|
||||
|
||||
/**
|
||||
* Emit an auto-mode event wrapped in the correct format for the client.
|
||||
* All auto-mode events are sent as type "auto-mode:event" with the actual
|
||||
* event type and data in the payload.
|
||||
*/
|
||||
private emitAutoModeEvent(
|
||||
eventType: string,
|
||||
data: Record<string, unknown>
|
||||
): void {
|
||||
// Wrap the event in auto-mode:event format expected by the client
|
||||
this.events.emit("auto-mode:event", {
|
||||
type: eventType,
|
||||
...data,
|
||||
});
|
||||
}
|
||||
|
||||
private sleep(ms: number): Promise<void> {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
}
|
||||
}
|
||||
410
apps/server/src/services/feature-loader.ts
Normal file
410
apps/server/src/services/feature-loader.ts
Normal file
@@ -0,0 +1,410 @@
|
||||
/**
|
||||
* Feature Loader - Handles loading and managing features from individual feature folders
|
||||
* Each feature is stored in .automaker/features/{featureId}/feature.json
|
||||
*/
|
||||
|
||||
import path from "path";
|
||||
import fs from "fs/promises";
|
||||
|
||||
export interface Feature {
|
||||
id: string;
|
||||
category: string;
|
||||
description: string;
|
||||
steps?: string[];
|
||||
passes?: boolean;
|
||||
priority?: number;
|
||||
imagePaths?: Array<string | { path: string; [key: string]: unknown }>;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
export class FeatureLoader {
|
||||
/**
|
||||
* Get the features directory path
|
||||
*/
|
||||
getFeaturesDir(projectPath: string): string {
|
||||
return path.join(projectPath, ".automaker", "features");
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the images directory path for a feature
|
||||
*/
|
||||
getFeatureImagesDir(projectPath: string, featureId: string): string {
|
||||
return path.join(this.getFeatureDir(projectPath, featureId), "images");
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete images that were removed from a feature
|
||||
*/
|
||||
private async deleteOrphanedImages(
|
||||
projectPath: string,
|
||||
oldPaths: Array<string | { path: string; [key: string]: unknown }> | undefined,
|
||||
newPaths: Array<string | { path: string; [key: string]: unknown }> | undefined
|
||||
): Promise<void> {
|
||||
if (!oldPaths || oldPaths.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Build sets of paths for comparison
|
||||
const oldPathSet = new Set(
|
||||
oldPaths.map((p) => (typeof p === "string" ? p : p.path))
|
||||
);
|
||||
const newPathSet = new Set(
|
||||
(newPaths || []).map((p) => (typeof p === "string" ? p : p.path))
|
||||
);
|
||||
|
||||
// Find images that were removed
|
||||
for (const oldPath of oldPathSet) {
|
||||
if (!newPathSet.has(oldPath)) {
|
||||
try {
|
||||
const fullPath = path.isAbsolute(oldPath)
|
||||
? oldPath
|
||||
: path.join(projectPath, oldPath);
|
||||
|
||||
await fs.unlink(fullPath);
|
||||
console.log(`[FeatureLoader] Deleted orphaned image: ${oldPath}`);
|
||||
} catch (error) {
|
||||
// Ignore errors when deleting (file may already be gone)
|
||||
console.warn(`[FeatureLoader] Failed to delete image: ${oldPath}`, error);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Copy images from temp directory to feature directory and update paths
|
||||
*/
|
||||
private async migrateImages(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
imagePaths?: Array<string | { path: string; [key: string]: unknown }>
|
||||
): Promise<Array<string | { path: string; [key: string]: unknown }> | undefined> {
|
||||
if (!imagePaths || imagePaths.length === 0) {
|
||||
return imagePaths;
|
||||
}
|
||||
|
||||
const featureImagesDir = this.getFeatureImagesDir(projectPath, featureId);
|
||||
await fs.mkdir(featureImagesDir, { recursive: true });
|
||||
|
||||
const updatedPaths: Array<string | { path: string; [key: string]: unknown }> = [];
|
||||
|
||||
for (const imagePath of imagePaths) {
|
||||
try {
|
||||
const originalPath = typeof imagePath === "string" ? imagePath : imagePath.path;
|
||||
|
||||
// Skip if already in feature directory
|
||||
if (originalPath.includes(`/features/${featureId}/images/`)) {
|
||||
updatedPaths.push(imagePath);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Resolve the full path
|
||||
const fullOriginalPath = path.isAbsolute(originalPath)
|
||||
? originalPath
|
||||
: path.join(projectPath, originalPath);
|
||||
|
||||
// Check if file exists
|
||||
try {
|
||||
await fs.access(fullOriginalPath);
|
||||
} catch {
|
||||
console.warn(`[FeatureLoader] Image not found, skipping: ${fullOriginalPath}`);
|
||||
continue;
|
||||
}
|
||||
|
||||
// Get filename and create new path
|
||||
const filename = path.basename(originalPath);
|
||||
const newPath = path.join(featureImagesDir, filename);
|
||||
const relativePath = `.automaker/features/${featureId}/images/${filename}`;
|
||||
|
||||
// Copy the file
|
||||
await fs.copyFile(fullOriginalPath, newPath);
|
||||
console.log(`[FeatureLoader] Copied image: ${originalPath} -> ${relativePath}`);
|
||||
|
||||
// Try to delete the original temp file
|
||||
try {
|
||||
await fs.unlink(fullOriginalPath);
|
||||
} catch {
|
||||
// Ignore errors when deleting temp file
|
||||
}
|
||||
|
||||
// Update the path in the result
|
||||
if (typeof imagePath === "string") {
|
||||
updatedPaths.push(relativePath);
|
||||
} else {
|
||||
updatedPaths.push({ ...imagePath, path: relativePath });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`[FeatureLoader] Failed to migrate image:`, error);
|
||||
// Keep original path if migration fails
|
||||
updatedPaths.push(imagePath);
|
||||
}
|
||||
}
|
||||
|
||||
return updatedPaths;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the path to a specific feature folder
|
||||
*/
|
||||
getFeatureDir(projectPath: string, featureId: string): string {
|
||||
return path.join(this.getFeaturesDir(projectPath), featureId);
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the path to a feature's feature.json file
|
||||
*/
|
||||
getFeatureJsonPath(projectPath: string, featureId: string): string {
|
||||
return path.join(this.getFeatureDir(projectPath, featureId), "feature.json");
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the path to a feature's agent-output.md file
|
||||
*/
|
||||
getAgentOutputPath(projectPath: string, featureId: string): string {
|
||||
return path.join(this.getFeatureDir(projectPath, featureId), "agent-output.md");
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate a new feature ID
|
||||
*/
|
||||
generateFeatureId(): string {
|
||||
return `feature-${Date.now()}-${Math.random().toString(36).substring(2, 11)}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get all features for a project
|
||||
*/
|
||||
async getAll(projectPath: string): Promise<Feature[]> {
|
||||
try {
|
||||
const featuresDir = this.getFeaturesDir(projectPath);
|
||||
|
||||
// Check if features directory exists
|
||||
try {
|
||||
await fs.access(featuresDir);
|
||||
} catch {
|
||||
return [];
|
||||
}
|
||||
|
||||
// Read all feature directories
|
||||
const entries = await fs.readdir(featuresDir, { withFileTypes: true });
|
||||
const featureDirs = entries.filter((entry) => entry.isDirectory());
|
||||
|
||||
// Load each feature
|
||||
const features: Feature[] = [];
|
||||
for (const dir of featureDirs) {
|
||||
const featureId = dir.name;
|
||||
const featureJsonPath = this.getFeatureJsonPath(projectPath, featureId);
|
||||
|
||||
try {
|
||||
const content = await fs.readFile(featureJsonPath, "utf-8");
|
||||
const feature = JSON.parse(content);
|
||||
|
||||
if (!feature.id) {
|
||||
console.warn(
|
||||
`[FeatureLoader] Feature ${featureId} missing required 'id' field, skipping`
|
||||
);
|
||||
continue;
|
||||
}
|
||||
|
||||
features.push(feature);
|
||||
} catch (error) {
|
||||
if ((error as NodeJS.ErrnoException).code === "ENOENT") {
|
||||
continue;
|
||||
} else if (error instanceof SyntaxError) {
|
||||
console.warn(
|
||||
`[FeatureLoader] Failed to parse feature.json for ${featureId}: ${error.message}`
|
||||
);
|
||||
} else {
|
||||
console.error(
|
||||
`[FeatureLoader] Failed to load feature ${featureId}:`,
|
||||
(error as Error).message
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Sort by creation order (feature IDs contain timestamp)
|
||||
features.sort((a, b) => {
|
||||
const aTime = a.id ? parseInt(a.id.split("-")[1] || "0") : 0;
|
||||
const bTime = b.id ? parseInt(b.id.split("-")[1] || "0") : 0;
|
||||
return aTime - bTime;
|
||||
});
|
||||
|
||||
return features;
|
||||
} catch (error) {
|
||||
console.error("[FeatureLoader] Failed to get all features:", error);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get a single feature by ID
|
||||
*/
|
||||
async get(projectPath: string, featureId: string): Promise<Feature | null> {
|
||||
try {
|
||||
const featureJsonPath = this.getFeatureJsonPath(projectPath, featureId);
|
||||
const content = await fs.readFile(featureJsonPath, "utf-8");
|
||||
return JSON.parse(content);
|
||||
} catch (error) {
|
||||
if ((error as NodeJS.ErrnoException).code === "ENOENT") {
|
||||
return null;
|
||||
}
|
||||
console.error(`[FeatureLoader] Failed to get feature ${featureId}:`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Create a new feature
|
||||
*/
|
||||
async create(projectPath: string, featureData: Partial<Feature>): Promise<Feature> {
|
||||
const featureId = featureData.id || this.generateFeatureId();
|
||||
const featureDir = this.getFeatureDir(projectPath, featureId);
|
||||
const featureJsonPath = this.getFeatureJsonPath(projectPath, featureId);
|
||||
|
||||
// Ensure features directory exists
|
||||
const featuresDir = this.getFeaturesDir(projectPath);
|
||||
await fs.mkdir(featuresDir, { recursive: true });
|
||||
|
||||
// Create feature directory
|
||||
await fs.mkdir(featureDir, { recursive: true });
|
||||
|
||||
// Migrate images from temp directory to feature directory
|
||||
const migratedImagePaths = await this.migrateImages(
|
||||
projectPath,
|
||||
featureId,
|
||||
featureData.imagePaths
|
||||
);
|
||||
|
||||
// Ensure feature has required fields
|
||||
const feature: Feature = {
|
||||
category: featureData.category || "Uncategorized",
|
||||
description: featureData.description || "",
|
||||
...featureData,
|
||||
id: featureId,
|
||||
imagePaths: migratedImagePaths,
|
||||
};
|
||||
|
||||
// Write feature.json
|
||||
await fs.writeFile(featureJsonPath, JSON.stringify(feature, null, 2), "utf-8");
|
||||
|
||||
console.log(`[FeatureLoader] Created feature ${featureId}`);
|
||||
return feature;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a feature (partial updates supported)
|
||||
*/
|
||||
async update(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
updates: Partial<Feature>
|
||||
): Promise<Feature> {
|
||||
const feature = await this.get(projectPath, featureId);
|
||||
if (!feature) {
|
||||
throw new Error(`Feature ${featureId} not found`);
|
||||
}
|
||||
|
||||
// Handle image path changes
|
||||
let updatedImagePaths = updates.imagePaths;
|
||||
if (updates.imagePaths !== undefined) {
|
||||
// Delete orphaned images (images that were removed)
|
||||
await this.deleteOrphanedImages(
|
||||
projectPath,
|
||||
feature.imagePaths,
|
||||
updates.imagePaths
|
||||
);
|
||||
|
||||
// Migrate any new images
|
||||
updatedImagePaths = await this.migrateImages(
|
||||
projectPath,
|
||||
featureId,
|
||||
updates.imagePaths
|
||||
);
|
||||
}
|
||||
|
||||
// Merge updates
|
||||
const updatedFeature: Feature = {
|
||||
...feature,
|
||||
...updates,
|
||||
...(updatedImagePaths !== undefined ? { imagePaths: updatedImagePaths } : {}),
|
||||
};
|
||||
|
||||
// Write back to file
|
||||
const featureJsonPath = this.getFeatureJsonPath(projectPath, featureId);
|
||||
await fs.writeFile(
|
||||
featureJsonPath,
|
||||
JSON.stringify(updatedFeature, null, 2),
|
||||
"utf-8"
|
||||
);
|
||||
|
||||
console.log(`[FeatureLoader] Updated feature ${featureId}`);
|
||||
return updatedFeature;
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a feature
|
||||
*/
|
||||
async delete(projectPath: string, featureId: string): Promise<boolean> {
|
||||
try {
|
||||
const featureDir = this.getFeatureDir(projectPath, featureId);
|
||||
await fs.rm(featureDir, { recursive: true, force: true });
|
||||
console.log(`[FeatureLoader] Deleted feature ${featureId}`);
|
||||
return true;
|
||||
} catch (error) {
|
||||
console.error(`[FeatureLoader] Failed to delete feature ${featureId}:`, error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get agent output for a feature
|
||||
*/
|
||||
async getAgentOutput(
|
||||
projectPath: string,
|
||||
featureId: string
|
||||
): Promise<string | null> {
|
||||
try {
|
||||
const agentOutputPath = this.getAgentOutputPath(projectPath, featureId);
|
||||
const content = await fs.readFile(agentOutputPath, "utf-8");
|
||||
return content;
|
||||
} catch (error) {
|
||||
if ((error as NodeJS.ErrnoException).code === "ENOENT") {
|
||||
return null;
|
||||
}
|
||||
console.error(
|
||||
`[FeatureLoader] Failed to get agent output for ${featureId}:`,
|
||||
error
|
||||
);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Save agent output for a feature
|
||||
*/
|
||||
async saveAgentOutput(
|
||||
projectPath: string,
|
||||
featureId: string,
|
||||
content: string
|
||||
): Promise<void> {
|
||||
const featureDir = this.getFeatureDir(projectPath, featureId);
|
||||
await fs.mkdir(featureDir, { recursive: true });
|
||||
|
||||
const agentOutputPath = this.getAgentOutputPath(projectPath, featureId);
|
||||
await fs.writeFile(agentOutputPath, content, "utf-8");
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete agent output for a feature
|
||||
*/
|
||||
async deleteAgentOutput(projectPath: string, featureId: string): Promise<void> {
|
||||
try {
|
||||
const agentOutputPath = this.getAgentOutputPath(projectPath, featureId);
|
||||
await fs.unlink(agentOutputPath);
|
||||
} catch (error) {
|
||||
if ((error as NodeJS.ErrnoException).code !== "ENOENT") {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
20
apps/server/tsconfig.json
Normal file
20
apps/server/tsconfig.json
Normal file
@@ -0,0 +1,20 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2022",
|
||||
"module": "NodeNext",
|
||||
"moduleResolution": "NodeNext",
|
||||
"lib": ["ES2022"],
|
||||
"outDir": "./dist",
|
||||
"rootDir": "./src",
|
||||
"strict": true,
|
||||
"esModuleInterop": true,
|
||||
"skipLibCheck": true,
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"resolveJsonModule": true,
|
||||
"declaration": true,
|
||||
"declarationMap": true,
|
||||
"sourceMap": true
|
||||
},
|
||||
"include": ["src/**/*"],
|
||||
"exclude": ["node_modules", "dist"]
|
||||
}
|
||||
40
docker-compose.yml
Normal file
40
docker-compose.yml
Normal file
@@ -0,0 +1,40 @@
|
||||
# Automaker Docker Compose
|
||||
# For self-hosting the Automaker backend server
|
||||
|
||||
services:
|
||||
server:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: apps/server/Dockerfile
|
||||
container_name: automaker-server
|
||||
restart: unless-stopped
|
||||
ports:
|
||||
- "3008:3008"
|
||||
environment:
|
||||
# Required
|
||||
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
|
||||
|
||||
# Optional - authentication (leave empty to disable)
|
||||
- AUTOMAKER_API_KEY=${AUTOMAKER_API_KEY:-}
|
||||
|
||||
# Optional - restrict to specific directories (comma-separated)
|
||||
- ALLOWED_PROJECT_DIRS=${ALLOWED_PROJECT_DIRS:-/projects}
|
||||
|
||||
# Optional - data directory for sessions, etc.
|
||||
- DATA_DIR=/data
|
||||
|
||||
# Optional - CORS origin (default allows all)
|
||||
- CORS_ORIGIN=${CORS_ORIGIN:-*}
|
||||
|
||||
# Optional - additional API keys
|
||||
- OPENAI_API_KEY=${OPENAI_API_KEY:-}
|
||||
- GOOGLE_API_KEY=${GOOGLE_API_KEY:-}
|
||||
volumes:
|
||||
# Persist data between restarts
|
||||
- automaker-data:/data
|
||||
|
||||
# Mount your projects directory (read-write access)
|
||||
- ${PROJECTS_DIR:-./projects}:/projects
|
||||
|
||||
volumes:
|
||||
automaker-data:
|
||||
2769
package-lock.json
generated
2769
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -13,9 +13,13 @@
|
||||
"dev:electron:debug": "npm run dev:electron:debug --workspace=apps/app",
|
||||
"dev:electron:wsl": "npm run dev:electron:wsl --workspace=apps/app",
|
||||
"dev:electron:wsl:gpu": "npm run dev:electron:wsl:gpu --workspace=apps/app",
|
||||
"dev:server": "npm run dev --workspace=apps/server",
|
||||
"dev:full": "concurrently \"npm run dev:server\" \"npm run dev:web\"",
|
||||
"build": "npm run build --workspace=apps/app",
|
||||
"build:server": "npm run build --workspace=apps/server",
|
||||
"build:electron": "npm run build:electron --workspace=apps/app",
|
||||
"start": "npm run start --workspace=apps/app",
|
||||
"start:server": "npm run start --workspace=apps/server",
|
||||
"lint": "npm run lint --workspace=apps/app",
|
||||
"test": "npm run test --workspace=apps/app",
|
||||
"test:headed": "npm run test:headed --workspace=apps/app",
|
||||
|
||||
267
plan.md
Normal file
267
plan.md
Normal file
@@ -0,0 +1,267 @@
|
||||
# Plan: Full Web Support for Automaker
|
||||
|
||||
## Goal
|
||||
Make the app work fully in web browsers while keeping Electron support. Web mode connects to a backend server (self-hosted or cloud). Electron embeds the same server locally.
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────┐
|
||||
│ Next.js Frontend │
|
||||
│ (same code both modes) │
|
||||
└───────────────┬─────────────────────┘
|
||||
│
|
||||
┌───────────┴───────────┐
|
||||
│ │
|
||||
[Web Mode] [Electron Mode]
|
||||
│ │
|
||||
HTTP/WebSocket HTTP/WebSocket
|
||||
to remote server to localhost:3008
|
||||
│ │
|
||||
└───────────┬───────────┘
|
||||
│
|
||||
┌───────────────▼─────────────────────┐
|
||||
│ Backend Server │
|
||||
│ (apps/server) │
|
||||
│ - Express + WebSocket │
|
||||
│ - All services from electron/ │
|
||||
│ - Claude Agent SDK │
|
||||
│ - File ops, Git, PTY │
|
||||
└─────────────────────────────────────┘
|
||||
```
|
||||
|
||||
**Key insight**: Electron uses the same HTTP API - just connects to localhost instead of remote.
|
||||
|
||||
---
|
||||
|
||||
## New Package: `apps/server`
|
||||
|
||||
```
|
||||
apps/server/
|
||||
├── package.json
|
||||
├── src/
|
||||
│ ├── index.ts # Express server entry
|
||||
│ ├── routes/
|
||||
│ │ ├── fs.ts # File system routes
|
||||
│ │ ├── agent.ts # Agent routes
|
||||
│ │ ├── sessions.ts # Session routes
|
||||
│ │ ├── auto-mode.ts # Auto mode routes
|
||||
│ │ ├── features.ts # Features routes
|
||||
│ │ ├── worktree.ts # Git worktree routes
|
||||
│ │ ├── setup.ts # Setup/config routes
|
||||
│ │ └── suggestions.ts # Feature suggestions routes
|
||||
│ ├── services/ # Moved from electron/services/
|
||||
│ │ ├── agent-service.ts
|
||||
│ │ ├── auto-mode-service.ts
|
||||
│ │ ├── worktree-manager.ts
|
||||
│ │ ├── feature-loader.ts
|
||||
│ │ ├── feature-executor.ts
|
||||
│ │ └── ...
|
||||
│ └── lib/
|
||||
│ ├── events.ts # Event emitter for streaming
|
||||
│ └── security.ts # Path validation
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Critical Files to Modify
|
||||
|
||||
| File | Change |
|
||||
|------|--------|
|
||||
| `apps/app/src/lib/electron.ts` | Add `HttpApiClient` class that implements `ElectronAPI` using fetch/WebSocket |
|
||||
| `apps/app/electron/main.js` | Simplify to: spawn server + create window (remove 1500+ lines of IPC handlers) |
|
||||
| `apps/app/electron/preload.js` | Simplify to just expose `isElectron` flag |
|
||||
| `apps/app/package.json` | Remove server-side deps (Claude SDK, pty) |
|
||||
| Root `package.json` | Add `apps/server` workspace |
|
||||
|
||||
---
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1: Create Server Package (Foundation)
|
||||
1. Create `apps/server` with Express + TypeScript setup
|
||||
2. Add health check endpoint: `GET /api/health`
|
||||
3. Copy one simple service (feature-loader) and create route
|
||||
4. Test with curl/Postman
|
||||
|
||||
### Phase 2: File System API
|
||||
1. Create `POST /api/fs/read`, `POST /api/fs/write`, etc.
|
||||
2. Add path security (allowlist validation)
|
||||
3. Update `electron.ts` with `HttpApiClient` for fs operations
|
||||
4. Test: file operations work in web mode
|
||||
|
||||
### Phase 3: Agent API with Streaming
|
||||
1. Add WebSocket server for events (`/api/events`)
|
||||
2. Migrate `agent-service.js` to TypeScript
|
||||
3. Create routes: `POST /api/agent/send`, etc.
|
||||
4. Events stream via WebSocket instead of IPC
|
||||
5. Test: chat works in web mode
|
||||
|
||||
### Phase 4: Sessions & Features API
|
||||
1. Migrate session management routes
|
||||
2. Migrate features CRUD routes
|
||||
3. Test: project/feature management works
|
||||
|
||||
### Phase 5: Auto Mode & Worktree
|
||||
1. Migrate `auto-mode-service.js` (complex - has streaming)
|
||||
2. Migrate `worktree-manager.js`
|
||||
3. Test: auto mode runs features in web
|
||||
|
||||
### Phase 6: Remaining Services
|
||||
1. Spec regeneration
|
||||
2. Feature suggestions
|
||||
3. Setup/CLI detection
|
||||
4. Model provider checks
|
||||
|
||||
### Phase 7: Simplify Electron
|
||||
1. Update `main.js` to spawn server process + create window
|
||||
2. Remove all IPC handlers
|
||||
3. Electron app uses HTTP like web
|
||||
4. Test: Electron still works
|
||||
|
||||
### Phase 8: Production Ready
|
||||
1. Add authentication (API key header)
|
||||
2. Configure CORS for production
|
||||
3. Add `ALLOWED_PROJECT_DIRS` env for security
|
||||
4. Docker setup for deployment
|
||||
5. Update build scripts
|
||||
|
||||
---
|
||||
|
||||
## API Design Pattern
|
||||
|
||||
Convert IPC handlers to REST:
|
||||
|
||||
```
|
||||
IPC: dialog:openDirectory → Web: User types path, POST /api/fs/validate
|
||||
IPC: fs:readFile → POST /api/fs/read { filePath }
|
||||
IPC: agent:send → POST /api/agent/send { sessionId, message, ... }
|
||||
IPC: auto-mode:start → POST /api/auto-mode/start { projectPath }
|
||||
IPC: features:getAll → GET /api/projects/:path/features
|
||||
```
|
||||
|
||||
Streaming via WebSocket:
|
||||
```
|
||||
ws://server/api/events
|
||||
|
||||
Events: agent:stream, auto-mode:event, suggestions:event
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Web-Specific Handling
|
||||
|
||||
| Feature | Electron | Web |
|
||||
|---------|----------|-----|
|
||||
| File picker | Native dialog | Text input + server validation |
|
||||
| Open link | shell.openExternal | window.open() |
|
||||
| Data directory | app.getPath('userData') | Server's DATA_DIR env |
|
||||
|
||||
---
|
||||
|
||||
## Configuration
|
||||
|
||||
**Server `.env`:**
|
||||
```
|
||||
PORT=3008
|
||||
DATA_DIR=/path/to/data
|
||||
ANTHROPIC_API_KEY=xxx
|
||||
ALLOWED_PROJECT_DIRS=/home/user/projects
|
||||
```
|
||||
|
||||
**Frontend `.env.local`:**
|
||||
```
|
||||
NEXT_PUBLIC_SERVER_URL=http://localhost:3008
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Estimated Scope
|
||||
|
||||
- New files: ~15-20 (server package)
|
||||
- Modified files: ~5 (electron.ts, main.js, preload.js, package.jsons)
|
||||
- Deleted lines: ~1500 (IPC handlers from main.js)
|
||||
- Services to migrate: ~10
|
||||
|
||||
---
|
||||
|
||||
## Implementation Status
|
||||
|
||||
### ✅ ALL PHASES COMPLETE
|
||||
|
||||
- [x] **Phase 1**: Server package foundation (`apps/server`)
|
||||
- Express server with WebSocket support
|
||||
- Event emitter for streaming
|
||||
- Security module for path validation
|
||||
- Health check endpoint
|
||||
|
||||
- [x] **Phase 2**: HttpApiClient in frontend
|
||||
- `apps/app/src/lib/http-api-client.ts` - full implementation
|
||||
- Modified `electron.ts` to use HTTP client when not in Electron
|
||||
- No mocks - all calls go through HTTP
|
||||
|
||||
- [x] **Phase 3**: Agent API with streaming
|
||||
- `apps/server/src/services/agent-service.ts`
|
||||
- `apps/server/src/routes/agent.ts`
|
||||
- WebSocket streaming for responses
|
||||
|
||||
- [x] **Phase 4**: Sessions & Features API
|
||||
- `apps/server/src/routes/sessions.ts`
|
||||
- `apps/server/src/services/feature-loader.ts`
|
||||
- `apps/server/src/routes/features.ts`
|
||||
|
||||
- [x] **Phase 5**: Auto Mode & Worktree
|
||||
- `apps/server/src/services/auto-mode-service.ts` - full implementation with Claude SDK
|
||||
- `apps/server/src/routes/auto-mode.ts`
|
||||
- `apps/server/src/routes/worktree.ts`
|
||||
- `apps/server/src/routes/git.ts`
|
||||
|
||||
- [x] **Phase 6**: Remaining services
|
||||
- `apps/server/src/routes/setup.ts` - CLI detection, API keys, platform info
|
||||
- `apps/server/src/routes/suggestions.ts` - AI-powered feature suggestions
|
||||
- `apps/server/src/routes/spec-regeneration.ts` - spec generation from overview
|
||||
- `apps/server/src/routes/models.ts` - model providers and availability
|
||||
- `apps/server/src/routes/running-agents.ts` - active agent tracking
|
||||
|
||||
- [x] **Phase 7**: Simplify Electron
|
||||
- `apps/app/electron/main-simplified.js` - spawns server, minimal IPC
|
||||
- `apps/app/electron/preload-simplified.js` - only native features exposed
|
||||
- Updated `electron.ts` to detect simplified mode
|
||||
- Updated `http-api-client.ts` to use native dialogs when available
|
||||
|
||||
- [x] **Phase 8**: Production ready
|
||||
- `apps/server/src/lib/auth.ts` - API key authentication middleware
|
||||
- `apps/server/Dockerfile` - multi-stage Docker build
|
||||
- `docker-compose.yml` - easy deployment configuration
|
||||
- `apps/server/.env.example` - documented configuration
|
||||
|
||||
---
|
||||
|
||||
## Additional Fixes Applied
|
||||
|
||||
### State Persistence
|
||||
- Features now cached in localStorage via Zustand persist middleware
|
||||
- Board view properly handles API failures by keeping cached data
|
||||
- Theme and UI state properly persisted across refreshes
|
||||
|
||||
### Authentication Display
|
||||
- Server now returns proper auth method names: `oauth_token_env`, `oauth_token`, `api_key_env`, `api_key`
|
||||
- Settings view displays correct auth source (OAuth token, API key, subscription, etc.)
|
||||
- Added support for Codex subscription detection
|
||||
- Fixed "Unknown method" display issue
|
||||
|
||||
### Bug Fixes
|
||||
- Fixed board-view.tsx crash when feature status is unknown (defaults to backlog)
|
||||
- Removed "Mock IPC" label from web mode indicator
|
||||
- Fixed unused imports and dependency warnings
|
||||
- Updated API key authentication header support in HTTP client
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
The architecture is simple: **one backend server, two ways to access it** (web browser or Electron shell).
|
||||
|
||||
- **Web users**: Connect browser to your cloud-hosted server
|
||||
- **Electron users**: App spawns server locally, connects to localhost
|
||||
- **Same codebase**: Frontend code unchanged, backend services extracted to standalone server
|
||||
Reference in New Issue
Block a user