Files
BMAD-METHOD/demos/early-v3alpha-full-stack-app-demo/4-arch-suggested-changes.md

111 lines
9.7 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

Heres a summary of potential additions and adjustments:
**New Technical Epics/Stories (or significant additions to existing ones):**
1. **New Technical Story (or Sub-tasks under Epic 1): Workflow Orchestration Setup**
- **Goal:** Implement the core workflow orchestration mechanism.
- **Stories/Tasks:**
- **Story: Define and Implement `workflow_runs` Table and `WorkflowTrackerService`**
- Acceptance Criteria:
- Supabase migration created for the `workflow_runs` table as defined in the architecture document.
- `WorkflowTrackerService` implemented in `supabase/functions/_shared/` with methods for initiating, updating step details, incrementing counters, failing, and completing workflow runs.
- Service includes robust error handling and logging via Pino.
- Unit tests for `WorkflowTrackerService` achieve >80% coverage.
- **Story: Implement `CheckWorkflowCompletionService` (Supabase Cron Function)**
- Acceptance Criteria:
- Supabase Function `check-workflow-completion-service` created.
- Function queries `workflow_runs` and related tables to determine if a workflow run is ready to progress to the next major stage (e.g., from summarization to newsletter generation, from podcast initiated to delivery).
- Function correctly updates `workflow_runs.status` and invokes the next appropriate service function (e.g., `NewsletterGenerationService`) via a Supabase database webhook trigger or direct HTTP call if preferred.
- Logic for handling podcast link availability (delay/retry/timeout before sending email) is implemented here or in conjunction with `NewsletterGenerationService`.
- The function is configurable to be run periodically via Vercel Cron Jobs (or `pg_cron`).
- Comprehensive logging implemented using Pino.
- Unit tests achieve >80% coverage.
- **Story: Implement Workflow Status API Endpoint (`/api/system/workflow-status/{jobId}`)**
- Acceptance Criteria:
- Next.js API Route Handler created at `/api/system/workflow-status/{jobId}`.
- Endpoint secured with API Key authentication.
- Retrieves and returns status details from the `workflow_runs` table for the given `jobId`.
- Handles cases where `jobId` is not found (404).
- Unit and integration tests for the API endpoint.
2. **New Technical Story (under Epic 3: AI-Powered Content Summarization): Implement LLM Facade and Configuration**
- **Goal:** Create a flexible interface for interacting with different LLM providers for summarization.
- **Story: Design and Implement `LLMFacade`**
- Acceptance Criteria:
- `LLMFacade` interface and concrete implementations (e.g., `OllamaAdapter`, `RemoteLLMApiAdapter`) created in `supabase/functions/_shared/llm-facade.ts`.
- Factory function implemented to select the LLM adapter based on environment variables (`LLM_PROVIDER_TYPE`, `OLLAMA_API_URL`, `REMOTE_LLM_API_KEY`, `REMOTE_LLM_API_URL`, `LLM_MODEL_NAME`).
- Facades handle making requests to the respective LLM APIs and parsing responses.
- Error handling and retry logic for transient API errors implemented within the facade.
- Unit tests for the facade and adapters (mocking actual HTTP calls) achieve >80% coverage.
3. **New Technical Story (or Sub-task under relevant Epics): Implement Facades for External Services**
- **Goal:** Encapsulate all external service interactions.
- **Stories/Tasks:**
- **Story: Implement `HNAlgoliaFacade`** (for HN Content Service)
- **Story: Implement `PlayHTFacade`** (for Podcast Generation Service)
- **Story: Implement `NodemailerFacade`** (for Newsletter Generation Service)
- Acceptance Criteria (General for each facade):
- Facade created in `supabase/functions/_shared/`.
- Handles API authentication, request formation, and response parsing for the specific service.
- Implements basic retry logic for transient errors.
- Unit tests (mocking actual HTTP/library calls) achieve >80% coverage.
**Adjustments to Existing Epics/Stories:**
- **Epic 1: Project Initialization, Setup, and HN Content Acquisition**
- **Story 1.3 (API and CLI trigger):**
- Modify AC: "The API endpoint (`/api/system/trigger-workflow`) creates an entry in the `workflow_runs` table and returns the `jobId`."
- Add AC: "The API endpoint is secured via an API key."
- Add AC: "The CLI command invokes the `/api/system/trigger-workflow` endpoint or directly interacts with `WorkflowTrackerService` to start a new workflow run."
- Add AC: "All interactions with the API or CLI that initiate a workflow must record the `workflow_run_id` in logs."
- **Story 1.4 (Retrieve HN Posts & Comments):**
- Modify AC: "Retrieved data (posts and comments) is stored in Supabase database, linked to the current `workflow_run_id`."
- Add AC: "Upon completion, the service updates the `workflow_runs` table with status and details (e.g., number of posts fetched) via `WorkflowTrackerService`."
- Add AC: "For each new `hn_posts` record, a database event/webhook is configured to trigger the Article Scraping Service (Epic 2), passing `hn_post_id` and `workflow_run_id`."
- **Epic 2: Article Scraping**
- **Story 2.2 & 2.3 (Scrape & Store):**
- Modify AC: "Scraped article content is stored in the `scraped_articles` table, linked to the `hn_post_id` and the current `workflow_run_id`."
- Add AC: "The `scraping_status` and any `error_message` are recorded in the `scraped_articles` table."
- Add AC: "Upon completion of scraping an article (success or failure), the service updates the `workflow_runs.details` (e.g., incrementing scraped counts) via `WorkflowTrackerService`."
- Add AC: "For each successfully scraped article, a database event/webhook is configured to trigger the Summarization Service (Epic 3) for that article, passing `scraped_article_id` and `workflow_run_id`."
- **Story 2.4 (Trigger scraping via API/CLI):** This story might become redundant if the main workflow trigger (Story 1.3) handles the entire pipeline initiation. If retained for isolated testing, ensure it also works with the `workflow_run_id` concept.
- **Epic 3: AI-Powered Content Summarization**
- **Story 3.1 (Integrate AI summarization):** Refine to "Integrate with `LLMFacade` for summarization."
- **Story 3.2 & 3.3 (Generate Summaries):**
- Modify AC: "Summaries are stored in `article_summaries` or `comment_summaries` tables, linked to the `scraped_article_id` (for articles) or `hn_post_id` (for comments), and the current `workflow_run_id`."
- Add AC: "The service retrieves prompts from the `summarization_prompts` table."
- Add AC: "Upon completion of each summarization task, the service updates `workflow_runs.details` (e.g., incrementing summaries generated counts) via `WorkflowTrackerService`."
- Add AC (Implicit): The `CheckWorkflowCompletionService` will monitor these tables to determine when all summarization for a `workflow_run_id` is complete.
- **Epic 4: Automated Newsletter Creation and Distribution**
- **Story 4.1 & 4.2 (Retrieve template & Generate Newsletter):**
- Modify AC: "The `NewsletterGenerationService` is triggered by the `CheckWorkflowCompletionService` when all summaries for a `workflow_run_id` are ready."
- Add AC: "The service retrieves the newsletter template from `newsletter_templates` table and summaries associated with the `workflow_run_id`."
- Modify AC: "Generated newsletter is stored in the `newsletters` table, linked to the `workflow_run_id`."
- Add AC: "The service updates `workflow_runs.status` to 'generating_podcast' (or similar) after initiating podcast generation."
- **Story 4.3 (Send newsletter):**
- Modify AC: "The `NewsletterGenerationService` (specifically, its delivery part) is triggered by `CheckWorkflowCompletionService` once the podcast link is available for the `workflow_run_id` (or timeout/failure occurred for podcast step)."
- Modify AC: "Implements conditional logic for podcast link inclusion and handles delay/retry as per PRD, coordinated by `CheckWorkflowCompletionService`."
- Add AC: "Updates `newsletters.delivery_status` and `workflow_runs.status` to 'completed' or 'failed' via `WorkflowTrackerService`."
- **Epic 5: Podcast Generation Integration**
- **Story 5.1, 5.2, 5.3 (Integrate Play.ht, Send Content, Webhook Handler):**
- Modify AC: The `PodcastGenerationService` is invoked by `NewsletterGenerationService` (or `CheckWorkflowCompletionService`) for a specific `workflow_run_id` and `newsletter_id`.
- Modify AC: The `podcast_playht_job_id` and `podcast_status` are stored in the `newsletters` table.
- Modify AC: The `PlayHTWebhookHandlerAPI` (`/api/webhooks/playht`) updates the `newsletters` table with the podcast URL and status.
- Add AC: The `PlayHTWebhookHandlerAPI` also updates the `workflow_runs.details` with the podcast status via `WorkflowTrackerService` for the relevant `workflow_run_id` (needs to be looked up from the `newsletter_id` or `playht_job_id`).
**General Considerations for all Epics:**
- **Error Handling & Logging:** Reinforce that all services must implement robust error handling, log extensively using Pino (including `workflow_run_id`), and update `workflow_runs` appropriately on success or failure using the `WorkflowTrackerService`.
- **Database Webhook/Trigger Configuration:** Add technical tasks/stories related to setting up the actual Supabase database webhooks (e.g., `pg_net` calls) or triggers that connect the pipeline steps (e.g., `hn_posts` insert triggering `ArticleScrapingService`). This is a crucial implementation detail of the event-driven flow.
- **Environment Variable Management:** A story to create and document `docs/environment-vars.md` and set up `.env.example`.