Files
BMAD-METHOD/BETA-V3/v3-demos/project1/architecture.txt

1035 lines
67 KiB
Plaintext

# BMad DiCaster Architecture Document
## Introduction / Preamble
This document outlines the overall project architecture for BMad DiCaster, including backend systems, shared services, and non-UI specific concerns. Its primary goal is to serve as the guiding architectural blueprint for AI-driven development, ensuring consistency and adherence to chosen patterns and technologies.
**Relationship to Frontend Architecture:**
This project includes a significant user interface. A separate Frontend Architecture Document (expected to be named `frontend-architecture.md` and linked in "Key Reference Documents" once created) will detail the frontend-specific design and MUST be used in conjunction with this document. Core technology stack choices documented herein (see "Definitive Tech Stack Selections") are definitive for the entire project, including any frontend components.
## Table of Contents
1. [Introduction / Preamble](https://www.google.com/search?q=%23introduction--preamble)
2. [Table of Contents](https://www.google.com/search?q=%23table-of-contents)
3. [Technical Summary](https://www.google.com/search?q=%23technical-summary)
4. [High-Level Overview](https://www.google.com/search?q=%23high-level-overview)
5. [Component View](https://www.google.com/search?q=%23component-view)
- [Architectural / Design Patterns Adopted](https://www.google.com/search?q=%23architectural--design-patterns-adopted)
6. [Workflow Orchestration and Status Management](https://www.google.com/search?q=%23workflow-orchestration-and-status-management)
7. [Project Structure](https://www.google.com/search?q=%23project-structure)
- [Key Directory Descriptions](https://www.google.com/search?q=%23key-directory-descriptions)
- [Monorepo Management](https://www.google.com/search?q=%23monorepo-management)
- [Notes](https://www.google.com/search?q=%23notes)
8. [API Reference](https://www.google.com/search?q=%23api-reference)
- [External APIs Consumed](https://www.google.com/search?q=%23external-apis-consumed)
- [Internal APIs Provided (by BMad DiCaster)](https://www.google.com/search?q=%23internal-apis-provided-by-bmad-dicaster)
9. [Data Models](https://www.google.com/search?q=%23data-models)
- [Core Application Entities / Domain Objects](https://www.google.com/search?q=%23core-application-entities--domain-objects)
- [Database Schemas (Supabase PostgreSQL)](https://www.google.com/search?q=%23database-schemas-supabase-postgresql)
10. [Core Workflow / Sequence Diagrams](https://www.google.com/search?q=%23core-workflow--sequence-diagrams)
- [1. Daily Workflow Initiation & HN Content Acquisition](https://www.google.com/search?q=%231-daily-workflow-initiation--hn-content-acquisition)
- [2. Article Scraping & Summarization Flow](https://www.google.com/search?q=%232-article-scraping--summarization-flow)
- [3. Newsletter, Podcast, and Delivery Flow](https://www.google.com/search?q=%233-newsletter-podcast-and-delivery-flow)
11. [Definitive Tech Stack Selections](https://www.google.com/search?q=%23definitive-tech-stack-selections)
12. [Infrastructure and Deployment Overview](https://www.google.com/search?q=%23infrastructure-and-deployment-overview)
13. [Error Handling Strategy](https://www.google.com/search?q=%23error-handling-strategy)
14. [Coding Standards](https://www.google.com/search?q=%23coding-standards)
- [Detailed Language & Framework Conventions](https://www.google.com/search?q=%23detailed-language--framework-conventions)
15. [Overall Testing Strategy](https://www.google.com/search?q=%23overall-testing-strategy)
16. [Security Best Practices](https://www.google.com/search?q=%23security-best-practices)
17. [Key Reference Documents](https://www.google.com/search?q=%23key-reference-documents)
18. [Change Log](https://www.google.com/search?q=%23change-log)
19. [Prompt for Design Architect: Frontend Architecture Definition](https://www.google.com/search?q=%23prompt-for-design-architect-frontend-architecture-definition)
## Technical Summary
BMad DiCaster is a web application designed to provide daily, concise summaries of top Hacker News (HN) posts, delivered as an HTML newsletter and an optional AI-generated podcast, accessible via a Next.js web interface. The system employs a serverless, event-driven architecture hosted on Vercel, with Supabase providing PostgreSQL database services and function hosting. Key components include services for HN content retrieval, article scraping (using Cheerio), AI-powered summarization (via a configurable LLM facade for Ollama/remote APIs), podcast generation (Play.ht), newsletter generation (Nodemailer), and workflow orchestration. The architecture emphasizes modularity, clear separation of concerns (pragmatic hexagonal approach for complex functions), and robust error handling, aiming for efficient development, particularly by AI developer agents.
## High-Level Overview
The BMad DiCaster application will adopt a **serverless, event-driven architecture** hosted entirely on Vercel, with Supabase providing backend services (database and functions). The project will be structured as a **monorepo**, containing both the Next.js frontend application and the backend Supabase functions.
The core data processing flow is designed as an event-driven pipeline:
1. A scheduled mechanism (Vercel Cron Job) or manual trigger (API/CLI) initiates the daily workflow, creating a `workflow_run` job.
2. Hacker News posts and comments are retrieved (HN Algolia API) and stored in Supabase.
3. This data insertion triggers a Supabase function (via database webhook) to scrape linked articles.
4. Successful article scraping and storage trigger further Supabase functions for AI-powered summarization of articles and comments.
5. The completion of summarization steps for a workflow run is tracked, and once all prerequisites are met, a newsletter generation service is triggered.
6. The newsletter content is sent to the Play.ht API to generate a podcast.
7. Play.ht calls a webhook to notify our system when the podcast is ready, providing the podcast URL.
8. The newsletter data in Supabase is updated with the podcast URL.
9. The newsletter is then delivered to subscribers via Nodemailer, after considering podcast availability (with delay/retry logic).
10. The Next.js frontend allows users to view current and past newsletters and listen to the podcasts.
This event-driven approach, using Supabase Database Webhooks (via `pg_net` or native functionality) to trigger Vercel-hosted Supabase Functions, aims to create a resilient and scalable system. It mitigates potential timeout issues by breaking down long-running processes into smaller, asynchronously triggered units.
Below is a system context diagram illustrating the primary services and user interactions:
```mermaid
graph TD
User[Developer/Admin] -- "Triggers Daily Workflow (API/CLI/Cron)" --> BMadDiCasterBE[BMad DiCaster Backend Logic]
UserWeb[End User] -- "Accesses Web Interface" --> BMadDiCasterFE[BMad DiCaster Frontend (Next.js on Vercel)]
BMadDiCasterFE -- "Displays Data From" --> SupabaseDB[Supabase PostgreSQL]
BMadDiCasterFE -- "Interacts With for Data/Triggers" --> SupabaseFunctions[Supabase Functions on Vercel]
subgraph "BMad DiCaster Backend Logic (Supabase Functions & Vercel)"
direction LR
SupabaseFunctions
HNAPI[Hacker News Algolia API]
ArticleScraper[Article Scraper Service]
Summarizer[Summarization Service (LLM Facade)]
PlayHTAPI[Play.ht API]
NewsletterService[Newsletter Generation & Delivery Service]
Nodemailer[Nodemailer Service]
end
BMadDiCasterBE --> SupabaseDB
SupabaseFunctions -- "Fetches HN Data" --> HNAPI
SupabaseFunctions -- "Scrapes Articles" --> ArticleScraper
ArticleScraper -- "Gets URLs from" --> SupabaseDB
ArticleScraper -- "Stores Content" --> SupabaseDB
SupabaseFunctions -- "Summarizes Content" --> Summarizer
Summarizer -- "Uses Prompts from / Stores Summaries" --> SupabaseDB
SupabaseFunctions -- "Generates Podcast" --> PlayHTAPI
PlayHTAPI -- "Sends Webhook (Podcast URL)" --> SupabaseFunctions
SupabaseFunctions -- "Updates Podcast URL" --> SupabaseDB
SupabaseFunctions -- "Generates Newsletter" --> NewsletterService
NewsletterService -- "Uses Template/Data from" --> SupabaseDB
NewsletterService -- "Sends Emails Via" --> Nodemailer
SupabaseDB -- "Stores Subscriber List" --> NewsletterService
classDef user fill:#9cf,stroke:#333,stroke-width:2px;
classDef fe fill:#f9f,stroke:#333,stroke-width:2px;
classDef be fill:#ccf,stroke:#333,stroke-width:2px;
classDef external fill:#ffc,stroke:#333,stroke-width:2px;
classDef db fill:#cfc,stroke:#333,stroke-width:2px;
class User,UserWeb user;
class BMadDiCasterFE fe;
class BMadDiCasterBE,SupabaseFunctions,ArticleScraper,Summarizer,NewsletterService be;
class HNAPI,PlayHTAPI,Nodemailer external;
class SupabaseDB db;
```
## Component View
The BMad DiCaster system is composed of several key logical components, primarily implemented as serverless functions (Supabase Functions deployed on Vercel) and a Next.js frontend application. These components work together in an event-driven manner.
```mermaid
graph TD
subgraph FrontendApp [Frontend Application (Next.js)]
direction LR
WebAppUI["Web Application UI (React Components)"]
APIServiceFE["API Service (Frontend - Next.js Route Handlers)"]
end
subgraph BackendServices [Backend Services (Supabase Functions & Core Logic)]
direction TB
WorkflowTriggerAPI["Workflow Trigger API (/api/system/trigger-workflow)"]
HNContentService["HN Content Service (Supabase Fn)"]
ArticleScrapingService["Article Scraping Service (Supabase Fn)"]
SummarizationService["Summarization Service (LLM Facade - Supabase Fn)"]
PodcastGenerationService["Podcast Generation Service (Supabase Fn)"]
NewsletterGenerationService["Newsletter Generation Service (Supabase Fn)"]
PlayHTWebhookHandlerAPI["Play.ht Webhook API (/api/webhooks/playht)"]
CheckWorkflowCompletionService["CheckWorkflowCompletionService (Supabase Cron Fn)"]
end
subgraph ExternalIntegrations [External APIs & Services]
direction TB
HNAlgoliaAPI["Hacker News Algolia API"]
PlayHTAPI["Play.ht API"]
LLMProvider["LLM Provider (Ollama/Remote API)"]
NodemailerService["Nodemailer (Email Delivery)"]
end
subgraph DataStorage [Data Storage (Supabase PostgreSQL)]
direction TB
DB_WorkflowRuns["workflow_runs Table"]
DB_Posts["hn_posts Table"]
DB_Comments["hn_comments Table"]
DB_Articles["scraped_articles Table"]
DB_Summaries["article_summaries / comment_summaries Tables"]
DB_Newsletters["newsletters Table"]
DB_Subscribers["subscribers Table"]
DB_Prompts["summarization_prompts Table"]
DB_NewsletterTemplates["newsletter_templates Table"]
end
UserWeb[End User] --> WebAppUI
WebAppUI --> APIServiceFE
APIServiceFE --> WorkflowTriggerAPI
APIServiceFE --> DataStorage
DevAdmin[Developer/Admin/Cron] --> WorkflowTriggerAPI
WorkflowTriggerAPI --> DB_WorkflowRuns
DB_WorkflowRuns -- "Triggers (via CheckWorkflowCompletion or direct)" --> HNContentService
HNContentService --> HNAlgoliaAPI
HNContentService --> DB_Posts
HNContentService --> DB_Comments
HNContentService --> DB_WorkflowRuns
DB_Posts -- "Triggers (via DB Webhook)" --> ArticleScrapingService
ArticleScrapingService --> DB_Articles
ArticleScrapingService --> DB_WorkflowRuns
DB_Articles -- "Triggers (via DB Webhook)" --> SummarizationService
SummarizationService --> LLMProvider
SummarizationService --> DB_Prompts
SummarizationService --> DB_Summaries
SummarizationService --> DB_WorkflowRuns
CheckWorkflowCompletionService -- "Monitors & Triggers Next Steps Based On" --> DB_WorkflowRuns
CheckWorkflowCompletionService -- "Monitors & Triggers Next Steps Based On" --> DB_Summaries
CheckWorkflowCompletionService -- "Monitors & Triggers Next Steps Based On" --> DB_Newsletters
CheckWorkflowCompletionService --> NewsletterGenerationService
NewsletterGenerationService --> DB_NewsletterTemplates
NewsletterGenerationService --> DB_Summaries
NewsletterGenerationService --> DB_Newsletters
NewsletterGenerationService --> DB_WorkflowRuns
CheckWorkflowCompletionService --> PodcastGenerationService
PodcastGenerationService --> PlayHTAPI
PodcastGenerationService --> DB_Newsletters
PodcastGenerationService --> DB_WorkflowRuns
PlayHTAPI -- "Webhook" --> PlayHTWebhookHandlerAPI
PlayHTWebhookHandlerAPI --> DB_Newsletters
PlayHTWebhookHandlerAPI --> DB_WorkflowRuns
CheckWorkflowCompletionService -- "Triggers Delivery" --> NewsletterGenerationService
NewsletterGenerationService -- "(For Delivery)" --> NodemailerService
NewsletterGenerationService -- "(For Delivery)" --> DB_Subscribers
NewsletterGenerationService -- "(For Delivery)" --> DB_Newsletters
NewsletterGenerationService -- "(For Delivery)" --> DB_WorkflowRuns
classDef user fill:#9cf,stroke:#333,stroke-width:2px;
classDef feapp fill:#f9d,stroke:#333,stroke-width:2px;
classDef beapp fill:#cdf,stroke:#333,stroke-width:2px;
classDef external fill:#ffc,stroke:#333,stroke-width:2px;
classDef db fill:#cfc,stroke:#333,stroke-width:2px;
class UserWeb,DevAdmin user;
class FrontendApp,WebAppUI,APIServiceFE feapp;
class BackendServices,WorkflowTriggerAPI,HNContentService,ArticleScrapingService,SummarizationService,PodcastGenerationService,NewsletterGenerationService,PlayHTWebhookHandlerAPI,CheckWorkflowCompletionService beapp;
class ExternalIntegrations,HNAlgoliaAPI,PlayHTAPI,LLMProvider,NodemailerService external;
class DataStorage,DB_WorkflowRuns,DB_Posts,DB_Comments,DB_Articles,DB_Summaries,DB_Newsletters,DB_Subscribers,DB_Prompts,DB_NewsletterTemplates db;
```
- **Frontend Application (Next.js on Vercel):**
- **Web Application UI (React Components):** Renders UI, displays newsletters/podcasts, handles user interactions.
- **API Service (Frontend - Next.js Route Handlers):** Handles frontend-initiated API calls (e.g., for future admin functions) and receives incoming webhooks (Play.ht).
- **Backend Services (Supabase Functions & Core Logic):**
- **Workflow Trigger API (`/api/system/trigger-workflow`):** Secure Next.js API route to manually initiate the daily workflow.
- **HN Content Service (Supabase Fn):** Retrieves posts/comments from HN Algolia API, stores them.
- **Article Scraping Service (Supabase Fn):** Triggered by new HN posts, scrapes article content.
- **Summarization Service (LLM Facade - Supabase Fn):** Triggered by new articles/comments, generates summaries using LLM.
- **Podcast Generation Service (Supabase Fn):** Sends newsletter content to Play.ht API.
- **Newsletter Generation Service (Supabase Fn):** Compiles newsletter, handles podcast link logic, triggers email delivery.
- **Play.ht Webhook API (`/api/webhooks/playht`):** Next.js API route to receive podcast status from Play.ht.
- **CheckWorkflowCompletionService (Supabase Cron Fn):** Periodically monitors `workflow_runs` and related tables to orchestrate the progression between pipeline stages (e.g., from summarization to newsletter generation, then to delivery).
- **Data Storage (Supabase PostgreSQL):** Stores all application data including workflow state, content, summaries, newsletters, subscribers, prompts, and templates.
- **External APIs & Services:** HN Algolia API, Play.ht API, LLM Provider (Ollama/Remote), Nodemailer.
### Architectural / Design Patterns Adopted
- **Event-Driven Architecture:** Core backend processing is a series of steps triggered by database events (Supabase Database Webhooks calling Supabase Functions hosted on Vercel) and orchestrated via the `workflow_runs` table and the `CheckWorkflowCompletionService`.
- **Serverless Functions:** Backend logic is encapsulated in Supabase Functions (running on Vercel).
- **Monorepo:** All code resides in a single repository.
- **Facade Pattern:** Encapsulates interactions with external services (HN API, Play.ht API, LLM, Nodemailer) within `supabase/functions/_shared/`.
- **Factory Pattern (for LLM Service):** The `LLMFacade` will use a factory to instantiate the appropriate LLM client based on environment configuration.
- **Hexagonal Architecture (Pragmatic Application):** For complex Supabase Functions, core business logic will be separated from framework-specific handlers and data interaction code (adapters) to improve testability and maintainability. Simpler functions may have a more direct implementation.
- **Repository Pattern (for Data Access - Conceptual):** Data access logic within services will be organized, conceptually resembling repositories, even if not strictly implemented with separate repository classes for all entities in MVP Supabase Functions.
- **Configuration via Environment Variables:** All sensitive and environment-specific configurations managed via environment variables.
## Workflow Orchestration and Status Management
The BMad DiCaster application employs an event-driven pipeline for its daily content processing. To manage, monitor, and ensure the robust execution of this multi-step workflow, the following orchestration strategy is implemented:
**1. Central Workflow Tracking (`workflow_runs` Table):**
- A dedicated table, `public.workflow_runs` (defined in Data Models), serves as the single source of truth for the state and progress of each initiated daily workflow.
- Each workflow execution is identified by a unique `id` (jobId) in this table.
- Key fields include `status`, `current_step_details`, `error_message`, and a `details` JSONB column to store metadata and progress counters (e.g., `posts_fetched`, `articles_scraped_successfully`, `summaries_generated`, `podcast_playht_job_id`, `podcast_status`).
**2. Workflow Initiation:**
- A workflow is initiated via the `POST /api/system/trigger-workflow` API endpoint (callable manually, by CLI, or by a cron job).
- Upon successful trigger, a new record is created in `workflow_runs` with an initial status (e.g., 'pending' or 'fetching_hn'), and the `jobId` is returned to the caller.
- This initial record creation triggers the first service in the pipeline (`HNContentService`) via a database webhook or an initial direct call from the trigger API logic.
**3. Service Function Responsibilities:**
- Each backend Supabase Function (`HNContentService`, `ArticleScrapingService`, `SummarizationService`, `PodcastGenerationService`, `NewsletterGenerationService`) participating in the workflow **must**:
- Be aware of the `workflow_run_id` for the job it is processing.
- **Before starting its primary task:** Update the `workflow_runs` table for the current `workflow_run_id` to reflect its `current_step_details`.
- **Upon successful completion of its task:** Update relevant data tables and the `workflow_runs.details` JSONB field.
- **Upon failure:** Update the `workflow_runs` table to set `status` to 'failed', and populate `error_message` and `current_step_details`.
- Utilize the shared `WorkflowTrackerService` for consistent status updates.
- The `PlayHTWebhookHandlerAPI` updates the `newsletters` table and then the `workflow_runs.details` with podcast status.
**4. Orchestration and Progression (`CheckWorkflowCompletionService`):**
- A dedicated Supabase Function, `CheckWorkflowCompletionService`, will be scheduled to run periodically (e.g., every 5-10 minutes via Vercel Cron Jobs invoking a dedicated HTTP endpoint for this service, or Supabase's `pg_cron` if preferred for DB-centric scheduling).
- This service orchestrates progression between major stages by:
- Querying `workflow_runs` for jobs in intermediate statuses.
- Verifying if all prerequisites for the next stage are met (e.g., all summaries done before newsletter generation, podcast ready before delivery).
- If conditions are met, it updates `workflow_runs.status` and invokes the appropriate next service, passing the `workflow_run_id`.
**5. Shared `WorkflowTrackerService`:**
- A utility service in `supabase/functions/_shared/` will provide standardized methods for backend functions to interact with the `workflow_runs` table.
**6. Podcast Link Before Email Delivery:**
- The `NewsletterGenerationService` initiates podcast creation.
- The `CheckWorkflowCompletionService` monitors `newsletters.podcast_url` (populated by `PlayHTWebhookHandlerAPI`) or `newsletters.podcast_status`.
- Email delivery is triggered by `CheckWorkflowCompletionService` once the podcast URL is available, a timeout is reached, or podcast generation fails (as per PRD's delay/retry logic).
## Project Structure
The BMad DiCaster project is organized as a monorepo, leveraging the Vercel/Supabase Next.js App Router template as its foundation.
```plaintext
{project-root}/
├── app/ # Next.js App Router
│ ├── (api)/ # API route handlers
│ │ ├── system/
│ │ │ ├── trigger-workflow/route.ts
│ │ │ └── workflow-status/[jobId]/route.ts
│ │ └── webhooks/
│ │ └── playht/route.ts
│ ├── components/ # Application-specific UI react components
│ │ └── core/ # e.g., NewsletterCard, PodcastPlayer
│ ├── newsletters/
│ │ ├── [newsletterId]/page.tsx
│ │ └── page.tsx
│ ├── auth/ # Auth-related pages and components (from template)
│ ├── login/page.tsx # Login page (from template)
│ ├── layout.tsx
│ └── page.tsx # Homepage
├── components/ # Shadcn UI components root (as configured by components.json)
│ ├── tutorial/ # Example/template components (can be removed)
│ ├── typography/ # Example/template components (can be removed)
│ └── ui/ # Base UI elements (button.tsx, card.tsx etc.)
├── docs/ # Project documentation
│ ├── prd.md
│ ├── architecture.md # This document
│ ├── ui-ux-spec.md
│ ├── ADR/ # Architecture Decision Records (to be created as needed)
│ └── environment-vars.md # (To be created)
├── lib/ # General utility functions for frontend (e.g., utils.ts from template)
│ └── utils.ts
├── supabase/ # Supabase specific project files (backend logic)
│ ├── functions/ # Supabase Edge Functions (for event-driven pipeline)
│ │ ├── hn-content-service/index.ts
│ │ ├── article-scraper-service/index.ts
│ │ ├── summarization-service/index.ts
│ │ ├── podcast-generation-service/index.ts
│ │ ├── newsletter-generation-service/index.ts
│ │ ├── check-workflow-completion-service/index.ts # Cron-triggered orchestrator
│ │ └── _shared/ # Shared utilities/facades FOR Supabase backend functions
│ │ ├── supabase-admin-client.ts
│ │ ├── llm-facade.ts
│ │ ├── playht-facade.ts
│ │ ├── nodemailer-facade.ts
│ │ └── workflow-tracker-service.ts # For updating workflow_runs table
│ ├── migrations/ # Database schema migrations
│ │ └── YYYYMMDDHHMMSS_initial_schema.sql
│ └── config.toml # Supabase project configuration (for CLI)
├── public/ # Static assets (images, favicon, etc.)
├── shared/ # Shared code/types between frontend and Supabase functions
│ └── types/
│ ├── api-schemas.ts # Request/response types for app/(api) routes
│ ├── domain-models.ts # Core entity types (HNPost, ArticleSummary etc.)
│ └── index.ts # Barrel file for shared types
├── styles/ # Global styles (e.g., globals.css for Tailwind base)
├── tests/ # Automated tests
│ ├── e2e/ # Playwright E2E tests
│ │ ├── newsletter-view.spec.ts
│ │ └── playwright.config.ts
│ └── integration/ # Integration tests
│ └── api-trigger-workflow.integration.test.ts
│ # Unit tests are co-located with source files, e.g., app/components/core/MyComponent.test.tsx
├── utils/ # Root utilities (from template)
│ └── supabase/ # Supabase helper functions FOR FRONTEND (from template)
│ ├── client.ts # Client-side Supabase client
│ ├── middleware.ts # Logic for Next.js middleware
│ └── server.ts # Server-side Supabase client
├── .env.example
├── .gitignore
├── components.json # Shadcn UI configuration
├── middleware.ts # Next.js middleware (root, uses utils/supabase/middleware.ts)
├── next-env.d.ts
├── next.config.mjs
├── package.json
├── postcss.config.js
├── README.md
├── tailwind.config.ts
└── tsconfig.json
```
### Key Directory Descriptions:
- **`app/`**: Next.js frontend (pages, UI components, Next.js API routes).
- **`app/(api)/`**: Backend API routes hosted on Vercel, including webhook receivers and system triggers.
- **`app/components/core/`**: Application-specific reusable React components.
- **`components/`**: Root for Shadcn UI components.
- **`docs/`**: All project documentation.
- **`lib/`**: Frontend-specific utility functions.
- **`supabase/functions/`**: Backend serverless functions (event-driven pipeline steps).
- **`supabase/functions/_shared/`**: Utilities and facades for these backend functions, including `WorkflowTrackerService`.
- **`supabase/migrations/`**: Database migrations managed by Supabase CLI.
- **`shared/types/`**: TypeScript types/interfaces shared between frontend and `supabase/functions/`. Path alias `@shared/*` to be configured in `tsconfig.json`.
- **`tests/`**: Contains E2E and integration tests. Unit tests are co-located with source files.
- **`utils/supabase/`**: Frontend-focused Supabase client helpers provided by the starter template.
### Monorepo Management:
- Standard `npm` (or `pnpm`/`yarn` workspaces if adopted later) for managing dependencies.
- The root `tsconfig.json` includes path aliases (`@/*`, `@shared/*`).
### Notes:
- Supabase functions in `supabase/functions/` are deployed to Vercel via Supabase CLI and Vercel integration.
- The `CheckWorkflowCompletionService` might be invoked via a Vercel Cron Job calling a simple HTTP trigger endpoint for that function, or via `pg_cron` if direct database scheduling is preferred.
## API Reference
### External APIs Consumed
#### 1\. Hacker News (HN) Algolia API
- **Purpose:** To retrieve top Hacker News posts and their associated comments.
- **Base URL(s):** Production: `http://hn.algolia.com/api/v1/`
- **Authentication:** None required.
- **Key Endpoints Used:**
- **`GET /search` (for top posts)**
- Description: Retrieves stories currently on the Hacker News front page.
- Request Parameters: `tags=front_page`
- Example Request: `curl "http://hn.algolia.com/api/v1/search?tags=front_page"`
- Post-processing: Application sorts fetched stories by `points` (descending), selects up to top 30.
- Success Response Schema (Code: `200 OK`): (Standard Algolia search response containing 'hits' array with story objects).
- **`GET /items/{objectID}` (for comments)**
- Description: Retrieves a story by `objectID` to get its full comment tree from the `children` field. Called for each selected top story.
- Success Response Schema (Code: `200 OK`): (Standard Algolia item response, `children` array contains comment tree).
- **Rate Limits:** Generous for public use; daily calls are fine.
- **Link to Official Docs:** [https://hn.algolia.com/api](https://hn.algolia.com/api)
#### 2\. Play.ht API
- **Purpose:** To generate AI-powered podcast versions of the newsletter content.
- **Base URL(s):** Production: `https://api.play.ai/api/v1`
- **Authentication:** API Key (`X-USER-ID` header) and Bearer Token (`Authorization` header). Stored as `PLAYHT_USER_ID` and `PLAYHT_API_KEY`.
- **Key Endpoints Used:**
- **`POST /playnotes`**
- Description: Initiates text-to-speech conversion.
- Request Body: `multipart/form-data` including `sourceFile` (HTML newsletter content), `synthesisStyle` ("podcast"), voice parameters, and `webHookUrl` (pointing to `/api/webhooks/playht` on our Vercel deployment).
- **Note on Content Delivery:** MVP uses `sourceFile` (direct upload). Fallback: upload content to Supabase Storage and provide `sourceFileUrl`.
- Success Response Schema (Code: `201 Created`): JSON object with `id` (PlayNote ID), `status`, etc.
- **Webhook Handling:** Our endpoint `/api/webhooks/playht` receives `POST` requests from Play.ht with `id`, `audioUrl`, and `status`.
- **Rate Limits:** Refer to official Play.ht documentation.
- **Link to Official Docs:** [https://docs.play.ai/api-reference/playnote/post](https://docs.play.ai/api-reference/playnote/post)
#### 3\. LLM Provider (Facade for Summarization)
- **Purpose:** To generate summaries for articles and comment threads.
- **Configuration:** Via environment variables (`LLM_PROVIDER_TYPE`, `OLLAMA_API_URL`, `REMOTE_LLM_API_KEY`, `REMOTE_LLM_API_URL`, `LLM_MODEL_NAME`).
- **Facade Interface (`LLMFacade` in `supabase/functions/_shared/llm-facade.ts`):**
```typescript
export interface LLMSummarizationOptions {
/* ... */
}
export interface LLMFacade {
generateSummary(
textToSummarize: string,
options?: LLMSummarizationOptions
): Promise<string>;
}
```
- **Implementations:**
- **Local Ollama Adapter:** HTTP requests to `OLLAMA_API_URL` (e.g., `POST /api/generate` or `/api/chat`).
- **Remote LLM API Adapter:** Authenticated HTTP requests to `REMOTE_LLM_API_URL`.
- **Rate Limits:** Provider-dependent.
- **Link to Official Docs:** Ollama: [https://github.com/ollama/ollama/blob/main/docs/api.md](https://www.google.com/search?q=https://github.com/ollama/ollama/blob/main/docs/api.md)
#### 4\. Nodemailer (Email Delivery Service)
- **Purpose:** To send generated HTML newsletters.
- **Interaction Type:** Library integration within `NewsletterGenerationService` (Supabase Function) via `NodemailerFacade` in `supabase/functions/_shared/nodemailer-facade.ts`.
- **Configuration:** Via SMTP environment variables (`SMTP_HOST`, `SMTP_PORT`, `SMTP_USER`, `SMTP_PASS`, etc.).
- **Key Operations:** Create transporter, construct email (From, To, Subject, HTML), send email.
- **Link to Official Docs:** [https://nodemailer.com/](https://nodemailer.com/)
### Internal APIs Provided (by BMad DiCaster)
#### 1\. Workflow Trigger API
- **Purpose:** To manually initiate the daily content processing pipeline.
- **Endpoint Path:** `/api/system/trigger-workflow` (Next.js API Route Handler)
- **Method:** `POST`
- **Authentication:** API Key in `X-API-KEY` header (matches `WORKFLOW_TRIGGER_API_KEY` env var).
- **Request Body:** MVP: Empty or `{}`.
- **Success Response (`202 Accepted`):** `{"message": "Daily workflow triggered...", "jobId": "<UUID>"}`
- **Action:** Creates a record in `workflow_runs` and initiates the pipeline.
#### 2\. Workflow Status API
- **Purpose:** Allow developers/admins to check the status of a workflow run.
- **Endpoint Path:** `/api/system/workflow-status/{jobId}` (Next.js API Route Handler)
- **Method:** `GET`
- **Authentication:** API Key in `X-API-KEY` header.
- **Success Response (`200 OK`):** JSON object with `jobId`, `status`, `currentStep`, `details`, etc. (from `workflow_runs` table).
#### 3\. Play.ht Webhook Receiver
- **Purpose:** To receive status updates and podcast audio URLs from Play.ht.
- **Endpoint Path:** `/api/webhooks/playht` (Next.js API Route Handler)
- **Method:** `POST`
- **Authentication:** Implement verification (shared secret or signature if Play.ht supports).
- **Request Body (from Play.ht):** JSON with `id`, `audioUrl`, `status`.
- **Action:** Updates `newsletters` and `workflow_runs` tables.
## Data Models
### Core Application Entities / Domain Objects
(Conceptual types, typically defined in `shared/types/domain-models.ts`)
#### 1\. `WorkflowRun`
- **Description:** A single execution of the daily workflow.
- **Schema:** `id (string UUID)`, `createdAt (string ISO)`, `lastUpdatedAt (string ISO)`, `status (enum string)`, `currentStepDetails (string?)`, `errorMessage (string?)`, `details (object?)`
#### 2\. `HNPost`
- **Description:** A post from Hacker News.
- **Schema:** `id (string HN_objectID)`, `hnNumericId (number?)`, `title (string)`, `url (string?)`, `author (string)`, `points (number)`, `createdAt (string ISO)`, `retrievedAt (string ISO)`, `numComments (number?)`, `workflowRunId (string UUID?)`
#### 3\. `HNComment`
- **Description:** A comment on an HN post.
- **Schema:** `id (string HN_commentID)`, `hnPostId (string)`, `parentId (string?)`, `author (string?)`, `text (string HTML)`, `createdAt (string ISO)`
#### 4\. `ScrapedArticle`
- **Description:** Content scraped from an article URL.
- **Schema:** `id (string UUID)`, `hnPostId (string)`, `originalUrl (string)`, `title (string?)`, `mainTextContent (string?)`, `scrapedAt (string ISO)`, `scrapingStatus (enum string)`, `workflowRunId (string UUID?)`
#### 5\. `ArticleSummary`
- **Description:** AI-generated summary of a `ScrapedArticle`.
- **Schema:** `id (string UUID)`, `scrapedArticleId (string UUID)`, `summaryText (string)`, `generatedAt (string ISO)`, `llmModelUsed (string?)`, `workflowRunId (string UUID?)`
#### 6\. `CommentSummary`
- **Description:** AI-generated summary of comments for an `HNPost`.
- **Schema:** `id (string UUID)`, `hnPostId (string)`, `summaryText (string)`, `generatedAt (string ISO)`, `llmModelUsed (string?)`, `workflowRunId (string UUID?)`
#### 7\. `Newsletter`
- **Description:** The daily generated newsletter.
- **Schema:** `id (string UUID)`, `workflowRunId (string UUID)`, `title (string)`, `generatedAt (string ISO)`, `htmlContent (string)`, `podcastPlayhtJobId (string?)`, `podcastUrl (string?)`, `podcastStatus (enum string?)`, `deliveryStatus (enum string)`, `targetDate (string YYYY-MM-DD)`
#### 8\. `Subscriber`
- **Description:** An email subscriber.
- **Schema:** `id (string UUID)`, `email (string)`, `subscribedAt (string ISO)`, `isActive (boolean)`
#### 9\. `SummarizationPrompt`
- **Description:** Stores prompts for AI summarization.
- **Schema:** `id (string UUID)`, `promptName (string)`, `promptText (string)`, `version (string)`, `isDefaultArticlePrompt (boolean)`, `isDefaultCommentPrompt (boolean)`
#### 10\. `NewsletterTemplate`
- **Description:** HTML/MJML templates for newsletters.
- **Schema:** `id (string UUID)`, `templateName (string)`, `mjmlContent (string?)`, `htmlContent (string)`, `version (string)`, `isDefault (boolean)`
### Database Schemas (Supabase PostgreSQL)
#### 1\. `workflow_runs`
```sql
CREATE TABLE public.workflow_runs (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
last_updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
status TEXT NOT NULL DEFAULT 'pending', -- pending, fetching_hn, scraping_articles, summarizing_content, generating_podcast, generating_newsletter, delivering_newsletter, completed, failed
current_step_details TEXT NULL,
error_message TEXT NULL,
details JSONB NULL -- {postsFetched, articlesAttempted, articlesScrapedSuccessfully, summariesGenerated, podcastJobId, podcastStatus, newsletterSentAt, subscribersNotified}
);
```
#### 2\. `hn_posts`
```sql
CREATE TABLE public.hn_posts (
id TEXT PRIMARY KEY, -- HN's objectID
hn_numeric_id BIGINT NULL UNIQUE,
title TEXT NOT NULL,
url TEXT NULL,
author TEXT NULL,
points INTEGER NOT NULL DEFAULT 0,
created_at TIMESTAMPTZ NOT NULL,
retrieved_at TIMESTAMPTZ NOT NULL DEFAULT now(),
hn_story_text TEXT NULL,
num_comments INTEGER NULL DEFAULT 0,
tags TEXT[] NULL,
workflow_run_id UUID NULL REFERENCES public.workflow_runs(id) ON DELETE SET NULL -- The run that fetched this instance of the post
);
```
#### 3\. `hn_comments`
```sql
CREATE TABLE public.hn_comments (
id TEXT PRIMARY KEY, -- HN's comment ID
hn_post_id TEXT NOT NULL REFERENCES public.hn_posts(id) ON DELETE CASCADE,
parent_comment_id TEXT NULL REFERENCES public.hn_comments(id) ON DELETE CASCADE,
author TEXT NULL,
comment_text TEXT NOT NULL,
created_at TIMESTAMPTZ NOT NULL,
retrieved_at TIMESTAMPTZ NOT NULL DEFAULT now()
);
CREATE INDEX idx_hn_comments_post_id ON public.hn_comments(hn_post_id);
```
#### 4\. `scraped_articles`
```sql
CREATE TABLE public.scraped_articles (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
hn_post_id TEXT NOT NULL REFERENCES public.hn_posts(id) ON DELETE CASCADE, -- Should be unique if one article per post processing for a workflow run
original_url TEXT NOT NULL,
resolved_url TEXT NULL,
title TEXT NULL,
author TEXT NULL,
publication_date TIMESTAMPTZ NULL,
main_text_content TEXT NULL,
scraped_at TIMESTAMPTZ NOT NULL DEFAULT now(),
scraping_status TEXT NOT NULL DEFAULT 'pending', -- pending, success, failed_unreachable, failed_paywall, failed_parsing
error_message TEXT NULL,
workflow_run_id UUID NULL REFERENCES public.workflow_runs(id) ON DELETE SET NULL
);
CREATE UNIQUE INDEX idx_scraped_articles_hn_post_id_workflow_run_id ON public.scraped_articles(hn_post_id, workflow_run_id);
```
#### 5\. `article_summaries`
```sql
CREATE TABLE public.article_summaries (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
scraped_article_id UUID NOT NULL REFERENCES public.scraped_articles(id) ON DELETE CASCADE,
summary_text TEXT NOT NULL,
generated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
llm_prompt_version TEXT NULL,
llm_model_used TEXT NULL,
workflow_run_id UUID NOT NULL REFERENCES public.workflow_runs(id) ON DELETE CASCADE -- Summary is specific to a workflow run
);
CREATE UNIQUE INDEX idx_article_summaries_scraped_article_id_workflow_run_id ON public.article_summaries(scraped_article_id, workflow_run_id);
```
#### 6\. `comment_summaries`
```sql
CREATE TABLE public.comment_summaries (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
hn_post_id TEXT NOT NULL REFERENCES public.hn_posts(id) ON DELETE CASCADE,
summary_text TEXT NOT NULL,
generated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
llm_prompt_version TEXT NULL,
llm_model_used TEXT NULL,
workflow_run_id UUID NOT NULL REFERENCES public.workflow_runs(id) ON DELETE CASCADE -- Summary is specific to a workflow run
);
CREATE UNIQUE INDEX idx_comment_summaries_hn_post_id_workflow_run_id ON public.comment_summaries(hn_post_id, workflow_run_id);
```
#### 7\. `newsletters`
```sql
CREATE TABLE public.newsletters (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
workflow_run_id UUID NOT NULL UNIQUE REFERENCES public.workflow_runs(id) ON DELETE CASCADE,
target_date DATE NOT NULL UNIQUE,
title TEXT NOT NULL,
generated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
html_content TEXT NOT NULL,
mjml_template_version TEXT NULL,
podcast_playht_job_id TEXT NULL,
podcast_url TEXT NULL,
podcast_status TEXT NULL DEFAULT 'pending', -- pending, generating, completed, failed
delivery_status TEXT NOT NULL DEFAULT 'pending', -- pending, sending, sent, failed, partially_failed
scheduled_send_at TIMESTAMPTZ NULL,
sent_at TIMESTAMPTZ NULL
);
```
#### 8\. `subscribers`
```sql
CREATE TABLE public.subscribers (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
email TEXT NOT NULL UNIQUE,
subscribed_at TIMESTAMPTZ NOT NULL DEFAULT now(),
is_active BOOLEAN NOT NULL DEFAULT TRUE,
unsubscribed_at TIMESTAMPTZ NULL
);
```
#### 9\. `summarization_prompts`
```sql
CREATE TABLE public.summarization_prompts (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
prompt_name TEXT NOT NULL UNIQUE,
prompt_text TEXT NOT NULL,
version TEXT NOT NULL DEFAULT '1.0',
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
is_default_article_prompt BOOLEAN NOT NULL DEFAULT FALSE,
is_default_comment_prompt BOOLEAN NOT NULL DEFAULT FALSE
-- Note: Logic to enforce single default will be in application layer or via more complex DB constraints/triggers.
);
```
#### 10\. `newsletter_templates`
```sql
CREATE TABLE public.newsletter_templates (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
template_name TEXT NOT NULL UNIQUE,
mjml_content TEXT NULL,
html_content TEXT NOT NULL,
version TEXT NOT NULL DEFAULT '1.0',
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
is_default BOOLEAN NOT NULL DEFAULT FALSE
-- Note: Logic to enforce single default will be in application layer.
);
```
## Core Workflow / Sequence Diagrams
### 1\. Daily Workflow Initiation & HN Content Acquisition
```mermaid
sequenceDiagram
actor Caller as Manual/API/CLI/Cron
participant TriggerAPI as POST /api/system/trigger-workflow
participant WorkflowRunsDB as workflow_runs (DB Table)
participant WorkflowTracker as WorkflowTrackerService
participant HNContentService as HNContentService (Supabase Fn)
participant HNAlgoliaAPI as HN Algolia API
participant HNPostsDB as hn_posts (DB Table)
participant HNCommentsDB as hn_comments (DB Table)
participant EventTrigger1 as DB Event/Webhook (on hn_posts insert)
Caller->>+TriggerAPI: Request to start daily workflow
TriggerAPI->>+WorkflowTracker: initiateNewWorkflow()
WorkflowTracker->>+WorkflowRunsDB: INSERT new run (status='pending', details={})
WorkflowRunsDB-->>-WorkflowTracker: new_workflow_run_id
WorkflowTracker-->>TriggerAPI: { jobId: new_workflow_run_id }
TriggerAPI-->>-Caller: HTTP 202 Accepted { jobId }
alt Initial Trigger for HN Content Fetch
WorkflowTracker->>+HNContentService: triggerFetch(workflow_run_id)
else Alternative: Event from WorkflowRunsDB insert
WorkflowRunsDB-->>EventTrigger1: New workflow_run record
EventTrigger1->>+HNContentService: Invoke(workflow_run_id, event_payload)
end
HNContentService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'fetching_hn_posts', 'fetching_hn')
WorkflowTracker->>+WorkflowRunsDB: UPDATE workflow_runs (status, current_step_details)
HNContentService->>+HNAlgoliaAPI: GET /search?tags=front_page
HNAlgoliaAPI-->>-HNContentService: Front page story items
loop For each story item (up to 30 after sorting by points)
HNContentService->>+HNPostsDB: INSERT story (hn_post_id, ..., workflow_run_id)
HNPostsDB-->>EventTrigger1: Notifies: New hn_post inserted
EventTrigger1-->>ArticleScrapingService: (Async) Trigger ArticleScrapingService(hn_post_id, workflow_run_id)
HNContentService->>+HNAlgoliaAPI: GET /items/{story_objectID} (to fetch comments)
HNAlgoliaAPI-->>-HNContentService: Story details with comments
loop For each comment
HNContentService->>+HNCommentsDB: INSERT comment
end
end
HNContentService->>+WorkflowTracker: updateWorkflowDetails(workflow_run_id, {posts_fetched: X, comments_fetched: Y})
WorkflowTracker->>WorkflowRunsDB: UPDATE workflow_runs (details)
```
### 2\. Article Scraping & Summarization Flow
```mermaid
sequenceDiagram
participant EventTrigger1 as DB Event/Webhook (on hn_posts insert)
participant ArticleScrapingService as ArticleScrapingService (Supabase Fn)
participant ScrapedArticlesDB as scraped_articles (DB Table)
participant WorkflowTracker as WorkflowTrackerService
participant WorkflowRunsDB as workflow_runs (DB Table)
participant EventTrigger2 as DB Event/Webhook (on scraped_articles insert/update)
participant SummarizationService as SummarizationService (Supabase Fn)
participant LLMFacade as LLMFacade (shared function)
participant LLMProvider as LLM Provider (Ollama/Remote)
participant SummariesDB as article_summaries / comment_summaries (DB Tables)
participant PromptsDB as summarization_prompts (DB Table)
EventTrigger1->>+ArticleScrapingService: Invoke(hn_post_id, workflow_run_id, article_url)
ArticleScrapingService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'scraping_article_for_post_' + hn_post_id, 'scraping_articles')
ArticleScrapingService->>+ScrapedArticlesDB: INSERT new article (status='pending', workflow_run_id)
opt Article URL is valid
ArticleScrapingService->>ArticleScrapingService: Fetch & Parse HTML with Cheerio
ArticleScrapingService->>+ScrapedArticlesDB: UPDATE scraped_articles SET content, status='success'
else Scraping fails
ArticleScrapingService->>+ScrapedArticlesDB: UPDATE scraped_articles SET status='failed_...'
end
ScrapedArticlesDB-->>EventTrigger2: Notifies: New/Updated scraped_article (status='success')
EventTrigger2-->>SummarizationService: (Async) Trigger SummarizationService(scraped_article_id, workflow_run_id, 'article')
ArticleScrapingService->>+WorkflowTracker: updateWorkflowDetails(workflow_run_id, {articles_attempted_increment: 1, ...})
SummarizationService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'summarizing_content_for_post_' + hn_post_id, 'summarizing_content')
alt Summarize Article
SummarizationService->>SummarizationService: Get article_text
SummarizationService->>+PromptsDB: Get article_prompt
SummarizationService->>+LLMFacade: generateSummary(article_text, article_prompt)
LLMFacade->>+LLMProvider: Request summary
LLMProvider-->>-LLMFacade: article_summary
SummarizationService->>+SummariesDB: INSERT into article_summaries
end
alt Summarize Comments (for hn_post_id)
SummarizationService->>SummarizationService: Get comment_texts
SummarizationService->>+PromptsDB: Get comment_prompt
SummarizationService->>+LLMFacade: generateSummary(comment_texts, comment_prompt)
LLMFacade->>+LLMProvider: Request summary
LLMProvider-->>-LLMFacade: comment_summary
SummarizationService->>+SummariesDB: INSERT into comment_summaries
end
SummarizationService->>+WorkflowTracker: updateWorkflowDetails(workflow_run_id, {summaries_generated_increment: N})
```
### 3\. Newsletter, Podcast, and Delivery Flow
```mermaid
sequenceDiagram
participant CheckWorkflowService as CheckWorkflowCompletionService (Supabase Cron Fn)
participant WorkflowRunsDB as workflow_runs (DB Table)
participant WorkflowTracker as WorkflowTrackerService
participant NewsletterGenService as NewsletterGenerationService (Supabase Fn)
participant PodcastGenService as PodcastGenerationService (Supabase Fn)
participant PlayHTAPI as Play.ht API
participant NewsletterTemplatesDB as newsletter_templates (DB Table)
participant SummariesDB as article_summaries / comment_summaries (DB Tables)
participant NewslettersDB as newsletters (DB Table)
participant PlayHTWebhook as POST /api/webhooks/playht (Next.js API Route)
participant NodemailerService as NodemailerFacade (shared function)
participant SubscribersDB as subscribers (DB Table)
participant ExternalEmailService as Email Service (e.g., Gmail SMTP)
CheckWorkflowService->>+WorkflowRunsDB: Query runs (status='summarizing_content', all summaries done?)
WorkflowRunsDB-->>-CheckWorkflowService: workflow_run_id (ready for newsletter)
CheckWorkflowService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'starting_newsletter_generation', 'generating_newsletter')
CheckWorkflowService->>+NewsletterGenService: Invoke(workflow_run_id)
NewsletterGenService->>+NewsletterTemplatesDB: Get default_template
NewsletterGenService->>+SummariesDB: Get summaries for workflow_run_id
NewsletterGenService->>NewsletterGenService: Compile HTML newsletter
NewsletterGenService->>+NewslettersDB: INSERT newsletter (html_content, podcast_status='pending')
NewsletterGenService->>+PodcastGenService: initiatePodcast(newsletter_id, html_content)
PodcastGenService->>+PlayHTAPI: POST /playnotes (webHookUrl=...)
PlayHTAPI-->>-PodcastGenService: { playht_job_id, status: 'generating' }
PodcastGenService->>+NewslettersDB: UPDATE newsletters SET podcast_playht_job_id, podcast_status='generating'
PodcastGenService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'podcast_generation_initiated', 'generating_podcast')
PlayHTAPI-->>+PlayHTWebhook: POST (status='completed', audioUrl='...')
PlayHTWebhook->>+NewslettersDB: UPDATE newsletters SET podcast_url, podcast_status='completed'
PlayHTWebhook->>+WorkflowTracker: updateWorkflowDetails(workflow_run_id, {podcast_status: 'completed'})
PlayHTWebhook-->>-PlayHTAPI: HTTP 200 OK
CheckWorkflowService->>+WorkflowRunsDB: Query runs (status='generating_podcast', podcast_status IN ('completed', 'failed') OR timeout)
WorkflowRunsDB-->>-CheckWorkflowService: workflow_run_id (ready for delivery)
CheckWorkflowService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'starting_newsletter_delivery', 'delivering_newsletter')
CheckWorkflowService->>+NewsletterGenService: triggerDelivery(newsletter_id)
NewsletterGenService->>+NewslettersDB: Get newsletter_data (html, podcast_url)
NewsletterGenService->>NewsletterGenService: (Embed podcast_url in HTML if available)
NewsletterGenService->>+SubscribersDB: Get active_subscribers
loop For each subscriber
NewsletterGenService->>+NodemailerService: sendEmail(to, subject, html)
NodemailerService->>+ExternalEmailService: SMTP send
end
NewsletterGenService->>+NewslettersDB: UPDATE newsletters SET delivery_status='sent', sent_at=now()
NewsletterGenService->>+WorkflowTracker: completeWorkflow(workflow_run_id, {delivery_status: 'sent'})
```
## Definitive Tech Stack Selections
This section outlines the definitive technology choices for the BMad DiCaster project.
- **Preferred Starter Template Frontend & Backend:** Vercel/Supabase Next.js App Router Template ([https://vercel.com/templates/next.js/supabase](https://vercel.com/templates/next.js/supabase))
| Category | Technology | Version / Details | Description / Purpose | Justification (Optional, from PRD/User) |
| :------------------- | :-------------------------- | :-------------------------- | :------------------------------------------------------------------------ | :----------------------------------------------------------- |
| **Languages** | TypeScript | `5.7.2` | Primary language for backend/frontend | Strong typing, community support, aligns with Next.js/React |
| **Runtime** | Node.js | `22.10.2` | Server-side execution environment for Next.js & Supabase Functions | Compatible with Next.js, Vercel environment |
| **Frameworks** | Next.js | `latest` (e.g., 14.x.x) | Full-stack React framework | App Router, SSR, API routes, Vercel synergy |
| | React | `19.0.0` | Frontend UI library | Component-based, declarative |
| **UI Libraries** | Tailwind CSS | `3.4.17` | Utility-first CSS framework | Rapid UI development, consistent styling |
| | Shadcn UI | `latest` | React component library (via CLI) | Pre-styled, accessible components, built on Radix & Tailwind |
| **Databases** | PostgreSQL | (via Supabase) | Primary relational data store | Provided by Supabase, robust, scalable |
| **Cloud Platform** | Vercel | N/A | Hosting platform for Next.js app & Supabase Functions | Seamless Next.js/Supabase deployment, Edge Network |
| **Cloud Services** | Supabase Functions | N/A (via Vercel deploy) | Serverless compute for backend pipeline & APIs | Integrated with Supabase DB, event-driven capabilities |
| | Supabase Auth | N/A | User authentication and management | Integrated with Supabase, RLS |
| | Supabase Storage | N/A | File storage (e.g., for temporary newsletter files if needed for Play.ht) | Integrated with Supabase |
| **Infrastructure** | Supabase CLI | `latest` | Local development, migrations, function deployment | Official tool for Supabase development |
| | Docker | `latest` (via Supabase CLI) | Containerization for local Supabase services | Local development consistency |
| **State Management** | Zustand | `latest` | Frontend state management | Simple, unopinionated, performant for React |
| **Testing** | React Testing Library (RTL) | `latest` | Testing React components | User-centric testing, works well with Jest |
| | Jest | `latest` | Unit/Integration testing framework for JS/TS | Widely used, good support for Next.js/React |
| | Playwright | `latest` | End-to-end testing framework | Modern, reliable, cross-browser |
| **CI/CD** | GitHub Actions | N/A | Continuous Integration/Deployment | Integration with GitHub, automation of build/deploy/test |
| **Other Tools** | Cheerio | `latest` | HTML parsing/scraping for articles | Server-side HTML manipulation |
| | Nodemailer | `latest` | Email sending library for newsletters | Robust email sending from Node.js |
| | Zod | `latest` | TypeScript-first schema declaration and validation | Data validation for API inputs, environment variables etc. |
| | `tsx` / `ts-node` | `latest` (for scripts) | TypeScript execution for Node.js scripts (e.g. `scripts/`) | Running TS scripts directly |
| | Prettier | `3.3.3` | Code formatter | Consistent code style |
| | ESLint | `latest` | Linter for TypeScript/JavaScript | Code quality and error prevention |
| | Pino | `latest` | High-performance JSON logger for Node.js | Structured and efficient logging |
## Infrastructure and Deployment Overview
- **Cloud Provider(s):** Vercel (for hosting Next.js app and Supabase Functions) and Supabase (managed PostgreSQL, Auth, Storage; runs on underlying cloud like AWS).
- **Core Services Used:** Vercel (Next.js Hosting, Serverless/Edge Functions, CDN, CI/CD, Cron Jobs), Supabase (PostgreSQL, Auth, Storage, Functions, Database Webhooks).
- **Infrastructure as Code (IaC):** Supabase Migrations (`supabase/migrations/`) for database schema; Vercel project settings (`vercel.json` if needed).
- **Deployment Strategy:** GitHub Actions for CI/CD. Frontend (Next.js) via Vercel Git integration. Backend (Supabase Functions) via Supabase CLI within GitHub Actions. Database migrations via Supabase CLI.
- **Environments:** Local (Next.js dev server, Supabase CLI local stack), Development/Preview (Vercel preview deployments linked to dev Supabase instance), Production (Vercel production deployment linked to prod Supabase instance).
- **Environment Promotion:** Local -\> Dev/Preview (PR) -\> Production (merge to main).
- **Rollback Strategy:** Vercel dashboard/CLI for app/function rollbacks; Supabase migrations or Point-in-Time Recovery for database.
## Error Handling Strategy
- **General Approach:** Use standard `Error` objects or custom extensions. Supabase Functions catch errors, log via Pino, update `workflow_runs`, and avoid unhandled rejections. Next.js API routes return appropriate HTTP error responses with JSON payloads.
- **Logging (Pino):**
- Library: Pino (`pino`) for structured JSON logging in Supabase Functions and Next.js API routes.
- Configuration: Shared Pino logger instance (`supabase/functions/_shared/logger.ts`).
- Format: JSON.
- Levels: `trace`, `debug`, `info`, `warn`, `error`, `fatal`.
- Context: Logs include `timestamp`, `severity`, `workflowRunId`, `service`/`functionName`, `message`, and relevant `details`. **No sensitive data logged.**
- **Specific Handling Patterns:**
- **External API Calls:** Through facades with timeouts and limited retries (exponential backoff) for transient errors. Standardized custom errors thrown by facades.
- **Internal Errors/Business Logic:** Caught within functions; log details, update `workflow_runs` to 'failed'. API routes return generic errors to clients.
- **Database Operations:** Critical errors lead to 'failed' workflow status.
- **Scraping/Summarization Failures:** Individual item failures are logged and status updated (e.g., `scraped_articles.scraping_status`), but may not halt the entire workflow run if other items succeed.
- **Podcast/Delivery Failures:** Logged, status updated in `newsletters` and `workflow_runs`. Newsletter may be sent without podcast after timeout/failure.
- **`CheckWorkflowCompletionService`:** Designed for resilience; errors in processing one run should not prevent processing of others or future scheduled runs.
## Coding Standards
(As detailed previously, including TypeScript, Node.js, ESLint, Prettier, naming conventions, co-located unit tests `*.test.ts(x)`/`*.spec.ts(x)`, async/await, strict type safety, Pino logging, and specific framework/anti-pattern guidelines.)
## Overall Testing Strategy
(As detailed previously, covering Unit Tests with Jest/RTL, Integration Tests, E2E Tests with Playwright, 80% unit test coverage target, specific mocking strategies for facades and external dependencies, and test data management.)
## Security Best Practices
(As detailed previously, including Zod for input validation, output encoding, secrets management via environment variables, dependency security scanning, API key authentication for system APIs, Play.ht webhook verification, Supabase RLS, principle of least privilege, HTTPS, and secure error information disclosure.)
## Key Reference Documents
1. **Product Requirements Document (PRD):** `docs/prd-incremental-full-agile-mode.txt`
2. **UI/UX Specification:** `docs/ui-ux-spec.txt`
3. **Technical Preferences:** `docs/technical-preferences copy.txt`
4. **Environment Variables Documentation:** `docs/environment-vars.md` (To be created)
5. **(Optional) Frontend Architecture Document:** `docs/frontend-architecture.md` (To be created by Design Architect)
6. **Play.ht API Documentation:** [https://docs.play.ai/api-reference/playnote/post](https://docs.play.ai/api-reference/playnote/post)
7. **Hacker News Algolia API:** [https://hn.algolia.com/api](https://hn.algolia.com/api)
8. **Ollama API Documentation:** [https://github.com/ollama/ollama/blob/main/docs/api.md](https://www.google.com/search?q=https://github.com/ollama/ollama/blob/main/docs/api.md)
9. **Supabase Documentation:** [https://supabase.com/docs](https://supabase.com/docs)
10. **Next.js Documentation:** [https://nextjs.org/docs](https://nextjs.org/docs)
11. **Vercel Documentation:** [https://vercel.com/docs](https://vercel.com/docs)
12. **Pino Logging Documentation:** [https://getpino.io/](https://getpino.io/)
## Change Log
| Change | Date | Version | Description | Author |
| :----------------------------------------- | :--------- | :------ | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :------------- |
| Initial Draft based on PRD and discussions | 2025-05-13 | 0.1 | First complete draft covering project overview, components, data models, tech stack, deployment, error handling, coding standards, testing strategy, security, and workflow orchestration. | 3-arch (Agent) |
---
## Prompt for Design Architect: Frontend Architecture Definition
**To the Design Architect (Agent Specializing in Frontend Architecture):**
You are now tasked with defining the detailed **Frontend Architecture** for the BMad DiCaster project. This main Architecture Document and the `docs/ui-ux-spec.txt` are your primary input artifacts. Your goal is to produce a dedicated `frontend-architecture.md` document.
**Key Inputs & Constraints (from this Main Architecture Document & UI/UX Spec):**
1. **Overall Project Architecture:** Familiarize yourself with the "High-Level Overview," "Component View," "Data Models" (especially any shared types in `shared/types/`), and "API Reference" (particularly internal APIs like `/api/system/trigger-workflow` and `/api/webhooks/playht` that the frontend might indirectly be aware of or need to interact with for admin purposes in the future, though MVP frontend primarily reads newsletter data).
2. **UI/UX Specification (`docs/ui-ux-spec.txt`):** This document contains user flows, wireframes, core screens (Newsletter List, Newsletter Detail), component inventory (NewsletterCard, PodcastPlayer, DownloadButton, BackButton), branding considerations (synthwave, minimalist), and accessibility aspirations.
3. **Definitive Technology Stack (Frontend Relevant):**
- Framework: Next.js (`latest`, App Router)
- Language: React (`19.0.0`) with TypeScript (`5.7.2`)
- UI Libraries: Tailwind CSS (`3.4.17`), Shadcn UI (`latest`)
- State Management: Zustand (`latest`)
- Testing: React Testing Library (RTL) (`latest`), Jest (`latest`)
- Starter Template: Vercel/Supabase Next.js App Router template ([https://vercel.com/templates/next.js/supabase](https://vercel.com/templates/next.js/supabase)). Leverage its existing structure for `app/`, `components/ui/` (from Shadcn), `lib/utils.ts`, and `utils/supabase/` (client, server, middleware helpers for Supabase).
4. **Project Structure (Frontend Relevant):** Refer to the "Project Structure" section in this document, particularly the `app/` directory, `components/` (for Shadcn `ui` and your `core` application components), `lib/`, and `utils/supabase/`.
5. **Existing Frontend Files (from template):** Be aware of `middleware.ts` (for Supabase auth) and any existing components or utility functions provided by the starter template.
**Tasks for Frontend Architecture Document (`frontend-architecture.md`):**
1. **Refine Frontend Project Structure:**
- Detail the specific folder structure within `app/`. Propose organization for pages (routes), layouts, application-specific components (`app/components/core/`), data fetching logic, context providers, and Zustand stores.
- How will Shadcn UI components (`components/ui/`) be used and potentially customized?
2. **Component Architecture:**
- For each core screen identified in the UI/UX spec (Newsletter List, Newsletter Detail), define the primary React component hierarchy.
- Specify responsibilities and key props for major reusable application components (e.g., `NewsletterCard`, `NewsletterDetailView`, `PodcastPlayerControls`).
- How will components fetch and display data from Supabase? (e.g., Server Components, Client Components using Supabase client from `utils/supabase/client.ts` or `utils/supabase/server.ts`).
3. **State Management (Zustand):**
- Identify global and local state needs.
- Define specific Zustand store(s): what data they will hold (e.g., current newsletter list, selected newsletter details, podcast player state), and what actions they will expose.
- How will components interact with these stores?
4. **Data Fetching & Caching (Frontend):**
- Specify patterns for fetching newsletter data (lists and individual items) and podcast information.
- How will Next.js data fetching capabilities (Server Components, Route Handlers, `Workspace` with caching options) be utilized with the Supabase client?
- Address loading and error states for data fetching in the UI.
5. **Routing:**
- Confirm Next.js App Router usage and define URL structure for the newsletter list and detail pages.
6. **Styling Approach:**
- Reiterate use of Tailwind CSS and Shadcn UI.
- Define any project-specific conventions for applying Tailwind classes or extending the theme (beyond what's in `tailwind.config.ts`).
- How will the "synthwave technical glowing purple vibes" be implemented using Tailwind?
7. **Error Handling (Frontend):**
- How will errors from API calls (to Supabase or internal Next.js API routes if any) be handled and displayed to the user?
- Strategy for UI error boundaries.
8. **Accessibility (AX):**
- Elaborate on how the WCAG 2.1 Level A requirements (keyboard navigation, semantic HTML, alt text, color contrast) will be met in component design and implementation, leveraging Next.js and Shadcn UI capabilities.
9. **Testing (Frontend):**
- Reiterate the use of Jest and RTL for unit/integration testing of React components.
- Provide examples or guidelines for writing effective frontend tests.
10. **Key Frontend Libraries & Versioning:** Confirm versions from the main tech stack and list any additional frontend-only libraries required.
Your output should be a clean, well-formatted `frontend-architecture.md` document ready for AI developer agents to use for frontend implementation. Adhere to the output formatting guidelines. You are now operating in **Frontend Architecture Mode**.