full beta demo - few minor issues to tweak, but 90% there!
This commit is contained in:
@@ -0,0 +1,191 @@
|
||||
# API Reference
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "API Reference".
|
||||
|
||||
### External APIs Consumed
|
||||
|
||||
#### 1\. Hacker News (HN) Algolia API
|
||||
|
||||
- **Purpose:** To retrieve top Hacker News posts and their associated comments.
|
||||
- **Base URL(s):** Production: `http://hn.algolia.com/api/v1/`
|
||||
- **Authentication:** None required.
|
||||
- **Key Endpoints Used:**
|
||||
- **`GET /search` (for top posts)**
|
||||
- Description: Retrieves stories currently on the Hacker News front page.
|
||||
- Request Parameters: `tags=front_page`
|
||||
- Example Request: `curl "http://hn.algolia.com/api/v1/search?tags=front_page"`
|
||||
- Post-processing: Application sorts fetched stories by `points` (descending), selects up to top 30.
|
||||
- Success Response Schema (Code: `200 OK`): Standard Algolia search response containing 'hits' array with story objects.
|
||||
```json
|
||||
{
|
||||
"hits": [
|
||||
{
|
||||
"objectID": "string",
|
||||
"created_at": "string",
|
||||
"title": "string",
|
||||
"url": "string",
|
||||
"author": "string",
|
||||
"points": "number",
|
||||
"story_text": "string",
|
||||
"num_comments": "number",
|
||||
"_tags": ["string"]
|
||||
}
|
||||
],
|
||||
"nbHits": "number",
|
||||
"page": "number",
|
||||
"nbPages": "number",
|
||||
"hitsPerPage": "number"
|
||||
}
|
||||
```
|
||||
- **`GET /items/{objectID}` (for comments)**
|
||||
- Description: Retrieves a specific story item by its `objectID` to get its full comment tree from the `children` field. Called for each selected top story.
|
||||
- Success Response Schema (Code: `200 OK`): Standard Algolia item response.
|
||||
```json
|
||||
{
|
||||
"id": "number",
|
||||
"created_at": "string",
|
||||
"author": "string",
|
||||
"text": "string",
|
||||
"parent_id": "number",
|
||||
"story_id": "number",
|
||||
"children": [
|
||||
{
|
||||
/* nested comment structure */
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
- **Rate Limits:** Generous for public use; daily calls are fine.
|
||||
- **Link to Official Docs:** [https://hn.algolia.com/api](https://hn.algolia.com/api)
|
||||
|
||||
#### 2\. Play.ht API
|
||||
|
||||
- **Purpose:** To generate AI-powered podcast versions of the newsletter content.
|
||||
- **Base URL(s):** Production: `https://api.play.ai/api/v1`
|
||||
- **Authentication:** API Key (`X-USER-ID` header) and Bearer Token (`Authorization` header). Stored as `PLAYHT_USER_ID` and `PLAYHT_API_KEY`.
|
||||
- **Key Endpoints Used:**
|
||||
- **`POST /playnotes`**
|
||||
- Description: Initiates the text-to-speech conversion.
|
||||
- Request Headers: `Authorization: Bearer {PLAYHT_API_KEY}`, `X-USER-ID: {PLAYHT_USER_ID}`, `Content-Type: multipart/form-data`, `Accept: application/json`.
|
||||
- Request Body Schema: `multipart/form-data`
|
||||
- `sourceFile`: `string (binary)` (Preferred: HTML newsletter content as file upload.)
|
||||
- `sourceFileUrl`: `string (uri)` (Alternative: URL to hosted newsletter content if `sourceFile` is problematic.)
|
||||
- `synthesisStyle`: `string` (Required, e.g., "podcast")
|
||||
- `voice1`: `string` (Required, Voice ID)
|
||||
- `voice1Name`: `string` (Required)
|
||||
- `voice1Gender`: `string` (Required)
|
||||
- `webHookUrl`: `string (uri)` (Required, e.g., `<YOUR_APP_DOMAIN>/api/webhooks/playht`)
|
||||
- **Note on Content Delivery:** MVP uses `sourceFile`. If issues arise, pivot to `sourceFileUrl` (e.g., content temporarily in Supabase Storage).
|
||||
- Success Response Schema (Code: `201 Created`):
|
||||
```json
|
||||
{
|
||||
"id": "string",
|
||||
"ownerId": "string",
|
||||
"name": "string",
|
||||
"sourceFileUrl": "string",
|
||||
"audioUrl": "string",
|
||||
"synthesisStyle": "string",
|
||||
"voice1": "string",
|
||||
"voice1Name": "string",
|
||||
"voice1Gender": "string",
|
||||
"webHookUrl": "string",
|
||||
"status": "string",
|
||||
"duration": "number",
|
||||
"requestedAt": "string",
|
||||
"createdAt": "string"
|
||||
}
|
||||
```
|
||||
- **Webhook Handling:** Endpoint `/api/webhooks/playht` receives `POST` from Play.ht.
|
||||
- Request Body Schema (from Play.ht):
|
||||
```json
|
||||
{ "id": "string", "audioUrl": "string", "status": "string" }
|
||||
```
|
||||
- **Rate Limits:** Refer to official Play.ht documentation.
|
||||
- **Link to Official Docs:** [https://docs.play.ai/api-reference/playnote/post](https://docs.play.ai/api-reference/playnote/post)
|
||||
|
||||
#### 3\. LLM Provider (Facade for Summarization)
|
||||
|
||||
- **Purpose:** To generate summaries for articles and comment threads.
|
||||
- **Configuration:** Via environment variables (`LLM_PROVIDER_TYPE`, `OLLAMA_API_URL`, `REMOTE_LLM_API_KEY`, `REMOTE_LLM_API_URL`, `LLM_MODEL_NAME`).
|
||||
- **Facade Interface (`LLMFacade` in `supabase/functions/_shared/llm-facade.ts`):**
|
||||
|
||||
```typescript
|
||||
// Located in supabase/functions/_shared/llm-facade.ts
|
||||
export interface LLMSummarizationOptions {
|
||||
prompt?: string;
|
||||
maxLength?: number;
|
||||
}
|
||||
|
||||
export interface LLMFacade {
|
||||
generateSummary(
|
||||
textToSummarize: string,
|
||||
options?: LLMSummarizationOptions
|
||||
): Promise<string>;
|
||||
}
|
||||
```
|
||||
|
||||
- **Implementations:**
|
||||
- **Local Ollama Adapter:** HTTP requests to `OLLAMA_API_URL`.
|
||||
- Request Body (example for `/api/generate`): `{"model": "string", "prompt": "string", "stream": false}`
|
||||
- Response Body (example): `{"model": "string", "response": "string", ...}`
|
||||
- **Remote LLM API Adapter:** Authenticated HTTP requests to `REMOTE_LLM_API_URL`. Schemas depend on the provider.
|
||||
- **Rate Limits:** Provider-dependent.
|
||||
- **Link to Official Docs:** Ollama: [https://github.com/ollama/ollama/blob/main/docs/api.md](https://www.google.com/search?q=https://github.com/ollama/ollama/blob/main/docs/api.md)
|
||||
|
||||
#### 4\. Nodemailer (Email Delivery Service)
|
||||
|
||||
- **Purpose:** To send generated HTML newsletters.
|
||||
- **Interaction Type:** Library integration within `NewsletterGenerationService` via `NodemailerFacade` in `supabase/functions/_shared/nodemailer-facade.ts`.
|
||||
- **Configuration:** Via SMTP environment variables (`SMTP_HOST`, `SMTP_PORT`, `SMTP_USER`, `SMTP_PASS`).
|
||||
- **Key Operations:** Create transporter, construct email message (From, To, Subject, HTML), send email.
|
||||
- **Link to Official Docs:** [https://nodemailer.com/](https://nodemailer.com/)
|
||||
|
||||
### Internal APIs Provided (by BMad DiCaster)
|
||||
|
||||
#### 1\. Workflow Trigger API
|
||||
|
||||
- **Purpose:** To manually initiate the daily content processing pipeline.
|
||||
- **Endpoint Path:** `/api/system/trigger-workflow` (Next.js API Route Handler)
|
||||
- **Method:** `POST`
|
||||
- **Authentication:** API Key in `X-API-KEY` header (matches `WORKFLOW_TRIGGER_API_KEY` env var).
|
||||
- **Request Body:** MVP: Empty or `{}`.
|
||||
- **Success Response (`202 Accepted`):** `{"message": "Daily workflow triggered successfully. Processing will occur asynchronously.", "jobId": "<UUID_of_the_workflow_run>"}`
|
||||
- **Error Response:** `400 Bad Request`, `401 Unauthorized`, `500 Internal Server Error`.
|
||||
- **Action:** Creates a record in `workflow_runs` table and initiates the pipeline.
|
||||
|
||||
#### 2\. Workflow Status API
|
||||
|
||||
- **Purpose:** Allow developers/admins to check the status of a specific workflow run.
|
||||
- **Endpoint Path:** `/api/system/workflow-status/{jobId}` (Next.js API Route Handler)
|
||||
- **Method:** `GET`
|
||||
- **Authentication:** API Key in `X-API-KEY` header.
|
||||
- **Request Parameters:** `jobId` (Path parameter).
|
||||
- **Success Response (`200 OK`):**
|
||||
```json
|
||||
{
|
||||
"jobId": "<UUID>",
|
||||
"createdAt": "timestamp",
|
||||
"lastUpdatedAt": "timestamp",
|
||||
"status": "string",
|
||||
"currentStep": "string",
|
||||
"errorMessage": "string?",
|
||||
"details": {
|
||||
/* JSONB object with step-specific progress */
|
||||
}
|
||||
}
|
||||
```
|
||||
- **Error Response:** `401 Unauthorized`, `404 Not Found`, `500 Internal Server Error`.
|
||||
- **Action:** Retrieves record from `workflow_runs` for the given `jobId`.
|
||||
|
||||
#### 3\. Play.ht Webhook Receiver
|
||||
|
||||
- **Purpose:** To receive status updates and podcast audio URLs from Play.ht.
|
||||
- **Endpoint Path:** `/api/webhooks/playht` (Next.js API Route Handler)
|
||||
- **Method:** `POST`
|
||||
- **Authentication:** Implement verification (e.g., shared secret token).
|
||||
- **Request Body Schema (Expected from Play.ht):**
|
||||
```json
|
||||
{ "id": "string", "audioUrl": "string", "status": "string" }
|
||||
```
|
||||
- **Success Response (`200 OK`):** `{"message": "Webhook received successfully"}`
|
||||
- **Action:** Updates `newsletters` and `workflow_runs` tables.
|
||||
@@ -0,0 +1,318 @@
|
||||
# BMad DiCaster Architecture Document
|
||||
|
||||
## Introduction / Preamble
|
||||
|
||||
This document outlines the overall project architecture for BMad DiCaster, including backend systems, shared services, and non-UI specific concerns. Its primary goal is to serve as the guiding architectural blueprint for AI-driven development, ensuring consistency and adherence to chosen patterns and technologies.
|
||||
|
||||
**Relationship to Frontend Architecture:**
|
||||
This project includes a significant user interface. A separate Frontend Architecture Document (expected to be named `frontend-architecture.md` and linked in "Key Reference Documents" once created) will detail the frontend-specific design and MUST be used in conjunction with this document. Core technology stack choices documented herein (see "Definitive Tech Stack Selections") are definitive for the entire project, including any frontend components.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
- [Introduction / Preamble](#introduction--preamble)
|
||||
- [Technical Summary](#technical-summary)
|
||||
- [High-Level Overview](#high-level-overview)
|
||||
- [Component View](#component-view)
|
||||
- [Architectural / Design Patterns Adopted](#architectural--design-patterns-adopted)
|
||||
- [Workflow Orchestration and Status Management](#workflow-orchestration-and-status-management)
|
||||
- [Project Structure](#project-structure)
|
||||
- [Key Directory Descriptions](#key-directory-descriptions)
|
||||
- [Monorepo Management](#monorepo-management)
|
||||
- [Notes](#notes)
|
||||
- [API Reference](#api-reference)
|
||||
- [External APIs Consumed](#external-apis-consumed)
|
||||
- [1. Hacker News (HN) Algolia API](#1-hacker-news-hn-algolia-api)
|
||||
- [2. Play.ht API](#2-playht-api)
|
||||
- [3. LLM Provider (Facade for Summarization)](#3-llm-provider-facade-for-summarization)
|
||||
- [4. Nodemailer (Email Delivery Service)](#4-nodemailer-email-delivery-service)
|
||||
- [Internal APIs Provided (by BMad DiCaster)](#internal-apis-provided-by-bmad-dicaster)
|
||||
- [1. Workflow Trigger API](#1-workflow-trigger-api)
|
||||
- [2. Workflow Status API](#2-workflow-status-api)
|
||||
- [3. Play.ht Webhook Receiver](#3-playht-webhook-receiver)
|
||||
- [Data Models](#data-models)
|
||||
- [Core Application Entities / Domain Objects](#core-application-entities--domain-objects)
|
||||
- [1. `WorkflowRun`](#1-workflowrun)
|
||||
- [2. `HNPost`](#2-hnpost)
|
||||
- [3. `HNComment`](#3-hncomment)
|
||||
- [4. `ScrapedArticle`](#4-scrapedarticle)
|
||||
- [5. `ArticleSummary`](#5-articlesummary)
|
||||
- [6. `CommentSummary`](#6-commentsummary)
|
||||
- [7. `Newsletter`](#7-newsletter)
|
||||
- [8. `Subscriber`](#8-subscriber)
|
||||
- [9. `SummarizationPrompt`](#9-summarizationprompt)
|
||||
- [10. `NewsletterTemplate`](#10-newslettertemplate)
|
||||
- [Database Schemas (Supabase PostgreSQL)](#database-schemas-supabase-postgresql)
|
||||
- [1. `workflow_runs`](#1-workflow_runs)
|
||||
- [2. `hn_posts`](#2-hn_posts)
|
||||
- [3. `hn_comments`](#3-hn_comments)
|
||||
- [4. `scraped_articles`](#4-scraped_articles)
|
||||
- [5. `article_summaries`](#5-article_summaries)
|
||||
- [6. `comment_summaries`](#6-comment_summaries)
|
||||
- [7. `newsletters`](#7-newsletters)
|
||||
- [8. `subscribers`](#8-subscribers)
|
||||
- [9. `summarization_prompts`](#9-summarization_prompts)
|
||||
- [10. `newsletter_templates`](#10-newsletter_templates)
|
||||
- [Core Workflow / Sequence Diagrams](#core-workflow--sequence-diagrams)
|
||||
- [1. Daily Workflow Initiation & HN Content Acquisition](#1-daily-workflow-initiation--hn-content-acquisition)
|
||||
- [2. Article Scraping & Summarization Flow](#2-article-scraping--summarization-flow)
|
||||
- [3. Newsletter, Podcast, and Delivery Flow](#3-newsletter-podcast-and-delivery-flow)
|
||||
- [Definitive Tech Stack Selections](#definitive-tech-stack-selections)
|
||||
- [Infrastructure and Deployment Overview](#infrastructure-and-deployment-overview)
|
||||
- [Error Handling Strategy](#error-handling-strategy)
|
||||
- [Coding Standards](#coding-standards)
|
||||
- [Detailed Language & Framework Conventions](#detailed-language--framework-conventions)
|
||||
- [TypeScript/Node.js (Next.js & Supabase Functions) Specifics](#typescriptnodejs-nextjs--supabase-functions-specifics)
|
||||
- [Overall Testing Strategy](#overall-testing-strategy)
|
||||
- [Security Best Practices](#security-best-practices)
|
||||
- [Key Reference Documents](#key-reference-documents)
|
||||
- [Change Log](#change-log)
|
||||
- [Prompt for Design Architect: Frontend Architecture Definition](#prompt-for-design-architect-frontend-architecture-definition)
|
||||
|
||||
## Technical Summary
|
||||
|
||||
BMad DiCaster is a web application designed to provide daily, concise summaries of top Hacker News (HN) posts, delivered as an HTML newsletter and an optional AI-generated podcast, accessible via a Next.js web interface. The system employs a serverless, event-driven architecture hosted on Vercel, with Supabase providing PostgreSQL database services and function hosting. Key components include services for HN content retrieval, article scraping (using Cheerio), AI-powered summarization (via a configurable LLM facade for Ollama/remote APIs), podcast generation (Play.ht), newsletter generation (Nodemailer), and workflow orchestration. The architecture emphasizes modularity, clear separation of concerns (pragmatic hexagonal approach for complex functions), and robust error handling, aiming for efficient development, particularly by AI developer agents.
|
||||
|
||||
## High-Level Overview
|
||||
|
||||
The BMad DiCaster application will adopt a **serverless, event-driven architecture** hosted entirely on Vercel, with Supabase providing backend services (database and functions). The project will be structured as a **monorepo**, containing both the Next.js frontend application and the backend Supabase functions.
|
||||
|
||||
The core data processing flow is designed as an event-driven pipeline:
|
||||
|
||||
1. A scheduled mechanism (Vercel Cron Job) or manual trigger (API/CLI) initiates the daily workflow, creating a `workflow_run` job.
|
||||
2. Hacker News posts and comments are retrieved (HN Algolia API) and stored in Supabase.
|
||||
3. This data insertion triggers a Supabase function (via database webhook) to scrape linked articles.
|
||||
4. Successful article scraping and storage trigger further Supabase functions for AI-powered summarization of articles and comments.
|
||||
5. The completion of summarization steps for a workflow run is tracked, and once all prerequisites are met, a newsletter generation service is triggered.
|
||||
6. The newsletter content is sent to the Play.ht API to generate a podcast.
|
||||
7. Play.ht calls a webhook to notify our system when the podcast is ready, providing the podcast URL.
|
||||
8. The newsletter data in Supabase is updated with the podcast URL.
|
||||
9. The newsletter is then delivered to subscribers via Nodemailer, after considering podcast availability (with delay/retry logic).
|
||||
10. The Next.js frontend allows users to view current and past newsletters and listen to the podcasts.
|
||||
|
||||
This event-driven approach, using Supabase Database Webhooks (via `pg_net` or native functionality) to trigger Vercel-hosted Supabase Functions, aims to create a resilient and scalable system. It mitigates potential timeout issues by breaking down long-running processes into smaller, asynchronously triggered units.
|
||||
|
||||
Below is a system context diagram illustrating the primary services and user interactions:
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
User[Developer/Admin] -- "Triggers Daily Workflow (API/CLI/Cron)" --> BMadDiCasterBE[BMad DiCaster Backend Logic]
|
||||
UserWeb[End User] -- "Accesses Web Interface" --> BMadDiCasterFE[BMad DiCaster Frontend (Next.js on Vercel)]
|
||||
BMadDiCasterFE -- "Displays Data From" --> SupabaseDB[Supabase PostgreSQL]
|
||||
BMadDiCasterFE -- "Interacts With for Data/Triggers" --> SupabaseFunctions[Supabase Functions on Vercel]
|
||||
|
||||
subgraph "BMad DiCaster Backend Logic (Supabase Functions & Vercel)"
|
||||
direction LR
|
||||
SupabaseFunctions
|
||||
HNAPI[Hacker News Algolia API]
|
||||
ArticleScraper[Article Scraper Service]
|
||||
Summarizer[Summarization Service (LLM Facade)]
|
||||
PlayHTAPI[Play.ht API]
|
||||
NewsletterService[Newsletter Generation & Delivery Service]
|
||||
Nodemailer[Nodemailer Service]
|
||||
end
|
||||
|
||||
BMadDiCasterBE --> SupabaseDB
|
||||
SupabaseFunctions -- "Fetches HN Data" --> HNAPI
|
||||
SupabaseFunctions -- "Scrapes Articles" --> ArticleScraper
|
||||
ArticleScraper -- "Gets URLs from" --> SupabaseDB
|
||||
ArticleScraper -- "Stores Content" --> SupabaseDB
|
||||
SupabaseFunctions -- "Summarizes Content" --> Summarizer
|
||||
Summarizer -- "Uses Prompts from / Stores Summaries" --> SupabaseDB
|
||||
SupabaseFunctions -- "Generates Podcast" --> PlayHTAPI
|
||||
PlayHTAPI -- "Sends Webhook (Podcast URL)" --> SupabaseFunctions
|
||||
SupabaseFunctions -- "Updates Podcast URL" --> SupabaseDB
|
||||
SupabaseFunctions -- "Generates Newsletter" --> NewsletterService
|
||||
NewsletterService -- "Uses Template/Data from" --> SupabaseDB
|
||||
NewsletterService -- "Sends Emails Via" --> Nodemailer
|
||||
SupabaseDB -- "Stores Subscriber List" --> NewsletterService
|
||||
|
||||
classDef user fill:#9cf,stroke:#333,stroke-width:2px;
|
||||
classDef fe fill:#f9f,stroke:#333,stroke-width:2px;
|
||||
classDef be fill:#ccf,stroke:#333,stroke-width:2px;
|
||||
classDef external fill:#ffc,stroke:#333,stroke-width:2px;
|
||||
classDef db fill:#cfc,stroke:#333,stroke-width:2px;
|
||||
|
||||
class User,UserWeb user;
|
||||
class BMadDiCasterFE fe;
|
||||
class BMadDiCasterBE,SupabaseFunctions,ArticleScraper,Summarizer,NewsletterService be;
|
||||
class HNAPI,PlayHTAPI,Nodemailer external;
|
||||
class SupabaseDB db;
|
||||
```
|
||||
|
||||
## Component View
|
||||
|
||||
> This section has been moved to a dedicated document: [Component View](./component-view.md)
|
||||
|
||||
## Workflow Orchestration and Status Management
|
||||
|
||||
The BMad DiCaster application employs an event-driven pipeline for its daily content processing. To manage, monitor, and ensure the robust execution of this multi-step workflow, the following orchestration strategy is implemented:
|
||||
|
||||
**1. Central Workflow Tracking (`workflow_runs` Table):**
|
||||
|
||||
- A dedicated table, `public.workflow_runs` (defined in Data Models), serves as the single source of truth for the state and progress of each initiated daily workflow.
|
||||
- Each workflow execution is identified by a unique `id` (jobId) in this table.
|
||||
- Key fields include `status`, `current_step_details`, `error_message`, and a `details` JSONB column to store metadata and progress counters (e.g., `posts_fetched`, `articles_scraped_successfully`, `summaries_generated`, `podcast_playht_job_id`, `podcast_status`).
|
||||
|
||||
**2. Workflow Initiation:**
|
||||
|
||||
- A workflow is initiated via the `POST /api/system/trigger-workflow` API endpoint (callable manually, by CLI, or by a cron job).
|
||||
- Upon successful trigger, a new record is created in `workflow_runs` with an initial status (e.g., 'pending' or 'fetching_hn'), and the `jobId` is returned to the caller.
|
||||
- This initial record creation triggers the first service in the pipeline (`HNContentService`) via a database webhook or an initial direct call from the trigger API logic.
|
||||
|
||||
**3. Service Function Responsibilities:**
|
||||
|
||||
- Each backend Supabase Function (`HNContentService`, `ArticleScrapingService`, `SummarizationService`, `PodcastGenerationService`, `NewsletterGenerationService`) participating in the workflow **must**:
|
||||
- Be aware of the `workflow_run_id` for the job it is processing. This ID should be passed along or retrievable based on the triggering event/data.
|
||||
- **Before starting its primary task:** Update the `workflow_runs` table for the current `workflow_run_id` to reflect its `current_step_details` (e.g., "Started scraping article X for workflow Y").
|
||||
- **Upon successful completion of its task:**
|
||||
- Update any relevant data tables (e.g., `scraped_articles`, `article_summaries`).
|
||||
- Update the `workflow_runs.details` JSONB field with relevant output or counters (e.g., increment `articles_scraped_successfully_count`).
|
||||
- **Upon failure:** Update the `workflow_runs` table for the `workflow_run_id` to set `status` to 'failed', and populate `error_message` and `current_step_details` with failure information.
|
||||
- Utilize the shared `WorkflowTrackerService` (see point 5) for consistent status updates.
|
||||
- The `PlayHTWebhookHandlerAPI` (Next.js API route) updates the `newsletters` table and then the `workflow_runs.details` with podcast status.
|
||||
|
||||
**4. Orchestration and Progression (`CheckWorkflowCompletionService`):**
|
||||
|
||||
- A dedicated Supabase Function, `CheckWorkflowCompletionService`, will be scheduled to run periodically (e.g., every 5-10 minutes via Vercel Cron Jobs invoking a dedicated HTTP endpoint for this service, or Supabase's `pg_cron` if preferred for DB-centric scheduling).
|
||||
- This service orchestrates progression between major stages by:
|
||||
- Querying `workflow_runs` for jobs in intermediate statuses.
|
||||
- Verifying if all prerequisite tasks for the next stage are complete by:
|
||||
- Querying related data tables (e.g., `scraped_articles`, `article_summaries`, `comment_summaries`) based on the `workflow_run_id`.
|
||||
- Checking expected counts against actual completed counts (e.g., all articles intended for summarization have an `article_summaries` entry for the current `workflow_run_id`).
|
||||
- Checking the status of the podcast generation in the `newsletters` table (linked to `workflow_run_id`) before proceeding to email delivery.
|
||||
- If conditions for the next stage are met, it updates the `workflow_runs.status` (e.g., to 'generating_newsletter') and then invokes the appropriate next service (e.g., `NewsletterGenerationService`), passing the `workflow_run_id`.
|
||||
|
||||
**5. Shared `WorkflowTrackerService`:**
|
||||
|
||||
- A utility service, `WorkflowTrackerService`, will be created in `supabase/functions/_shared/`.
|
||||
- It will provide standardized methods for all backend functions to interact with the `workflow_runs` table (e.g., `updateWorkflowStep()`, `incrementWorkflowDetailCounter()`, `failWorkflow()`, `completeWorkflowStep()`).
|
||||
- This promotes consistency in status updates and reduces redundant code.
|
||||
|
||||
**6. Podcast Link Before Email Delivery:**
|
||||
|
||||
- The `NewsletterGenerationService`, after generating the HTML and initiating podcast creation (via `PodcastGenerationService`), will set the `newsletters.podcast_status` to 'generating'.
|
||||
- The `CheckWorkflowCompletionService` (or the `NewsletterGenerationService` itself if designed for polling/delay) will monitor the `newsletters.podcast_url` (populated by the `PlayHTWebhookHandlerAPI`) or `newsletters.podcast_status`.
|
||||
- Email delivery is triggered by `CheckWorkflowCompletionService` once the podcast URL is available, a timeout is reached, or podcast generation fails (as per PRD's delay/retry logic). The final delivery status will be updated in `workflow_runs` and `newsletters`.
|
||||
|
||||
## Project Structure
|
||||
|
||||
> This section has been moved to a dedicated document: [Project Structure](./project-structure.md)
|
||||
|
||||
## API Reference
|
||||
|
||||
> This section has been moved to a dedicated document: [API Reference](./api-reference.md)
|
||||
|
||||
## Data Models
|
||||
|
||||
> This section has been moved to a dedicated document: [Data Models](./data-models.md)
|
||||
|
||||
## Core Workflow / Sequence Diagrams
|
||||
|
||||
> This section has been moved to a dedicated document: [Core Workflow / Sequence Diagrams](./sequence-diagrams.md)
|
||||
|
||||
## Definitive Tech Stack Selections
|
||||
|
||||
> This section has been moved to a dedicated document: [Definitive Tech Stack Selections](./tech-stack.md)
|
||||
|
||||
## Infrastructure and Deployment Overview
|
||||
|
||||
> This section has been moved to a dedicated document: [Infrastructure and Deployment Overview](./infra-deployment.md)
|
||||
|
||||
## Error Handling Strategy
|
||||
|
||||
> This section is part of the consolidated [Operational Guidelines](./operational-guidelines.md#error-handling-strategy).
|
||||
|
||||
## Coding Standards
|
||||
|
||||
> This section is part of the consolidated [Operational Guidelines](./operational-guidelines.md#coding-standards).
|
||||
|
||||
## Overall Testing Strategy
|
||||
|
||||
> This section is part of the consolidated [Operational Guidelines](./operational-guidelines.md#overall-testing-strategy).
|
||||
|
||||
## Security Best Practices
|
||||
|
||||
> This section is part of the consolidated [Operational Guidelines](./operational-guidelines.md#security-best-practices).
|
||||
|
||||
## Key Reference Documents
|
||||
|
||||
1. **Product Requirements Document (PRD):** `docs/prd-incremental-full-agile-mode.txt`
|
||||
2. **UI/UX Specification:** `docs/ui-ux-spec.txt`
|
||||
3. **Technical Preferences:** `docs/technical-preferences copy.txt`
|
||||
4. **Environment Variables Documentation:** [Environment Variables Documentation](./environment-vars.md)
|
||||
5. **(Optional) Frontend Architecture Document:** `docs/frontend-architecture.md` (To be created by Design Architect)
|
||||
6. **Play.ht API Documentation:** [https://docs.play.ai/api-reference/playnote/post](https://docs.play.ai/api-reference/playnote/post)
|
||||
7. **Hacker News Algolia API:** [https://hn.algolia.com/api](https://hn.algolia.com/api)
|
||||
8. **Ollama API Documentation:** [https://github.com/ollama/ollama/blob/main/docs/api.md](https://www.google.com/search?q=https://github.com/ollama/ollama/blob/main/docs/api.md)
|
||||
9. **Supabase Documentation:** [https://supabase.com/docs](https://supabase.com/docs)
|
||||
10. **Next.js Documentation:** [https://nextjs.org/docs](https://nextjs.org/docs)
|
||||
11. **Vercel Documentation:** [https://vercel.com/docs](https://vercel.com/docs)
|
||||
12. **Pino Logging Documentation:** [https://getpino.io/](https://getpino.io/)
|
||||
13. **Zod Documentation:** [https://zod.dev/](https://zod.dev/)
|
||||
|
||||
## Change Log
|
||||
|
||||
| Change | Date | Version | Description | Author |
|
||||
| :----------------------------------------- | :--------- | :------ | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :------------- |
|
||||
| Initial Draft based on PRD and discussions | 2025-05-13 | 0.1 | First complete draft covering project overview, components, data models, tech stack, deployment, error handling, coding standards, testing strategy, security, and workflow orchestration. | 3-arch (Agent) |
|
||||
|
||||
---
|
||||
|
||||
## Prompt for Design Architect: Frontend Architecture Definition
|
||||
|
||||
**To the Design Architect (Agent Specializing in Frontend Architecture):**
|
||||
|
||||
You are now tasked with defining the detailed **Frontend Architecture** for the BMad DiCaster project. This main Architecture Document and the `docs/ui-ux-spec.txt` are your primary input artifacts. Your goal is to produce a dedicated `frontend-architecture.md` document.
|
||||
|
||||
**Key Inputs & Constraints (from this Main Architecture Document & UI/UX Spec):**
|
||||
|
||||
1. **Overall Project Architecture:** Familiarize yourself with the "High-Level Overview," "Component View," "Data Models" (especially any shared types in `shared/types/`), and "API Reference" (particularly internal APIs like `/api/system/trigger-workflow` and `/api/webhooks/playht` that the frontend might indirectly be aware of or need to interact with for admin purposes in the future, though MVP frontend primarily reads newsletter data).
|
||||
2. **UI/UX Specification (`docs/ui-ux-spec.txt`):** This document contains user flows, wireframes, core screens (Newsletter List, Newsletter Detail), component inventory (NewsletterCard, PodcastPlayer, DownloadButton, BackButton), branding considerations (synthwave, minimalist), and accessibility aspirations.
|
||||
3. **Definitive Technology Stack (Frontend Relevant):**
|
||||
- Framework: Next.js (`latest`, App Router)
|
||||
- Language: React (`19.0.0`) with TypeScript (`5.7.2`)
|
||||
- UI Libraries: Tailwind CSS (`3.4.17`), Shadcn UI (`latest`)
|
||||
- State Management: Zustand (`latest`)
|
||||
- Testing: React Testing Library (RTL) (`latest`), Jest (`latest`)
|
||||
- Starter Template: Vercel/Supabase Next.js App Router template ([https://vercel.com/templates/next.js/supabase](https://vercel.com/templates/next.js/supabase)). Leverage its existing structure for `app/`, `components/ui/` (from Shadcn), `lib/utils.ts`, and `utils/supabase/` (client, server, middleware helpers for Supabase).
|
||||
4. **Project Structure (Frontend Relevant):** Refer to the "Project Structure" section in this document, particularly the `app/` directory, `components/` (for Shadcn `ui` and your `core` application components), `lib/`, and `utils/supabase/`.
|
||||
5. **Existing Frontend Files (from template):** Be aware of `middleware.ts` (for Supabase auth) and any existing components or utility functions provided by the starter template.
|
||||
|
||||
**Tasks for Frontend Architecture Document (`frontend-architecture.md`):**
|
||||
|
||||
1. **Refine Frontend Project Structure:**
|
||||
- Detail the specific folder structure within `app/`. Propose organization for pages (routes), layouts, application-specific components (`app/components/core/`), data fetching logic, context providers, and Zustand stores.
|
||||
- How will Shadcn UI components (`components/ui/`) be used and potentially customized?
|
||||
2. **Component Architecture:**
|
||||
- For each core screen identified in the UI/UX spec (Newsletter List, Newsletter Detail), define the primary React component hierarchy.
|
||||
- Specify responsibilities and key props for major reusable application components (e.g., `NewsletterCard`, `NewsletterDetailView`, `PodcastPlayerControls`).
|
||||
- How will components fetch and display data from Supabase? (e.g., Server Components, Client Components using Supabase client from `utils/supabase/client.ts` or `utils/supabase/server.ts`).
|
||||
3. **State Management (Zustand):**
|
||||
- Identify global and local state needs.
|
||||
- Define specific Zustand store(s): what data they will hold (e.g., current newsletter list, selected newsletter details, podcast player state), and what actions they will expose.
|
||||
- How will components interact with these stores?
|
||||
4. **Data Fetching & Caching (Frontend):**
|
||||
- Specify patterns for fetching newsletter data (lists and individual items) and podcast information.
|
||||
- How will Next.js data fetching capabilities (Server Components, Route Handlers, `Workspace` with caching options) be utilized with the Supabase client?
|
||||
- Address loading and error states for data fetching in the UI.
|
||||
5. **Routing:**
|
||||
- Confirm Next.js App Router usage and define URL structure for the newsletter list and detail pages.
|
||||
6. **Styling Approach:**
|
||||
- Reiterate use of Tailwind CSS and Shadcn UI.
|
||||
- Define any project-specific conventions for applying Tailwind classes or extending the theme (beyond what's in `tailwind.config.ts`).
|
||||
- How will the "synthwave technical glowing purple vibes" be implemented using Tailwind?
|
||||
7. **Error Handling (Frontend):**
|
||||
- How will errors from API calls (to Supabase or internal Next.js API routes if any) be handled and displayed to the user?
|
||||
- Strategy for UI error boundaries.
|
||||
8. **Accessibility (AX):**
|
||||
- Elaborate on how the WCAG 2.1 Level A requirements (keyboard navigation, semantic HTML, alt text, color contrast) will be met in component design and implementation, leveraging Next.js and Shadcn UI capabilities.
|
||||
9. **Testing (Frontend):**
|
||||
- Reiterate the use of Jest and RTL for unit/integration testing of React components.
|
||||
- Provide examples or guidelines for writing effective frontend tests.
|
||||
10. **Key Frontend Libraries & Versioning:** Confirm versions from the main tech stack and list any additional frontend-only libraries required.
|
||||
|
||||
Your output should be a clean, well-formatted `frontend-architecture.md` document ready for AI developer agents to use for frontend implementation. Adhere to the output formatting guidelines. You are now operating in **Frontend Architecture Mode**.
|
||||
|
||||
---
|
||||
|
||||
This concludes the BMad DiCaster Architecture Document.
|
||||
@@ -0,0 +1,141 @@
|
||||
# Component View
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "Component View".
|
||||
|
||||
The BMad DiCaster system is composed of several key logical components, primarily implemented as serverless functions (Supabase Functions deployed on Vercel) and a Next.js frontend application. These components work together in an event-driven manner.
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
subgraph FrontendApp [Frontend Application (Next.js)]
|
||||
direction LR
|
||||
WebAppUI["Web Application UI (React Components)"]
|
||||
APIServiceFE["API Service (Frontend - Next.js Route Handlers)"]
|
||||
end
|
||||
|
||||
subgraph BackendServices [Backend Services (Supabase Functions & Core Logic)]
|
||||
direction TB
|
||||
WorkflowTriggerAPI["Workflow Trigger API (/api/system/trigger-workflow)"]
|
||||
HNContentService["HN Content Service (Supabase Fn)"]
|
||||
ArticleScrapingService["Article Scraping Service (Supabase Fn)"]
|
||||
SummarizationService["Summarization Service (LLM Facade - Supabase Fn)"]
|
||||
PodcastGenerationService["Podcast Generation Service (Supabase Fn)"]
|
||||
NewsletterGenerationService["Newsletter Generation Service (Supabase Fn)"]
|
||||
PlayHTWebhookHandlerAPI["Play.ht Webhook API (/api/webhooks/playht)"]
|
||||
CheckWorkflowCompletionService["CheckWorkflowCompletionService (Supabase Cron Fn)"]
|
||||
end
|
||||
|
||||
subgraph ExternalIntegrations [External APIs & Services]
|
||||
direction TB
|
||||
HNAlgoliaAPI["Hacker News Algolia API"]
|
||||
PlayHTAPI["Play.ht API"]
|
||||
LLMProvider["LLM Provider (Ollama/Remote API)"]
|
||||
NodemailerService["Nodemailer (Email Delivery)"]
|
||||
end
|
||||
|
||||
subgraph DataStorage [Data Storage (Supabase PostgreSQL)]
|
||||
direction TB
|
||||
DB_WorkflowRuns["workflow_runs Table"]
|
||||
DB_Posts["hn_posts Table"]
|
||||
DB_Comments["hn_comments Table"]
|
||||
DB_Articles["scraped_articles Table"]
|
||||
DB_Summaries["article_summaries / comment_summaries Tables"]
|
||||
DB_Newsletters["newsletters Table"]
|
||||
DB_Subscribers["subscribers Table"]
|
||||
DB_Prompts["summarization_prompts Table"]
|
||||
DB_NewsletterTemplates["newsletter_templates Table"]
|
||||
end
|
||||
|
||||
UserWeb[End User] --> WebAppUI
|
||||
WebAppUI --> APIServiceFE
|
||||
APIServiceFE --> WorkflowTriggerAPI
|
||||
APIServiceFE --> DataStorage
|
||||
|
||||
|
||||
DevAdmin[Developer/Admin/Cron] --> WorkflowTriggerAPI
|
||||
|
||||
WorkflowTriggerAPI --> DB_WorkflowRuns
|
||||
|
||||
DB_WorkflowRuns -- "Triggers (via CheckWorkflowCompletion or direct)" --> HNContentService
|
||||
HNContentService --> HNAlgoliaAPI
|
||||
HNContentService --> DB_Posts
|
||||
HNContentService --> DB_Comments
|
||||
HNContentService --> DB_WorkflowRuns
|
||||
|
||||
|
||||
DB_Posts -- "Triggers (via DB Webhook)" --> ArticleScrapingService
|
||||
ArticleScrapingService --> DB_Articles
|
||||
ArticleScrapingService --> DB_WorkflowRuns
|
||||
|
||||
DB_Articles -- "Triggers (via DB Webhook)" --> SummarizationService
|
||||
SummarizationService --> LLMProvider
|
||||
SummarizationService --> DB_Prompts
|
||||
SummarizationService --> DB_Summaries
|
||||
SummarizationService --> DB_WorkflowRuns
|
||||
|
||||
CheckWorkflowCompletionService -- "Monitors & Triggers Next Steps Based On" --> DB_WorkflowRuns
|
||||
CheckWorkflowCompletionService -- "Monitors & Triggers Next Steps Based On" --> DB_Summaries
|
||||
CheckWorkflowCompletionService -- "Monitors & Triggers Next Steps Based On" --> DB_Newsletters
|
||||
|
||||
|
||||
CheckWorkflowCompletionService --> NewsletterGenerationService
|
||||
NewsletterGenerationService --> DB_NewsletterTemplates
|
||||
NewsletterGenerationService --> DB_Summaries
|
||||
NewsletterGenerationService --> DB_Newsletters
|
||||
NewsletterGenerationService --> DB_WorkflowRuns
|
||||
|
||||
|
||||
CheckWorkflowCompletionService --> PodcastGenerationService
|
||||
PodcastGenerationService --> PlayHTAPI
|
||||
PodcastGenerationService --> DB_Newsletters
|
||||
PodcastGenerationService --> DB_WorkflowRuns
|
||||
|
||||
PlayHTAPI -- "Webhook" --> PlayHTWebhookHandlerAPI
|
||||
PlayHTWebhookHandlerAPI --> DB_Newsletters
|
||||
PlayHTWebhookHandlerAPI --> DB_WorkflowRuns
|
||||
|
||||
|
||||
CheckWorkflowCompletionService -- "Triggers Delivery" --> NewsletterGenerationService
|
||||
NewsletterGenerationService -- "(For Delivery)" --> NodemailerService
|
||||
NewsletterGenerationService -- "(For Delivery)" --> DB_Subscribers
|
||||
NewsletterGenerationService -- "(For Delivery)" --> DB_Newsletters
|
||||
NewsletterGenerationService -- "(For Delivery)" --> DB_WorkflowRuns
|
||||
|
||||
|
||||
classDef user fill:#9cf,stroke:#333,stroke-width:2px;
|
||||
classDef feapp fill:#f9d,stroke:#333,stroke-width:2px;
|
||||
classDef beapp fill:#cdf,stroke:#333,stroke-width:2px;
|
||||
classDef external fill:#ffc,stroke:#333,stroke-width:2px;
|
||||
classDef db fill:#cfc,stroke:#333,stroke-width:2px;
|
||||
|
||||
class UserWeb,DevAdmin user;
|
||||
class FrontendApp,WebAppUI,APIServiceFE feapp;
|
||||
class BackendServices,WorkflowTriggerAPI,HNContentService,ArticleScrapingService,SummarizationService,PodcastGenerationService,NewsletterGenerationService,PlayHTWebhookHandlerAPI,CheckWorkflowCompletionService beapp;
|
||||
class ExternalIntegrations,HNAlgoliaAPI,PlayHTAPI,LLMProvider,NodemailerService external;
|
||||
class DataStorage,DB_WorkflowRuns,DB_Posts,DB_Comments,DB_Articles,DB_Summaries,DB_Newsletters,DB_Subscribers,DB_Prompts,DB_NewsletterTemplates db;
|
||||
```
|
||||
|
||||
- **Frontend Application (Next.js on Vercel):**
|
||||
- **Web Application UI (React Components):** Renders UI, displays newsletters/podcasts, handles user interactions.
|
||||
- **API Service (Frontend - Next.js Route Handlers):** Handles frontend-initiated API calls (e.g., for future admin functions) and receives incoming webhooks (Play.ht).
|
||||
- **Backend Services (Supabase Functions & Core Logic):**
|
||||
- **Workflow Trigger API (`/api/system/trigger-workflow`):** Secure Next.js API route to manually initiate the daily workflow.
|
||||
- **HN Content Service (Supabase Fn):** Retrieves posts/comments from HN Algolia API, stores them.
|
||||
- **Article Scraping Service (Supabase Fn):** Triggered by new HN posts, scrapes article content.
|
||||
- **Summarization Service (LLM Facade - Supabase Fn):** Triggered by new articles/comments, generates summaries using LLM.
|
||||
- **Podcast Generation Service (Supabase Fn):** Sends newsletter content to Play.ht API.
|
||||
- **Newsletter Generation Service (Supabase Fn):** Compiles newsletter, handles podcast link logic, triggers email delivery.
|
||||
- **Play.ht Webhook API (`/api/webhooks/playht`):** Next.js API route to receive podcast status from Play.ht.
|
||||
- **CheckWorkflowCompletionService (Supabase Cron Fn):** Periodically monitors `workflow_runs` and related tables to orchestrate the progression between pipeline stages (e.g., from summarization to newsletter generation, then to delivery).
|
||||
- **Data Storage (Supabase PostgreSQL):** Stores all application data including workflow state, content, summaries, newsletters, subscribers, prompts, and templates.
|
||||
- **External APIs & Services:** HN Algolia API, Play.ht API, LLM Provider (Ollama/Remote), Nodemailer.
|
||||
|
||||
### Architectural / Design Patterns Adopted
|
||||
|
||||
- **Event-Driven Architecture:** Core backend processing is a series of steps triggered by database events (Supabase Database Webhooks calling Supabase Functions hosted on Vercel) and orchestrated via the `workflow_runs` table and the `CheckWorkflowCompletionService`.
|
||||
- **Serverless Functions:** Backend logic is encapsulated in Supabase Functions (running on Vercel).
|
||||
- **Monorepo:** All code resides in a single repository.
|
||||
- **Facade Pattern:** Encapsulates interactions with external services (HN API, Play.ht API, LLM, Nodemailer) within `supabase/functions/_shared/`.
|
||||
- **Factory Pattern (for LLM Service):** The `LLMFacade` will use a factory to instantiate the appropriate LLM client based on environment configuration.
|
||||
- **Hexagonal Architecture (Pragmatic Application):** For complex Supabase Functions, core business logic will be separated from framework-specific handlers and data interaction code (adapters) to improve testability and maintainability. Simpler functions may have a more direct implementation.
|
||||
- **Repository Pattern (for Data Access - Conceptual):** Data access logic within services will be organized, conceptually resembling repositories, even if not strictly implemented with separate repository classes for all entities in MVP Supabase Functions.
|
||||
- **Configuration via Environment Variables:** All sensitive and environment-specific configurations managed via environment variables.
|
||||
@@ -0,0 +1,232 @@
|
||||
# Data Models
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "Data Models".
|
||||
|
||||
This section defines the core data structures used within the BMad DiCaster application, including conceptual domain entities and their corresponding database schemas in Supabase PostgreSQL.
|
||||
|
||||
### Core Application Entities / Domain Objects
|
||||
|
||||
(Conceptual types, typically defined in `shared/types/domain-models.ts`)
|
||||
|
||||
#### 1\. `WorkflowRun`
|
||||
|
||||
- **Description:** A single execution of the daily workflow.
|
||||
- **Schema:** `id (string UUID)`, `createdAt (string ISO)`, `lastUpdatedAt (string ISO)`, `status (enum string: 'pending' | 'fetching_hn' | 'scraping_articles' | 'summarizing_content' | 'generating_podcast' | 'generating_newsletter' | 'delivering_newsletter' | 'completed' | 'failed')`, `currentStepDetails (string?)`, `errorMessage (string?)`, `details (object?: { postsFetched?: number, articlesAttempted?: number, articlesScrapedSuccessfully?: number, summariesGenerated?: number, podcastJobId?: string, podcastStatus?: string, newsletterGeneratedAt?: string, subscribersNotified?: number })`
|
||||
|
||||
#### 2\. `HNPost`
|
||||
|
||||
- **Description:** A post from Hacker News.
|
||||
- **Schema:** `id (string HN_objectID)`, `hnNumericId (number?)`, `title (string)`, `url (string?)`, `author (string)`, `points (number)`, `createdAt (string ISO)`, `retrievedAt (string ISO)`, `hnStoryText (string?)`, `numComments (number?)`, `tags (string[]?)`, `workflowRunId (string UUID?)`
|
||||
|
||||
#### 3\. `HNComment`
|
||||
|
||||
- **Description:** A comment on an HN post.
|
||||
- **Schema:** `id (string HN_commentID)`, `hnPostId (string)`, `parentId (string?)`, `author (string?)`, `text (string HTML)`, `createdAt (string ISO)`, `retrievedAt (string ISO)`, `children (HNComment[]?)`
|
||||
|
||||
#### 4\. `ScrapedArticle`
|
||||
|
||||
- **Description:** Content scraped from an article URL.
|
||||
- **Schema:** `id (string UUID)`, `hnPostId (string)`, `originalUrl (string)`, `resolvedUrl (string?)`, `title (string?)`, `author (string?)`, `publicationDate (string ISO?)`, `mainTextContent (string?)`, `scrapedAt (string ISO)`, `scrapingStatus (enum string: 'pending' | 'success' | 'failed_unreachable' | 'failed_paywall' | 'failed_parsing')`, `errorMessage (string?)`, `workflowRunId (string UUID?)`
|
||||
|
||||
#### 5\. `ArticleSummary`
|
||||
|
||||
- **Description:** AI-generated summary of a `ScrapedArticle`.
|
||||
- **Schema:** `id (string UUID)`, `scrapedArticleId (string UUID)`, `summaryText (string)`, `generatedAt (string ISO)`, `llmPromptVersion (string?)`, `llmModelUsed (string?)`, `workflowRunId (string UUID)`
|
||||
|
||||
#### 6\. `CommentSummary`
|
||||
|
||||
- **Description:** AI-generated summary of comments for an `HNPost`.
|
||||
- **Schema:** `id (string UUID)`, `hnPostId (string)`, `summaryText (string)`, `generatedAt (string ISO)`, `llmPromptVersion (string?)`, `llmModelUsed (string?)`, `workflowRunId (string UUID)`
|
||||
|
||||
#### 7\. `Newsletter`
|
||||
|
||||
- **Description:** The daily generated newsletter.
|
||||
- **Schema:** `id (string UUID)`, `workflowRunId (string UUID)`, `targetDate (string YYYY-MM-DD)`, `title (string)`, `generatedAt (string ISO)`, `htmlContent (string)`, `mjmlTemplateVersion (string?)`, `podcastPlayhtJobId (string?)`, `podcastUrl (string?)`, `podcastStatus (enum string?: 'pending' | 'generating' | 'completed' | 'failed')`, `deliveryStatus (enum string: 'pending' | 'sending' | 'sent' | 'partially_failed' | 'failed')`, `scheduledSendAt (string ISO?)`, `sentAt (string ISO?)`
|
||||
|
||||
#### 8\. `Subscriber`
|
||||
|
||||
- **Description:** An email subscriber.
|
||||
- **Schema:** `id (string UUID)`, `email (string)`, `subscribedAt (string ISO)`, `isActive (boolean)`, `unsubscribedAt (string ISO?)`
|
||||
|
||||
#### 9\. `SummarizationPrompt`
|
||||
|
||||
- **Description:** Stores prompts for AI summarization.
|
||||
- **Schema:** `id (string UUID)`, `promptName (string)`, `promptText (string)`, `version (string)`, `createdAt (string ISO)`, `updatedAt (string ISO)`, `isDefaultArticlePrompt (boolean)`, `isDefaultCommentPrompt (boolean)`
|
||||
|
||||
#### 10\. `NewsletterTemplate`
|
||||
|
||||
- **Description:** HTML/MJML templates for newsletters.
|
||||
- **Schema:** `id (string UUID)`, `templateName (string)`, `mjmlContent (string?)`, `htmlContent (string)`, `version (string)`, `createdAt (string ISO)`, `updatedAt (string ISO)`, `isDefault (boolean)`
|
||||
|
||||
### Database Schemas (Supabase PostgreSQL)
|
||||
|
||||
#### 1\. `workflow_runs`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.workflow_runs (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
last_updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
status TEXT NOT NULL DEFAULT 'pending', -- pending, fetching_hn, scraping_articles, summarizing_content, generating_podcast, generating_newsletter, delivering_newsletter, completed, failed
|
||||
current_step_details TEXT NULL,
|
||||
error_message TEXT NULL,
|
||||
details JSONB NULL -- {postsFetched, articlesAttempted, articlesScrapedSuccessfully, summariesGenerated, podcastJobId, podcastStatus, newsletterGeneratedAt, subscribersNotified}
|
||||
);
|
||||
COMMENT ON COLUMN public.workflow_runs.status IS 'Possible values: pending, fetching_hn, scraping_articles, summarizing_content, generating_podcast, generating_newsletter, delivering_newsletter, completed, failed';
|
||||
COMMENT ON COLUMN public.workflow_runs.details IS 'Stores step-specific progress or metadata like postsFetched, articlesScraped, podcastJobId, etc.';
|
||||
```
|
||||
|
||||
#### 2\. `hn_posts`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.hn_posts (
|
||||
id TEXT PRIMARY KEY, -- HN's objectID
|
||||
hn_numeric_id BIGINT NULL UNIQUE,
|
||||
title TEXT NOT NULL,
|
||||
url TEXT NULL,
|
||||
author TEXT NULL,
|
||||
points INTEGER NOT NULL DEFAULT 0,
|
||||
created_at TIMESTAMPTZ NOT NULL, -- HN post creation time
|
||||
retrieved_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
hn_story_text TEXT NULL,
|
||||
num_comments INTEGER NULL DEFAULT 0,
|
||||
tags TEXT[] NULL,
|
||||
workflow_run_id UUID NULL REFERENCES public.workflow_runs(id) ON DELETE SET NULL -- The run that fetched this instance of the post
|
||||
);
|
||||
COMMENT ON COLUMN public.hn_posts.id IS 'Hacker News objectID for the story.';
|
||||
```
|
||||
|
||||
#### 3\. `hn_comments`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.hn_comments (
|
||||
id TEXT PRIMARY KEY, -- HN's comment ID
|
||||
hn_post_id TEXT NOT NULL REFERENCES public.hn_posts(id) ON DELETE CASCADE,
|
||||
parent_comment_id TEXT NULL REFERENCES public.hn_comments(id) ON DELETE CASCADE,
|
||||
author TEXT NULL,
|
||||
comment_text TEXT NOT NULL, -- HTML content of the comment
|
||||
created_at TIMESTAMPTZ NOT NULL, -- HN comment creation time
|
||||
retrieved_at TIMESTAMPTZ NOT NULL DEFAULT now()
|
||||
);
|
||||
CREATE INDEX idx_hn_comments_post_id ON public.hn_comments(hn_post_id);
|
||||
```
|
||||
|
||||
#### 4\. `scraped_articles`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.scraped_articles (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
hn_post_id TEXT NOT NULL REFERENCES public.hn_posts(id) ON DELETE CASCADE,
|
||||
original_url TEXT NOT NULL,
|
||||
resolved_url TEXT NULL,
|
||||
title TEXT NULL,
|
||||
author TEXT NULL,
|
||||
publication_date TIMESTAMPTZ NULL,
|
||||
main_text_content TEXT NULL,
|
||||
scraped_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
scraping_status TEXT NOT NULL DEFAULT 'pending', -- pending, success, failed_unreachable, failed_paywall, failed_parsing
|
||||
error_message TEXT NULL,
|
||||
workflow_run_id UUID NULL REFERENCES public.workflow_runs(id) ON DELETE SET NULL
|
||||
);
|
||||
CREATE UNIQUE INDEX idx_scraped_articles_hn_post_id_workflow_run_id ON public.scraped_articles(hn_post_id, workflow_run_id);
|
||||
COMMENT ON COLUMN public.scraped_articles.scraping_status IS 'Possible values: pending, success, failed_unreachable, failed_paywall, failed_parsing, failed_generic';
|
||||
```
|
||||
|
||||
#### 5\. `article_summaries`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.article_summaries (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
scraped_article_id UUID NOT NULL REFERENCES public.scraped_articles(id) ON DELETE CASCADE,
|
||||
summary_text TEXT NOT NULL,
|
||||
generated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
llm_prompt_version TEXT NULL,
|
||||
llm_model_used TEXT NULL,
|
||||
workflow_run_id UUID NOT NULL REFERENCES public.workflow_runs(id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE UNIQUE INDEX idx_article_summaries_scraped_article_id_workflow_run_id ON public.article_summaries(scraped_article_id, workflow_run_id);
|
||||
COMMENT ON COLUMN public.article_summaries.llm_prompt_version IS 'Version or identifier of the summarization prompt used.';
|
||||
```
|
||||
|
||||
#### 6\. `comment_summaries`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.comment_summaries (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
hn_post_id TEXT NOT NULL REFERENCES public.hn_posts(id) ON DELETE CASCADE,
|
||||
summary_text TEXT NOT NULL,
|
||||
generated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
llm_prompt_version TEXT NULL,
|
||||
llm_model_used TEXT NULL,
|
||||
workflow_run_id UUID NOT NULL REFERENCES public.workflow_runs(id) ON DELETE CASCADE
|
||||
);
|
||||
CREATE UNIQUE INDEX idx_comment_summaries_hn_post_id_workflow_run_id ON public.comment_summaries(hn_post_id, workflow_run_id);
|
||||
```
|
||||
|
||||
#### 7\. `newsletters`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.newsletters (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
workflow_run_id UUID NOT NULL UNIQUE REFERENCES public.workflow_runs(id) ON DELETE CASCADE,
|
||||
target_date DATE NOT NULL UNIQUE,
|
||||
title TEXT NOT NULL,
|
||||
generated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
html_content TEXT NOT NULL,
|
||||
mjml_template_version TEXT NULL,
|
||||
podcast_playht_job_id TEXT NULL,
|
||||
podcast_url TEXT NULL,
|
||||
podcast_status TEXT NULL DEFAULT 'pending', -- pending, generating, completed, failed
|
||||
delivery_status TEXT NOT NULL DEFAULT 'pending', -- pending, sending, sent, failed, partially_failed
|
||||
scheduled_send_at TIMESTAMPTZ NULL,
|
||||
sent_at TIMESTAMPTZ NULL
|
||||
);
|
||||
CREATE INDEX idx_newsletters_target_date ON public.newsletters(target_date);
|
||||
COMMENT ON COLUMN public.newsletters.target_date IS 'The date this newsletter pertains to. Ensures uniqueness.';
|
||||
```
|
||||
|
||||
#### 8\. `subscribers`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.subscribers (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
email TEXT NOT NULL UNIQUE,
|
||||
subscribed_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
is_active BOOLEAN NOT NULL DEFAULT TRUE,
|
||||
unsubscribed_at TIMESTAMPTZ NULL
|
||||
);
|
||||
CREATE INDEX idx_subscribers_email_active ON public.subscribers(email, is_active);
|
||||
```
|
||||
|
||||
#### 9\. `summarization_prompts`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.summarization_prompts (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
prompt_name TEXT NOT NULL UNIQUE,
|
||||
prompt_text TEXT NOT NULL,
|
||||
version TEXT NOT NULL DEFAULT '1.0',
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
is_default_article_prompt BOOLEAN NOT NULL DEFAULT FALSE,
|
||||
is_default_comment_prompt BOOLEAN NOT NULL DEFAULT FALSE
|
||||
);
|
||||
COMMENT ON COLUMN public.summarization_prompts.prompt_name IS 'Unique identifier for the prompt, e.g., article_summary_v2.1';
|
||||
-- Application logic will enforce that only one prompt of each type is marked as default.
|
||||
```
|
||||
|
||||
#### 10\. `newsletter_templates`
|
||||
|
||||
```sql
|
||||
CREATE TABLE public.newsletter_templates (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
template_name TEXT NOT NULL UNIQUE,
|
||||
mjml_content TEXT NULL,
|
||||
html_content TEXT NOT NULL,
|
||||
version TEXT NOT NULL DEFAULT '1.0',
|
||||
created_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
updated_at TIMESTAMPTZ NOT NULL DEFAULT now(),
|
||||
is_default BOOLEAN NOT NULL DEFAULT FALSE
|
||||
);
|
||||
-- Application logic will enforce that only one template is marked as default.
|
||||
```
|
||||
@@ -0,0 +1,9 @@
|
||||
# Environment Variables Documentation
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "Environment Variables Documentation".
|
||||
|
||||
The BMad DiCaster Architecture Document (`3-architecture.md`) indicates that detailed environment variable documentation is intended to be consolidated, potentially in a file named `docs/environment-vars.md`. This file is marked as "(To be created)" within the "Key Reference Documents" section of `3-architecture.md`.
|
||||
|
||||
While specific environment variables are mentioned contextually throughout `3-architecture.md` (e.g., for Play.ht API keys, LLM provider configuration, SMTP settings, and workflow trigger API keys), a dedicated, centralized list of all variables, their purposes, and example values is not present as a single extractable section suitable for verbatim sharding at this time.
|
||||
|
||||
This sharded document serves as a placeholder, reflecting the sharding plan's intent to capture "Environment Variables Documentation". For specific variables mentioned in context, please refer to the full `3-architecture.md` (particularly sections like API Reference, Infrastructure Overview, and Security Best Practices) until a dedicated and consolidated list is formally compiled as intended.
|
||||
111
BETA-V3/v3-demos/full-stack-app-demo/10-sharded-docs/epic-1.md
Normal file
111
BETA-V3/v3-demos/full-stack-app-demo/10-sharded-docs/epic-1.md
Normal file
@@ -0,0 +1,111 @@
|
||||
# Epic 1: Project Initialization, Setup, and HN Content Acquisition
|
||||
|
||||
> This document is a granulated shard from the main "BETA-V3/v3-demos/full-stack-app-demo/8-prd-po-updated.md" focusing on "Epic 1: Project Initialization, Setup, and HN Content Acquisition".
|
||||
|
||||
- Goal: Establish the foundational project structure, including the Next.js application, Supabase integration, deployment pipeline, API/CLI triggers, core workflow orchestration, and implement functionality to retrieve, process, and store Hacker News posts/comments via a `ContentAcquisitionFacade`, providing data for newsletter generation. Implement the database event mechanism to trigger subsequent processing. Define core configuration tables, seed data, and set up testing frameworks.
|
||||
|
||||
- **Story 1.1:** As a developer, I want to set up the Next.js project with Supabase integration, so that I have a functional foundation for building the application.
|
||||
- Acceptance Criteria:
|
||||
- The Next.js project is initialized using the Vercel/Supabase template.
|
||||
- Supabase is successfully integrated with the Next.js project.
|
||||
- The project codebase is initialized in a Git repository.
|
||||
- A basic project `README.md` is created in the root of the repository, including a project overview, links to main documentation (PRD, architecture), and essential developer setup/run commands.
|
||||
- **Story 1.2:** As a developer, I want to configure the deployment pipeline to Vercel with separate development and production environments, so that I can easily deploy and update the application.
|
||||
- Acceptance Criteria:
|
||||
- The project is successfully linked to a Vercel project with separate environments.
|
||||
- Automated deployments are configured for the main branch to the production environment.
|
||||
- Environment variables are set up for local development and Vercel deployments.
|
||||
- **Story 1.3:** As a developer, I want to implement the API and CLI trigger mechanisms, so that I can manually trigger the workflow during development and testing.
|
||||
- Acceptance Criteria:
|
||||
- A secure API endpoint is created.
|
||||
- The API endpoint requires authentication (API key).
|
||||
- The API endpoint (`/api/system/trigger-workflow`) creates an entry in the `workflow_runs` table and returns the `jobId`.
|
||||
- The API endpoint returns an appropriate response to indicate success or failure.
|
||||
- The API endpoint is secured via an API key.
|
||||
- A CLI command is created.
|
||||
- The CLI command invokes the `/api/system/trigger-workflow` endpoint or directly interacts with `WorkflowTrackerService` to start a new workflow run.
|
||||
- The CLI command provides informative output to the console.
|
||||
- All API requests and CLI command executions are logged, including timestamps and any relevant data.
|
||||
- All interactions with the API or CLI that initiate a workflow must record the `workflow_run_id` in logs.
|
||||
- The API and CLI interfaces adhere to mobile responsiveness and Tailwind/theming principles.
|
||||
- **Story 1.4:** As a system, I want to retrieve the top 30 Hacker News posts and associated comments daily using a configurable `ContentAcquisitionFacade`, so that the data is available for summarization and newsletter generation.
|
||||
- Acceptance Criteria:
|
||||
- A `ContentAcquisitionFacade` is implemented in `supabase/functions/_shared/` to abstract interaction with the news data source (initially HN Algolia API).
|
||||
- The facade handles API authentication (if any), request formation, and response parsing for the specific news source.
|
||||
- The facade implements basic retry logic for transient errors.
|
||||
- Unit tests for the `ContentAcquisitionFacade` (mocking actual HTTP calls to the HN Algolia API) achieve >80% coverage.
|
||||
- The system retrieves the top 30 Hacker News posts daily via the `ContentAcquisitionFacade`.
|
||||
- The system retrieves associated comments for the top 30 posts via the `ContentAcquisitionFacade`.
|
||||
- Retrieved data (posts and comments) is stored in Supabase database, linked to the current `workflow_run_id`.
|
||||
- This functionality can be triggered via the API and CLI.
|
||||
- The system logs the start and completion of the retrieval process, including any errors.
|
||||
- Upon completion, the service updates the `workflow_runs` table with status and details (e.g., number of posts fetched) via `WorkflowTrackerService`.
|
||||
- Supabase migrations for `hn_posts` and `hn_comments` tables (as defined in `architecture.txt`) are created and applied before data operations.
|
||||
- **Story 1.5: Define and Implement `workflow_runs` Table and `WorkflowTrackerService`**
|
||||
- Goal: Implement the core workflow orchestration mechanism (tracking part).
|
||||
- Acceptance Criteria:
|
||||
- Supabase migration created for the `workflow_runs` table as defined in the architecture document.
|
||||
- `WorkflowTrackerService` implemented in `supabase/functions/_shared/` with methods for initiating, updating step details, incrementing counters, failing, and completing workflow runs.
|
||||
- Service includes robust error handling and logging via Pino.
|
||||
- Unit tests for `WorkflowTrackerService` achieve >80% coverage.
|
||||
- **Story 1.6: Implement `CheckWorkflowCompletionService` (Supabase Cron Function)**
|
||||
- Goal: Implement the core workflow orchestration mechanism (progression part).
|
||||
- Acceptance Criteria:
|
||||
- Supabase Function `check-workflow-completion-service` created.
|
||||
- Function queries `workflow_runs` and related tables to determine if a workflow run is ready to progress to the next major stage.
|
||||
- Function correctly updates `workflow_runs.status` and invokes the next appropriate service function.
|
||||
- Logic for handling podcast link availability is implemented here or in conjunction with `NewsletterGenerationService`.
|
||||
- The function is configurable to be run periodically.
|
||||
- Comprehensive logging implemented using Pino.
|
||||
- Unit tests achieve >80% coverage.
|
||||
- **Story 1.7: Implement Workflow Status API Endpoint (`/api/system/workflow-status/{jobId}`)**
|
||||
- Goal: Allow developers/admins to check the status of a workflow run.
|
||||
- Acceptance Criteria:
|
||||
- Next.js API Route Handler created at `/api/system/workflow-status/{jobId}`.
|
||||
- Endpoint secured with API Key authentication.
|
||||
- Retrieves and returns status details from the `workflow_runs` table.
|
||||
- Handles cases where `jobId` is not found (404).
|
||||
- Unit and integration tests for the API endpoint.
|
||||
- **Story 1.8: Create and document `docs/environment-vars.md` and set up `.env.example`**
|
||||
- Goal: Ensure environment variables are properly documented and managed.
|
||||
- Acceptance Criteria:
|
||||
- A `docs/environment-vars.md` file is created.
|
||||
- An `.env.example` file is created.
|
||||
- Sensitive information in examples is masked.
|
||||
- For each third-party service requiring credentials, `docs/environment-vars.md` includes:
|
||||
- A brief note or link guiding the user on where to typically sign up for the service and obtain the necessary API key or credential.
|
||||
- A recommendation for the user to check the service's current free/low-tier API rate limits against expected MVP usage.
|
||||
- A note that usage beyond free tier limits for commercial services (like Play.ht, remote LLMs, or email providers) may incur costs, and the user should review the provider's pricing.
|
||||
- **Story 1.9 (New): Implement Database Event/Webhook: `hn_posts` Insert to Article Scraping Service**
|
||||
- Goal: To ensure that the successful insertion of a new Hacker News post into the `hn_posts` table automatically triggers the `ArticleScrapingService`.
|
||||
- Acceptance Criteria:
|
||||
- A Supabase database trigger or webhook mechanism (e.g., using `pg_net` or native triggers calling a function) is implemented on the `hn_posts` table for INSERT operations.
|
||||
- The trigger successfully invokes the `ArticleScrapingService` (Supabase Function).
|
||||
- The invocation passes necessary parameters like `hn_post_id` and `workflow_run_id` to the `ArticleScrapingService`.
|
||||
- The mechanism is robust and includes error handling/logging for the trigger/webhook itself.
|
||||
- Unit/integration tests are created to verify the trigger fires correctly and the service is invoked with correct parameters.
|
||||
- **Story 1.10 (New): Define and Implement Core Configuration Tables**
|
||||
- Goal: To establish the database tables necessary for storing core application configurations like summarization prompts, newsletter templates, and subscriber lists.
|
||||
- Acceptance Criteria:
|
||||
- A Supabase migration is created and applied to define the `summarization_prompts` table schema as specified in `architecture.txt`.
|
||||
- A Supabase migration is created and applied to define the `newsletter_templates` table schema as specified in `architecture.txt`.
|
||||
- A Supabase migration is created and applied to define the `subscribers` table schema as specified in `architecture.txt`.
|
||||
- These tables are ready for data population (e.g., via seeding or manual entry for MVP).
|
||||
- **Story 1.11 (New): Create Seed Data for Initial Configuration**
|
||||
- Goal: To populate the database with initial configuration data (prompts, templates, test subscribers) necessary for development and testing of MVP features.
|
||||
- Acceptance Criteria:
|
||||
- A `supabase/seed.sql` file (or an equivalent, documented seeding mechanism) is created.
|
||||
- The seed mechanism populates the `summarization_prompts` table with at least one default article prompt and one default comment prompt.
|
||||
- The seed mechanism populates the `newsletter_templates` table with at least one default newsletter template (HTML format for MVP).
|
||||
- The seed mechanism populates the `subscribers` table with a small list of 1-3 test email addresses for MVP delivery testing.
|
||||
- Instructions on how to apply the seed data to a local or development Supabase instance are documented (e.g., in the project `README.md`).
|
||||
- **Story 1.12 (New): Set up and Configure Project Testing Frameworks**
|
||||
- Goal: To ensure that the primary testing frameworks (Jest, React Testing Library, Playwright) are installed and configured early in the project lifecycle, enabling test-driven development practices and adherence to the testing strategy.
|
||||
- Acceptance Criteria:
|
||||
- Jest and React Testing Library (RTL) are installed as project dependencies.
|
||||
- Jest and RTL are configured for unit and integration testing of Next.js components and JavaScript/TypeScript code (e.g., `jest.config.js` is set up, necessary Babel/TS transformations are in place).
|
||||
- A sample unit test (e.g., for a simple component or utility function) is created and runs successfully using the Jest/RTL setup.
|
||||
- Playwright is installed as a project dependency.
|
||||
- Playwright is configured for end-to-end testing (e.g., `playwright.config.ts` is set up, browser configurations are defined).
|
||||
- A sample E2E test (e.g., navigating to the application's homepage on the local development server) is created and runs successfully using Playwright.
|
||||
- Scripts to execute tests (e.g., unit tests, E2E tests) are added to `package.json`.
|
||||
@@ -0,0 +1,39 @@
|
||||
# Epic 2: Article Scraping
|
||||
|
||||
> This document is a granulated shard from the main "BETA-V3/v3-demos/full-stack-app-demo/8-prd-po-updated.md" focusing on "Epic 2: Article Scraping".
|
||||
|
||||
- Goal: Implement the functionality to scrape and store linked articles from HN posts, enriching the data available for summarization and the newsletter. Ensure this functionality is triggered by database events and can be tested via API/CLI (if retained). Implement the database event mechanism to trigger subsequent processing.
|
||||
|
||||
- **Story 2.1:** As a system, I want to identify URLs within the top 30 (configurable via environment variable) Hacker News posts, so that I can extract the content of linked articles.
|
||||
- Acceptance Criteria:
|
||||
- The system parses the top N (configurable via env var) Hacker News posts to identify URLs.
|
||||
- The system filters out any URLs that are not relevant to article scraping (e.g., links to images, videos, etc.).
|
||||
- **Story 2.2:** As a system, I want to scrape the content of the identified article URLs using Cheerio, so that I can provide summaries in the newsletter.
|
||||
- Acceptance Criteria:
|
||||
- The system scrapes the content from the identified article URLs using Cheerio.
|
||||
- The system extracts relevant content such as the article title, author, publication date, and main text.
|
||||
- The system handles potential issues during scraping, such as website errors or changes in website structure, logging errors for review.
|
||||
- **Story 2.3:** As a system, I want to store the scraped article content in the Supabase database, associated with the corresponding Hacker News post and workflow run, so that it can be used for summarization and newsletter generation.
|
||||
- Acceptance Criteria:
|
||||
- Scraped article content is stored in the `scraped_articles` table, linked to the `hn_post_id` and the current `workflow_run_id`.
|
||||
- The system ensures that the stored data includes all extracted information (title, author, date, text).
|
||||
- The `scraping_status` and any `error_message` are recorded in the `scraped_articles` table.
|
||||
- Upon completion of scraping an article (success or failure), the service updates the `workflow_runs.details` (e.g., incrementing scraped counts) via `WorkflowTrackerService`.
|
||||
- A Supabase migration for the `scraped_articles` table (as defined in `architecture.txt`) is created and applied before data operations.
|
||||
- **Story 2.4:** As a developer, I want to trigger the article scraping process via the API and CLI, so that I can manually initiate it for testing and debugging.
|
||||
- _Architect's Note: This story might become redundant if the main workflow trigger (Story 1.3) handles the entire pipeline initiation and individual service testing is done via direct function invocation or unit/integration tests._
|
||||
- Acceptance Criteria:
|
||||
- The API endpoint can trigger the article scraping process.
|
||||
- The CLI command can trigger the article scraping process locally.
|
||||
- The system logs the start and completion of the scraping process, including any errors encountered.
|
||||
- All API requests and CLI command executions are logged, including timestamps and any relevant data.
|
||||
- The system handles partial execution gracefully (i.e., if triggered before Epic 1 components like `WorkflowTrackerService` are available, it logs a message and exits).
|
||||
- If retained for isolated testing, all scraping operations initiated via this trigger must be associated with a valid `workflow_run_id` and update the `workflow_runs` table accordingly via `WorkflowTrackerService`.
|
||||
- **Story 2.5 (New): Implement Database Event/Webhook: `scraped_articles` Success to Summarization Service**
|
||||
- Goal: To ensure that the successful scraping and storage of an article in `scraped_articles` automatically triggers the `SummarizationService`.
|
||||
- Acceptance Criteria:
|
||||
- A Supabase database trigger or webhook mechanism is implemented on the `scraped_articles` table (e.g., on INSERT or UPDATE where `scraping_status` is 'success').
|
||||
- The trigger successfully invokes the `SummarizationService` (Supabase Function).
|
||||
- The invocation passes necessary parameters like `scraped_article_id` and `workflow_run_id` to the `SummarizationService`.
|
||||
- The mechanism is robust and includes error handling/logging for the trigger/webhook itself.
|
||||
- Unit/integration tests are created to verify the trigger fires correctly and the service is invoked with correct parameters.
|
||||
@@ -0,0 +1,41 @@
|
||||
# Epic 3: AI-Powered Content Summarization
|
||||
|
||||
> This document is a granulated shard from the main "BETA-V3/v3-demos/full-stack-app-demo/8-prd-po-updated.md" focusing on "Epic 3: AI-Powered Content Summarization".
|
||||
|
||||
- Goal: Integrate AI summarization capabilities, by implementing and using a configurable and testable `LLMFacade`, to generate concise summaries of articles and comments from prompts stored in the database. This will enrich the newsletter content, be triggerable via API/CLI, is triggered by database events, and track progress via `WorkflowTrackerService`.
|
||||
|
||||
- **Story 3.1:** As a system, I want to integrate an AI summarization capability by implementing and using an `LLMFacade`, so that I can generate concise summaries of articles and comments using various configurable LLM providers.
|
||||
- Acceptance Criteria:
|
||||
- An `LLMFacade` interface and concrete implementations (e.g., `OllamaAdapter`, `RemoteLLMApiAdapter`) are created in `supabase/functions/_shared/llm-facade.ts`.
|
||||
- A factory function is implemented within or alongside the facade to select the appropriate LLM adapter based on environment variables (e.g., `LLM_PROVIDER_TYPE`, `OLLAMA_API_URL`, `REMOTE_LLM_API_KEY`, `REMOTE_LLM_API_URL`, `LLM_MODEL_NAME`).
|
||||
- The `LLMFacade` handles making requests to the respective LLM APIs (as configured) and parsing their responses to extract the summary.
|
||||
- Robust error handling and retry logic for transient API errors are implemented within the facade.
|
||||
- Unit tests for the `LLMFacade` and its adapters (mocking actual HTTP calls) achieve >80% coverage.
|
||||
- The system utilizes this `LLMFacade` for all summarization tasks (articles and comments).
|
||||
- The integration is configurable via environment variables to switch between local and remote LLMs and specify model names.
|
||||
- **Story 3.2:** As a system, I want to retrieve summarization prompts from the database, and then use them via the `LLMFacade` to generate 2-paragraph summaries of the scraped articles, so that users can quickly grasp the main content and the prompts can be easily updated.
|
||||
- Acceptance Criteria:
|
||||
- The service retrieves the appropriate summarization prompt from the `summarization_prompts` table.
|
||||
- The system generates a 2-paragraph summary for each scraped article using the retrieved prompt via the `LLMFacade`.
|
||||
- Generated summaries are stored in the `article_summaries` table, linked to the `scraped_article_id` and the current `workflow_run_id`.
|
||||
- The summaries are accurate and capture the key information from the article.
|
||||
- Upon completion of each article summarization task, the service updates `workflow_runs.details` (e.g., incrementing article summaries generated counts) via `WorkflowTrackerService`.
|
||||
- (System Note: The `CheckWorkflowCompletionService` monitors the `article_summaries` table as part of determining overall summarization completion for a `workflow_run_id`).
|
||||
- A Supabase migration for the `article_summaries` table (as defined in `architecture.txt`) is created and applied before data operations.
|
||||
- **Story 3.3:** As a system, I want to retrieve summarization prompts from the database, and then use them via the `LLMFacade` to generate 2-paragraph summaries of the comments for the selected HN posts, so that users can understand the main discussions and the prompts can be easily updated.
|
||||
- Acceptance Criteria:
|
||||
- The service retrieves the appropriate summarization prompt from the `summarization_prompts` table.
|
||||
- The system generates a 2-paragraph summary of the comments for each selected HN post using the retrieved prompt via the `LLMFacade`.
|
||||
- Generated summaries are stored in the `comment_summaries` table, linked to the `hn_post_id` and the current `workflow_run_id`.
|
||||
- The summaries highlight interesting interactions and key points from the discussion.
|
||||
- Upon completion of each comment summarization task, the service updates `workflow_runs.details` (e.g., incrementing comment summaries generated counts) via `WorkflowTrackerService`.
|
||||
- (System Note: The `CheckWorkflowCompletionService` monitors the `comment_summaries` table as part of determining overall summarization completion for a `workflow_run_id`).
|
||||
- A Supabase migration for the `comment_summaries` table (as defined in `architecture.txt`) is created and applied before data operations.
|
||||
- **Story 3.4:** As a developer, I want to trigger the AI summarization process via the API and CLI, so that I can manually initiate it for testing and debugging.
|
||||
- Acceptance Criteria:
|
||||
- The API endpoint can trigger the AI summarization process.
|
||||
- The CLI command can trigger the AI summarization process locally.
|
||||
- The system logs the input and output of the summarization process, including the summarization prompt used and any errors.
|
||||
- All API requests and CLI command executions are logged, including timestamps and any relevant data.
|
||||
- The system handles partial execution gracefully (i.e., if triggered before Epic 2 is complete, it logs a message and exits).
|
||||
- All summarization operations initiated via this trigger must be associated with a valid `workflow_run_id` and update the `workflow_runs` table accordingly via `WorkflowTrackerService`.
|
||||
@@ -0,0 +1,43 @@
|
||||
# Epic 4: Automated Newsletter Creation and Distribution
|
||||
|
||||
> This document is a granulated shard from the main "BETA-V3/v3-demos/full-stack-app-demo/8-prd-po-updated.md" focusing on "Epic 4: Automated Newsletter Creation and Distribution".
|
||||
|
||||
- Goal: Automate the generation and delivery of the daily newsletter by implementing and using a configurable `EmailDispatchFacade`. This includes handling podcast link availability, being triggerable via API/CLI, orchestration by `CheckWorkflowCompletionService`, and status tracking via `WorkflowTrackerService`.
|
||||
|
||||
- **Story 4.1:** As a system, I want to retrieve the newsletter template from the database, so that the newsletter's design and structure can be updated without code changes.
|
||||
- Acceptance Criteria:
|
||||
- The system retrieves the newsletter template from the `newsletter_templates` database table.
|
||||
- **Story 4.2:** As a system, I want to generate a daily newsletter in HTML format using the retrieved template, so that users can receive a concise summary of Hacker News content.
|
||||
- Acceptance Criteria:
|
||||
- The `NewsletterGenerationService` is triggered by the `CheckWorkflowCompletionService` when all summaries for a `workflow_run_id` are ready.
|
||||
- The service retrieves the newsletter template (from Story 4.1 output) from `newsletter_templates` table and summaries associated with the `workflow_run_id`.
|
||||
- The system generates a newsletter in HTML format using the template retrieved from the database.
|
||||
- The newsletter includes summaries of selected articles and comments.
|
||||
- The newsletter includes links to the original HN posts and articles.
|
||||
- The newsletter includes the original post dates/times.
|
||||
- Generated newsletter is stored in the `newsletters` table, linked to the `workflow_run_id`.
|
||||
- The service updates `workflow_runs.status` to 'generating_podcast' (or a similar appropriate status indicating handoff to podcast generation) after initiating podcast generation (as part of Epic 5 logic that will be invoked by this service or by `CheckWorkflowCompletionService` after this story's core task).
|
||||
- A Supabase migration for the `newsletters` table (as defined in `architecture.txt`) is created and applied before data operations.
|
||||
- **Story 4.3:** As a system, I want to send the generated newsletter to a list of subscribers by implementing and using an `EmailDispatchFacade`, with credentials securely provided, so that users receive the daily summary in their inbox.
|
||||
- Acceptance Criteria:
|
||||
- An `EmailDispatchFacade` is implemented in `supabase/functions/_shared/` to abstract interaction with the email sending service (initially Nodemailer via SMTP).
|
||||
- The facade handles configuration (e.g., SMTP settings from environment variables), email construction (From, To, Subject, HTML content), and sending the email.
|
||||
- The facade includes error handling for email dispatch and logs relevant status information.
|
||||
- Unit tests for the `EmailDispatchFacade` (mocking the actual Nodemailer library calls) achieve >80% coverage.
|
||||
- The `NewsletterGenerationService` (specifically, its delivery part, utilizing the `EmailDispatchFacade`) is triggered by `CheckWorkflowCompletionService` once the podcast link is available in the `newsletters` table for the `workflow_run_id` (or a configured timeout/failure condition for the podcast step has been met).
|
||||
- The system retrieves the list of subscriber email addresses from the Supabase database.
|
||||
- The system sends the HTML newsletter (with podcast link conditionally included) to all active subscribers using the `EmailDispatchFacade`.
|
||||
- Credentials for the email service (e.g., SMTP server details) are securely accessed via environment variables and used by the facade.
|
||||
- The system logs the delivery status for each subscriber (potentially via the facade).
|
||||
- The system implements conditional logic for podcast link inclusion (from `newsletters` table) and handles delay/retry as per PRD, coordinated by `CheckWorkflowCompletionService`.
|
||||
- Updates `newsletters.delivery_status` (e.g., 'sent', 'failed') and `workflow_runs.status` to 'completed' or 'failed' via `WorkflowTrackerService` upon completion or failure of delivery.
|
||||
- The initial email template includes a placeholder for the podcast URL.
|
||||
- The end-to-end generation time for a typical daily newsletter (from workflow trigger to successful email dispatch initiation, for a small set of content) is measured and logged during testing to ensure it's within a reasonable operational timeframe (target < 30 minutes).
|
||||
- **Story 4.4:** As a developer, I want to trigger the newsletter generation and distribution process via the API and CLI, so that I can manually initiate it for testing and debugging.
|
||||
- Acceptance Criteria:
|
||||
- The API endpoint can trigger the newsletter generation and distribution process.
|
||||
- The CLI command can trigger the newsletter generation and distribution process locally.
|
||||
- The system logs the start and completion of the process, including any errors.
|
||||
- All API requests and CLI command executions are logged, including timestamps and any relevant data.
|
||||
- The system handles partial execution gracefully (i.e., if triggered before Epic 3 is complete, it logs a message and exits).
|
||||
- All newsletter operations initiated via this trigger must be associated with a valid `workflow_run_id` and update the `workflow_runs` table accordingly via `WorkflowTrackerService`.
|
||||
@@ -0,0 +1,36 @@
|
||||
# Epic 5: Podcast Generation Integration
|
||||
|
||||
> This document is a granulated shard from the main "BETA-V3/v3-demos/full-stack-app-demo/8-prd-po-updated.md" focusing on "Epic 5: Podcast Generation Integration".
|
||||
|
||||
- Goal: Integrate with an audio generation API (initially Play.ht) by implementing and using a configurable `AudioGenerationFacade` to create podcast versions of the newsletter. This includes handling webhooks to update newsletter data and workflow status. Ensure this is triggerable via API/CLI, orchestrated appropriately, and uses `WorkflowTrackerService`.
|
||||
|
||||
- **Story 5.1:** As a system, I want to integrate with an audio generation API (e.g., Play.ht's PlayNote API) by implementing and using an `AudioGenerationFacade`, so that I can generate AI-powered podcast versions of the newsletter content.
|
||||
- Acceptance Criteria:
|
||||
- An `AudioGenerationFacade` is implemented in `supabase/functions/_shared/` to abstract interaction with the audio generation service (initially Play.ht).
|
||||
- The facade handles API authentication, request formation (e.g., sending content for synthesis, providing webhook URL), and response parsing for the specific audio generation service.
|
||||
- The facade is configurable via environment variables (e.g., API key, user ID, service endpoint, webhook URL base).
|
||||
- Robust error handling and retry logic for transient API errors are implemented within the facade.
|
||||
- Unit tests for the `AudioGenerationFacade` (mocking actual HTTP calls to the Play.ht API) achieve >80% coverage.
|
||||
- The system uses this `AudioGenerationFacade` for all podcast generation tasks.
|
||||
- The integration employs webhooks for asynchronous status updates from the audio generation service.
|
||||
- (Context: The `PodcastGenerationService` containing this logic is invoked by `NewsletterGenerationService` or `CheckWorkflowCompletionService` for a specific `workflow_run_id` and `newsletter_id`.)
|
||||
- **Story 5.2:** As a system, I want to send the newsletter content to the audio generation service via the `AudioGenerationFacade` to initiate podcast creation, and receive a job ID or initial response, so that I can track the podcast creation process.
|
||||
- Acceptance Criteria:
|
||||
- The system sends the newsletter content (identified by `newsletter_id` for a given `workflow_run_id`) to the configured audio generation service via the `AudioGenerationFacade`.
|
||||
- The system receives a job ID or initial response from the service via the facade.
|
||||
- The `podcast_playht_job_id` (or a generic `podcast_job_id`) and `podcast_status` (e.g., 'generating', 'submitted') are stored in the `newsletters` table, linked to the `workflow_run_id`.
|
||||
- **Story 5.3:** As a system, I want to implement a webhook handler to receive the podcast URL from the audio generation service, and update the newsletter data and workflow status, so that the podcast link can be included in the newsletter and web interface, and the overall workflow can proceed.
|
||||
- Acceptance Criteria:
|
||||
- The system implements a webhook handler (`PlayHTWebhookHandlerAPI` at `/api/webhooks/playht` or a more generic path like `/api/webhooks/audio-generation`) to receive the podcast URL and status from the audio generation service.
|
||||
- The webhook handler extracts the podcast URL and status (e.g., 'completed', 'failed') from the webhook payload.
|
||||
- The webhook handler updates the `newsletters` table with the podcast URL and status for the corresponding job.
|
||||
- The `PlayHTWebhookHandlerAPI` also updates the `workflow_runs.details` with the podcast status (e.g., `podcast_status: 'completed'`) via `WorkflowTrackerService` for the relevant `workflow_run_id` (which may need to be looked up from the `newsletter_id` or job ID present in the webhook or associated with the service job).
|
||||
- If supported by the audio generation service (e.g., Play.ht), implement security verification for the incoming webhook (such as shared secret or signature validation) to ensure authenticity. If direct verification mechanisms are not supported by the provider, this specific AC is N/A, and alternative measures (like IP whitelisting, if applicable and secure) should be considered and documented.
|
||||
- **Story 5.4:** As a developer, I want to trigger the podcast generation process via the API and CLI, so that I can manually initiate it for testing and debugging.
|
||||
- Acceptance Criteria:
|
||||
- The API endpoint can trigger the podcast generation process.
|
||||
- The CLI command can trigger the podcast generation process locally.
|
||||
- The system logs the start and completion of the process, including any intermediate steps, responses from the audio generation service, and webhook interactions.
|
||||
- All API requests and CLI command executions are logged, including timestamps and any relevant data.
|
||||
- The system handles partial execution gracefully (i.e., if triggered before Epic 4 components are ready, it logs a message and exits).
|
||||
- All podcast generation operations initiated via this trigger must be associated with a valid `workflow_run_id` and `newsletter_id`, and update the `workflow_runs` and `newsletters` tables accordingly via `WorkflowTrackerService` and direct table updates as necessary.
|
||||
@@ -0,0 +1,44 @@
|
||||
# Epic 6: Web Interface for Initial Structure and Content Access
|
||||
|
||||
> This document is a granulated shard from the main "BETA-V3/v3-demos/full-stack-app-demo/8-prd-po-updated.md" focusing on "Epic 6: Web Interface for Initial Structure and Content Access".
|
||||
|
||||
- Goal: Develop a user-friendly, responsive, and accessible web interface, based on the `frontend-architecture.md`, to display newsletters and provide access to podcast content, aligning with the project's visual and technical guidelines. All UI development within this epic must adhere to the "synthwave technical glowing purple vibes" aesthetic using Tailwind CSS and Shadcn UI, ensure basic mobile responsiveness, meet WCAG 2.1 Level A accessibility guidelines (including semantic HTML, keyboard navigation, alt text, color contrast), and optimize images using `next/image`, as detailed in the `frontend-architecture.txt` and `ui-ux-spec.txt`.
|
||||
|
||||
- **Story 6.1:** As a developer, I want to establish the initial Next.js App Router structure for the web interface, including core layouts and routing, using `frontend-architecture.md` as a guide, so that I have a foundational frontend structure.
|
||||
- Acceptance Criteria:
|
||||
- Initial HTML/CSS mockups (e.g., from Vercel v0, if used) serve as a visual guide, but implementation uses Next.js and Shadcn UI components as per `frontend-architecture.md`.
|
||||
- Next.js App Router routes are set up for `/newsletters` (listing page) and `/newsletters/[newsletterId]` (detail page) within an `app/(web)/` route group.
|
||||
- Root layout (`app/(web)/layout.tsx`) and any necessary feature-specific layouts (e.g., `app/(web)/newsletters/layout.tsx`) are implemented using Next.js App Router conventions and Tailwind CSS.
|
||||
- A `PageWrapper.tsx` component (as defined in `frontend-architecture.txt`) is implemented and used for consistent page styling (e.g., padding, max-width).
|
||||
- Basic page structure renders correctly in development environment.
|
||||
- **Story 6.2:** As a user, I want to see a list of current and past newsletters on the `/newsletters` page, so that I can easily browse available content.
|
||||
- Acceptance Criteria:
|
||||
- The `app/(web)/newsletters/page.tsx` route displays a list of newsletters.
|
||||
- Newsletter items are displayed using a `NewsletterCard.tsx` component.
|
||||
- The `NewsletterCard.tsx` component is developed (e.g., using Shadcn UI `Card` as a base), displaying at least the newsletter title, target date, and a link/navigation to its detail page.
|
||||
- `NewsletterCard.tsx` is styled using Tailwind CSS to fit the "synthwave" theme.
|
||||
- Data for the newsletter list (e.g., ID, title, date) is fetched server-side on `app/(web)/newsletters/page.tsx` using the Supabase server client.
|
||||
- The newsletter list page is responsive across common device sizes (mobile, desktop).
|
||||
- The list includes relevant information such as the newsletter title and date.
|
||||
- The list is paginated or provides scrolling functionality to handle a large number of newsletters.
|
||||
- Key page load performance (e.g., Largest Contentful Paint) for the newsletter list page is benchmarked (e.g., using browser developer tools or Lighthouse) during development testing to ensure it aligns with the target of fast load times (target < 2 seconds).
|
||||
- **Story 6.3:** As a user, I want to be able to select a newsletter from the list and read its full content within the web page on the `/newsletters/[newsletterId]` page.
|
||||
- Acceptance Criteria:
|
||||
- Clicking on a `NewsletterCard` navigates to the corresponding `app/(web)/newsletters/[newsletterId]/page.tsx` route.
|
||||
- The full HTML content of the selected newsletter is retrieved server-side using the Supabase server client and displayed in a readable format.
|
||||
- A `BackButton.tsx` component is developed (e.g., using Shadcn UI `Button` as a base) and integrated on the newsletter detail page, allowing users to navigate back to the newsletter list.
|
||||
- The newsletter detail page content area is responsive across common device sizes.
|
||||
- Key page load performance (e.g., Largest Contentful Paint) for the newsletter detail page is benchmarked (e.g., using browser developer tools or Lighthouse) during development testing to ensure it aligns with the target of fast load times (target < 2 seconds).
|
||||
- **Story 6.4:** As a user, I want to have the option to download the currently viewed newsletter from its detail page, so that I can access it offline.
|
||||
- Acceptance Criteria:
|
||||
- A `DownloadButton.tsx` component is developed (e.g., using Shadcn UI `Button` as a base).
|
||||
- The `DownloadButton.tsx` is integrated and visible on the newsletter detail page (`/newsletters/[newsletterId]`).
|
||||
- Clicking the button initiates a download of the newsletter content (e.g., HTML format for MVP).
|
||||
- **Story 6.5:** As a user, I want to listen to the generated podcast associated with a newsletter within the web interface on its detail page, if a podcast is available.
|
||||
- Acceptance Criteria:
|
||||
- A `PodcastPlayer.tsx` React component with standard playback controls (play, pause, seek bar, volume control) is developed.
|
||||
- An `podcastPlayerSlice.ts` Zustand store is implemented to manage podcast player state (e.g., current track URL, playback status, current time, volume).
|
||||
- The `PodcastPlayer.tsx` component integrates with the `podcastPlayerSlice.ts` Zustand store for its state management.
|
||||
- If a podcast URL is available for the displayed newsletter (fetched from Supabase), the `PodcastPlayer.tsx` component is displayed on the newsletter detail page.
|
||||
- The `PodcastPlayer.tsx` can load and play the podcast audio from the provided URL.
|
||||
- The `PodcastPlayer.tsx` is styled using Tailwind CSS to fit the "synthwave" theme and is responsive.
|
||||
@@ -0,0 +1,73 @@
|
||||
# API Interaction Layer
|
||||
|
||||
> This document is a granulated shard from the main "5-front-end-architecture.md" focusing on "API Interaction Layer".
|
||||
|
||||
The frontend will interact with Supabase for data. Server Components will fetch data directly using server-side Supabase client. Client Components needing to mutate data or trigger backend logic will use Next.js Server Actions or, if necessary, dedicated Next.js API Route Handlers which then interact with Supabase.
|
||||
|
||||
### Client/Service Structure
|
||||
|
||||
- **HTTP Client Setup (for Next.js API Route Handlers, if used extensively):**
|
||||
|
||||
- While Server Components and Server Actions are preferred for Supabase interactions, if direct calls from client to custom Next.js API routes are needed, a simple `fetch` wrapper or a lightweight client like `ky` could be used.
|
||||
- The Vercel/Supabase template provides `utils/supabase/client.ts` (for client-side components) and `utils/supabase/server.ts` (for Server Components, Route Handlers, Server Actions). These will be the primary interfaces to Supabase.
|
||||
- **Base URL:** Not applicable for direct Supabase client usage. For custom API routes: relative paths (e.g., `/api/my-route`).
|
||||
- **Authentication:** The Supabase clients handle auth token management. For custom API routes, Next.js middleware (`middleware.ts`) would handle session verification.
|
||||
|
||||
- **Service Definitions (Conceptual for Supabase Data Access):**
|
||||
|
||||
- No separate "service" files like `userService.ts` are strictly necessary for data fetching with Server Components. Data fetching logic will be co-located with the Server Components or within Server Actions.
|
||||
- **Example (Data fetching in a Server Component):**
|
||||
|
||||
```typescript
|
||||
// app/(web)/newsletters/page.tsx
|
||||
import { createClient } from "@/utils/supabase/server";
|
||||
import NewsletterCard from "@/app/components/core/NewsletterCard"; // Corrected path
|
||||
|
||||
export default async function NewsletterListPage() {
|
||||
const supabase = createClient();
|
||||
const { data: newsletters, error } = await supabase
|
||||
.from("newsletters")
|
||||
.select("id, title, target_date, podcast_url") // Add podcast_url
|
||||
.order("target_date", { ascending: false });
|
||||
|
||||
if (error) console.error("Error fetching newsletters:", error);
|
||||
// Render newsletters or error state
|
||||
}
|
||||
```
|
||||
|
||||
- **Example (Server Action for a hypothetical "subscribe" feature - future scope):**
|
||||
|
||||
```typescript
|
||||
// app/actions/subscribeActions.ts
|
||||
"use server";
|
||||
import { createClient } from "@/utils/supabase/server";
|
||||
import { z } from "zod";
|
||||
import { revalidatePath } from "next/cache";
|
||||
|
||||
const EmailSchema = z.string().email();
|
||||
|
||||
export async function subscribeToNewsletter(email: string) {
|
||||
const validation = EmailSchema.safeParse(email);
|
||||
if (!validation.success) {
|
||||
return { error: "Invalid email format." };
|
||||
}
|
||||
const supabase = createClient();
|
||||
const { error } = await supabase
|
||||
.from("subscribers")
|
||||
.insert({ email: validation.data });
|
||||
if (error) {
|
||||
return { error: "Subscription failed." };
|
||||
}
|
||||
revalidatePath("/"); // Example path revalidation
|
||||
return { success: true };
|
||||
}
|
||||
```
|
||||
|
||||
### Error Handling & Retries (Frontend)
|
||||
|
||||
- **Server Component Data Fetching Errors:** Errors from Supabase in Server Components should be caught. The component can then render an appropriate error UI or pass error information as props. Next.js error handling (e.g. `error.tsx` files) can also be used for unrecoverable errors.
|
||||
- **Client Component / Server Action Errors:**
|
||||
- Server Actions should return structured responses (e.g., `{ success: boolean, data?: any, error?: string }`). Client Components calling Server Actions will handle these responses to update UI (e.g., display error messages, toast notifications).
|
||||
- Shadcn UI includes a `Toast` component which can be used for non-modal error notifications.
|
||||
- **UI Error Boundaries:** React Error Boundaries can be implemented at key points in the component tree (e.g., around major layout sections or complex components) to catch rendering errors in Client Components and display a fallback UI, preventing a full app crash. A root `global-error.tsx` can serve as a global boundary.
|
||||
- **Retry Logic:** Generally, retry logic for data fetching should be handled by the user (e.g., a "Try Again" button) rather than automatic client-side retries for MVP, unless dealing with specific, known transient issues. Supabase client libraries might have their own internal retry mechanisms for certain types of network errors.
|
||||
@@ -0,0 +1,137 @@
|
||||
# BMad DiCaster Frontend Architecture Document
|
||||
|
||||
## Table of Contents
|
||||
|
||||
- [Introduction](#introduction)
|
||||
- [Overall Frontend Philosophy & Patterns](#overall-frontend-philosophy--patterns)
|
||||
- [Detailed Frontend Directory Structure](#detailed-frontend-directory-structure)
|
||||
- [Component Breakdown & Implementation Details](#component-breakdown--implementation-details)
|
||||
- [Component Naming & Organization](#component-naming--organization)
|
||||
- [Template for Component Specification](#template-for-component-specification)
|
||||
- [State Management In-Depth](#state-management-in-depth)
|
||||
- [Chosen Solution](#chosen-solution)
|
||||
- [Rationale](#rationale)
|
||||
- [Store Structure / Slices](#store-structure--slices)
|
||||
- [Key Selectors](#key-selectors)
|
||||
- [Key Actions / Reducers / Thunks](#key-actions--reducers--thunks)
|
||||
- [API Interaction Layer](#api-interaction-layer)
|
||||
- [Client/Service Structure](#clientservice-structure)
|
||||
- [Error Handling & Retries (Frontend)](#error-handling--retries-frontend)
|
||||
- [Routing Strategy](#routing-strategy)
|
||||
- [Routing Library](#routing-library)
|
||||
- [Route Definitions](#route-definitions)
|
||||
- [Route Guards / Protection](#route-guards--protection)
|
||||
- [Build, Bundling, and Deployment](#build-bundling-and-deployment)
|
||||
- [Build Process & Scripts](#build-process--scripts)
|
||||
- [Key Bundling Optimizations](#key-bundling-optimizations)
|
||||
- [Deployment to CDN/Hosting](#deployment-to-cdnhosting)
|
||||
- [Frontend Testing Strategy](#frontend-testing-strategy)
|
||||
- [Link to Main Testing Strategy](#link-to-main-testing-strategy)
|
||||
- [Component Testing](#component-testing)
|
||||
- [UI Integration/Flow Testing](#ui-integrationflow-testing)
|
||||
- [End-to-End UI Testing Tools & Scope](#end-to-end-ui-testing-tools--scope)
|
||||
- [Accessibility (AX) Implementation Details](#accessibility-ax-implementation-details)
|
||||
- [Performance Considerations](#performance-considerations)
|
||||
- [Change Log](#change-log)
|
||||
|
||||
## Introduction
|
||||
|
||||
This document details the technical architecture specifically for the frontend of BMad DiCaster. It complements the main BMad DiCaster Architecture Document and the UI/UX Specification. The goal is to provide a clear blueprint for frontend development, ensuring consistency, maintainability, and alignment with the overall system design and user experience goals.
|
||||
|
||||
- **Link to Main Architecture Document:** `docs/architecture.md` (Note: The overall system architecture, including Monorepo/Polyrepo decisions and backend structure, will influence frontend choices, especially around shared code or API interaction patterns.)
|
||||
- **Link to UI/UX Specification:** `docs/ui-ux-spec.txt`
|
||||
- **Link to Primary Design Files (Figma, Sketch, etc.):** N/A (Low-fidelity wireframes described in `docs/ui-ux-spec.txt`; detailed mockups to be created during development)
|
||||
- **Link to Deployed Storybook / Component Showcase (if applicable):** N/A (To be developed)
|
||||
|
||||
## Overall Frontend Philosophy & Patterns
|
||||
|
||||
> Key aspects of this section have been moved to dedicated documents:
|
||||
>
|
||||
> - For styling approach, theme customization, and visual design: See [Frontend Style Guide](./front-end-style-guide.md)
|
||||
> - For core framework choices, component architecture, data flow, and general coding standards: See [Frontend Coding Standards & Accessibility](./front-end-coding-standards.md#general-coding-standards-from-overall-philosophy--patterns)
|
||||
|
||||
## Detailed Frontend Directory Structure
|
||||
|
||||
> This section has been moved to a dedicated document: [Detailed Frontend Directory Structure](./front-end-project-structure.md)
|
||||
|
||||
## Component Breakdown & Implementation Details
|
||||
|
||||
> This section has been moved to a dedicated document: [Component Breakdown & Implementation Details](./front-end-component-guide.md)
|
||||
|
||||
## State Management In-Depth
|
||||
|
||||
> This section has been moved to a dedicated document: [State Management In-Depth](./front-end-state-management.md)
|
||||
|
||||
## API Interaction Layer
|
||||
|
||||
> This section has been moved to a dedicated document: [API Interaction Layer](./front-end-api-interaction.md)
|
||||
|
||||
## Routing Strategy
|
||||
|
||||
> This section has been moved to a dedicated document: [Routing Strategy](./front-end-routing-strategy.md)
|
||||
|
||||
## Build, Bundling, and Deployment
|
||||
|
||||
Details align with the Vercel platform and Next.js capabilities.
|
||||
|
||||
### Build Process & Scripts
|
||||
|
||||
- **Key Build Scripts:**
|
||||
- `npm run dev`: Starts Next.js local development server.
|
||||
- `npm run build`: Generates an optimized production build of the Next.js application. (Script from `package.json`)
|
||||
- `npm run start`: Starts the Next.js production server after a build.
|
||||
- **Environment Variables Handling during Build:**
|
||||
- Client-side variables must be prefixed with `NEXT_PUBLIC_` (e.g., `NEXT_PUBLIC_SUPABASE_URL`, `NEXT_PUBLIC_SUPABASE_ANON_KEY`).
|
||||
- Server-side variables (used in Server Components, Server Actions, Route Handlers) are accessed directly via `process.env`.
|
||||
- Environment variables are managed in Vercel project settings for different environments (Production, Preview, Development). Local development uses `.env.local`.
|
||||
|
||||
### Key Bundling Optimizations
|
||||
|
||||
- **Code Splitting:** Next.js App Router automatically performs route-based code splitting. Dynamic imports (`next/dynamic`) can be used for further component-level code splitting if needed.
|
||||
- **Tree Shaking:** Ensured by Next.js's Webpack configuration during the production build process.
|
||||
- **Lazy Loading:** Next.js handles lazy loading of route segments by default. Images (`next/image`) are optimized and can be lazy-loaded.
|
||||
- **Minification & Compression:** Handled automatically by Next.js during `npm run build` (JavaScript, CSS minification; Gzip/Brotli compression often handled by Vercel).
|
||||
|
||||
### Deployment to CDN/Hosting
|
||||
|
||||
- **Target Platform:** **Vercel** (as per `architecture.txt`)
|
||||
- **Deployment Trigger:** Automatic deployments via Vercel's Git integration (GitHub) on pushes/merges to specified branches (e.g., `main` for production, PR branches for previews). (Aligned with `architecture.txt`)
|
||||
- **Asset Caching Strategy:** Vercel's Edge Network handles CDN caching for static assets and Server Component payloads. Cache-control headers will be configured according to Next.js defaults and can be customized if necessary (e.g., for `public/` assets).
|
||||
|
||||
## Frontend Testing Strategy
|
||||
|
||||
> This section has been moved to a dedicated document: [Frontend Testing Strategy](./front-end-testing-strategy.md)
|
||||
|
||||
## Accessibility (AX) Implementation Details
|
||||
|
||||
> This section has been moved to a dedicated document: [Frontend Coding Standards & Accessibility](./front-end-coding-standards.md#accessibility-ax-implementation-details)
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
[cite_start] The goal is a fast-loading and responsive user experience. [cite: 360, 565]
|
||||
|
||||
- **Image Optimization:**
|
||||
- Use `next/image` for automatic image optimization (resizing, WebP format where supported, lazy loading by default).
|
||||
- **Code Splitting & Lazy Loading:**
|
||||
- Next.js App Router handles route-based code splitting.
|
||||
- `next/dynamic` for client-side lazy loading of components that are not immediately visible or are heavy.
|
||||
- **Minimizing Re-renders (React):**
|
||||
- Judicious use of `React.memo` for components that render frequently with the same props.
|
||||
- Optimizing Zustand selectors if complex derived state is introduced (though direct access is often sufficient).
|
||||
- Ensuring stable prop references where possible.
|
||||
- **Debouncing/Throttling:** Not anticipated for MVP features, but will be considered for future interactive elements like search inputs.
|
||||
- **Virtualization:** Not anticipated for MVP given the limited number of items (e.g., 30 newsletters per day). If lists become very long in the future, virtualization libraries like TanStack Virtual will be considered.
|
||||
- **Caching Strategies (Client-Side):**
|
||||
- Leverage Next.js's built-in caching for Server Component payloads and static assets via Vercel's Edge Network.
|
||||
- Browser caching for static assets (`public/` folder) will use default optimal headers set by Vercel.
|
||||
- **Performance Monitoring Tools:**
|
||||
- Browser DevTools (Performance tab, Lighthouse).
|
||||
- Vercel Analytics (if enabled) for real-user monitoring.
|
||||
- WebPageTest for detailed performance analysis.
|
||||
- **Bundle Size Analysis:** Use tools like `@next/bundle-analyzer` to inspect production bundles and identify opportunities for optimization if bundle sizes become a concern.
|
||||
|
||||
## Change Log
|
||||
|
||||
| Date | Version | Description | Author |
|
||||
| :--------- | :------ | :----------------------------------------------- | :----------------- |
|
||||
| 2025-05-13 | 0.1 | Initial draft of frontend architecture document. | 4-design-arch (AI) |
|
||||
@@ -0,0 +1,54 @@
|
||||
# Frontend Coding Standards & Accessibility
|
||||
|
||||
> This document is a granulated shard from the main "5-front-end-architecture.md" focusing on "Front-End Coding Standards and Accessibility Best Practices".
|
||||
|
||||
## General Coding Standards (from Overall Philosophy & Patterns)
|
||||
|
||||
- **Framework & Core Libraries:**
|
||||
- Next.js (Latest, App Router)
|
||||
- React (19.0.0)
|
||||
- TypeScript (5.7.2)
|
||||
- **Component Architecture Approach:**
|
||||
- Shadcn UI for foundational elements.
|
||||
- Application-Specific Components in `app/components/core/`.
|
||||
- Prefer Server Components; use Client Components (`"use client"`) only when necessary for interactivity or browser APIs.
|
||||
- **Data Flow:**
|
||||
- Unidirectional: Server Components (data fetching) -> Client Components (props).
|
||||
- Mutations/Actions: Next.js Server Actions or API Route Handlers, with data revalidation.
|
||||
- Supabase Client for DB interaction.
|
||||
- **Key Design Patterns Used:**
|
||||
- Server Components & Client Components.
|
||||
- React Hooks (and custom hooks).
|
||||
- Provider Pattern (React Context API) when necessary.
|
||||
- Facade Pattern (conceptual for Supabase client).
|
||||
|
||||
## Naming & Organization Conventions (from Component Breakdown & Detailed Structure)
|
||||
|
||||
- **Component File Naming:**
|
||||
- React component files: `PascalCase.tsx` (e.g., `NewsletterCard.tsx`).
|
||||
- Next.js special files (`page.tsx`, `layout.tsx`, etc.): conventional lowercase/kebab-case.
|
||||
- **Directory Naming:** `kebab-case`.
|
||||
- **Non-Component TypeScript Files (.ts):** Primarily `camelCase.ts` (e.g., `utils.ts`, `uiSlice.ts`). Config files (`tailwind.config.ts`) and shared type definitions (`api-schemas.ts`) may use `kebab-case`.
|
||||
- **Component Organization:**
|
||||
- Core application components: `app/components/core/`.
|
||||
- Layout components: `app/components/layout/`.
|
||||
- Shadcn UI components: `components/ui/`.
|
||||
- Page-specific components (if complex and not reusable) can be co-located within the page's route directory.
|
||||
|
||||
## Accessibility (AX) Implementation Details
|
||||
|
||||
> This section is directly from "Accessibility (AX) Implementation Details" in `5-front-end-architecture.md`.
|
||||
|
||||
The frontend will adhere to **WCAG 2.1 Level A** as a minimum target, as specified in `docs/ui-ux-spec.txt`.
|
||||
|
||||
- **Semantic HTML:** Emphasis on using correct HTML5 elements (`<nav>`, `<main>`, `<article>`, `<aside>`, `<button>`, etc.) to provide inherent meaning and structure.
|
||||
- **ARIA Implementation:**
|
||||
- Shadcn UI components are built with accessibility in mind, often including appropriate ARIA attributes.
|
||||
- For custom components, relevant ARIA roles (e.g., `role="region"`, `role="alert"`) and attributes (e.g., `aria-label`, `aria-describedby`, `aria-live`, `aria-expanded`) will be used for dynamic content, interactive elements, and custom widgets to ensure assistive technologies can interpret them correctly.
|
||||
- **Keyboard Navigation:** All interactive elements (links, buttons, inputs, custom controls) must be focusable and operable using only the keyboard in a logical order. Focus indicators will be clear and visible.
|
||||
- **Focus Management:** For dynamic UI elements like modals or non-native dropdowns (if any are built custom beyond Shadcn capabilities), focus will be managed programmatically to ensure it moves to and is trapped within the element as appropriate, and returns to the trigger element upon dismissal.
|
||||
- **Alternative Text:** All meaningful images will have descriptive `alt` text. Decorative images will have empty `alt=""`.
|
||||
- **Color Contrast:** Adherence to WCAG 2.1 Level A color contrast ratios for text and interactive elements against their backgrounds. The "synthwave" theme's purple accents will be chosen carefully to meet these requirements. Tools will be used to verify contrast.
|
||||
- **Testing Tools for AX:**
|
||||
- Automated: Axe DevTools browser extension, Lighthouse accessibility audits.
|
||||
- Manual: Keyboard-only navigation testing, screen reader testing (e.g., NVDA, VoiceOver) for key user flows.
|
||||
@@ -0,0 +1,77 @@
|
||||
# Component Breakdown & Implementation Details
|
||||
|
||||
> This document is a granulated shard from the main "5-front-end-architecture.md" focusing on "Component Library, Reusable UI Components Guide, Atomic Design Elements, or Component Breakdown & Implementation Details".
|
||||
|
||||
This section outlines the conventions and templates for defining UI components. While a few globally shared or foundational components (e.g., main layout structures) might be specified here upfront to ensure consistency, the detailed specification for most feature-specific components will emerge as user stories are implemented. The key is for the development team (or AI agent) to follow the "Template for Component Specification" below whenever a new component is identified for development.
|
||||
|
||||
### Component Naming & Organization
|
||||
|
||||
- **Component File Naming:**
|
||||
|
||||
- React component files will use `PascalCase.tsx`. For example, `NewsletterCard.tsx`, `PodcastPlayer.tsx`.
|
||||
- Next.js special files like `page.tsx`, `layout.tsx`, `loading.tsx`, `error.tsx`, `global-error.tsx`, and `not-found.tsx` will use their conventional lowercase or kebab-case names.
|
||||
|
||||
- **Component Organization (Reiteration from Directory Structure):**
|
||||
|
||||
- **Application-Specific Core Components:** Reusable components specific to BMad DiCaster (e.g., `NewsletterCard`, `PodcastPlayer`) will reside in `app/components/core/`.
|
||||
- **Application-Specific Layout Components:** Components used for structuring page layouts (e.g., `PageWrapper.tsx`) will reside in `app/components/layout/`.
|
||||
- **Shadcn UI Components:** Components added via the Shadcn UI CLI will reside in `components/ui/` (e.g., `Button.tsx`, `Card.tsx`).
|
||||
- **Page-Specific Components:** If a component is complex but _only_ used on a single page, it can be co-located with that page's route file, for instance, in a `components` subfolder within that route's directory. However, the preference is to place reusable components in `app/components/core/` or `app/components/layout/`.
|
||||
|
||||
### Template for Component Specification
|
||||
|
||||
This template should be used to define and document each significant UI component identified from the UI/UX Specification (`docs/ui-ux-spec.txt`) and any subsequent design iterations. The goal is to provide sufficient detail for a developer or an AI agent to implement the component with minimal ambiguity. Most feature-specific components will be detailed emergently during development, following this template.
|
||||
|
||||
---
|
||||
|
||||
#### Component: `{ComponentName}` (e.g., `NewsletterCard`, `PodcastPlayerControls`)
|
||||
|
||||
- **Purpose:** {Briefly describe what this component does and its primary role in the user interface. What user need does it address?}
|
||||
- **Source File(s):** {e.g., `app/components/core/NewsletterCard.tsx`}
|
||||
- **Visual Reference:** {Link to specific Figma frame/component if available, or a detailed description/sketch if not. If based on a Shadcn UI component, note that and any key customizations.}
|
||||
- **Props (Properties):**
|
||||
{List each prop the component accepts. Specify its name, TypeScript type, whether it's required, any default value, and a clear description.}
|
||||
| Prop Name | Type | Required? | Default Value | Description |
|
||||
| :------------ | :------------------------------------ | :-------- | :------------ | :--------------------------------------------------- |
|
||||
| `exampleProp` | `string` | Yes | N/A | Example string prop. |
|
||||
| `items` | `Array<{id: string, name: string}>` | Yes | N/A | An array of item objects to display. |
|
||||
| `variant` | `'primary' \| 'secondary'` | No | `'primary'` | Visual variant of the component. |
|
||||
| `onClick` | `(event: React.MouseEvent) => void` | No | N/A | Optional click handler. |
|
||||
- **Internal State (if any):**
|
||||
{Describe any significant internal state the component manages using React hooks (e.g., `useState`).}
|
||||
| State Variable | Type | Initial Value | Description |
|
||||
| :---------------- | :-------- | :------------ | :------------------------------------------------ |
|
||||
| `isLoading` | `boolean` | `false` | Tracks if data for the component is loading. |
|
||||
| `selectedItem` | `string \| null` | `null` | Stores the ID of the currently selected item. |
|
||||
- **Key UI Elements / Structure (Conceptual):**
|
||||
{Describe the main visual parts of the component and their general layout. Reference Shadcn UI components if used as building blocks.}
|
||||
```jsx
|
||||
// Example for a Card component
|
||||
<Card>
|
||||
<CardHeader>
|
||||
<CardTitle>{"{{titleProp}}"}</CardTitle>
|
||||
<CardDescription>{"{{descriptionProp}}"}</CardDescription>
|
||||
</CardHeader>
|
||||
<CardContent>{/* {list of items or main content} */}</CardContent>
|
||||
<CardFooter>{/* {action buttons or footer content} */}</CardFooter>
|
||||
</Card>
|
||||
```
|
||||
- **Events Handled / Emitted:**
|
||||
- **Handles:** {List significant DOM events the component handles directly.}
|
||||
- **Emits (Callbacks):** {If the component uses props to emit events (callbacks) to its parent, list them here.}
|
||||
- **Actions Triggered (Side Effects):**
|
||||
- **State Management (Zustand):** {If the component interacts with a Zustand store, specify which store and actions.}
|
||||
- **API Calls / Data Fetching:** {Specify how Client Components trigger mutations or re-fetches (e.g., Server Actions).}
|
||||
- **Styling Notes:**
|
||||
- {Reference to specific Shadcn UI components used.}
|
||||
- {Key Tailwind CSS classes or custom styles for the "synthwave" theme.}
|
||||
- {Specific responsiveness behavior.}
|
||||
- **Accessibility (AX) Notes:**
|
||||
- {Specific ARIA attributes needed.}
|
||||
- {Keyboard navigation considerations.}
|
||||
- {Focus management details.}
|
||||
- {Notes on color contrast.}
|
||||
|
||||
---
|
||||
|
||||
_This template will be applied to each new significant component during the development process._
|
||||
@@ -0,0 +1,81 @@
|
||||
# Detailed Frontend Directory Structure
|
||||
|
||||
> This document is a granulated shard from the main "5-front-end-architecture.md" focusing on "Detailed Frontend Directory Structure".
|
||||
|
||||
The BMad DiCaster frontend will adhere to the Next.js App Router conventions and build upon the structure provided by the Vercel/Supabase Next.js App Router template. The monorepo structure defined in the main Architecture Document (`docs/architecture.md`) already outlines the top-level directories. This section details the frontend-specific organization.
|
||||
|
||||
**Naming Conventions Adopted:**
|
||||
|
||||
- **Directories:** `kebab-case` (e.g., `app/(web)/newsletter-list/`, `app/components/core/`)
|
||||
- **React Component Files (.tsx):** `PascalCase.tsx` (e.g., `NewsletterCard.tsx`, `PodcastPlayer.tsx`). Next.js App Router special files (e.g., `page.tsx`, `layout.tsx`, `loading.tsx`, `global-error.tsx`, `not-found.tsx`) retain their conventional lowercase or kebab-case names.
|
||||
- **Non-Component TypeScript Files (.ts):** Primarily `camelCase.ts` (e.g., `utils.ts`, `uiSlice.ts`). Configuration files (e.g., `tailwind.config.ts`) and shared type definition files (e.g., `api-schemas.ts`, `domain-models.ts`) may retain `kebab-case` as per common practice or previous agreement.
|
||||
|
||||
```plaintext
|
||||
{project-root}/
|
||||
├── app/ # Next.js App Router (Frontend Pages, Layouts, API Routes)
|
||||
│ ├── (web)/ # Group for user-facing web pages
|
||||
│ │ ├── newsletters/ # Route group for newsletter features
|
||||
│ │ │ ├── [newsletterId]/ # Dynamic route for individual newsletter detail
|
||||
│ │ │ │ ├── page.tsx # Newsletter Detail Page component
|
||||
│ │ │ │ └── loading.tsx # Optional: Loading UI for this route
|
||||
│ │ │ ├── page.tsx # Newsletter List Page component
|
||||
│ │ │ └── layout.tsx # Optional: Layout specific to /newsletters routes
|
||||
│ │ ├── layout.tsx # Root layout for all (web) pages
|
||||
│ │ └── page.tsx # Homepage (displays newsletter list)
|
||||
│ ├── (api)/ # API route handlers (as defined in main architecture [cite: 82, 127, 130, 133])
|
||||
│ │ ├── system/
|
||||
│ │ │ └── ...
|
||||
│ │ └── webhooks/
|
||||
│ │ └── ...
|
||||
│ ├── components/ # Application-specific UI React components (Core Logic)
|
||||
│ │ ├── core/ # Core, reusable application components
|
||||
│ │ │ ├── NewsletterCard.tsx
|
||||
│ │ │ ├── PodcastPlayer.tsx
|
||||
│ │ │ ├── DownloadButton.tsx
|
||||
│ │ │ └── BackButton.tsx
|
||||
│ │ └── layout/ # General layout components
|
||||
│ │ └── PageWrapper.tsx # Consistent padding/max-width for pages
|
||||
│ ├── auth/ # Auth-related pages and components (from template, MVP frontend is public)
|
||||
│ ├── login/page.tsx # Login page (from template, MVP frontend is public)
|
||||
│ └── global-error.tsx # Optional: Custom global error UI (Next.js special file)
|
||||
│ └── not-found.tsx # Optional: Custom 404 page UI (Next.js special file)
|
||||
│ ├── components/ # Shadcn UI components root (as configured by components.json [cite: 92])
|
||||
│ │ └── ui/ # Base UI elements from Shadcn (e.g., Button.tsx, Card.tsx)
|
||||
│ ├── lib/ # General utility functions for frontend [cite: 86, 309]
|
||||
│ │ ├── utils.ts # General utility functions (date formatting, etc.)
|
||||
│ │ └── hooks/ # Custom global React hooks
|
||||
│ │ └── useScreenWidth.ts # Example custom hook
|
||||
│ ├── store/ # Zustand state management
|
||||
│ │ ├── index.ts # Main store setup/export (can be store.ts or index.ts)
|
||||
│ │ └── slices/ # Individual state slices
|
||||
│ │ └── podcastPlayerSlice.ts # State for the podcast player
|
||||
│ ├── public/ # Static assets (images, favicon, etc.) [cite: 89]
|
||||
│ │ └── logo.svg # Application logo (to be provided [cite: 379])
|
||||
│ ├── shared/ # Shared code/types between frontend and Supabase functions [cite: 89, 97]
|
||||
│ │ └── types/
|
||||
│ │ ├── api-schemas.ts # Zod schemas for API req/res
|
||||
│ │ └── domain-models.ts # Core entity types (HNPost, Newsletter, etc. from main arch)
|
||||
│ ├── styles/ # Global styles [cite: 90]
|
||||
│ │ └── globals.css # Tailwind base styles, custom global styles
|
||||
│ ├── utils/ # Root utilities (from template [cite: 91])
|
||||
│ │ └── supabase/ # Supabase helper functions FOR FRONTEND (from template [cite: 92, 309])
|
||||
│ │ ├── client.ts # Client-side Supabase client
|
||||
│ │ ├── middleware.ts # Logic for Next.js middleware (Supabase auth [cite: 92, 311])
|
||||
│ │ └── server.ts # Server-side Supabase client
|
||||
│ ├── tailwind.config.ts # Tailwind CSS configuration [cite: 93]
|
||||
│ └── tsconfig.json # TypeScript configuration (includes path aliases like @/* [cite: 101])
|
||||
```
|
||||
|
||||
### Notes on Frontend Structure:
|
||||
|
||||
- **`app/(web)/`**: Route group for user-facing pages.
|
||||
- **`newsletters/page.tsx`**: Server Component for listing newsletters. [cite: 375, 573]
|
||||
- **`newsletters/[newsletterId]/page.tsx`**: Server Component for displaying a single newsletter. [cite: 376, 576]
|
||||
- **`app/components/core/`**: Houses application-specific React components like `NewsletterCard.tsx`, `PodcastPlayer.tsx`, `DownloadButton.tsx`, `BackButton.tsx` (identified in `ux-ui-spec.txt`). Components follow `PascalCase.tsx`.
|
||||
- **`app/components/layout/`**: For structural layout components, e.g., `PageWrapper.tsx`. Components follow `PascalCase.tsx`.
|
||||
- **`components/ui/`**: Standard directory for Shadcn UI components (e.g., `Button.tsx`, `Card.tsx`).
|
||||
- **`lib/hooks/`**: Custom React hooks (e.g., `useScreenWidth.ts`), files follow `camelCase.ts`.
|
||||
- **`store/slices/`**: Zustand state slices. `podcastPlayerSlice.ts` for podcast player state. Files follow `camelCase.ts`.
|
||||
- **`shared/types/`**: Type definitions. `api-schemas.ts` and `domain-models.ts` use `kebab-case.ts`.
|
||||
- **`utils/supabase/`**: Template-provided Supabase clients. Files follow `camelCase.ts`.
|
||||
- **Path Aliases**: `tsconfig.json` uses `@/*` aliases. [cite: 98, 101]
|
||||
@@ -0,0 +1,24 @@
|
||||
# Routing Strategy
|
||||
|
||||
> This document is a granulated shard from the main "5-front-end-architecture.md" focusing on "Routing Strategy".
|
||||
|
||||
Navigation and routing will be handled by the Next.js App Router.
|
||||
|
||||
- **Routing Library:** **Next.js App Router** (as per `architecture.txt`)
|
||||
|
||||
### Route Definitions
|
||||
|
||||
Based on `ux-ui-spec.txt` and PRD.
|
||||
|
||||
| Path Pattern | Component/Page (`app/(web)/...`) | Protection | Notes |
|
||||
| :---------------------------- | :------------------------------------ | :--------- | :-------------------------------------------------------------------------- |
|
||||
| `/` | `newsletters/page.tsx` (effectively) | Public | Homepage displays the newsletter list. |
|
||||
| `/newsletters` | `newsletters/page.tsx` | Public | Displays a list of current and past newsletters. |
|
||||
| `/newsletters/[newsletterId]` | `newsletters/[newsletterId]/page.tsx` | Public | Displays the detail page for a selected newsletter. `newsletterId` is UUID. |
|
||||
|
||||
_(Note: The main architecture document shows an `app/page.tsx` for the homepage. For MVP, this can either redirect to `/newsletters` or directly render the newsletter list content. The table above assumes it effectively serves the newsletter list.)_
|
||||
|
||||
### Route Guards / Protection
|
||||
|
||||
- **Authentication Guard:** The MVP frontend is public-facing, displaying newsletters and podcasts without user login. The Vercel/Supabase template includes middleware (`middleware.ts`) for protecting routes based on Supabase Auth. This will be relevant for any future admin sections but is not actively used to gate content for general users in MVP.
|
||||
- **Authorization Guard:** Not applicable for MVP.
|
||||
@@ -0,0 +1,121 @@
|
||||
# State Management In-Depth
|
||||
|
||||
> This document is a granulated shard from the main "5-front-end-architecture.md" focusing on "State Management In-Depth".
|
||||
|
||||
This section expands on the State Management strategy chosen (Zustand) and outlined in the "Overall Frontend Philosophy & Patterns".
|
||||
|
||||
- **Chosen Solution:** **Zustand** (Latest version, as per `architecture.txt`)
|
||||
- **Rationale:** Zustand was chosen for its simplicity, small bundle size, and unopinionated nature, suitable for BMad DiCaster's relatively simple frontend state needs (e.g., podcast player status). Server-side data is primarily managed by Next.js Server Components.
|
||||
|
||||
### Store Structure / Slices
|
||||
|
||||
Global client-side state will be organized into distinct "slices" within `store/slices/`. Components can import and use individual stores directly.
|
||||
|
||||
- **Conventions:**
|
||||
- Each slice in its own file: `store/slices/camelCaseSlice.ts`.
|
||||
- Define state interface, initial state, and action functions.
|
||||
- **Core Slice: `podcastPlayerSlice.ts`** (for MVP)
|
||||
|
||||
- **Purpose:** Manages the state of the podcast player (current track, playback status, time, volume).
|
||||
- **Source File:** `store/slices/podcastPlayerSlice.ts`
|
||||
- **State Shape (Example):**
|
||||
|
||||
```typescript
|
||||
interface PodcastTrack {
|
||||
id: string; // Could be newsletterId or a specific audio ID
|
||||
title: string;
|
||||
audioUrl: string;
|
||||
duration?: number; // in seconds
|
||||
}
|
||||
|
||||
interface PodcastPlayerState {
|
||||
currentTrack: PodcastTrack | null;
|
||||
isPlaying: boolean;
|
||||
currentTime: number; // in seconds
|
||||
volume: number; // 0 to 1
|
||||
isLoading: boolean;
|
||||
error: string | null;
|
||||
}
|
||||
|
||||
interface PodcastPlayerActions {
|
||||
loadTrack: (track: PodcastTrack) => void;
|
||||
play: () => void;
|
||||
pause: () => void;
|
||||
setCurrentTime: (time: number) => void;
|
||||
setVolume: (volume: number) => void;
|
||||
setError: (message: string | null) => void;
|
||||
resetPlayer: () => void;
|
||||
}
|
||||
```
|
||||
|
||||
- **Key Actions:** `loadTrack`, `play`, `pause`, `setCurrentTime`, `setVolume`, `setError`, `resetPlayer`.
|
||||
- **Zustand Store Definition:**
|
||||
|
||||
```typescript
|
||||
import { create } from "zustand";
|
||||
|
||||
// Previously defined interfaces: PodcastTrack, PodcastPlayerState, PodcastPlayerActions
|
||||
|
||||
const initialPodcastPlayerState: PodcastPlayerState = {
|
||||
currentTrack: null,
|
||||
isPlaying: false,
|
||||
currentTime: 0,
|
||||
volume: 0.75,
|
||||
isLoading: false,
|
||||
error: null,
|
||||
};
|
||||
|
||||
export const usePodcastPlayerStore = create<
|
||||
PodcastPlayerState & PodcastPlayerActions
|
||||
>((set) => ({
|
||||
...initialPodcastPlayerState,
|
||||
loadTrack: (track) =>
|
||||
set({
|
||||
currentTrack: track,
|
||||
isLoading: true, // Assume loading until actual audio element confirms
|
||||
error: null,
|
||||
isPlaying: false, // Usually don't autoplay on load
|
||||
currentTime: 0,
|
||||
}),
|
||||
play: () =>
|
||||
set((state) => {
|
||||
if (!state.currentTrack) return {}; // No track loaded
|
||||
return { isPlaying: true, isLoading: false, error: null };
|
||||
}),
|
||||
pause: () => set({ isPlaying: false }),
|
||||
setCurrentTime: (time) => set({ currentTime: time }),
|
||||
setVolume: (volume) => set({ volume: Math.max(0, Math.min(1, volume)) }),
|
||||
setError: (message) =>
|
||||
set({ error: message, isLoading: false, isPlaying: false }),
|
||||
resetPlayer: () => set({ ...initialPodcastPlayerState }),
|
||||
}));
|
||||
```
|
||||
|
||||
### Key Selectors
|
||||
|
||||
Selectors are functions that derive data from the store state. With Zustand, state is typically accessed directly from the hook, but memoized selectors can be created with libraries like `reselect` if complex derived data is needed, though for simple cases direct access is fine.
|
||||
|
||||
- **Convention:** For direct state access, components will use: `const { currentTrack, isPlaying, play } = usePodcastPlayerStore();`
|
||||
- **Example Selectors (if using `reselect` or similar, for more complex derivations later):**
|
||||
- `selectCurrentTrackTitle`: Returns `state.currentTrack?.title || 'No track loaded'`.
|
||||
- `selectIsPodcastPlaying`: Returns `state.isPlaying`.
|
||||
|
||||
### Key Actions / Reducers / Thunks
|
||||
|
||||
Zustand actions are functions defined within the `create` call that use `set` to update state. Asynchronous operations (like fetching data, though less common for Zustand which is often for UI state) can be handled by calling async functions within these actions and then calling `set` upon completion.
|
||||
|
||||
- **Convention:** Actions are part of the store hook: `const { loadTrack } = usePodcastPlayerStore();`.
|
||||
- **Asynchronous Example (Conceptual, if a slice needed to fetch data):**
|
||||
```typescript
|
||||
// In a hypothetical userSettingsSlice.ts
|
||||
// fetchUserSettings: async () => {
|
||||
// set({ isLoading: true });
|
||||
// try {
|
||||
// const settings = await api.fetchUserSettings(); // api is an imported service
|
||||
// set({ userSettings: settings, isLoading: false });
|
||||
// } catch (error) {
|
||||
// set({ error: 'Failed to fetch settings', isLoading: false });
|
||||
// }
|
||||
// }
|
||||
```
|
||||
For BMad DiCaster MVP, most data fetching is via Server Components. Client-side async actions in Zustand would primarily be for client-specific operations not directly tied to server data fetching.
|
||||
@@ -0,0 +1,31 @@
|
||||
# Frontend Style Guide
|
||||
|
||||
> This document is a granulated shard from the main "5-front-end-architecture.md" focusing on "UI Style Guide, Brand Guidelines, Visual Design Specifications, or Styling Approach".
|
||||
|
||||
The frontend for BMad DiCaster will be built using modern, efficient, and maintainable practices, leveraging the Vercel/Supabase Next.js App Router template as a starting point. The core philosophy is to create a responsive, fast-loading, and accessible user interface that aligns with the "synthwave technical glowing purple vibes" aesthetic.
|
||||
|
||||
- **Framework & Core Libraries relevant to Styling:**
|
||||
|
||||
- **Next.js (Latest, e.g., 14.x.x, App Router):** Chosen for its robust full-stack capabilities and seamless integration with Vercel for deployment.
|
||||
- **React (19.0.0):** As the underlying UI library for Next.js.
|
||||
- **TypeScript (5.7.2):** For strong typing and improved code quality.
|
||||
|
||||
- **Component Architecture relevant to Styling:**
|
||||
|
||||
- **Shadcn UI (Latest):** This collection of reusable UI components, built on Radix UI and Tailwind CSS, will be used for foundational elements like buttons, cards, dialogs, etc.
|
||||
- **Application-Specific Components:** Custom components will be developed for unique UI parts.
|
||||
|
||||
- **Styling Approach:**
|
||||
|
||||
- **Tailwind CSS (3.4.17):** A utility-first CSS framework for rapid UI development and consistent styling. It will be used for all styling, including achieving the "synthwave technical glowing purple vibes."
|
||||
- **Shadcn UI:** Leverages Tailwind CSS for its components.
|
||||
- **Global Styles:** `app/globals.css` will be used for base Tailwind directives and any genuinely global style definitions.
|
||||
- **Theme Customization:** `tailwind.config.ts` will be used to extend Tailwind's default theme with custom colors (e.g., synthwave purples like `#800080` as an accent), fonts, or spacing as needed to achieve the desired aesthetic. The "synthwave technical glowing purple vibes" will be achieved through a dark base theme, with purple accents for interactive elements, highlights, and potentially subtle text shadows or glows on specific headings or decorative elements. Font choices will lean towards modern, clean sans-serifs as specified in `ux-ui-spec.txt`, potentially with a more stylized font for major headings if it fits the theme without compromising readability.
|
||||
|
||||
- **Visual Design Specifications (derived from UI/UX Spec and Architecture):**
|
||||
- **Aesthetic:** "Synthwave technical glowing purple vibes."
|
||||
- **Layout:** Minimalist, clean, focusing on content readability.
|
||||
- **Color Palette:** Dark base theme with purple accents (e.g., `#800080`). Ensure high contrast for accessibility.
|
||||
- **Typography:** Modern, clean sans-serif fonts. Stylized font for major headings if it fits the theme and maintains readability.
|
||||
- **Iconography:** (To be determined, likely to use a standard library like Heroicons or Phosphor Icons, integrated as SVG or via Shadcn UI if applicable).
|
||||
- **Responsiveness:** The UI must be responsive and adapt to various screen sizes (desktop, tablet, mobile).
|
||||
@@ -0,0 +1,44 @@
|
||||
# Frontend Testing Strategy
|
||||
|
||||
> This document is a granulated shard from the main "5-front-end-architecture.md" focusing on "Frontend Testing Strategy".
|
||||
|
||||
This section elaborates on the overall testing strategy defined in `architecture.txt`, focusing on frontend specifics.
|
||||
|
||||
- **Link to Main Testing Strategy:** `docs/architecture.md#overall-testing-strategy` (and `docs/architecture.md#coding-standards` for test file colocation).
|
||||
|
||||
### Component Testing
|
||||
|
||||
- **Scope:** Testing individual React components in isolation, primarily focusing on UI rendering based on props and basic interactions.
|
||||
- **Tools:** **Jest** (test runner, assertion library, mocking) and **React Testing Library (RTL)** (for user-centric component querying and interaction).
|
||||
- **Focus:**
|
||||
- Correct rendering based on props.
|
||||
- User interactions (e.g., button clicks triggering callbacks).
|
||||
- Conditional rendering logic.
|
||||
- Accessibility attributes.
|
||||
- **Location:** Test files (`*.test.tsx` or `*.spec.tsx`) will be co-located with the component files (e.g., `app/components/core/NewsletterCard.test.tsx`).
|
||||
- **Example Guideline:** "A `NewsletterCard` component should render the title and date passed as props. Clicking the card should navigate (mocked) or call an `onClick` prop."
|
||||
|
||||
### UI Integration/Flow Testing
|
||||
|
||||
- **Scope:** Testing interactions between multiple components that compose a piece of UI or a small user flow, potentially with mocked Supabase client responses or Zustand store states.
|
||||
- **Tools:** Jest and React Testing Library.
|
||||
- **Focus:**
|
||||
- Data flow between a parent and its child components.
|
||||
- State updates in a Zustand store affecting multiple components.
|
||||
- Rendering of a sequence of UI elements in a simple flow (e.g., selecting an item from a list and seeing details appear).
|
||||
- **Example Guideline:** "The `NewsletterListPage` should correctly render multiple `NewsletterCard` components when provided with mock newsletter data. Clicking a card should correctly invoke navigation logic."
|
||||
|
||||
### End-to-End UI Testing Tools & Scope
|
||||
|
||||
- **Tools:** **Playwright**.
|
||||
- **Scope (Frontend Focus):**
|
||||
- Verify the "Viewing a Newsletter" user flow:
|
||||
1. Navigate to the newsletter list page.
|
||||
2. Verify newsletters are listed.
|
||||
3. Click on a newsletter.
|
||||
4. Verify the newsletter detail page loads with content.
|
||||
5. Verify the podcast player is present if a podcast URL exists.
|
||||
6. Verify the download button is present.
|
||||
7. Verify the "Back to List" button works.
|
||||
- Basic mobile responsiveness checks for key pages (list and detail).
|
||||
- **Test Data Management for UI:** E2E tests will rely on data populated in the development Supabase instance or use mocked API responses if targeting isolated frontend tests with Playwright's network interception. For true E2E against a live dev environment, pre-seeded data in Supabase dev instance will be used.
|
||||
@@ -0,0 +1,37 @@
|
||||
# Index
|
||||
|
||||
## PRD Epics and Stories
|
||||
|
||||
- [Product Requirements Document (PRD)](./prd.md) - The main PRD document, linking to individual epics.
|
||||
- [Epic 1: Project Initialization, Setup, and HN Content Acquisition](./epic-1.md)
|
||||
- [Epic 2: Article Scraping](./epic-2.md)
|
||||
- [Epic 3: AI-Powered Content Summarization](./epic-3.md)
|
||||
- [Epic 4: Automated Newsletter Creation and Distribution](./epic-4.md)
|
||||
- [Epic 5: Podcast Generation Integration](./epic-5.md)
|
||||
- [Epic 6: Web Interface for Initial Structure and Content Access](./epic-6.md)
|
||||
|
||||
## Architecture Documents
|
||||
|
||||
- [System Architecture Document](./architecture.md) - The main system architecture document, linking to detailed shards.
|
||||
- [API Reference](./api-reference.md) - Details on external and internal APIs.
|
||||
- [Component View](./component-view.md) - Logical components and architectural patterns.
|
||||
- [Data Models](./data-models.md) - Core application entities and database schemas.
|
||||
- [Environment Variables Documentation](./environment-vars.md) - Placeholder for consolidated environment variable information.
|
||||
- [Infrastructure and Deployment Overview](./infra-deployment.md) - Cloud providers, core services, and deployment strategy.
|
||||
- [Key Reference Documents](./key-references.md) - List of key documents referenced in the architecture.
|
||||
- [Operational Guidelines](./operational-guidelines.md) - Consolidated guidelines for error handling, coding standards, testing, and security.
|
||||
- [Project Structure](./project-structure.md) - Monorepo organization and key directory descriptions.
|
||||
- [Sequence Diagrams](./sequence-diagrams.md) - Core workflow and sequence diagrams.
|
||||
- [Technology Stack](./tech-stack.md) - Definitive technology selections for the project.
|
||||
|
||||
### Frontend Specific Architecture Documents
|
||||
|
||||
- [Frontend Architecture Document](./front-end-architecture.md) - The main frontend architecture document, linking to detailed shards.
|
||||
- [Frontend Project Structure](./front-end-project-structure.md) - Detailed frontend directory structure and naming conventions.
|
||||
- [Frontend Style Guide](./front-end-style-guide.md) - Styling approach, theme customization, and visual design specifications.
|
||||
- [Frontend Component Guide](./front-end-component-guide.md) - Component naming, organization, and template for component specification.
|
||||
- [Frontend Coding Standards & Accessibility](./front-end-coding-standards.md) - Frontend-specific coding standards and accessibility (AX) implementation details.
|
||||
- [Frontend State Management](./front-end-state-management.md) - In-depth details of Zustand store structure, slices, selectors, and actions.
|
||||
- [Frontend API Interaction Layer](./front-end-api-interaction.md) - Client/service structure for API interactions and frontend error handling.
|
||||
- [Frontend Routing Strategy](./front-end-routing-strategy.md) - Route definitions and protection mechanisms.
|
||||
- [Frontend Testing Strategy](./front-end-testing-strategy.md) - Component, UI integration, and end-to-end testing strategies for the frontend.
|
||||
@@ -0,0 +1 @@
|
||||
# Infrastructure and Deployment Overview\n\n> This document is a granulated shard from the main \"3-architecture.md\" focusing on \"Infrastructure and Deployment Overview\".\n\n- **Cloud Provider(s):**\n - **Vercel:** For hosting the Next.js frontend application, Next.js API routes (including the Play.ht webhook receiver and the workflow trigger API), and Supabase Functions (Edge/Serverless Functions deployed via Supabase CLI and Vercel integration).\n - **Supabase:** Provides the managed PostgreSQL database, authentication, storage, and the an environment for deploying backend functions. Supabase itself runs on underlying cloud infrastructure (e.g., AWS).\n- **Core Services Used:**\n - **Vercel:** Next.js Hosting (SSR, SSG, ISR, Edge runtime), Serverless Functions (for Next.js API routes), Edge Functions (for Next.js middleware and potentially some API routes), Global CDN, CI/CD (via GitHub integration), Environment Variables Management, Vercel Cron Jobs (for scheduled triggering of the `/api/system/trigger-workflow` endpoint).\n - **Supabase:** PostgreSQL Database, Supabase Auth, Supabase Storage (for temporary file hosting if needed for Play.ht, or other static assets), Supabase Functions (backend logic for the event-driven pipeline, deployed via Supabase CLI, runs on Vercel infrastructure), Database Webhooks (using `pg_net` or built-in functionality to trigger Supabase/Vercel functions), Supabase CLI (for local development, migrations, function deployment).\n- **Infrastructure as Code (IaC):**\n - **Supabase Migrations:** SQL migration files in `supabase/migrations/` define the database schema and are managed by the Supabase CLI. This is the primary IaC for the database.\n - **Vercel Configuration:** `vercel.json` (if needed for custom configurations beyond what the Vercel dashboard and Next.js provide) and project settings via the Vercel dashboard.\n - No explicit IaC for Vercel services beyond its declarative nature and Next.js conventions is anticipated for MVP.\n- **Deployment Strategy:**\n - **Source Control:** GitHub will be used for version control.\n - **CI/CD Tool:** GitHub Actions (as defined in `/.github/workflows/main.yml`).\n - **Frontend (Next.js app on Vercel):** Continuous deployment triggered by pushes/merges to the main branch. Preview deployments automatically created for pull requests.\n - **Backend (Supabase Functions):** Deployed via Supabase CLI commands (e.g., `supabase functions deploy <function_name> --project-ref <your-project-ref>`), run as part of the GitHub Actions workflow.\n - **Database Migrations (Supabase):** Applied via CI/CD step using `supabase migration up --linked` or Supabase CLI against remote DB.\n- **Environments:**\n - **Local Development:** Next.js local dev server (`next dev`), local Supabase stack (`supabase start`), local `.env.local`.\n - **Development/Preview (on Vercel):** Auto-deployed per PR/dev branch push, connected to a **Development Supabase instance**.\n - **Production (on Vercel):** Deployed from the main branch, connected to a **Production Supabase instance**.\n- **Environment Promotion:** Local -\> Dev/Preview (PR) -\> Production (merge to main).\n- **Rollback Strategy:** Vercel dashboard/CLI for app/function rollbacks; Supabase migrations (revert migration) or Point-in-Time Recovery for database.
|
||||
@@ -0,0 +1,17 @@
|
||||
# Key Reference Documents
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "Key Reference Documents".
|
||||
|
||||
1. **Product Requirements Document (PRD):** `docs/prd-incremental-full-agile-mode.txt`
|
||||
2. **UI/UX Specification:** `docs/ui-ux-spec.txt`
|
||||
3. **Technical Preferences:** `docs/technical-preferences copy.txt`
|
||||
4. **Environment Variables Documentation:** `docs/environment-vars.md` (To be created)
|
||||
5. **(Optional) Frontend Architecture Document:** `docs/frontend-architecture.md` (To be created by Design Architect)
|
||||
6. **Play.ht API Documentation:** [https://docs.play.ai/api-reference/playnote/post](https://docs.play.ai/api-reference/playnote/post)
|
||||
7. **Hacker News Algolia API:** [https://hn.algolia.com/api](https://hn.algolia.com/api)
|
||||
8. **Ollama API Documentation:** [https://github.com/ollama/ollama/blob/main/docs/api.md](https://www.google.com/search?q=https://github.com/ollama/ollama/blob/main/docs/api.md)
|
||||
9. **Supabase Documentation:** [https://supabase.com/docs](https://supabase.com/docs)
|
||||
10. **Next.js Documentation:** [https://nextjs.org/docs](https://nextjs.org/docs)
|
||||
11. **Vercel Documentation:** [https://vercel.com/docs](https://vercel.com/docs)
|
||||
12. **Pino Logging Documentation:** [https://getpino.io/](https://getpino.io/)
|
||||
13. **Zod Documentation:** [https://zod.dev/](https://zod.dev/)
|
||||
@@ -0,0 +1,122 @@
|
||||
# Operational Guidelines
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "Operational Guidelines (Coding Standards, Testing, Error Handling, Security)".
|
||||
|
||||
### Error Handling Strategy
|
||||
|
||||
A robust error handling strategy is essential for the reliability of the BMad DiCaster pipeline. This involves consistent error logging, appropriate retry mechanisms, and clear error propagation. The `workflow_runs` table will be a central piece in tracking errors for entire workflow executions.
|
||||
|
||||
- **General Approach:**
|
||||
- Standard JavaScript `Error` objects (or custom extensions of `Error`) will be used for exceptions within TypeScript code.
|
||||
- Each Supabase Function in the pipeline will catch its own errors, log them using Pino, update the `workflow_runs` table with an error status/message (via `WorkflowTrackerService`), and prevent unhandled promise rejections.
|
||||
- Next.js API routes will catch errors, log them, and return appropriate HTTP error responses (e.g., 4xx, 500) with a JSON error payload.
|
||||
- **Logging (Pino):**
|
||||
- **Library/Method:** Pino (`pino`) is the standard logging library for Supabase Functions and Next.js API routes.
|
||||
- **Configuration:** A shared Pino logger instance (e.g., `supabase/functions/_shared/logger.ts`) will be configured for JSON output, ISO timestamps, and environment-aware pretty-printing for development.
|
||||
```typescript
|
||||
// Example: supabase/functions/_shared/logger.ts
|
||||
import pino from "pino";
|
||||
export const logger = pino({
|
||||
level: process.env.LOG_LEVEL || "info",
|
||||
formatters: { level: (label) => ({ level: label }) },
|
||||
timestamp: pino.stdTimeFunctions.isoTime,
|
||||
...(process.env.NODE_ENV === "development" && {
|
||||
transport: {
|
||||
target: "pino-pretty",
|
||||
options: {
|
||||
colorize: true,
|
||||
translateTime: "SYS:standard",
|
||||
ignore: "pid,hostname",
|
||||
},
|
||||
},
|
||||
}),
|
||||
});
|
||||
```
|
||||
- **Format:** Structured JSON.
|
||||
- **Levels:** `trace`, `debug`, `info`, `warn`, `error`, `fatal`.
|
||||
- **Context:** Logs must include `timestamp`, `severity`, `workflowRunId` (where applicable), `service` or `functionName`, a clear `message`, and relevant `details` (sanitized). **Sensitive data must NEVER be logged.** Pass error objects directly to Pino: `logger.error({ err: errorInstance, workflowRunId }, "Operation failed");`.
|
||||
- **Specific Handling Patterns:**
|
||||
- **External API Calls (HN Algolia, Play.ht, LLM Provider):**
|
||||
- **Facades:** Calls made through dedicated facades in `supabase/functions/_shared/`.
|
||||
- **Timeouts:** Implement reasonable connect and read timeouts.
|
||||
- **Retries:** Facades implement limited retries (2-3) with exponential backoff for transient errors (network issues, 5xx errors).
|
||||
- **Error Propagation:** Facades catch, log, and throw standardized custom errors (e.g., `ExternalApiError`) containing contextual information.
|
||||
- **Internal Errors / Business Logic Exceptions (Supabase Functions):**
|
||||
- Use `try...catch`. Critical errors preventing task completion for a `workflow_run_id` must: 1. Log detailed error (Pino). 2. Call `WorkflowTrackerService.failWorkflow(...)`.
|
||||
- Next.js API routes return generic JSON errors (e.g., `{"error": "Internal server error"}`) and appropriate HTTP status codes.
|
||||
- **Database Operations (Supabase):** Critical errors treated as internal errors (log, update `workflow_runs` to 'failed').
|
||||
- **Scraping/Summarization/Podcast/Delivery Failures:** Individual item failures are logged and status updated (e.g., `scraped_articles.scraping_status`). The overall workflow may continue with available data, with partial success noted in `workflow_runs.details`. Systemic failures lead to `workflow_runs.status = 'failed'`.
|
||||
- **`CheckWorkflowCompletionService`:** Must be resilient. Errors processing one `workflow_run_id` should be logged but not prevent processing of other runs or subsequent scheduled invocations.
|
||||
|
||||
### Coding Standards
|
||||
|
||||
These standards are mandatory for all code generation by AI agents and human developers.
|
||||
|
||||
- **Primary Language & Runtime:** TypeScript `5.7.2`, Node.js `22.10.2`.
|
||||
- **Style Guide & Linter:** ESLint (configured with Next.js defaults, TypeScript support) and Prettier (`3.3.3`). Configurations in root. Linting/formatting are mandatory.
|
||||
- **Naming Conventions:**
|
||||
- Variables & Functions/Methods: `camelCase`
|
||||
- Classes/Types/Interfaces: `PascalCase`
|
||||
- Constants: `UPPER_SNAKE_CASE`
|
||||
- Files (.ts, .tsx): `kebab-case` (e.g., `newsletter-card.tsx`)
|
||||
- Supabase function directories: `kebab-case` (e.g., `hn-content-service`)
|
||||
- **File Structure:** Adhere to "Project Structure." Unit tests (`*.test.ts(x)`/`*.spec.ts(x)`) co-located with source files.
|
||||
- **Asynchronous Operations:** Always use `async`/`await` for Promises; ensure proper handling.
|
||||
- **Type Safety (TypeScript):** Adhere to `tsconfig.json` (`"strict": true`). Avoid `any`; use `unknown` with type narrowing. Shared types in `shared/types/`.
|
||||
- **Comments & Documentation:** Explain _why_, not _what_. Use TSDoc for exported members. READMEs for modules/services.
|
||||
- **Dependency Management:** Use `npm`. Vet new dependencies. Pin versions or use `^` for non-breaking updates. Resolve `latest` tags to specific versions upon setup.
|
||||
- **Environment Variables:** Manage via environment variables (`.env.example` provided). Use Zod for runtime parsing/validation.
|
||||
- **Modularity & Reusability:** Break down complexity. Use shared utilities/facades.
|
||||
|
||||
#### Detailed Language & Framework Conventions
|
||||
|
||||
##### TypeScript/Node.js (Next.js & Supabase Functions) Specifics:
|
||||
|
||||
- **Immutability:** Prefer immutable data structures (e.g., `Readonly<T>`, `as const`). Follow Zustand patterns for immutable state updates in React.
|
||||
- **Functional vs. OOP:** Favor functional constructs for data transformation/utilities. Use classes for services/facades managing state or as per framework (e.g., React functional components with Hooks preferred).
|
||||
- **Error Handling Specifics:** `throw new Error('...')` or custom error classes. Ensure `Promise` rejections are `Error` objects.
|
||||
- **Null/Undefined Handling:** With `strictNullChecks`, handle explicitly. Avoid `!` non-null assertion; prefer explicit checks, `?.`, `??`.
|
||||
- **Module System:** Use ES Modules (`import`/`export`) exclusively.
|
||||
- **Logging Specifics (Pino):** Use shared Pino logger. Include context object (`logger.info({ context }, "message")`), especially `workflowRunId`.
|
||||
- **Next.js Conventions:** Follow App Router conventions. Use Server Components for data fetching where appropriate. Route Handlers for API endpoints.
|
||||
- **Supabase Function Conventions:** `index.ts` as entry. Self-contained or use `_shared/` utilities. Secure client initialization (admin vs. user).
|
||||
- **Code Generation Anti-Patterns to Avoid:** Overly nested logic, single-letter variables (except trivial loops), disabling linter/TS errors without cause, bypassing framework security, monolithic functions.
|
||||
|
||||
### Overall Testing Strategy
|
||||
|
||||
- **Tools:** Jest (unit/integration), React Testing Library (RTL) (React components), Playwright (E2E). Supabase CLI for local DB/function testing.
|
||||
- **Unit Tests:**
|
||||
- **Scope:** Isolate individual functions, methods, classes, React components. Focus on logic, transformations, component rendering.
|
||||
- **Location & Naming:** Co-located with source files (`*.test.ts`, `*.spec.ts`, `*.test.tsx`, `*.spec.tsx`).
|
||||
- **Mocking/Stubbing:** Jest mocks for dependencies. External API Facades are mocked when testing services that use them. Facades themselves are tested by mocking the underlying HTTP client or library's network calls.
|
||||
- **AI Agent Responsibility:** Generate unit tests covering logic paths, props, events, edge cases, error conditions for new/modified code.
|
||||
- **Integration Tests:**
|
||||
- **Scope:** Interactions between components/services (e.g., API route -> service -> DB).
|
||||
- **Location:** `tests/integration/`.
|
||||
- **Environment:** Local Supabase dev environment. Consider `msw` for mocking HTTP services called by frontend/backend.
|
||||
- **AI Agent Responsibility:** Generate tests for key service interactions or API contracts.
|
||||
- **End-to-End (E2E) Tests:**
|
||||
- **Scope:** Validate complete user flows via UI.
|
||||
- **Tool:** Playwright. Location: `tests/e2e/`.
|
||||
- **Key Scenarios (MVP):** View newsletter list, view detail, play podcast, download newsletter.
|
||||
- **AI Agent Responsibility:** Generate E2E test stubs/scripts for critical paths.
|
||||
- **Test Coverage:**
|
||||
- **Target:** Aim for **80% unit test coverage** for new business logic and critical components. Quality over quantity.
|
||||
- **Measurement:** Jest coverage reports.
|
||||
- **Mocking/Stubbing Strategy (General):** Test one unit at a time. Mock external dependencies for unit tests. For facade unit tests: use the real library but mock its external calls at the library's boundary.
|
||||
- **Test Data Management:** Inline mock data for unit tests. Factories/fixtures or `seed.sql` for integration/E2E tests.
|
||||
|
||||
### Security Best Practices
|
||||
|
||||
- **Input Sanitization/Validation:** Zod for all external inputs (API requests, function payloads, external API responses). Validate at component boundaries.
|
||||
- **Output Encoding:** Rely on React JSX auto-escaping for frontend. Ensure HTML for newsletters is sanitized if dynamic data is injected outside of a secure templating engine.
|
||||
- **Secrets Management:** Via environment variables (Vercel UI, `.env.local`). Never hardcode or log secrets. Access via `process.env`. Use Supabase service role key only in backend functions.
|
||||
- **Dependency Security:** Regular `npm audit`. Vet new dependencies.
|
||||
- **Authentication/Authorization:**
|
||||
- Workflow Trigger/Status APIs: API Key (`X-API-KEY`).
|
||||
- Play.ht Webhook: Shared secret or signature verification.
|
||||
- Supabase RLS: Enable on tables, define policies (especially for `subscribers` and any data directly queried by frontend).
|
||||
- **Principle of Least Privilege:** Scope API keys and database roles narrowly.
|
||||
- **API Security (General):** HTTPS (Vercel default). Consider rate limiting for public APIs. Standard HTTP security headers.
|
||||
- **Error Handling & Information Disclosure:** Log detailed errors server-side; return generic messages/error IDs to clients.
|
||||
- **Regular Security Audits/Testing (Post-MVP):** Consider for future enhancements.
|
||||
172
BETA-V3/v3-demos/full-stack-app-demo/10-sharded-docs/prd.md
Normal file
172
BETA-V3/v3-demos/full-stack-app-demo/10-sharded-docs/prd.md
Normal file
@@ -0,0 +1,172 @@
|
||||
# BMad DiCaster Product Requirements Document (PRD)
|
||||
|
||||
## Goal, Objective and Context
|
||||
|
||||
**Goal:** To develop a web application that provides a daily, concise summary of top Hacker News (HN) posts, delivered as a newsletter and accessible via a web interface.
|
||||
|
||||
**Objective:** To streamline the consumption of HN content by curating the top stories, providing AI-powered summaries, and offering an optional AI-generated podcast version.
|
||||
|
||||
**Context:** Busy professionals and enthusiasts want to stay updated on HN but lack the time to sift through numerous posts and discussions. This application will address this problem by automating the delivery of summarized content.
|
||||
|
||||
## Functional Requirements (MVP)
|
||||
|
||||
- **HN Content Retrieval & Storage:**
|
||||
- Daily retrieval of the top 30 Hacker News posts and associated comments using the HN Algolia API.
|
||||
- Scraping and storage of up to 10 linked articles per day.
|
||||
- Storage of all retrieved data (posts, comments, articles) with date association.
|
||||
- **AI-Powered Summarization:**
|
||||
- AI-powered summarization of the 10 selected articles (2-paragraph summaries).
|
||||
- AI-powered summarization of comments for the 10 selected posts (2-paragraph summaries highlighting interesting interactions).
|
||||
- Configuration for local or remote LLM usage via environment variables.
|
||||
- **Newsletter Generation & Delivery:**
|
||||
- Generation of a daily newsletter in HTML format, including summaries, links to HN posts and articles, and original post dates/times.
|
||||
- Automated delivery of the newsletter to a manually configured list of subscribers in Supabase. The list of emails will be manually populated in the database. Account information for the Nodemailer service will be provided via environment variables.
|
||||
- **Podcast Generation & Integration:**
|
||||
- Integration with Play.ht's PlayNote API for AI-generated podcast creation from the newsletter content.
|
||||
- Webhook handler to update the newsletter with the generated podcast link.
|
||||
- **Web Interface (MVP):**
|
||||
- Display of current and past newsletters.
|
||||
- Functionality to read the newsletter content within the web page.
|
||||
- Download option for newsletters.
|
||||
- Web player for listening to generated podcasts.
|
||||
- Basic mobile responsiveness for displaying newsletters and podcasts.
|
||||
- **API & Triggering:**
|
||||
- Secure API endpoint to manually trigger the daily workflow, secured with API keys.
|
||||
- CLI command to manually trigger the daily workflow locally.
|
||||
|
||||
## Non-Functional Requirements (MVP)
|
||||
|
||||
- **Performance:**
|
||||
- The system should retrieve HN posts and generate the newsletter within a reasonable timeframe (e.g., under 30 minutes) to ensure timely delivery.
|
||||
- The web interface should load quickly (e.g., within 2 seconds) to provide a smooth user experience.
|
||||
- **Scalability:**
|
||||
- The system is designed for an initial MVP delivery to 3-5 email subscribers. Scalability beyond this will be considered post-MVP.
|
||||
- **Security:**
|
||||
- The API endpoint for triggering the daily workflow must be secure, using API keys.
|
||||
- User data (email addresses) should be stored securely. No other security measures are required for the MVP.
|
||||
- **Reliability:**
|
||||
- No specific uptime or availability requirements are defined for the MVP.
|
||||
- The newsletter generation and delivery process should be robust and handle potential errors gracefully.
|
||||
- The system must be executable from a local development environment.
|
||||
- **Maintainability:**
|
||||
- The codebase should adhere to good quality coding standards, including separation of concerns.
|
||||
- The system should employ facades and factories to facilitate future expansion.
|
||||
- The system should be built as an event-driven pipeline, leveraging Supabase to capture data at each stage and trigger subsequent functions asynchronously. This approach aims to mitigate potential timeout issues with Vercel hosting.
|
||||
|
||||
## User Interaction and Design Goals
|
||||
|
||||
This section captures the high-level vision and goals for the User Experience (UX) to guide the Design Architect.
|
||||
|
||||
- **Overall Vision & Experience:**
|
||||
- The desired look and feel is modern and minimalist, with synthwave technical glowing purple vibes.
|
||||
- Users should have a clean and efficient experience when accessing and consuming newsletter content and podcasts.
|
||||
- **Key Interaction Paradigms:**
|
||||
- Interaction paradigms will be determined by the Design Architect.
|
||||
- **Core Screens/Views (Conceptual):**
|
||||
- The MVP will consist of two pages:
|
||||
- A list page to display current and past newsletters.
|
||||
- A detail page to display the selected newsletter content, including:
|
||||
- Download option for the newsletter.
|
||||
- Web player for listening to the generated podcast.
|
||||
- The article laid out for viewing.
|
||||
- **Accessibility Aspirations:**
|
||||
- The web interface (Epic 6) will adhere to WCAG 2.1 Level A guidelines as detailed in `frontend-architecture.md`. (Updated during checklist review)
|
||||
- **Branding Considerations (High-Level):**
|
||||
- A logo for the application will be provided.
|
||||
- The application will use the name "BMad DiCaster".
|
||||
- **Target Devices/Platforms:**
|
||||
- The application will be designed as a mobile-first responsive web app, ensuring it looks good on both mobile and desktop devices.
|
||||
|
||||
## Technical Assumptions
|
||||
|
||||
This section captures any existing technical information that will guide the Architect in the technical design.
|
||||
|
||||
- The application will be developed using the Next.js/Supabase template and hosted entirely on Vercel.
|
||||
- This implies a monorepo structure, as the frontend (Next.js) and backend (Supabase functions) will reside within the same repository.
|
||||
- The backend will primarily leverage serverless functions provided by Vercel and Supabase.
|
||||
- Frontend development will be in Next.js with React.
|
||||
- Data storage will be handled by Supabase's PostgreSQL database.
|
||||
- Separate Supabase instances will be used for development and production environments to ensure data isolation and stability.
|
||||
- For local development, developers can utilize the Supabase CLI and Vercel CLI to emulate the production environment, primarily for testing functions and deployments, but the development Supabase instance will be the primary source of dev data.
|
||||
- Testing will include unit tests, integration tests (especially for interactions with Supabase), and end-to-end tests.
|
||||
- The system should be built as an event-driven pipeline, leveraging Supabase to capture data at each stage and trigger subsequent functions asynchronously to mitigate potential timeout issues with Vercel.
|
||||
|
||||
## Epic Overview
|
||||
|
||||
_(Note: Epics will be developed sequentially. Development will start with Epic 1 and proceed to the next epic only after the previous one is fully completed and verified. Per the BMAD method, every story must be self-contained and done before the next one is started.)_
|
||||
|
||||
_(Note: All UI development across all epics must adhere to mobile responsiveness and Tailwind CSS/theming principles to ensure a consistent and maintainable user experience.)_
|
||||
|
||||
**(General Note on Service Implementation for All Epics):** All backend services (Supabase Functions) developed as part of any epic must implement robust error handling. They should log extensively using Pino, ensuring that all log entries include the relevant `workflow_run_id` for traceability. Furthermore, services must interact with the `WorkflowTrackerService` to update the `workflow_runs` table appropriately on both successful completion of their tasks and in case of any failures, recording status and error messages as applicable.
|
||||
|
||||
- **Epic 1: Project Initialization, Setup, and HN Content Acquisition**
|
||||
- Goal: Establish the foundational project structure, including the Next.js application, Supabase integration, deployment pipeline, API/CLI triggers, core workflow orchestration, and implement functionality to retrieve, process, and store Hacker News posts/comments via a `ContentAcquisitionFacade`, providing data for newsletter generation. Implement the database event mechanism to trigger subsequent processing. Define core configuration tables, seed data, and set up testing frameworks.
|
||||
- **Epic 2: Article Scraping**
|
||||
- Goal: Implement the functionality to scrape and store linked articles from HN posts, enriching the data available for summarization and the newsletter. Ensure this functionality is triggered by database events and can be tested via API/CLI (if retained). Implement the database event mechanism to trigger subsequent processing.
|
||||
- **Epic 3: AI-Powered Content Summarization**
|
||||
- Goal: Integrate AI summarization capabilities, by implementing and using a configurable and testable `LLMFacade`, to generate concise summaries of articles and comments from prompts stored in the database. This will enrich the newsletter content, be triggerable via API/CLI, is triggered by database events, and track progress via `WorkflowTrackerService`.
|
||||
- **Epic 4: Automated Newsletter Creation and Distribution**
|
||||
- Goal: Automate the generation and delivery of the daily newsletter by implementing and using a configurable `EmailDispatchFacade`. This includes handling podcast link availability, being triggerable via API/CLI, orchestration by `CheckWorkflowCompletionService`, and status tracking via `WorkflowTrackerService`.
|
||||
- **Epic 5: Podcast Generation Integration**
|
||||
- Goal: Integrate with an audio generation API (initially Play.ht) by implementing and using a configurable `AudioGenerationFacade` to create podcast versions of the newsletter. This includes handling webhooks to update newsletter data and workflow status. Ensure this is triggerable via API/CLI, orchestrated appropriately, and uses `WorkflowTrackerService`.
|
||||
- **Epic 6: Web Interface for Initial Structure and Content Access**
|
||||
- Goal: Develop a user-friendly, responsive, and accessible web interface, based on the `frontend-architecture.md`, to display newsletters and provide access to podcast content, aligning with the project's visual and technical guidelines. All UI development within this epic must adhere to the "synthwave technical glowing purple vibes" aesthetic using Tailwind CSS and Shadcn UI, ensure basic mobile responsiveness, meet WCAG 2.1 Level A accessibility guidelines (including semantic HTML, keyboard navigation, alt text, color contrast), and optimize images using `next/image`, as detailed in the `frontend-architecture.txt` and `ui-ux-spec.txt`.
|
||||
|
||||
---
|
||||
|
||||
**Epic 1: Project Initialization, Setup, and HN Content Acquisition**
|
||||
|
||||
> This section has been moved to a dedicated document: [Epic 1: Project Initialization, Setup, and HN Content Acquisition](./epic-1.md)
|
||||
|
||||
---
|
||||
|
||||
**Epic 2: Article Scraping**
|
||||
|
||||
> This section has been moved to a dedicated document: [Epic 2: Article Scraping](./epic-2.md)
|
||||
|
||||
---
|
||||
|
||||
**Epic 3: AI-Powered Content Summarization**
|
||||
|
||||
> This section has been moved to a dedicated document: [Epic 3: AI-Powered Content Summarization](./epic-3.md)
|
||||
|
||||
---
|
||||
|
||||
**Epic 4: Automated Newsletter Creation and Distribution**
|
||||
|
||||
> This section has been moved to a dedicated document: [Epic 4: Automated Newsletter Creation and Distribution](./epic-4.md)
|
||||
|
||||
---
|
||||
|
||||
**Epic 5: Podcast Generation Integration**
|
||||
|
||||
> This section has been moved to a dedicated document: [Epic 5: Podcast Generation Integration](./epic-5.md)
|
||||
|
||||
---
|
||||
|
||||
**Epic 6: Web Interface for Initial Structure and Content Access**
|
||||
|
||||
> This section has been moved to a dedicated document: [Epic 6: Web Interface for Initial Structure and Content Access](./epic-6.md)
|
||||
|
||||
---
|
||||
|
||||
## Out of Scope Ideas Post MVP
|
||||
|
||||
- User Authentication and Management
|
||||
- Subscription Management
|
||||
- Admin Dashboard
|
||||
- Viewing and updating daily podcast settings
|
||||
- Prompt management for summarization
|
||||
- UI for template modification
|
||||
- Enhanced Newsletter Customization
|
||||
- Additional Content Digests
|
||||
- Configuration and creation of different digests
|
||||
- Support for content sources beyond Hacker News
|
||||
- Advanced scraping techniques (e.g., Playwright)
|
||||
|
||||
## Change Log
|
||||
|
||||
| Change | Date | Version | Description | Author |
|
||||
| :----------------------------------------------- | :--------- | :------ | :-------------------------------------------------------------------------------------------------------------------------------------------------------- | :----- |
|
||||
| Initial Draft | 2025-05-13 | 0.1 | Initial draft of the Product Requirements Document | 2-pm |
|
||||
| Updates from Arch suggestions & Checklist Review | 2025-05-14 | 0.3 | Incorporated changes from `arch-suggested-changes.txt`, `fea-suggested-changes.txt`, and Master Checklist review, including new stories & AC refinements. | 5-posm |
|
||||
@@ -0,0 +1,110 @@
|
||||
# Project Structure
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "Project Structure".
|
||||
|
||||
The BMad DiCaster project is organized as a monorepo, leveraging the Vercel/Supabase Next.js App Router template as its foundation.
|
||||
|
||||
```plaintext
|
||||
{project-root}/
|
||||
├── app/ # Next.js App Router
|
||||
│ ├── (api)/ # API route handlers
|
||||
│ │ ├── system/
|
||||
│ │ │ ├── trigger-workflow/route.ts
|
||||
│ │ │ └── workflow-status/[jobId]/route.ts
|
||||
│ │ └── webhooks/
|
||||
│ │ └── playht/route.ts
|
||||
│ ├── components/ # Application-specific UI react components
|
||||
│ │ └── core/ # e.g., NewsletterCard, PodcastPlayer
|
||||
│ ├── newsletters/
|
||||
│ │ ├── [newsletterId]/page.tsx
|
||||
│ │ └── page.tsx
|
||||
│ ├── auth/ # Auth-related pages and components (from template)
|
||||
│ ├── login/page.tsx # Login page (from template)
|
||||
│ ├── layout.tsx
|
||||
│ └── page.tsx # Homepage
|
||||
├── components/ # Shadcn UI components root (as configured by components.json)
|
||||
│ ├── tutorial/ # Example/template components (can be removed)
|
||||
│ ├── typography/ # Example/template components (can be removed)
|
||||
│ └── ui/ # Base UI elements (button.tsx, card.tsx etc.)
|
||||
├── docs/ # Project documentation
|
||||
│ ├── prd.md # Or prd-incremental-full-agile-mode.txt
|
||||
│ ├── architecture.md # This document
|
||||
│ ├── ui-ux-spec.md # Or ui-ux-spec.txt
|
||||
│ ├── technical-preferences.md # Or technical-preferences copy.txt
|
||||
│ ├── ADR/ # Architecture Decision Records (to be created as needed)
|
||||
│ └── environment-vars.md # (To be created)
|
||||
├── lib/ # General utility functions for frontend (e.g., utils.ts from template)
|
||||
│ └── utils.ts
|
||||
├── supabase/ # Supabase specific project files (backend logic)
|
||||
│ ├── functions/ # Supabase Edge Functions (for event-driven pipeline)
|
||||
│ │ ├── hn-content-service/index.ts
|
||||
│ │ ├── article-scraper-service/index.ts
|
||||
│ │ ├── summarization-service/index.ts
|
||||
│ │ ├── podcast-generation-service/index.ts
|
||||
│ │ ├── newsletter-generation-service/index.ts
|
||||
│ │ ├── check-workflow-completion-service/index.ts # Cron-triggered orchestrator
|
||||
│ │ └── _shared/ # Shared utilities/facades FOR Supabase backend functions
|
||||
│ │ ├── supabase-admin-client.ts
|
||||
│ │ ├── llm-facade.ts
|
||||
│ │ ├── playht-facade.ts
|
||||
│ │ ├── nodemailer-facade.ts
|
||||
│ │ └── workflow-tracker-service.ts # For updating workflow_runs table
|
||||
│ ├── migrations/ # Database schema migrations
|
||||
│ │ └── YYYYMMDDHHMMSS_initial_schema.sql
|
||||
│ └── config.toml # Supabase project configuration (for CLI)
|
||||
├── public/ # Static assets (images, favicon, etc.)
|
||||
├── shared/ # Shared code/types between frontend and Supabase functions
|
||||
│ └── types/
|
||||
│ ├── api-schemas.ts # Request/response types for app/(api) routes
|
||||
│ ├── domain-models.ts # Core entity types (HNPost, ArticleSummary etc.)
|
||||
│ └── index.ts # Barrel file for shared types
|
||||
├── styles/ # Global styles (e.g., globals.css for Tailwind base)
|
||||
├── tests/ # Automated tests
|
||||
│ ├── e2e/ # Playwright E2E tests
|
||||
│ │ ├── newsletter-view.spec.ts
|
||||
│ │ └── playwright.config.ts
|
||||
│ └── integration/ # Integration tests
|
||||
│ └── api-trigger-workflow.integration.test.ts
|
||||
│ # Unit tests are co-located with source files, e.g., app/components/core/MyComponent.test.tsx
|
||||
├── utils/ # Root utilities (from template)
|
||||
│ └── supabase/ # Supabase helper functions FOR FRONTEND (from template)
|
||||
│ ├── client.ts # Client-side Supabase client
|
||||
│ ├── middleware.ts # Logic for Next.js middleware
|
||||
│ └── server.ts # Server-side Supabase client
|
||||
├── .env.example
|
||||
├── .gitignore
|
||||
├── components.json # Shadcn UI configuration
|
||||
├── middleware.ts # Next.js middleware (root, uses utils/supabase/middleware.ts)
|
||||
├── next-env.d.ts
|
||||
├── next.config.mjs
|
||||
├── package.json
|
||||
├── postcss.config.js
|
||||
├── README.md
|
||||
├── tailwind.config.ts
|
||||
└── tsconfig.json
|
||||
```
|
||||
|
||||
### Key Directory Descriptions:
|
||||
|
||||
- **`app/`**: Next.js frontend (pages, UI components, Next.js API routes).
|
||||
- **`app/(api)/`**: Backend API routes hosted on Vercel, including webhook receivers and system triggers.
|
||||
- **`app/components/core/`**: Application-specific reusable React components.
|
||||
- **`components/`**: Root for Shadcn UI components.
|
||||
- **`docs/`**: All project documentation.
|
||||
- **`lib/`**: Frontend-specific utility functions.
|
||||
- **`supabase/functions/`**: Backend serverless functions (event-driven pipeline steps).
|
||||
- **`supabase/functions/_shared/`**: Utilities and facades for these backend functions, including `WorkflowTrackerService`.
|
||||
- **`supabase/migrations/`**: Database migrations managed by Supabase CLI.
|
||||
- **`shared/types/`**: TypeScript types/interfaces shared between frontend and `supabase/functions/`. Path alias `@shared/*` to be configured in `tsconfig.json`.
|
||||
- **`tests/`**: Contains E2E and integration tests. Unit tests are co-located with source files.
|
||||
- **`utils/supabase/`**: Frontend-focused Supabase client helpers provided by the starter template.
|
||||
|
||||
### Monorepo Management:
|
||||
|
||||
- Standard `npm` (or `pnpm`/`yarn` workspaces if adopted later) for managing dependencies.
|
||||
- The root `tsconfig.json` includes path aliases (`@/*`, `@shared/*`).
|
||||
|
||||
### Notes:
|
||||
|
||||
- Supabase functions in `supabase/functions/` are deployed to Vercel via Supabase CLI and Vercel integration.
|
||||
- The `CheckWorkflowCompletionService` might be invoked via a Vercel Cron Job calling a simple HTTP trigger endpoint for that function, or via `pg_cron` if direct database scheduling is preferred.
|
||||
@@ -0,0 +1,216 @@
|
||||
# Core Workflow / Sequence Diagrams
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "Core Workflow / Sequence Diagrams".
|
||||
|
||||
These diagrams illustrate the key sequences of operations in the BMad DiCaster system.
|
||||
|
||||
### 1\. Daily Workflow Initiation & HN Content Acquisition
|
||||
|
||||
This diagram shows the manual/API trigger initiating a new workflow run, followed by the fetching of Hacker News posts and comments.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
actor Caller as Manual/API/CLI/Cron
|
||||
participant TriggerAPI as POST /api/system/trigger-workflow
|
||||
participant WorkflowRunsDB as workflow_runs (DB Table)
|
||||
participant WorkflowTracker as WorkflowTrackerService
|
||||
participant HNContentService as HNContentService (Supabase Fn)
|
||||
participant HNAlgoliaAPI as HN Algolia API
|
||||
participant HNPostsDB as hn_posts (DB Table)
|
||||
participant HNCommentsDB as hn_comments (DB Table)
|
||||
participant EventTrigger1 as DB Event/Webhook (on hn_posts insert)
|
||||
|
||||
Caller->>+TriggerAPI: Request to start daily workflow
|
||||
TriggerAPI->>+WorkflowTracker: initiateNewWorkflow()
|
||||
WorkflowTracker->>+WorkflowRunsDB: INSERT new run (status='pending', details={})
|
||||
WorkflowRunsDB-->>-WorkflowTracker: new_workflow_run_id
|
||||
WorkflowTracker-->>TriggerAPI: { jobId: new_workflow_run_id }
|
||||
TriggerAPI-->>-Caller: HTTP 202 Accepted { jobId }
|
||||
|
||||
alt Initial Trigger for HN Content Fetch
|
||||
WorkflowTracker->>+HNContentService: triggerFetch(workflow_run_id)
|
||||
Note over WorkflowTracker,HNContentService: This could be a direct call or an event insertion that HNContentService picks up.
|
||||
else Alternative: Event from WorkflowRunsDB insert
|
||||
WorkflowRunsDB-->>EventTrigger1: New workflow_run record
|
||||
EventTrigger1->>+HNContentService: Invoke(workflow_run_id, event_payload)
|
||||
end
|
||||
|
||||
HNContentService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'fetching_hn_posts', 'fetching_hn')
|
||||
WorkflowTracker->>+WorkflowRunsDB: UPDATE workflow_runs (status, current_step_details)
|
||||
WorkflowRunsDB-->>-WorkflowTracker: ack
|
||||
|
||||
HNContentService->>+HNAlgoliaAPI: GET /search?tags=front_page
|
||||
HNAlgoliaAPI-->>-HNContentService: Front page story items
|
||||
|
||||
loop For each story item (up to 30 after sorting by points)
|
||||
HNContentService->>+HNPostsDB: INSERT story (hn_post_id, title, url, points, created_at, workflow_run_id)
|
||||
HNPostsDB-->>EventTrigger1: Notifies: New hn_post inserted
|
||||
EventTrigger1-->>ArticleScrapingService: (Async) Trigger ArticleScrapingService(hn_post_id, workflow_run_id)
|
||||
Note right of EventTrigger1: Triggers article scraping (next diagram)
|
||||
|
||||
HNContentService->>+HNAlgoliaAPI: GET /items/{story_objectID} (to fetch comments)
|
||||
HNAlgoliaAPI-->>-HNContentService: Story details with comments
|
||||
loop For each comment
|
||||
HNContentService->>+HNCommentsDB: INSERT comment (comment_id, hn_post_id, text, author, created_at)
|
||||
HNCommentsDB-->>-HNContentService: ack
|
||||
end
|
||||
end
|
||||
HNContentService->>+WorkflowTracker: updateWorkflowDetails(workflow_run_id, {posts_fetched: X, comments_fetched: Y})
|
||||
WorkflowTracker->>+WorkflowRunsDB: UPDATE workflow_runs (details)
|
||||
WorkflowRunsDB-->>-WorkflowTracker: ack
|
||||
Note over HNContentService: HN Content Service might mark its part for the workflow as 'hn_data_fetched'. The overall workflow status will be managed by CheckWorkflowCompletionService.
|
||||
```
|
||||
|
||||
### 2\. Article Scraping & Summarization Flow
|
||||
|
||||
This diagram shows the flow starting from a new HN post being available, leading to article scraping, and then summarization of the article content and HN comments.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant EventTrigger1 as DB Event/Webhook (on hn_posts insert)
|
||||
participant ArticleScrapingService as ArticleScrapingService (Supabase Fn)
|
||||
participant ScrapedArticlesDB as scraped_articles (DB Table)
|
||||
participant WorkflowTracker as WorkflowTrackerService
|
||||
participant WorkflowRunsDB as workflow_runs (DB Table)
|
||||
participant EventTrigger2 as DB Event/Webhook (on scraped_articles insert/update)
|
||||
participant SummarizationService as SummarizationService (Supabase Fn)
|
||||
participant LLMFacade as LLMFacade (shared function)
|
||||
participant LLMProvider as LLM Provider (Ollama/Remote)
|
||||
participant SummariesDB as article_summaries / comment_summaries (DB Tables)
|
||||
participant PromptsDB as summarization_prompts (DB Table)
|
||||
|
||||
EventTrigger1->>+ArticleScrapingService: Invoke(hn_post_id, workflow_run_id, article_url)
|
||||
ArticleScrapingService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'scraping_article_for_post_' + hn_post_id, 'scraping_articles')
|
||||
WorkflowTracker->>WorkflowRunsDB: UPDATE workflow_runs (current_step_details)
|
||||
|
||||
ArticleScrapingService->>ArticleScrapingService: Identify relevant URL from hn_post (if multiple)
|
||||
ArticleScrapingService->>+ScrapedArticlesDB: INSERT new article (hn_post_id, original_url, status='pending', workflow_run_id)
|
||||
ScrapedArticlesDB-->>-ArticleScrapingService: new_scraped_article_id
|
||||
|
||||
opt Article URL is valid and scrapeable
|
||||
ArticleScrapingService->>ArticleScrapingService: Fetch HTML content from article_url (using Cheerio compatible fetch)
|
||||
ArticleScrapingService->>ArticleScrapingService: Parse HTML with Cheerio, extract title, author, date, main_text
|
||||
ArticleScrapingService->>+ScrapedArticlesDB: UPDATE scraped_articles SET main_text_content, title, author, status='success' WHERE id=new_scraped_article_id
|
||||
else Scraping fails or URL invalid
|
||||
ArticleScrapingService->>+ScrapedArticlesDB: UPDATE scraped_articles SET status='failed_parsing/unreachable', error_message='...' WHERE id=new_scraped_article_id
|
||||
end
|
||||
ScrapedArticlesDB-->>EventTrigger2: Notifies: New/Updated scraped_article (status='success')
|
||||
EventTrigger2-->>SummarizationService: (Async) Trigger SummarizationService(scraped_article_id, workflow_run_id, 'article')
|
||||
Note right of EventTrigger2: Triggers article summarization
|
||||
|
||||
ArticleScrapingService->>+WorkflowTracker: updateWorkflowDetails(workflow_run_id, {articles_attempted_increment: 1, articles_scraped_successfully_increment: (success ? 1:0) })
|
||||
WorkflowTracker->>WorkflowRunsDB: UPDATE workflow_runs (details)
|
||||
|
||||
HNPostsDB (not shown, but data is available) -- "Data for comments" --> SummarizationService
|
||||
Note right of SummarizationService: HN Comments are also summarized for the hn_post_id associated with this workflow_run_id. This might be a separate invocation or part of a broader summarization task for the post.
|
||||
SummarizationService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'summarizing_content_for_post_' + hn_post_id, 'summarizing_content')
|
||||
WorkflowTracker->>WorkflowRunsDB: UPDATE workflow_runs (current_step_details)
|
||||
|
||||
alt Summarize Article
|
||||
SummarizationService->>SummarizationService: Get text_content from scraped_articles WHERE id=scraped_article_id
|
||||
SummarizationService->>+PromptsDB: SELECT prompt_text WHERE is_default_article_prompt=TRUE
|
||||
PromptsDB-->>-SummarizationService: article_prompt_text
|
||||
SummarizationService->>+LLMFacade: generateSummary(text_content, {prompt: article_prompt_text})
|
||||
LLMFacade->>+LLMProvider: Request summary (Ollama or Remote API call)
|
||||
LLMProvider-->>-LLMFacade: summary_response
|
||||
LLMFacade-->>-SummarizationService: article_summary_text
|
||||
SummarizationService->>+SummariesDB: INSERT into article_summaries (scraped_article_id, summary_text, workflow_run_id, llm_model_used)
|
||||
SummariesDB-->>-SummarizationService: ack
|
||||
end
|
||||
|
||||
alt Summarize Comments (for each relevant hn_post_id in the workflow_run)
|
||||
SummarizationService->>SummarizationService: Get all comments for hn_post_id from hn_comments table
|
||||
SummarizationService->>SummarizationService: Concatenate/prepare comment text
|
||||
SummarizationService->>+PromptsDB: SELECT prompt_text WHERE is_default_comment_prompt=TRUE
|
||||
PromptsDB-->>-SummarizationService: comment_prompt_text
|
||||
SummarizationService->>+LLMFacade: generateSummary(all_comments_text, {prompt: comment_prompt_text})
|
||||
LLMFacade->>+LLMProvider: Request summary
|
||||
LLMProvider-->>-LLMFacade: summary_response
|
||||
LLMFacade-->>-SummarizationService: comment_summary_text
|
||||
SummarizationService->>+SummariesDB: INSERT into comment_summaries (hn_post_id, summary_text, workflow_run_id, llm_model_used)
|
||||
SummariesDB-->>-SummarizationService: ack
|
||||
end
|
||||
SummarizationService->>+WorkflowTracker: updateWorkflowDetails(workflow_run_id, {summaries_generated_increment: 1_or_2})
|
||||
WorkflowTracker->>WorkflowRunsDB: UPDATE workflow_runs (details)
|
||||
Note over SummarizationService: After all expected summaries for the workflow_run are done, the CheckWorkflowCompletionService will eventually pick this up.
|
||||
```
|
||||
|
||||
### 3\. Newsletter, Podcast, and Delivery Flow
|
||||
|
||||
This diagram shows the steps from completed summarization to newsletter generation, podcast creation, webhook handling, and final email delivery. It assumes the `CheckWorkflowCompletionService` has determined that all summaries for a given `workflow_run_id` are ready.
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant CheckWorkflowService as CheckWorkflowCompletionService (Supabase Cron Fn)
|
||||
participant WorkflowRunsDB as workflow_runs (DB Table)
|
||||
participant WorkflowTracker as WorkflowTrackerService
|
||||
participant NewsletterGenService as NewsletterGenerationService (Supabase Fn)
|
||||
participant PodcastGenService as PodcastGenerationService (Supabase Fn)
|
||||
participant PlayHTAPI as Play.ht API
|
||||
participant NewsletterTemplatesDB as newsletter_templates (DB Table)
|
||||
participant SummariesDB as article_summaries / comment_summaries (DB Tables)
|
||||
participant NewslettersDB as newsletters (DB Table)
|
||||
participant PlayHTWebhook as POST /api/webhooks/playht (Next.js API Route)
|
||||
participant NodemailerService as NodemailerFacade (shared function)
|
||||
participant SubscribersDB as subscribers (DB Table)
|
||||
participant ExternalEmailService as Email Service (e.g., Gmail SMTP)
|
||||
|
||||
CheckWorkflowService->>+WorkflowRunsDB: Query for runs with status 'summarizing_content' and all summaries complete
|
||||
WorkflowRunsDB-->>-CheckWorkflowService: workflow_run_id (ready for newsletter)
|
||||
|
||||
CheckWorkflowService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'starting_newsletter_generation', 'generating_newsletter')
|
||||
WorkflowTracker->>+WorkflowRunsDB: UPDATE workflow_runs (status, current_step_details)
|
||||
WorkflowRunsDB-->>-WorkflowTracker: ack
|
||||
CheckWorkflowService->>+NewsletterGenService: Invoke(workflow_run_id)
|
||||
|
||||
NewsletterGenService->>+NewsletterTemplatesDB: SELECT html_content, version WHERE is_default=TRUE
|
||||
NewsletterTemplatesDB-->>-NewsletterGenService: template_html, template_version
|
||||
NewsletterGenService->>+SummariesDB: SELECT article_summaries, comment_summaries WHERE workflow_run_id=...
|
||||
SummariesDB-->>-NewsletterGenService: summaries_data
|
||||
NewsletterGenService->>NewsletterGenService: Compile HTML newsletter using template and summaries_data
|
||||
NewsletterGenService->>+NewslettersDB: INSERT newsletter (workflow_run_id, title, html_content, podcast_status='pending', delivery_status='pending', target_date)
|
||||
NewslettersDB-->>-NewsletterGenService: new_newsletter_id
|
||||
|
||||
NewsletterGenService->>+PodcastGenService: initiatePodcast(new_newsletter_id, html_content_for_podcast, workflow_run_id)
|
||||
WorkflowTracker->>+WorkflowRunsDB: updateWorkflowStep(workflow_run_id, 'podcast_generation_initiated', 'generating_podcast')
|
||||
WorkflowTracker->>WorkflowRunsDB: UPDATE workflow_runs
|
||||
PodcastGenService->>+PlayHTAPI: POST /playnotes (sourceFile=html_content, webHookUrl=...)
|
||||
PlayHTAPI-->>-PodcastGenService: { playht_job_id, status: 'generating' }
|
||||
PodcastGenService->>+NewslettersDB: UPDATE newsletters SET podcast_playht_job_id, podcast_status='generating' WHERE id=new_newsletter_id
|
||||
NewslettersDB-->>-PodcastGenService: ack
|
||||
Note over NewsletterGenService, PodcastGenService: Newsletter is now generated; podcast is being generated by Play.ht. Email delivery will wait for podcast completion or timeout.
|
||||
|
||||
PlayHTAPI-->>+PlayHTWebhook: POST (status='completed', audioUrl='...', id=playht_job_id)
|
||||
PlayHTWebhook->>+NewslettersDB: UPDATE newsletters SET podcast_url, podcast_status='completed' WHERE podcast_playht_job_id=...
|
||||
NewslettersDB-->>-PlayHTWebhook: ack
|
||||
PlayHTWebhook->>+WorkflowTracker: updateWorkflowDetails(workflow_run_id_from_newsletter, {podcast_status: 'completed'})
|
||||
WorkflowTracker->>WorkflowRunsDB: UPDATE workflow_runs (details)
|
||||
PlayHTWebhook-->>-PlayHTAPI: HTTP 200 OK
|
||||
|
||||
CheckWorkflowService->>+WorkflowRunsDB: Query for runs with status 'generating_podcast' AND newsletters.podcast_status IN ('completed', 'failed') OR timeout reached
|
||||
WorkflowRunsDB-->>-CheckWorkflowService: workflow_run_id (ready for delivery)
|
||||
|
||||
CheckWorkflowService->>+WorkflowTracker: updateWorkflowStep(workflow_run_id, 'starting_newsletter_delivery', 'delivering_newsletter')
|
||||
WorkflowTracker->>+WorkflowRunsDB: UPDATE workflow_runs (status, current_step_details)
|
||||
WorkflowRunsDB-->>-WorkflowTracker: ack
|
||||
CheckWorkflowService->>+NewsletterGenService: triggerDelivery(newsletter_id_for_workflow_run)
|
||||
|
||||
|
||||
NewsletterGenService->>+NewslettersDB: SELECT html_content, podcast_url WHERE id=newsletter_id
|
||||
NewslettersDB-->>-NewsletterGenService: newsletter_data
|
||||
NewsletterGenService->>NewsletterGenService: (If podcast_url available, embed it in html_content)
|
||||
NewsletterGenService->>+SubscribersDB: SELECT email WHERE is_active=TRUE
|
||||
SubscribersDB-->>-NewsletterGenService: subscriber_emails[]
|
||||
|
||||
loop For each subscriber_email
|
||||
NewsletterGenService->>+NodemailerService: sendEmail(to=subscriber_email, subject=newsletter_title, html=final_html_content)
|
||||
NodemailerService->>+ExternalEmailService: SMTP send
|
||||
ExternalEmailService-->>-NodemailerService: delivery_success/failure
|
||||
NodemailerService-->>-NewsletterGenService: status
|
||||
end
|
||||
NewsletterGenService->>+NewslettersDB: UPDATE newsletters SET delivery_status='sent' (or 'partially_failed'), sent_at=now()
|
||||
NewslettersDB-->>-NewsletterGenService: ack
|
||||
NewsletterGenService->>+WorkflowTracker: completeWorkflow(workflow_run_id, {delivery_status: 'sent', subscribers_notified: X})
|
||||
WorkflowTracker->>+WorkflowRunsDB: UPDATE workflow_runs (status='completed', details)
|
||||
WorkflowRunsDB-->>-WorkflowTracker: ack
|
||||
```
|
||||
@@ -0,0 +1,37 @@
|
||||
# Definitive Tech Stack Selections
|
||||
|
||||
> This document is a granulated shard from the main "3-architecture.md" focusing on "Definitive Tech Stack Selections".
|
||||
|
||||
This section outlines the definitive technology choices for the BMad DiCaster project. These selections are the single source of truth for all technology choices. "Latest" implies the latest stable version available at the time of project setup (2025-05-13); the specific version chosen should be pinned in `package.json` and this document updated accordingly.
|
||||
|
||||
- **Preferred Starter Template Frontend & Backend:** Vercel/Supabase Next.js App Router Template ([https://vercel.com/templates/next.js/supabase](https://vercel.com/templates/next.js/supabase))
|
||||
|
||||
| Category | Technology | Version / Details | Description / Purpose | Justification (Optional, from PRD/User) |
|
||||
| :------------------- | :-------------------------- | :----------------------------------------- | :------------------------------------------------------------------------ | :----------------------------------------------------------- |
|
||||
| **Languages** | TypeScript | `5.7.2` | Primary language for backend/frontend | Strong typing, community support, aligns with Next.js/React |
|
||||
| **Runtime** | Node.js | `22.10.2` | Server-side execution environment for Next.js & Supabase Functions | Compatible with Next.js, Vercel environment |
|
||||
| **Frameworks** | Next.js | `latest` (e.g., 14.2.3 at time of writing) | Full-stack React framework | App Router, SSR, API routes, Vercel synergy |
|
||||
| | React | `19.0.0` | Frontend UI library | Component-based, declarative |
|
||||
| **UI Libraries** | Tailwind CSS | `3.4.17` | Utility-first CSS framework | Rapid UI development, consistent styling |
|
||||
| | Shadcn UI | `latest` (CLI based) | React component library (via CLI) | Pre-styled, accessible components, built on Radix & Tailwind |
|
||||
| **Databases** | PostgreSQL | (via Supabase) | Primary relational data store | Provided by Supabase, robust, scalable |
|
||||
| **Cloud Platform** | Vercel | N/A | Hosting platform for Next.js app & Supabase Functions | Seamless Next.js/Supabase deployment, Edge Network |
|
||||
| **Cloud Services** | Supabase Functions | N/A (via Vercel deploy) | Serverless compute for backend pipeline & APIs | Integrated with Supabase DB, event-driven capabilities |
|
||||
| | Supabase Auth | N/A | User authentication and management | Integrated with Supabase, RLS |
|
||||
| | Supabase Storage | N/A | File storage (e.g., for temporary newsletter files if needed for Play.ht) | Integrated with Supabase |
|
||||
| **Infrastructure** | Supabase CLI | `latest` | Local development, migrations, function deployment | Official tool for Supabase development |
|
||||
| | Docker | `latest` (via Supabase CLI) | Containerization for local Supabase services | Local development consistency |
|
||||
| **State Management** | Zustand | `latest` | Frontend state management | Simple, unopinionated, performant for React |
|
||||
| **Testing** | React Testing Library (RTL) | `latest` | Testing React components | User-centric testing, works well with Jest |
|
||||
| | Jest | `latest` | Unit/Integration testing framework for JS/TS | Widely used, good support for Next.js/React |
|
||||
| | Playwright | `latest` | End-to-end testing framework | Modern, reliable, cross-browser |
|
||||
| **CI/CD** | GitHub Actions | N/A | Continuous Integration/Deployment | Integration with GitHub, automation of build/deploy/test |
|
||||
| **Other Tools** | Cheerio | `latest` | HTML parsing/scraping for articles | Server-side HTML manipulation |
|
||||
| | Nodemailer | `latest` | Email sending library for newsletters | Robust email sending from Node.js |
|
||||
| | Zod | `latest` | TypeScript-first schema declaration and validation | Data validation for API inputs, environment variables etc. |
|
||||
| | `tsx` / `ts-node` | `latest` (for scripts) | TypeScript execution for Node.js scripts (e.g. `scripts/`) | Running TS scripts directly |
|
||||
| | Prettier | `3.3.3` | Code formatter | Consistent code style |
|
||||
| | ESLint | `latest` | Linter for TypeScript/JavaScript | Code quality and error prevention |
|
||||
| | Pino | `latest` | High-performance JSON logger for Node.js | Structured and efficient logging |
|
||||
|
||||
</rewritten_file>
|
||||
Reference in New Issue
Block a user