massive v2 update, and v1 moved to legacy folder
This commit is contained in:
0
LEGACY-V1/ai/stories/readme.md
Normal file
0
LEGACY-V1/ai/stories/readme.md
Normal file
187
LEGACY-V1/ai/templates/architecture-template.md
Normal file
187
LEGACY-V1/ai/templates/architecture-template.md
Normal file
@@ -0,0 +1,187 @@
|
||||
# Architecture for {PRD Title}
|
||||
|
||||
Status: { Draft | Approved }
|
||||
|
||||
## Technical Summary
|
||||
|
||||
{ Short 1-2 paragraph }
|
||||
|
||||
## Technology Table
|
||||
|
||||
Table listing choices for languages, libraries, infra, cloud resources, etc... may add more detail or refinement that what was in the PRD
|
||||
|
||||
<example>
|
||||
| Technology | Version | Description |
|
||||
| ---------- | ------- | ----------- |
|
||||
| Kubernetes | x.y.z | Container orchestration platform for microservices deployment |
|
||||
| Apache Kafka | x.y.z | Event streaming platform for real-time data ingestion |
|
||||
| TimescaleDB | x.y.z | Time-series database for sensor data storage |
|
||||
| Go | x.y.z | Primary language for data processing services |
|
||||
| GoRilla Mux | x.y.z | REST API Framework |
|
||||
| Python | x.y.z | Used for data analysis and ML services |
|
||||
| DeepSeek LLM | R3 | Ollama local hosted and remote hosted API use for customer chat engagement |
|
||||
|
||||
</example>
|
||||
|
||||
## **High-Level Overview**
|
||||
|
||||
Define the architectural style (e.g., Monolith, Microservices, Serverless) and justify the choice based on the PRD. Include a high-level diagram (e.g., C4 Context or Container level using Mermaid syntax).
|
||||
|
||||
### **Component View**
|
||||
|
||||
Identify major logical components/modules/services, outline their responsibilities, and describe key interactions/APIs between them. Include diagrams if helpful (e.g., C4 Container/Component or class diagrams using Mermaid syntax).
|
||||
|
||||
## Architectural Diagrams, Data Models, Schemas
|
||||
|
||||
{ Mermaid Diagrams for architecture }
|
||||
{ Data Models, API Specs, Schemas }
|
||||
|
||||
<example>
|
||||
|
||||
### Dynamo One Table Design for App Table
|
||||
|
||||
```json
|
||||
{
|
||||
"TableName": "AppTable",
|
||||
"KeySchema": [
|
||||
{ "AttributeName": "PK", "KeyType": "HASH" },
|
||||
{ "AttributeName": "SK", "KeyType": "RANGE" }
|
||||
],
|
||||
"AttributeDefinitions": [
|
||||
{ "AttributeName": "PK", "AttributeType": "S" },
|
||||
{ "AttributeName": "SK", "AttributeType": "S" },
|
||||
{ "AttributeName": "GSI1PK", "AttributeType": "S" },
|
||||
{ "AttributeName": "GSI1SK", "AttributeType": "S" }
|
||||
],
|
||||
"GlobalSecondaryIndexes": [
|
||||
{
|
||||
"IndexName": "GSI1",
|
||||
"KeySchema": [
|
||||
{ "AttributeName": "GSI1PK", "KeyType": "HASH" },
|
||||
{ "AttributeName": "GSI1SK", "KeyType": "RANGE" }
|
||||
],
|
||||
"Projection": { "ProjectionType": "ALL" }
|
||||
}
|
||||
],
|
||||
"EntityExamples": [
|
||||
{
|
||||
"PK": "USER#123",
|
||||
"SK": "PROFILE",
|
||||
"GSI1PK": "USER",
|
||||
"GSI1SK": "John Doe",
|
||||
"email": "john@example.com",
|
||||
"createdAt": "2023-05-01T12:00:00Z"
|
||||
},
|
||||
{
|
||||
"PK": "USER#123",
|
||||
"SK": "ORDER#456",
|
||||
"GSI1PK": "ORDER",
|
||||
"GSI1SK": "2023-05-15T09:30:00Z",
|
||||
"total": 129.99,
|
||||
"status": "shipped"
|
||||
},
|
||||
{
|
||||
"PK": "PRODUCT#789",
|
||||
"SK": "DETAILS",
|
||||
"GSI1PK": "PRODUCT",
|
||||
"GSI1SK": "Wireless Headphones",
|
||||
"price": 79.99,
|
||||
"inventory": 42
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Sequence Diagram for Recording Alerts
|
||||
|
||||
```mermaid
|
||||
sequenceDiagram
|
||||
participant Sensor
|
||||
participant API
|
||||
participant ProcessingService
|
||||
participant Database
|
||||
participant NotificationService
|
||||
|
||||
Sensor->>API: Send sensor reading
|
||||
API->>ProcessingService: Forward reading data
|
||||
ProcessingService->>ProcessingService: Validate & analyze data
|
||||
alt Is threshold exceeded
|
||||
ProcessingService->>Database: Store alert
|
||||
ProcessingService->>NotificationService: Trigger notification
|
||||
NotificationService->>NotificationService: Format alert message
|
||||
NotificationService-->>API: Send notification status
|
||||
else Normal reading
|
||||
ProcessingService->>Database: Store reading only
|
||||
end
|
||||
Database-->>ProcessingService: Confirm storage
|
||||
ProcessingService-->>API: Return processing result
|
||||
API-->>Sensor: Send acknowledgement
|
||||
```
|
||||
|
||||
### Sensor Reading Schema
|
||||
|
||||
```json
|
||||
{
|
||||
"sensor_id": "string",
|
||||
"timestamp": "datetime",
|
||||
"readings": {
|
||||
"temperature": "float",
|
||||
"pressure": "float",
|
||||
"humidity": "float"
|
||||
},
|
||||
"metadata": {
|
||||
"location": "string",
|
||||
"calibration_date": "datetime"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
</example>
|
||||
|
||||
## Project Structure
|
||||
|
||||
{ Diagram the folder and file organization structure along with descriptions }
|
||||
|
||||
```
|
||||
├ /src
|
||||
├── /services
|
||||
│ ├── /gateway # Sensor data ingestion
|
||||
│ ├── /processor # Data processing and validation
|
||||
│ ├── /analytics # Data analysis and ML
|
||||
│ └── /notifier # Alert and notification system
|
||||
├── /deploy
|
||||
│ ├── /kubernetes # K8s manifests
|
||||
│ └── /terraform # Infrastructure as Code
|
||||
└── /docs
|
||||
├── /api # API documentation
|
||||
└── /schemas # Data schemas
|
||||
```
|
||||
|
||||
## Testing Requirements and Framework
|
||||
|
||||
### Patterns and Standards (Opinionated & Specific)
|
||||
|
||||
- **Architectural/Design Patterns:** Mandate specific patterns to be used (e.g., Repository Pattern for data access, MVC/MVVM for structure, CQRS if applicable). .
|
||||
|
||||
- **API Design Standards:** Define the API style (e.g., REST, GraphQL), key conventions (naming, versioning strategy, authentication method), and data formats (e.g., JSON).
|
||||
|
||||
- **Coding Standards:** Specify the mandatory style guide (e.g., Airbnb JavaScript Style Guide, PEP 8), code formatter (e.g., Prettier), and linter (e.g., ESLint with specific config). Define mandatory naming conventions (files, variables, classes). Define test file location conventions.
|
||||
|
||||
- **Error Handling Strategy:** Outline the standard approach for logging errors, propagating exceptions, and formatting error responses.
|
||||
|
||||
### Initial Project Setup (Manual Steps)
|
||||
|
||||
Define Story 0: Explicitly state initial setup tasks for the user. Expand on what was in the PRD if it was present already if not sufficient, or else just repeat it. Examples:
|
||||
|
||||
- Framework CLI Generation: Specify exact command (e.g., `npx create-next-app@latest...`, `ng new...`). Justify why manual is preferred.
|
||||
- Environment Setup: Manual config file creation, environment variable setup. Register for Cloud DB Account.
|
||||
- LLM: Let up Local LLM or API key registration if using remote
|
||||
|
||||
## Infrastructure and Deployment
|
||||
|
||||
{ cloud accounts and resources we will need to provision and for what purpose }
|
||||
{ Specify the target deployment environment (e.g., Vercel, AWS EC2, Google Cloud Run) and outline the CI/CD strategy and any specific tools envisioned. }
|
||||
|
||||
## Change Log
|
||||
|
||||
{ table of changes }
|
||||
118
LEGACY-V1/ai/templates/prd-template.md
Normal file
118
LEGACY-V1/ai/templates/prd-template.md
Normal file
@@ -0,0 +1,118 @@
|
||||
# {Project Name} PRD
|
||||
|
||||
## Status: { Draft | Approved }
|
||||
|
||||
## Intro
|
||||
|
||||
{ Short 1-2 paragraph describing the what and why of what the prd will achieve, as outlined in the project brief or through user questioning }
|
||||
|
||||
## Goals and Context
|
||||
|
||||
{
|
||||
A short summarization of the project brief, with highlights on:
|
||||
|
||||
- Clear project objectives
|
||||
- Measurable outcomes
|
||||
- Success criteria
|
||||
- Key performance indicators (KPIs)
|
||||
}
|
||||
|
||||
## Features and Requirements
|
||||
|
||||
{
|
||||
|
||||
- Functional requirements
|
||||
- Non-functional requirements
|
||||
- User experience requirements
|
||||
- Integration requirements
|
||||
- Testing requirements
|
||||
}
|
||||
|
||||
## Epic Story List
|
||||
|
||||
{ We will test fully before each story is complete, so there will be no dedicated Epic and stories at the end for testing }
|
||||
|
||||
### Epic 0: Initial Manual Set Up or Provisioning
|
||||
|
||||
- stories or tasks the user might need to perform, such as register or set up an account or provide api keys, manually configure some local resources like an LLM, etc...
|
||||
|
||||
### Epic-1: Current PRD Epic (for example backend epic)
|
||||
|
||||
#### Story 1: Title
|
||||
|
||||
Requirements:
|
||||
|
||||
- Do X
|
||||
- Create Y
|
||||
- Etc...
|
||||
|
||||
### Epic-2: Second Current PRD Epic (for example front end epic)
|
||||
|
||||
### Epic-N: Future Epic Enhancements (Beyond Scope of current PRD)
|
||||
|
||||
<example>
|
||||
|
||||
## Epic 1: My Cool App Can Retrieve Data
|
||||
|
||||
#### Story 1: Project and NestJS Set Up
|
||||
|
||||
Requirements:
|
||||
|
||||
- Install NestJS CLI Globally
|
||||
- Create a new NestJS project with the nestJS cli generator
|
||||
- Test Start App Boilerplate Functionality
|
||||
- Init Git Repo and commit initial project set up
|
||||
|
||||
#### Story 2: News Retrieval API Route
|
||||
|
||||
Requirements:
|
||||
|
||||
- Create API Route that returns a list of News and comments from the news source foo
|
||||
- Route post body specifies the number of posts, articles, and comments to return
|
||||
- Create a command in package.json that I can use to call the API Route (route configured in env.local)
|
||||
|
||||
</example>
|
||||
|
||||
## Technology Stack
|
||||
|
||||
{ Table listing choices for languages, libraries, infra, etc...}
|
||||
|
||||
<example>
|
||||
| Technology | Version | Description |
|
||||
| ---------- | ------- | ----------- |
|
||||
| Kubernetes | x.y.z | Container orchestration platform for microservices deployment |
|
||||
| Apache Kafka | x.y.z | Event streaming platform for real-time data ingestion |
|
||||
| TimescaleDB | x.y.z | Time-series database for sensor data storage |
|
||||
| Go | x.y.z | Primary language for data processing services |
|
||||
| GoRilla Mux | x.y.z | REST API Framework |
|
||||
| Python | x.y.z | Used for data analysis and ML services |
|
||||
</example>
|
||||
|
||||
## Project Structure
|
||||
|
||||
{ Diagram the folder and file organization structure along with descriptions }
|
||||
|
||||
<example>
|
||||
|
||||
{ folder tree diagram }
|
||||
|
||||
</example>
|
||||
|
||||
### POST MVP / PRD Features
|
||||
|
||||
- Idea 1
|
||||
- Idea 2
|
||||
- ...
|
||||
- Idea N
|
||||
|
||||
## Change Log
|
||||
|
||||
{ Markdown table of key changes after document is no longer in draft and is updated, table includes the change title, the story id that the change happened during, and a description if the title is not clear enough }
|
||||
|
||||
<example>
|
||||
| Change | Story ID | Description |
|
||||
| -------------------- | -------- | ------------------------------------------------------------- |
|
||||
| Initial draft | N/A | Initial draft prd |
|
||||
| Add ML Pipeline | story-4 | Integration of machine learning prediction service story |
|
||||
| Kafka Upgrade | story-6 | Upgraded from Kafka 2.0 to Kafka 3.0 for improved performance |
|
||||
</example>
|
||||
53
LEGACY-V1/ai/templates/story-template.md
Normal file
53
LEGACY-V1/ai/templates/story-template.md
Normal file
@@ -0,0 +1,53 @@
|
||||
# Story {N}: {Title}
|
||||
|
||||
## Story
|
||||
|
||||
**As a** {role}
|
||||
**I want** {action}
|
||||
**so that** {benefit}.
|
||||
|
||||
## Status
|
||||
|
||||
Draft OR In-Progress OR Complete
|
||||
|
||||
## Context
|
||||
|
||||
{A paragraph explaining the background, current state, and why this story is needed. Include any relevant technical context or business drivers.}
|
||||
|
||||
## Estimation
|
||||
|
||||
Story Points: {Story Points (1 SP=1 day of Human Development, or 10 minutes of AI development)}
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
1. - [ ] {First criterion - ordered by logical progression}
|
||||
2. - [ ] {Second criterion}
|
||||
3. - [ ] {Third criterion}
|
||||
{Use - [x] for completed items}
|
||||
|
||||
## Subtasks
|
||||
|
||||
1. - [ ] {Major Task Group 1}
|
||||
1. - [ ] {Subtask}
|
||||
2. - [ ] {Subtask}
|
||||
3. - [ ] {Subtask}
|
||||
2. - [ ] {Major Task Group 2}
|
||||
1. - [ ] {Subtask}
|
||||
2. - [ ] {Subtask}
|
||||
3. - [ ] {Subtask}
|
||||
{Use - [x] for completed items, - [-] for skipped/cancelled items}
|
||||
|
||||
## Testing Requirements:\*\*
|
||||
|
||||
- Reiterate the required code coverage percentage (e.g., >= 85%).
|
||||
|
||||
## Story Wrap Up (To be filled in AFTER agent execution):\*\*
|
||||
|
||||
- **Agent Model Used:** `<Agent Model Name/Version>`
|
||||
- **Agent Credit or Cost:** `<Cost/Credits Consumed>`
|
||||
- **Date/Time Completed:** `<Timestamp>`
|
||||
- **Commit Hash:** `<Git Commit Hash of resulting code>`
|
||||
- **Change Log**
|
||||
- change X
|
||||
- change Y
|
||||
...
|
||||
Reference in New Issue
Block a user