mirror of
https://github.com/eyaltoledano/claude-task-master.git
synced 2026-01-30 06:12:05 +00:00
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com> Co-authored-by: Cedric Hurst <cedric@spantree.net> Closes #1555
This commit is contained in:
40
.changeset/task-metadata-field.md
Normal file
40
.changeset/task-metadata-field.md
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
---
|
||||||
|
"task-master-ai": minor
|
||||||
|
---
|
||||||
|
|
||||||
|
Add optional `metadata` field to tasks for storing user-defined custom data
|
||||||
|
|
||||||
|
Tasks and subtasks now support an optional `metadata` field that allows storing arbitrary JSON data such as:
|
||||||
|
- External IDs (GitHub issues, Jira tickets, Linear issues)
|
||||||
|
- Workflow data (sprints, story points, custom statuses)
|
||||||
|
- Integration data (sync timestamps, external system references)
|
||||||
|
- Custom tracking (UUIDs, version numbers, audit information)
|
||||||
|
|
||||||
|
Key features:
|
||||||
|
- **AI-Safe**: Metadata is preserved through all AI operations (update-task, expand, etc.) because AI schemas intentionally exclude this field
|
||||||
|
- **Flexible Schema**: Store any JSON-serializable data without schema changes
|
||||||
|
- **Backward Compatible**: The field is optional; existing tasks work without modification
|
||||||
|
- **Subtask Support**: Both tasks and subtasks can have their own metadata
|
||||||
|
- **MCP Tool Support**: Use `update_task` and `update_subtask` with the `metadata` parameter to update metadata (requires `TASK_MASTER_ALLOW_METADATA_UPDATES=true` in MCP server environment)
|
||||||
|
|
||||||
|
Example usage:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": 1,
|
||||||
|
"title": "Implement authentication",
|
||||||
|
"metadata": {
|
||||||
|
"githubIssue": 42,
|
||||||
|
"sprint": "Q1-S3",
|
||||||
|
"storyPoints": 5
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
MCP metadata update example:
|
||||||
|
```javascript
|
||||||
|
// With TASK_MASTER_ALLOW_METADATA_UPDATES=true set in MCP env
|
||||||
|
update_task({
|
||||||
|
id: "1",
|
||||||
|
metadata: '{"githubIssue": 42, "sprint": "Q1-S3"}'
|
||||||
|
})
|
||||||
|
```
|
||||||
@@ -9,7 +9,7 @@ description: "Tasks in Task Master follow a specific format designed to provide
|
|||||||
Tasks in tasks.json have the following structure:
|
Tasks in tasks.json have the following structure:
|
||||||
|
|
||||||
| Field | Description | Example |
|
| Field | Description | Example |
|
||||||
| -------------- | ---------------------------------------------- | ------------------------------------------------------ |
|
| -------------- | ----------------------------------------------- | ------------------------------------------------------ |
|
||||||
| `id` | Unique identifier for the task. | `1` |
|
| `id` | Unique identifier for the task. | `1` |
|
||||||
| `title` | Brief, descriptive title. | `"Initialize Repo"` |
|
| `title` | Brief, descriptive title. | `"Initialize Repo"` |
|
||||||
| `description` | What the task involves. | `"Create a new repository, set up initial structure."` |
|
| `description` | What the task involves. | `"Create a new repository, set up initial structure."` |
|
||||||
@@ -19,6 +19,7 @@ Tasks in tasks.json have the following structure:
|
|||||||
| `details` | Implementation instructions. | `"Use GitHub client ID/secret, handle callback..."` |
|
| `details` | Implementation instructions. | `"Use GitHub client ID/secret, handle callback..."` |
|
||||||
| `testStrategy` | How to verify success. | `"Deploy and confirm 'Hello World' response."` |
|
| `testStrategy` | How to verify success. | `"Deploy and confirm 'Hello World' response."` |
|
||||||
| `subtasks` | Nested subtasks related to the main task. | `[{"id": 1, "title": "Configure OAuth", ...}]` |
|
| `subtasks` | Nested subtasks related to the main task. | `[{"id": 1, "title": "Configure OAuth", ...}]` |
|
||||||
|
| `metadata` | Optional user-defined data (see below). | `{"githubIssue": 42, "sprint": "Q1-S3"}` |
|
||||||
|
|
||||||
## Task File Format
|
## Task File Format
|
||||||
|
|
||||||
@@ -38,6 +39,158 @@ Individual task files follow this format:
|
|||||||
<verification approach>
|
<verification approach>
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## User-Defined Metadata Field
|
||||||
|
|
||||||
|
The `metadata` field allows you to store arbitrary custom data on tasks without requiring schema changes. This is useful for:
|
||||||
|
|
||||||
|
- **External IDs**: Link tasks to GitHub issues, Jira tickets, Linear issues, etc.
|
||||||
|
- **Workflow data**: Track sprints, story points, custom statuses
|
||||||
|
- **Integration data**: Store sync timestamps, external system references
|
||||||
|
- **Custom tracking**: UUIDs, version numbers, audit information
|
||||||
|
|
||||||
|
### Key Characteristics
|
||||||
|
|
||||||
|
<CardGroup cols={2}>
|
||||||
|
<Card title="Fully Optional" icon="toggle-off">
|
||||||
|
The field is optional. Existing tasks work without it.
|
||||||
|
</Card>
|
||||||
|
|
||||||
|
<Card title="AI-Safe" icon="shield">
|
||||||
|
AI operations preserve your metadata - it's never overwritten by AI.
|
||||||
|
</Card>
|
||||||
|
|
||||||
|
<Card title="Flexible Schema" icon="shapes">
|
||||||
|
Store any JSON-serializable data: strings, numbers, objects, arrays.
|
||||||
|
</Card>
|
||||||
|
|
||||||
|
<Card title="Subtask Support" icon="list-tree">
|
||||||
|
Both tasks and subtasks can have their own metadata.
|
||||||
|
</Card>
|
||||||
|
</CardGroup>
|
||||||
|
|
||||||
|
### Usage Examples
|
||||||
|
|
||||||
|
**GitHub Issue Linking**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": 1,
|
||||||
|
"title": "Implement authentication",
|
||||||
|
"metadata": {
|
||||||
|
"githubIssue": 42,
|
||||||
|
"githubIssueUrl": "https://github.com/org/repo/issues/42"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Sprint & Project Management**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": 2,
|
||||||
|
"title": "Refactor API endpoints",
|
||||||
|
"metadata": {
|
||||||
|
"sprint": "Q1-S3",
|
||||||
|
"storyPoints": 5,
|
||||||
|
"epic": "API Modernization"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**External System Integration**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": 3,
|
||||||
|
"title": "Fix login bug",
|
||||||
|
"metadata": {
|
||||||
|
"jira": {
|
||||||
|
"key": "PROJ-123",
|
||||||
|
"type": "bug",
|
||||||
|
"priority": "P1"
|
||||||
|
},
|
||||||
|
"importedAt": "2024-01-15T10:30:00Z",
|
||||||
|
"lastSyncedAt": "2024-01-20T14:00:00Z"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Stable UUID Tracking**
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": 4,
|
||||||
|
"title": "Add user preferences",
|
||||||
|
"metadata": {
|
||||||
|
"uuid": "550e8400-e29b-41d4-a716-446655440000",
|
||||||
|
"version": 2,
|
||||||
|
"createdBy": "import-script"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
<Warning>
|
||||||
|
**Security Note**: Do not store secrets, API keys, or sensitive credentials in
|
||||||
|
the metadata field. Task data may be visible in logs, exports, or shared with
|
||||||
|
AI providers.
|
||||||
|
</Warning>
|
||||||
|
|
||||||
|
### Metadata Behavior
|
||||||
|
|
||||||
|
| Operation | Metadata Behavior |
|
||||||
|
| ---------------- | ------------------------------------------------------------ |
|
||||||
|
| `parse-prd` | New tasks are created without metadata |
|
||||||
|
| `update-task` | Existing metadata is preserved unless explicitly changed |
|
||||||
|
| `expand` | Parent task metadata is preserved; subtasks don't inherit it |
|
||||||
|
| `update-subtask` | Subtask metadata is preserved |
|
||||||
|
| Manual edit | You can add/modify metadata directly in tasks.json |
|
||||||
|
| MCP (with flag) | Use the `metadata` parameter to explicitly update metadata |
|
||||||
|
|
||||||
|
### Updating Metadata via MCP
|
||||||
|
|
||||||
|
The `update_task` and `update_subtask` MCP tools support a `metadata` parameter for updating task metadata. This feature is disabled by default for safety.
|
||||||
|
|
||||||
|
**To enable MCP metadata updates:**
|
||||||
|
|
||||||
|
Add `TASK_MASTER_ALLOW_METADATA_UPDATES=true` to your MCP server environment configuration in `.mcp.json`:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"mcpServers": {
|
||||||
|
"task-master-ai": {
|
||||||
|
"command": "npx",
|
||||||
|
"args": ["-y", "task-master-ai"],
|
||||||
|
"env": {
|
||||||
|
"TASK_MASTER_ALLOW_METADATA_UPDATES": "true",
|
||||||
|
"ANTHROPIC_API_KEY": "your_key_here"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Usage example:**
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Update task metadata (merges with existing)
|
||||||
|
update_task({
|
||||||
|
id: "1",
|
||||||
|
projectRoot: "/path/to/project",
|
||||||
|
metadata: '{"githubIssue": 42, "sprint": "Q1-S3"}'
|
||||||
|
})
|
||||||
|
|
||||||
|
// Update only metadata (no prompt required)
|
||||||
|
update_task({
|
||||||
|
id: "1",
|
||||||
|
projectRoot: "/path/to/project",
|
||||||
|
metadata: '{"status": "reviewed"}'
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
<Note>
|
||||||
|
The `metadata` parameter accepts a JSON string. The new metadata is merged with existing metadata, allowing you to update specific fields without losing others.
|
||||||
|
</Note>
|
||||||
|
|
||||||
## Features in Detail
|
## Features in Detail
|
||||||
|
|
||||||
<AccordionGroup>
|
<AccordionGroup>
|
||||||
@@ -56,7 +209,7 @@ The generated report contains:
|
|||||||
- Recommended number of subtasks based on complexity
|
- Recommended number of subtasks based on complexity
|
||||||
- AI-generated expansion prompts customized for each task
|
- AI-generated expansion prompts customized for each task
|
||||||
- Ready-to-run expansion commands directly within each task analysis
|
- Ready-to-run expansion commands directly within each task analysis
|
||||||
</Accordion>
|
</Accordion>
|
||||||
|
|
||||||
<Accordion title="Viewing Complexity Report">
|
<Accordion title="Viewing Complexity Report">
|
||||||
The `complexity-report` command:
|
The `complexity-report` command:
|
||||||
@@ -67,7 +220,7 @@ The `complexity-report` command:
|
|||||||
- Highlights tasks recommended for expansion based on threshold score
|
- Highlights tasks recommended for expansion based on threshold score
|
||||||
- Includes ready-to-use expansion commands for each complex task
|
- Includes ready-to-use expansion commands for each complex task
|
||||||
- If no report exists, offers to generate one on the spot
|
- If no report exists, offers to generate one on the spot
|
||||||
</Accordion>
|
</Accordion>
|
||||||
|
|
||||||
<Accordion title="Smart Task Expansion">
|
<Accordion title="Smart Task Expansion">
|
||||||
The `expand` command automatically checks for and uses the complexity report:
|
The `expand` command automatically checks for and uses the complexity report:
|
||||||
@@ -93,6 +246,7 @@ task-master expand --id=8
|
|||||||
# or expand all tasks
|
# or expand all tasks
|
||||||
task-master expand --all
|
task-master expand --all
|
||||||
```
|
```
|
||||||
|
|
||||||
</Accordion>
|
</Accordion>
|
||||||
|
|
||||||
<Accordion title="Finding the Next Task">
|
<Accordion title="Finding the Next Task">
|
||||||
@@ -108,7 +262,7 @@ The `next` command:
|
|||||||
- Command to mark the task as in-progress
|
- Command to mark the task as in-progress
|
||||||
- Command to mark the task as done
|
- Command to mark the task as done
|
||||||
- Commands for working with subtasks
|
- Commands for working with subtasks
|
||||||
</Accordion>
|
</Accordion>
|
||||||
|
|
||||||
<Accordion title="Viewing Specific Task Details">
|
<Accordion title="Viewing Specific Task Details">
|
||||||
The `show` command:
|
The `show` command:
|
||||||
@@ -119,7 +273,7 @@ The `show` command:
|
|||||||
- For subtasks, shows parent task relationship
|
- For subtasks, shows parent task relationship
|
||||||
- Provides contextual action suggestions based on the task's state
|
- Provides contextual action suggestions based on the task's state
|
||||||
- Works with both regular tasks and subtasks (using the format taskId.subtaskId)
|
- Works with both regular tasks and subtasks (using the format taskId.subtaskId)
|
||||||
</Accordion>
|
</Accordion>
|
||||||
</AccordionGroup>
|
</AccordionGroup>
|
||||||
|
|
||||||
## Best Practices for AI-Driven Development
|
## Best Practices for AI-Driven Development
|
||||||
@@ -130,11 +284,13 @@ The `show` command:
|
|||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
<Card title="👀 Review Tasks" icon="magnifying-glass">
|
<Card title="👀 Review Tasks" icon="magnifying-glass">
|
||||||
After parsing the PRD, review the tasks to ensure they make sense and have appropriate dependencies.
|
After parsing the PRD, review the tasks to ensure they make sense and have
|
||||||
|
appropriate dependencies.
|
||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
<Card title="📊 Analyze Complexity" icon="chart-line">
|
<Card title="📊 Analyze Complexity" icon="chart-line">
|
||||||
Use the complexity analysis feature to identify which tasks should be broken down further.
|
Use the complexity analysis feature to identify which tasks should be broken
|
||||||
|
down further.
|
||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
<Card title="⛓️ Follow Dependencies" icon="link">
|
<Card title="⛓️ Follow Dependencies" icon="link">
|
||||||
@@ -142,7 +298,8 @@ The `show` command:
|
|||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
<Card title="🔄 Update As You Go" icon="arrows-rotate">
|
<Card title="🔄 Update As You Go" icon="arrows-rotate">
|
||||||
If your implementation diverges from the plan, use the update command to keep future tasks aligned.
|
If your implementation diverges from the plan, use the update command to
|
||||||
|
keep future tasks aligned.
|
||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
<Card title="📦 Break Down Tasks" icon="boxes-stacked">
|
<Card title="📦 Break Down Tasks" icon="boxes-stacked">
|
||||||
@@ -150,14 +307,17 @@ The `show` command:
|
|||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
<Card title="🔄 Regenerate Files" icon="file-arrow-up">
|
<Card title="🔄 Regenerate Files" icon="file-arrow-up">
|
||||||
After any updates to tasks.json, regenerate the task files to keep them in sync.
|
After any updates to tasks.json, regenerate the task files to keep them in
|
||||||
|
sync.
|
||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
<Card title="💬 Provide Context" icon="comment">
|
<Card title="💬 Provide Context" icon="comment">
|
||||||
When asking the Cursor agent to help with a task, provide context about what you're trying to achieve.
|
When asking the Cursor agent to help with a task, provide context about what
|
||||||
|
you're trying to achieve.
|
||||||
</Card>
|
</Card>
|
||||||
|
|
||||||
<Card title="✅ Validate Dependencies" icon="circle-check">
|
<Card title="✅ Validate Dependencies" icon="circle-check">
|
||||||
Periodically run the validate-dependencies command to check for invalid or circular dependencies.
|
Periodically run the validate-dependencies command to check for invalid or
|
||||||
|
circular dependencies.
|
||||||
</Card>
|
</Card>
|
||||||
</CardGroup>
|
</CardGroup>
|
||||||
|
|||||||
@@ -442,3 +442,63 @@ export function withToolContext<TArgs extends { projectRoot?: string }>(
|
|||||||
}
|
}
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validates and parses metadata string for MCP tools.
|
||||||
|
* Checks environment flag, validates JSON format, and ensures metadata is a plain object.
|
||||||
|
*
|
||||||
|
* @param metadataString - JSON string to parse and validate
|
||||||
|
* @param errorResponseFn - Function to create error response
|
||||||
|
* @returns Object with parsed metadata or error
|
||||||
|
*/
|
||||||
|
export function validateMcpMetadata(
|
||||||
|
metadataString: string | null | undefined,
|
||||||
|
errorResponseFn: (message: string) => ContentResult
|
||||||
|
): { parsedMetadata: Record<string, unknown> | null; error?: ContentResult } {
|
||||||
|
// Return null if no metadata provided
|
||||||
|
if (!metadataString) {
|
||||||
|
return { parsedMetadata: null };
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if metadata updates are allowed via environment variable
|
||||||
|
const allowMetadataUpdates =
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES === 'true';
|
||||||
|
if (!allowMetadataUpdates) {
|
||||||
|
return {
|
||||||
|
parsedMetadata: null,
|
||||||
|
error: errorResponseFn(
|
||||||
|
'Metadata updates are disabled. Set TASK_MASTER_ALLOW_METADATA_UPDATES=true in your MCP server environment to enable metadata modifications.'
|
||||||
|
)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse and validate JSON
|
||||||
|
try {
|
||||||
|
const parsedMetadata = JSON.parse(metadataString);
|
||||||
|
|
||||||
|
// Ensure it's a plain object (not null, not array)
|
||||||
|
if (
|
||||||
|
typeof parsedMetadata !== 'object' ||
|
||||||
|
parsedMetadata === null ||
|
||||||
|
Array.isArray(parsedMetadata)
|
||||||
|
) {
|
||||||
|
return {
|
||||||
|
parsedMetadata: null,
|
||||||
|
error: errorResponseFn(
|
||||||
|
'Invalid metadata: must be a JSON object (not null or array)'
|
||||||
|
)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return { parsedMetadata };
|
||||||
|
} catch (parseError: unknown) {
|
||||||
|
const message =
|
||||||
|
parseError instanceof Error ? parseError.message : 'Unknown parse error';
|
||||||
|
return {
|
||||||
|
parsedMetadata: null,
|
||||||
|
error: errorResponseFn(
|
||||||
|
`Invalid metadata JSON: ${message}. Provide a valid JSON object string.`
|
||||||
|
)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -17,8 +17,9 @@ import { createLogWrapper } from '../../tools/utils.js';
|
|||||||
* @param {Object} args - Command arguments containing id, prompt, useResearch, tasksJsonPath, and projectRoot.
|
* @param {Object} args - Command arguments containing id, prompt, useResearch, tasksJsonPath, and projectRoot.
|
||||||
* @param {string} args.tasksJsonPath - Explicit path to the tasks.json file.
|
* @param {string} args.tasksJsonPath - Explicit path to the tasks.json file.
|
||||||
* @param {string} args.id - Subtask ID in format "parent.sub".
|
* @param {string} args.id - Subtask ID in format "parent.sub".
|
||||||
* @param {string} args.prompt - Information to append to the subtask.
|
* @param {string} [args.prompt] - Information to append to the subtask. Required unless only updating metadata.
|
||||||
* @param {boolean} [args.research] - Whether to use research role.
|
* @param {boolean} [args.research] - Whether to use research role.
|
||||||
|
* @param {Object} [args.metadata] - Parsed metadata object to merge into subtask metadata.
|
||||||
* @param {string} [args.projectRoot] - Project root path.
|
* @param {string} [args.projectRoot] - Project root path.
|
||||||
* @param {string} [args.tag] - Tag for the task (optional)
|
* @param {string} [args.tag] - Tag for the task (optional)
|
||||||
* @param {Object} log - Logger object.
|
* @param {Object} log - Logger object.
|
||||||
@@ -27,8 +28,9 @@ import { createLogWrapper } from '../../tools/utils.js';
|
|||||||
*/
|
*/
|
||||||
export async function updateSubtaskByIdDirect(args, log, context = {}) {
|
export async function updateSubtaskByIdDirect(args, log, context = {}) {
|
||||||
const { session } = context;
|
const { session } = context;
|
||||||
// Destructure expected args, including projectRoot
|
// Destructure expected args, including projectRoot and metadata
|
||||||
const { tasksJsonPath, id, prompt, research, projectRoot, tag } = args;
|
const { tasksJsonPath, id, prompt, research, metadata, projectRoot, tag } =
|
||||||
|
args;
|
||||||
|
|
||||||
const logWrapper = createLogWrapper(log);
|
const logWrapper = createLogWrapper(log);
|
||||||
|
|
||||||
@@ -60,9 +62,10 @@ export async function updateSubtaskByIdDirect(args, log, context = {}) {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!prompt) {
|
// At least prompt or metadata is required (validated in MCP tool layer)
|
||||||
|
if (!prompt && !metadata) {
|
||||||
const errorMessage =
|
const errorMessage =
|
||||||
'No prompt specified. Please provide the information to append.';
|
'No prompt or metadata specified. Please provide information to append or metadata to update.';
|
||||||
logWrapper.error(errorMessage);
|
logWrapper.error(errorMessage);
|
||||||
return {
|
return {
|
||||||
success: false,
|
success: false,
|
||||||
@@ -77,7 +80,7 @@ export async function updateSubtaskByIdDirect(args, log, context = {}) {
|
|||||||
const useResearch = research === true;
|
const useResearch = research === true;
|
||||||
|
|
||||||
log.info(
|
log.info(
|
||||||
`Updating subtask with ID ${subtaskIdStr} with prompt "${prompt}" and research: ${useResearch}`
|
`Updating subtask with ID ${subtaskIdStr} with prompt "${prompt || '(metadata-only)'}" and research: ${useResearch}`
|
||||||
);
|
);
|
||||||
|
|
||||||
const wasSilent = isSilentMode();
|
const wasSilent = isSilentMode();
|
||||||
@@ -98,7 +101,8 @@ export async function updateSubtaskByIdDirect(args, log, context = {}) {
|
|||||||
projectRoot,
|
projectRoot,
|
||||||
tag,
|
tag,
|
||||||
commandName: 'update-subtask',
|
commandName: 'update-subtask',
|
||||||
outputType: 'mcp'
|
outputType: 'mcp',
|
||||||
|
metadata
|
||||||
},
|
},
|
||||||
'json'
|
'json'
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -18,9 +18,10 @@ import { findTasksPath } from '../utils/path-utils.js';
|
|||||||
* @param {Object} args - Command arguments containing id, prompt, useResearch, tasksJsonPath, and projectRoot.
|
* @param {Object} args - Command arguments containing id, prompt, useResearch, tasksJsonPath, and projectRoot.
|
||||||
* @param {string} args.tasksJsonPath - Explicit path to the tasks.json file.
|
* @param {string} args.tasksJsonPath - Explicit path to the tasks.json file.
|
||||||
* @param {string} args.id - Task ID (or subtask ID like "1.2").
|
* @param {string} args.id - Task ID (or subtask ID like "1.2").
|
||||||
* @param {string} args.prompt - New information/context prompt.
|
* @param {string} [args.prompt] - New information/context prompt. Required unless only updating metadata.
|
||||||
* @param {boolean} [args.research] - Whether to use research role.
|
* @param {boolean} [args.research] - Whether to use research role.
|
||||||
* @param {boolean} [args.append] - Whether to append timestamped information instead of full update.
|
* @param {boolean} [args.append] - Whether to append timestamped information instead of full update.
|
||||||
|
* @param {Object} [args.metadata] - Parsed metadata object to merge into task metadata.
|
||||||
* @param {string} [args.projectRoot] - Project root path.
|
* @param {string} [args.projectRoot] - Project root path.
|
||||||
* @param {string} [args.tag] - Tag for the task (optional)
|
* @param {string} [args.tag] - Tag for the task (optional)
|
||||||
* @param {Object} log - Logger object.
|
* @param {Object} log - Logger object.
|
||||||
@@ -29,9 +30,17 @@ import { findTasksPath } from '../utils/path-utils.js';
|
|||||||
*/
|
*/
|
||||||
export async function updateTaskByIdDirect(args, log, context = {}) {
|
export async function updateTaskByIdDirect(args, log, context = {}) {
|
||||||
const { session } = context;
|
const { session } = context;
|
||||||
// Destructure expected args, including projectRoot
|
// Destructure expected args, including projectRoot and metadata
|
||||||
const { tasksJsonPath, id, prompt, research, append, projectRoot, tag } =
|
const {
|
||||||
args;
|
tasksJsonPath,
|
||||||
|
id,
|
||||||
|
prompt,
|
||||||
|
research,
|
||||||
|
append,
|
||||||
|
metadata,
|
||||||
|
projectRoot,
|
||||||
|
tag
|
||||||
|
} = args;
|
||||||
|
|
||||||
const logWrapper = createLogWrapper(log);
|
const logWrapper = createLogWrapper(log);
|
||||||
|
|
||||||
@@ -51,9 +60,10 @@ export async function updateTaskByIdDirect(args, log, context = {}) {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!prompt) {
|
// At least prompt or metadata is required (validated in MCP tool layer)
|
||||||
|
if (!prompt && !metadata) {
|
||||||
const errorMessage =
|
const errorMessage =
|
||||||
'No prompt specified. Please provide a prompt with new information for the task update.';
|
'No prompt or metadata specified. Please provide a prompt with new information or metadata for the task update.';
|
||||||
logWrapper.error(errorMessage);
|
logWrapper.error(errorMessage);
|
||||||
return {
|
return {
|
||||||
success: false,
|
success: false,
|
||||||
@@ -95,7 +105,7 @@ export async function updateTaskByIdDirect(args, log, context = {}) {
|
|||||||
const useResearch = research === true;
|
const useResearch = research === true;
|
||||||
|
|
||||||
logWrapper.info(
|
logWrapper.info(
|
||||||
`Updating task with ID ${taskId} with prompt "${prompt}" and research: ${useResearch}`
|
`Updating task with ID ${taskId} with prompt "${prompt || '(metadata-only)'}" and research: ${useResearch}`
|
||||||
);
|
);
|
||||||
|
|
||||||
const wasSilent = isSilentMode();
|
const wasSilent = isSilentMode();
|
||||||
@@ -116,7 +126,8 @@ export async function updateTaskByIdDirect(args, log, context = {}) {
|
|||||||
projectRoot,
|
projectRoot,
|
||||||
tag,
|
tag,
|
||||||
commandName: 'update-task',
|
commandName: 'update-task',
|
||||||
outputType: 'mcp'
|
outputType: 'mcp',
|
||||||
|
metadata
|
||||||
},
|
},
|
||||||
'json',
|
'json',
|
||||||
append || false
|
append || false
|
||||||
|
|||||||
@@ -7,7 +7,8 @@ import { TaskIdSchemaForMcp } from '@tm/core';
|
|||||||
import {
|
import {
|
||||||
createErrorResponse,
|
createErrorResponse,
|
||||||
handleApiResult,
|
handleApiResult,
|
||||||
withNormalizedProjectRoot
|
withNormalizedProjectRoot,
|
||||||
|
validateMcpMetadata
|
||||||
} from '@tm/mcp';
|
} from '@tm/mcp';
|
||||||
import { z } from 'zod';
|
import { z } from 'zod';
|
||||||
import { resolveTag } from '../../../scripts/modules/utils.js';
|
import { resolveTag } from '../../../scripts/modules/utils.js';
|
||||||
@@ -27,11 +28,22 @@ export function registerUpdateSubtaskTool(server) {
|
|||||||
id: TaskIdSchemaForMcp.describe(
|
id: TaskIdSchemaForMcp.describe(
|
||||||
'ID of the subtask to update in format "parentId.subtaskId" (e.g., "5.2"). Parent ID is the ID of the task that contains the subtask.'
|
'ID of the subtask to update in format "parentId.subtaskId" (e.g., "5.2"). Parent ID is the ID of the task that contains the subtask.'
|
||||||
),
|
),
|
||||||
prompt: z.string().describe('Information to add to the subtask'),
|
prompt: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'Information to add to the subtask. Required unless only updating metadata.'
|
||||||
|
),
|
||||||
research: z
|
research: z
|
||||||
.boolean()
|
.boolean()
|
||||||
.optional()
|
.optional()
|
||||||
.describe('Use Perplexity AI for research-backed updates'),
|
.describe('Use Perplexity AI for research-backed updates'),
|
||||||
|
metadata: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'JSON string of metadata to merge into subtask metadata. Example: \'{"ticketId": "JIRA-456", "reviewed": true}\'. Requires TASK_MASTER_ALLOW_METADATA_UPDATES=true in MCP environment.'
|
||||||
|
),
|
||||||
file: z.string().optional().describe('Absolute path to the tasks file'),
|
file: z.string().optional().describe('Absolute path to the tasks file'),
|
||||||
projectRoot: z
|
projectRoot: z
|
||||||
.string()
|
.string()
|
||||||
@@ -65,12 +77,29 @@ export function registerUpdateSubtaskTool(server) {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Validate metadata if provided
|
||||||
|
const validationResult = validateMcpMetadata(
|
||||||
|
args.metadata,
|
||||||
|
createErrorResponse
|
||||||
|
);
|
||||||
|
if (validationResult.error) {
|
||||||
|
return validationResult.error;
|
||||||
|
}
|
||||||
|
const parsedMetadata = validationResult.parsedMetadata;
|
||||||
|
// Validate that at least prompt or metadata is provided
|
||||||
|
if (!args.prompt && !parsedMetadata) {
|
||||||
|
return createErrorResponse(
|
||||||
|
'Either prompt or metadata must be provided for update-subtask'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
const result = await updateSubtaskByIdDirect(
|
const result = await updateSubtaskByIdDirect(
|
||||||
{
|
{
|
||||||
tasksJsonPath: tasksJsonPath,
|
tasksJsonPath: tasksJsonPath,
|
||||||
id: args.id,
|
id: args.id,
|
||||||
prompt: args.prompt,
|
prompt: args.prompt,
|
||||||
research: args.research,
|
research: args.research,
|
||||||
|
metadata: parsedMetadata,
|
||||||
projectRoot: args.projectRoot,
|
projectRoot: args.projectRoot,
|
||||||
tag: resolvedTag
|
tag: resolvedTag
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -6,7 +6,8 @@
|
|||||||
import {
|
import {
|
||||||
createErrorResponse,
|
createErrorResponse,
|
||||||
handleApiResult,
|
handleApiResult,
|
||||||
withNormalizedProjectRoot
|
withNormalizedProjectRoot,
|
||||||
|
validateMcpMetadata
|
||||||
} from '@tm/mcp';
|
} from '@tm/mcp';
|
||||||
import { z } from 'zod';
|
import { z } from 'zod';
|
||||||
import { resolveTag } from '../../../scripts/modules/utils.js';
|
import { resolveTag } from '../../../scripts/modules/utils.js';
|
||||||
@@ -30,7 +31,10 @@ export function registerUpdateTaskTool(server) {
|
|||||||
),
|
),
|
||||||
prompt: z
|
prompt: z
|
||||||
.string()
|
.string()
|
||||||
.describe('New information or context to incorporate into the task'),
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'New information or context to incorporate into the task. Required unless only updating metadata.'
|
||||||
|
),
|
||||||
research: z
|
research: z
|
||||||
.boolean()
|
.boolean()
|
||||||
.optional()
|
.optional()
|
||||||
@@ -41,6 +45,12 @@ export function registerUpdateTaskTool(server) {
|
|||||||
.describe(
|
.describe(
|
||||||
'Append timestamped information to task details instead of full update'
|
'Append timestamped information to task details instead of full update'
|
||||||
),
|
),
|
||||||
|
metadata: z
|
||||||
|
.string()
|
||||||
|
.optional()
|
||||||
|
.describe(
|
||||||
|
'JSON string of metadata to merge into task metadata. Example: \'{"githubIssue": 42, "sprint": "Q1-S3"}\'. Requires TASK_MASTER_ALLOW_METADATA_UPDATES=true in MCP environment.'
|
||||||
|
),
|
||||||
file: z.string().optional().describe('Absolute path to the tasks file'),
|
file: z.string().optional().describe('Absolute path to the tasks file'),
|
||||||
projectRoot: z
|
projectRoot: z
|
||||||
.string()
|
.string()
|
||||||
@@ -76,7 +86,23 @@ export function registerUpdateTaskTool(server) {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
// 3. Call Direct Function - Include projectRoot
|
// Validate metadata if provided
|
||||||
|
const validationResult = validateMcpMetadata(
|
||||||
|
args.metadata,
|
||||||
|
createErrorResponse
|
||||||
|
);
|
||||||
|
if (validationResult.error) {
|
||||||
|
return validationResult.error;
|
||||||
|
}
|
||||||
|
const parsedMetadata = validationResult.parsedMetadata;
|
||||||
|
// Validate that at least prompt or metadata is provided
|
||||||
|
if (!args.prompt && !parsedMetadata) {
|
||||||
|
return createErrorResponse(
|
||||||
|
'Either prompt or metadata must be provided for update-task'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Call Direct Function - Include projectRoot and metadata
|
||||||
const result = await updateTaskByIdDirect(
|
const result = await updateTaskByIdDirect(
|
||||||
{
|
{
|
||||||
tasksJsonPath: tasksJsonPath,
|
tasksJsonPath: tasksJsonPath,
|
||||||
@@ -84,6 +110,7 @@ export function registerUpdateTaskTool(server) {
|
|||||||
prompt: args.prompt,
|
prompt: args.prompt,
|
||||||
research: args.research,
|
research: args.research,
|
||||||
append: args.append,
|
append: args.append,
|
||||||
|
metadata: parsedMetadata,
|
||||||
projectRoot: args.projectRoot,
|
projectRoot: args.projectRoot,
|
||||||
tag: resolvedTag
|
tag: resolvedTag
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -12,6 +12,10 @@ export interface UpdateBridgeParams extends BaseBridgeParams {
|
|||||||
prompt: string;
|
prompt: string;
|
||||||
/** Whether to append or full update (default: false) */
|
/** Whether to append or full update (default: false) */
|
||||||
appendMode?: boolean;
|
appendMode?: boolean;
|
||||||
|
/** Whether to use research mode (default: false) */
|
||||||
|
useResearch?: boolean;
|
||||||
|
/** Metadata to merge into task (for metadata-only updates or alongside prompt) */
|
||||||
|
metadata?: Record<string, unknown>;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -45,6 +49,8 @@ export async function tryUpdateViaRemote(
|
|||||||
projectRoot,
|
projectRoot,
|
||||||
tag,
|
tag,
|
||||||
appendMode = false,
|
appendMode = false,
|
||||||
|
useResearch = false,
|
||||||
|
metadata,
|
||||||
isMCP = false,
|
isMCP = false,
|
||||||
outputFormat = 'text',
|
outputFormat = 'text',
|
||||||
report
|
report
|
||||||
@@ -76,7 +82,9 @@ export async function tryUpdateViaRemote(
|
|||||||
try {
|
try {
|
||||||
// Call the API storage method which handles the remote update
|
// Call the API storage method which handles the remote update
|
||||||
await tmCore.tasks.updateWithPrompt(String(taskId), prompt, tag, {
|
await tmCore.tasks.updateWithPrompt(String(taskId), prompt, tag, {
|
||||||
mode
|
mode,
|
||||||
|
useResearch,
|
||||||
|
...(metadata && { metadata })
|
||||||
});
|
});
|
||||||
|
|
||||||
if (spinner) {
|
if (spinner) {
|
||||||
|
|||||||
@@ -156,6 +156,13 @@ export interface Task extends TaskImplementationMetadata {
|
|||||||
recommendedSubtasks?: number;
|
recommendedSubtasks?: number;
|
||||||
expansionPrompt?: string;
|
expansionPrompt?: string;
|
||||||
complexityReasoning?: string;
|
complexityReasoning?: string;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* User-defined metadata that survives all task operations.
|
||||||
|
* Use for external IDs, custom workflow data, integrations, etc.
|
||||||
|
* This field is preserved through AI operations, updates, and serialization.
|
||||||
|
*/
|
||||||
|
metadata?: Record<string, unknown>;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
@@ -283,6 +283,7 @@ export class FileStorage implements IStorage {
|
|||||||
|
|
||||||
/**
|
/**
|
||||||
* Normalize task IDs - keep Task IDs as strings, Subtask IDs as numbers
|
* Normalize task IDs - keep Task IDs as strings, Subtask IDs as numbers
|
||||||
|
* Note: Uses spread operator to preserve all task properties including user-defined metadata
|
||||||
*/
|
*/
|
||||||
private normalizeTaskIds(tasks: Task[]): Task[] {
|
private normalizeTaskIds(tasks: Task[]): Task[] {
|
||||||
return tasks.map((task) => ({
|
return tasks.map((task) => ({
|
||||||
@@ -372,9 +373,37 @@ export class FileStorage implements IStorage {
|
|||||||
throw new Error(`Task ${taskId} not found`);
|
throw new Error(`Task ${taskId} not found`);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
const existingTask = tasks[taskIndex];
|
||||||
|
|
||||||
|
// Preserve subtask metadata when subtasks are updated
|
||||||
|
// AI operations don't include metadata in returned subtasks
|
||||||
|
let mergedSubtasks = updates.subtasks;
|
||||||
|
if (updates.subtasks && existingTask.subtasks) {
|
||||||
|
mergedSubtasks = updates.subtasks.map((updatedSubtask) => {
|
||||||
|
// Type-coerce IDs for comparison; fall back to title match if IDs don't match
|
||||||
|
const originalSubtask = existingTask.subtasks?.find(
|
||||||
|
(st) =>
|
||||||
|
String(st.id) === String(updatedSubtask.id) ||
|
||||||
|
(updatedSubtask.title && st.title === updatedSubtask.title)
|
||||||
|
);
|
||||||
|
// Merge metadata: preserve original and add/override with new
|
||||||
|
if (originalSubtask?.metadata || updatedSubtask.metadata) {
|
||||||
|
return {
|
||||||
|
...updatedSubtask,
|
||||||
|
metadata: {
|
||||||
|
...(originalSubtask?.metadata || {}),
|
||||||
|
...(updatedSubtask.metadata || {})
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return updatedSubtask;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
tasks[taskIndex] = {
|
tasks[taskIndex] = {
|
||||||
...tasks[taskIndex],
|
...existingTask,
|
||||||
...updates,
|
...updates,
|
||||||
|
...(mergedSubtasks && { subtasks: mergedSubtasks }),
|
||||||
id: String(taskId) // Keep consistent with normalizeTaskIds
|
id: String(taskId) // Keep consistent with normalizeTaskIds
|
||||||
};
|
};
|
||||||
await this.saveTasks(tasks, tag);
|
await this.saveTasks(tasks, tag);
|
||||||
|
|||||||
345
packages/tm-core/src/modules/tasks/entities/task.entity.spec.ts
Normal file
345
packages/tm-core/src/modules/tasks/entities/task.entity.spec.ts
Normal file
@@ -0,0 +1,345 @@
|
|||||||
|
/**
|
||||||
|
* @fileoverview Unit tests for TaskEntity metadata handling
|
||||||
|
*
|
||||||
|
* Tests the preservation of user-defined metadata through all TaskEntity operations
|
||||||
|
* including construction, serialization, and deserialization.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { describe, expect, it } from 'vitest';
|
||||||
|
import { TaskEntity } from './task.entity.js';
|
||||||
|
import type { Task } from '../../../common/types/index.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates a minimal valid task for testing
|
||||||
|
*/
|
||||||
|
function createMinimalTask(overrides: Partial<Task> = {}): Task {
|
||||||
|
return {
|
||||||
|
id: '1',
|
||||||
|
title: 'Test Task',
|
||||||
|
description: 'Test description',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: 'Task details',
|
||||||
|
testStrategy: 'Test strategy',
|
||||||
|
subtasks: [],
|
||||||
|
...overrides
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('TaskEntity', () => {
|
||||||
|
describe('metadata property', () => {
|
||||||
|
it('should preserve metadata through constructor', () => {
|
||||||
|
const metadata = { uuid: '123', custom: 'value' };
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle undefined metadata', () => {
|
||||||
|
const task = createMinimalTask();
|
||||||
|
// Explicitly not setting metadata
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle empty metadata object', () => {
|
||||||
|
const task = createMinimalTask({ metadata: {} });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual({});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata with string values', () => {
|
||||||
|
const metadata = { externalId: 'EXT-123', source: 'jira' };
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata with number values', () => {
|
||||||
|
const metadata = { priority: 5, score: 100 };
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata with boolean values', () => {
|
||||||
|
const metadata = { isBlocking: true, reviewed: false };
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata with nested objects', () => {
|
||||||
|
const metadata = {
|
||||||
|
jira: {
|
||||||
|
key: 'PROJ-123',
|
||||||
|
sprint: {
|
||||||
|
id: 5,
|
||||||
|
name: 'Sprint 5'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata with arrays', () => {
|
||||||
|
const metadata = {
|
||||||
|
labels: ['bug', 'high-priority'],
|
||||||
|
relatedIds: [1, 2, 3]
|
||||||
|
};
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata with null values', () => {
|
||||||
|
const metadata = { deletedAt: null, archivedBy: null };
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve complex mixed metadata', () => {
|
||||||
|
const metadata = {
|
||||||
|
externalId: 'EXT-456',
|
||||||
|
score: 85,
|
||||||
|
isUrgent: true,
|
||||||
|
tags: ['frontend', 'refactor'],
|
||||||
|
integration: {
|
||||||
|
source: 'github',
|
||||||
|
issueNumber: 123,
|
||||||
|
labels: ['enhancement']
|
||||||
|
},
|
||||||
|
timestamps: {
|
||||||
|
importedAt: '2024-01-15T10:00:00Z',
|
||||||
|
lastSynced: null
|
||||||
|
}
|
||||||
|
};
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('toJSON() with metadata', () => {
|
||||||
|
it('should include metadata in toJSON output', () => {
|
||||||
|
const metadata = { uuid: '123', custom: 'value' };
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
const json = entity.toJSON();
|
||||||
|
|
||||||
|
expect(json.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should include undefined metadata in toJSON output', () => {
|
||||||
|
const task = createMinimalTask();
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
const json = entity.toJSON();
|
||||||
|
|
||||||
|
expect(json.metadata).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should include empty metadata object in toJSON output', () => {
|
||||||
|
const task = createMinimalTask({ metadata: {} });
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
const json = entity.toJSON();
|
||||||
|
|
||||||
|
expect(json.metadata).toEqual({});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve nested metadata through toJSON', () => {
|
||||||
|
const metadata = {
|
||||||
|
integration: {
|
||||||
|
source: 'linear',
|
||||||
|
config: {
|
||||||
|
apiKey: 'redacted',
|
||||||
|
projectId: 'proj_123'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
|
||||||
|
const json = entity.toJSON();
|
||||||
|
|
||||||
|
expect(json.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('round-trip preservation', () => {
|
||||||
|
it('should preserve metadata through full round-trip', () => {
|
||||||
|
const originalMetadata = {
|
||||||
|
uuid: '550e8400-e29b-41d4-a716-446655440000',
|
||||||
|
externalSystem: 'jira',
|
||||||
|
customField: { nested: 'value' }
|
||||||
|
};
|
||||||
|
const originalTask = createMinimalTask({ metadata: originalMetadata });
|
||||||
|
|
||||||
|
// Task -> TaskEntity -> toJSON() -> TaskEntity -> toJSON()
|
||||||
|
const entity1 = new TaskEntity(originalTask);
|
||||||
|
const json1 = entity1.toJSON();
|
||||||
|
const entity2 = new TaskEntity(json1);
|
||||||
|
const json2 = entity2.toJSON();
|
||||||
|
|
||||||
|
expect(json2.metadata).toEqual(originalMetadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve all task fields alongside metadata', () => {
|
||||||
|
const metadata = { custom: 'data' };
|
||||||
|
const task = createMinimalTask({
|
||||||
|
id: '42',
|
||||||
|
title: 'Important Task',
|
||||||
|
description: 'Do the thing',
|
||||||
|
status: 'in-progress',
|
||||||
|
priority: 'high',
|
||||||
|
dependencies: ['1', '2'],
|
||||||
|
details: 'Detailed info',
|
||||||
|
testStrategy: 'Unit tests',
|
||||||
|
tags: ['urgent'],
|
||||||
|
metadata
|
||||||
|
});
|
||||||
|
|
||||||
|
const entity = new TaskEntity(task);
|
||||||
|
const json = entity.toJSON();
|
||||||
|
|
||||||
|
expect(json.id).toBe('42');
|
||||||
|
expect(json.title).toBe('Important Task');
|
||||||
|
expect(json.description).toBe('Do the thing');
|
||||||
|
expect(json.status).toBe('in-progress');
|
||||||
|
expect(json.priority).toBe('high');
|
||||||
|
expect(json.dependencies).toEqual(['1', '2']);
|
||||||
|
expect(json.details).toBe('Detailed info');
|
||||||
|
expect(json.testStrategy).toBe('Unit tests');
|
||||||
|
expect(json.tags).toEqual(['urgent']);
|
||||||
|
expect(json.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('fromObject() with metadata', () => {
|
||||||
|
it('should preserve metadata through fromObject', () => {
|
||||||
|
const metadata = { externalId: 'EXT-789' };
|
||||||
|
const task = createMinimalTask({ metadata });
|
||||||
|
|
||||||
|
const entity = TaskEntity.fromObject(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle undefined metadata in fromObject', () => {
|
||||||
|
const task = createMinimalTask();
|
||||||
|
|
||||||
|
const entity = TaskEntity.fromObject(task);
|
||||||
|
|
||||||
|
expect(entity.metadata).toBeUndefined();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('fromArray() with metadata', () => {
|
||||||
|
it('should preserve metadata on all tasks through fromArray', () => {
|
||||||
|
const task1 = createMinimalTask({
|
||||||
|
id: '1',
|
||||||
|
metadata: { source: 'import1' }
|
||||||
|
});
|
||||||
|
const task2 = createMinimalTask({
|
||||||
|
id: '2',
|
||||||
|
metadata: { source: 'import2' }
|
||||||
|
});
|
||||||
|
const task3 = createMinimalTask({ id: '3' }); // No metadata
|
||||||
|
|
||||||
|
const entities = TaskEntity.fromArray([task1, task2, task3]);
|
||||||
|
|
||||||
|
expect(entities).toHaveLength(3);
|
||||||
|
expect(entities[0].metadata).toEqual({ source: 'import1' });
|
||||||
|
expect(entities[1].metadata).toEqual({ source: 'import2' });
|
||||||
|
expect(entities[2].metadata).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve different metadata structures across tasks', () => {
|
||||||
|
const tasks = [
|
||||||
|
createMinimalTask({ id: '1', metadata: { simple: 'value' } }),
|
||||||
|
createMinimalTask({
|
||||||
|
id: '2',
|
||||||
|
metadata: { nested: { deep: { value: 123 } } }
|
||||||
|
}),
|
||||||
|
createMinimalTask({ id: '3', metadata: { array: [1, 2, 3] } }),
|
||||||
|
createMinimalTask({ id: '4', metadata: {} })
|
||||||
|
];
|
||||||
|
|
||||||
|
const entities = TaskEntity.fromArray(tasks);
|
||||||
|
const jsons = entities.map((e) => e.toJSON());
|
||||||
|
|
||||||
|
expect(jsons[0].metadata).toEqual({ simple: 'value' });
|
||||||
|
expect(jsons[1].metadata).toEqual({ nested: { deep: { value: 123 } } });
|
||||||
|
expect(jsons[2].metadata).toEqual({ array: [1, 2, 3] });
|
||||||
|
expect(jsons[3].metadata).toEqual({});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('no corruption of other fields', () => {
|
||||||
|
it('should not affect other task fields when metadata is present', () => {
|
||||||
|
const taskWithMetadata = createMinimalTask({
|
||||||
|
id: '99',
|
||||||
|
title: 'Original Title',
|
||||||
|
metadata: { someKey: 'someValue' }
|
||||||
|
});
|
||||||
|
|
||||||
|
const entity = new TaskEntity(taskWithMetadata);
|
||||||
|
|
||||||
|
expect(entity.id).toBe('99');
|
||||||
|
expect(entity.title).toBe('Original Title');
|
||||||
|
expect(entity.status).toBe('pending');
|
||||||
|
expect(entity.priority).toBe('medium');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not affect subtasks when metadata is present', () => {
|
||||||
|
const taskWithSubtasks = createMinimalTask({
|
||||||
|
metadata: { tracked: true },
|
||||||
|
subtasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'Subtask 1',
|
||||||
|
description: 'Subtask desc',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'low',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: ''
|
||||||
|
}
|
||||||
|
]
|
||||||
|
});
|
||||||
|
|
||||||
|
const entity = new TaskEntity(taskWithSubtasks);
|
||||||
|
|
||||||
|
expect(entity.subtasks).toHaveLength(1);
|
||||||
|
expect(entity.subtasks[0].title).toBe('Subtask 1');
|
||||||
|
expect(entity.metadata).toEqual({ tracked: true });
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -36,6 +36,7 @@ export class TaskEntity implements Task {
|
|||||||
recommendedSubtasks?: number;
|
recommendedSubtasks?: number;
|
||||||
expansionPrompt?: string;
|
expansionPrompt?: string;
|
||||||
complexityReasoning?: string;
|
complexityReasoning?: string;
|
||||||
|
metadata?: Record<string, unknown>;
|
||||||
|
|
||||||
constructor(data: Task | (Omit<Task, 'id'> & { id: number | string })) {
|
constructor(data: Task | (Omit<Task, 'id'> & { id: number | string })) {
|
||||||
this.validate(data);
|
this.validate(data);
|
||||||
@@ -68,6 +69,7 @@ export class TaskEntity implements Task {
|
|||||||
this.recommendedSubtasks = data.recommendedSubtasks;
|
this.recommendedSubtasks = data.recommendedSubtasks;
|
||||||
this.expansionPrompt = data.expansionPrompt;
|
this.expansionPrompt = data.expansionPrompt;
|
||||||
this.complexityReasoning = data.complexityReasoning;
|
this.complexityReasoning = data.complexityReasoning;
|
||||||
|
this.metadata = data.metadata;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -255,7 +257,8 @@ export class TaskEntity implements Task {
|
|||||||
complexity: this.complexity,
|
complexity: this.complexity,
|
||||||
recommendedSubtasks: this.recommendedSubtasks,
|
recommendedSubtasks: this.recommendedSubtasks,
|
||||||
expansionPrompt: this.expansionPrompt,
|
expansionPrompt: this.expansionPrompt,
|
||||||
complexityReasoning: this.complexityReasoning
|
complexityReasoning: this.complexityReasoning,
|
||||||
|
metadata: this.metadata
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -0,0 +1,481 @@
|
|||||||
|
/**
|
||||||
|
* @fileoverview Integration tests for metadata preservation across AI operations
|
||||||
|
*
|
||||||
|
* Tests that user-defined metadata survives all AI operations including:
|
||||||
|
* - update-task: AI updates task fields but doesn't include metadata in response
|
||||||
|
* - expand-task: AI generates subtasks but parent task metadata is preserved
|
||||||
|
* - parse-prd: AI generates new tasks without metadata field
|
||||||
|
*
|
||||||
|
* Key insight: AI schemas (base-schemas.js) intentionally EXCLUDE the metadata field.
|
||||||
|
* This means AI responses never include metadata, and the spread operator in
|
||||||
|
* storage/service layers preserves existing metadata during updates.
|
||||||
|
*
|
||||||
|
* These tests simulate what happens when AI operations update tasks - the AI
|
||||||
|
* returns a task object without a metadata field, and we verify that the
|
||||||
|
* existing metadata is preserved through the storage layer.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||||
|
import * as fs from 'fs';
|
||||||
|
import * as path from 'path';
|
||||||
|
import * as os from 'os';
|
||||||
|
import { FileStorage } from '../../../src/modules/storage/adapters/file-storage/file-storage.js';
|
||||||
|
import type { Task, Subtask } from '../../../src/common/types/index.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates a minimal valid task for testing
|
||||||
|
*/
|
||||||
|
function createTask(id: string, overrides: Partial<Task> = {}): Task {
|
||||||
|
return {
|
||||||
|
id,
|
||||||
|
title: `Task ${id}`,
|
||||||
|
description: `Description for task ${id}`,
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
subtasks: [],
|
||||||
|
...overrides
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates a realistic metadata object like external integrations would produce
|
||||||
|
*/
|
||||||
|
function createRealisticMetadata(): Record<string, unknown> {
|
||||||
|
return {
|
||||||
|
uuid: '550e8400-e29b-41d4-a716-446655440000',
|
||||||
|
githubIssue: 42,
|
||||||
|
sprint: 'Q1-S3',
|
||||||
|
jira: {
|
||||||
|
key: 'PROJ-123',
|
||||||
|
type: 'story',
|
||||||
|
epic: 'EPIC-45'
|
||||||
|
},
|
||||||
|
importedAt: '2024-01-15T10:30:00Z',
|
||||||
|
source: 'github-sync',
|
||||||
|
labels: ['frontend', 'refactor', 'high-priority']
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('AI Operation Metadata Preservation - Integration Tests', () => {
|
||||||
|
let tempDir: string;
|
||||||
|
let storage: FileStorage;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
// Create a temp directory for each test
|
||||||
|
tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'taskmaster-ai-test-'));
|
||||||
|
// Create .taskmaster/tasks directory structure
|
||||||
|
const taskmasterDir = path.join(tempDir, '.taskmaster', 'tasks');
|
||||||
|
fs.mkdirSync(taskmasterDir, { recursive: true });
|
||||||
|
storage = new FileStorage(tempDir);
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
// Clean up temp directory
|
||||||
|
fs.rmSync(tempDir, { recursive: true, force: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('update-task operation simulation', () => {
|
||||||
|
it('should preserve metadata when AI returns task without metadata field', async () => {
|
||||||
|
// Setup: Task with user metadata
|
||||||
|
const originalMetadata = createRealisticMetadata();
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
title: 'Original Title',
|
||||||
|
description: 'Original description',
|
||||||
|
metadata: originalMetadata
|
||||||
|
})
|
||||||
|
];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Simulate AI response: AI updates title/description but doesn't include metadata
|
||||||
|
// This is the exact pattern from update-task-by-id.js
|
||||||
|
const aiGeneratedUpdate: Partial<Task> = {
|
||||||
|
title: 'AI Updated Title',
|
||||||
|
description: 'AI refined description with more detail',
|
||||||
|
details: 'AI generated implementation details',
|
||||||
|
testStrategy: 'AI suggested test approach'
|
||||||
|
// Note: NO metadata field - AI schemas don't include it
|
||||||
|
};
|
||||||
|
|
||||||
|
// Apply update through FileStorage (simulating what AI operations do)
|
||||||
|
await storage.updateTask('1', aiGeneratedUpdate);
|
||||||
|
|
||||||
|
// Verify: AI fields updated, metadata preserved
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].title).toBe('AI Updated Title');
|
||||||
|
expect(loadedTasks[0].description).toBe(
|
||||||
|
'AI refined description with more detail'
|
||||||
|
);
|
||||||
|
expect(loadedTasks[0].details).toBe(
|
||||||
|
'AI generated implementation details'
|
||||||
|
);
|
||||||
|
expect(loadedTasks[0].testStrategy).toBe('AI suggested test approach');
|
||||||
|
// Critical: metadata must be preserved
|
||||||
|
expect(loadedTasks[0].metadata).toEqual(originalMetadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata through multiple sequential AI updates', async () => {
|
||||||
|
const metadata = { externalId: 'EXT-999', version: 1 };
|
||||||
|
const tasks: Task[] = [createTask('1', { metadata })];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// First AI update
|
||||||
|
await storage.updateTask('1', { title: 'First AI Update' });
|
||||||
|
|
||||||
|
// Second AI update
|
||||||
|
await storage.updateTask('1', {
|
||||||
|
description: 'Second AI Update adds details'
|
||||||
|
});
|
||||||
|
|
||||||
|
// Third AI update
|
||||||
|
await storage.updateTask('1', { priority: 'high' });
|
||||||
|
|
||||||
|
// Verify metadata survived all updates
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].title).toBe('First AI Update');
|
||||||
|
expect(loadedTasks[0].description).toBe('Second AI Update adds details');
|
||||||
|
expect(loadedTasks[0].priority).toBe('high');
|
||||||
|
expect(loadedTasks[0].metadata).toEqual(metadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve realistic integration metadata during AI operations', async () => {
|
||||||
|
const realisticMetadata = createRealisticMetadata();
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
title: 'Sync from GitHub',
|
||||||
|
metadata: realisticMetadata
|
||||||
|
})
|
||||||
|
];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// AI enriches the task
|
||||||
|
await storage.updateTask('1', {
|
||||||
|
title: 'Implement user authentication',
|
||||||
|
description: 'Set up JWT-based authentication system',
|
||||||
|
details: `
|
||||||
|
## Implementation Plan
|
||||||
|
1. Create auth middleware
|
||||||
|
2. Implement JWT token generation
|
||||||
|
3. Add refresh token logic
|
||||||
|
4. Set up protected routes
|
||||||
|
`.trim(),
|
||||||
|
testStrategy:
|
||||||
|
'Unit tests for JWT functions, integration tests for auth flow'
|
||||||
|
});
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
// All AI updates applied
|
||||||
|
expect(loadedTasks[0].title).toBe('Implement user authentication');
|
||||||
|
expect(loadedTasks[0].details).toContain('Implementation Plan');
|
||||||
|
// Realistic metadata preserved with all its nested structure
|
||||||
|
expect(loadedTasks[0].metadata).toEqual(realisticMetadata);
|
||||||
|
expect(
|
||||||
|
(loadedTasks[0].metadata as Record<string, unknown>).githubIssue
|
||||||
|
).toBe(42);
|
||||||
|
expect(
|
||||||
|
(
|
||||||
|
(loadedTasks[0].metadata as Record<string, unknown>).jira as Record<
|
||||||
|
string,
|
||||||
|
unknown
|
||||||
|
>
|
||||||
|
).key
|
||||||
|
).toBe('PROJ-123');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('expand-task operation simulation', () => {
|
||||||
|
it('should preserve parent task metadata when adding AI-generated subtasks', async () => {
|
||||||
|
const parentMetadata = { tracked: true, source: 'import' };
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
metadata: parentMetadata,
|
||||||
|
subtasks: []
|
||||||
|
})
|
||||||
|
];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Simulate expand-task: AI generates subtasks (without metadata)
|
||||||
|
const aiGeneratedSubtasks: Subtask[] = [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'AI Subtask 1',
|
||||||
|
description: 'First step generated by AI',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: 'Implementation details',
|
||||||
|
testStrategy: 'Test approach'
|
||||||
|
// No metadata - AI doesn't generate it
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 2,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'AI Subtask 2',
|
||||||
|
description: 'Second step generated by AI',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: ['1'],
|
||||||
|
details: 'More details',
|
||||||
|
testStrategy: 'More tests'
|
||||||
|
}
|
||||||
|
];
|
||||||
|
|
||||||
|
// Apply subtasks update
|
||||||
|
await storage.updateTask('1', { subtasks: aiGeneratedSubtasks });
|
||||||
|
|
||||||
|
// Verify parent metadata preserved
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].metadata).toEqual(parentMetadata);
|
||||||
|
expect(loadedTasks[0].subtasks).toHaveLength(2);
|
||||||
|
// Subtasks don't inherit parent metadata
|
||||||
|
expect(loadedTasks[0].subtasks[0].metadata).toBeUndefined();
|
||||||
|
expect(loadedTasks[0].subtasks[1].metadata).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve subtask metadata when parent is updated', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
metadata: { parentMeta: 'parent-value' },
|
||||||
|
subtasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'Subtask with metadata',
|
||||||
|
description: 'Has its own metadata',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
metadata: { subtaskMeta: 'subtask-value' }
|
||||||
|
}
|
||||||
|
]
|
||||||
|
})
|
||||||
|
];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// AI updates parent task (not subtasks)
|
||||||
|
await storage.updateTask('1', {
|
||||||
|
title: 'Parent Updated by AI',
|
||||||
|
description: 'New description'
|
||||||
|
});
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
// Parent metadata preserved
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({ parentMeta: 'parent-value' });
|
||||||
|
// Subtask and its metadata preserved
|
||||||
|
expect(loadedTasks[0].subtasks[0].metadata).toEqual({
|
||||||
|
subtaskMeta: 'subtask-value'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('parse-prd operation simulation', () => {
|
||||||
|
it('should generate tasks without metadata field (as AI would)', async () => {
|
||||||
|
// Simulate parse-prd output: AI generates tasks without metadata
|
||||||
|
const aiGeneratedTasks: Task[] = [
|
||||||
|
{
|
||||||
|
id: '1',
|
||||||
|
title: 'Set up project structure',
|
||||||
|
description: 'Initialize the project with proper folder structure',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'high',
|
||||||
|
dependencies: [],
|
||||||
|
details: 'Create src/, tests/, docs/ directories',
|
||||||
|
testStrategy: 'Verify directories exist',
|
||||||
|
subtasks: []
|
||||||
|
// No metadata - AI doesn't generate it
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: '2',
|
||||||
|
title: 'Implement core functionality',
|
||||||
|
description: 'Build the main features',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'high',
|
||||||
|
dependencies: ['1'],
|
||||||
|
details: 'Implement main modules',
|
||||||
|
testStrategy: 'Unit tests for each module',
|
||||||
|
subtasks: []
|
||||||
|
}
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(aiGeneratedTasks);
|
||||||
|
|
||||||
|
// Verify tasks saved correctly without metadata
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks).toHaveLength(2);
|
||||||
|
expect(loadedTasks[0].metadata).toBeUndefined();
|
||||||
|
expect(loadedTasks[1].metadata).toBeUndefined();
|
||||||
|
// Later, user can add metadata
|
||||||
|
await storage.updateTask('1', {
|
||||||
|
metadata: { externalId: 'USER-ADDED-123' }
|
||||||
|
});
|
||||||
|
const updatedTasks = await storage.loadTasks();
|
||||||
|
expect(updatedTasks[0].metadata).toEqual({
|
||||||
|
externalId: 'USER-ADDED-123'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('update-subtask operation simulation', () => {
|
||||||
|
it('should preserve subtask metadata when appending info', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
subtasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'Tracked subtask',
|
||||||
|
description: 'Has metadata from import',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: 'Initial details',
|
||||||
|
testStrategy: '',
|
||||||
|
metadata: { importedFrom: 'jira', ticketId: 'JIRA-456' }
|
||||||
|
}
|
||||||
|
]
|
||||||
|
})
|
||||||
|
];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Update subtask details (like update-subtask command does)
|
||||||
|
const updatedSubtask: Subtask = {
|
||||||
|
id: 1,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'Tracked subtask',
|
||||||
|
description: 'Has metadata from import',
|
||||||
|
status: 'in-progress',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details:
|
||||||
|
'Initial details\n\n<info added on 2024-01-20T10:00:00Z>\nImplementation notes from AI\n</info added on 2024-01-20T10:00:00Z>',
|
||||||
|
testStrategy: 'AI suggested tests',
|
||||||
|
metadata: { importedFrom: 'jira', ticketId: 'JIRA-456' }
|
||||||
|
};
|
||||||
|
|
||||||
|
await storage.updateTask('1', { subtasks: [updatedSubtask] });
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].subtasks[0].metadata).toEqual({
|
||||||
|
importedFrom: 'jira',
|
||||||
|
ticketId: 'JIRA-456'
|
||||||
|
});
|
||||||
|
expect(loadedTasks[0].subtasks[0].details).toContain(
|
||||||
|
'Implementation notes from AI'
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('mixed AI and storage metadata coexistence', () => {
|
||||||
|
it('should preserve user metadata alongside AI-generated task fields', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
// AI-generated fields
|
||||||
|
relevantFiles: [
|
||||||
|
{
|
||||||
|
path: 'src/auth.ts',
|
||||||
|
description: 'Auth module',
|
||||||
|
action: 'modify'
|
||||||
|
}
|
||||||
|
],
|
||||||
|
category: 'development',
|
||||||
|
skills: ['TypeScript', 'Security'],
|
||||||
|
acceptanceCriteria: ['Tests pass', 'Code reviewed'],
|
||||||
|
// User-defined metadata (from import)
|
||||||
|
metadata: {
|
||||||
|
externalId: 'JIRA-789',
|
||||||
|
storyPoints: 5,
|
||||||
|
sprint: 'Sprint 10'
|
||||||
|
}
|
||||||
|
})
|
||||||
|
];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// AI updates the task (doesn't touch metadata)
|
||||||
|
await storage.updateTask('1', {
|
||||||
|
relevantFiles: [
|
||||||
|
{ path: 'src/auth.ts', description: 'Auth module', action: 'modify' },
|
||||||
|
{
|
||||||
|
path: 'src/middleware.ts',
|
||||||
|
description: 'Added middleware',
|
||||||
|
action: 'create'
|
||||||
|
}
|
||||||
|
],
|
||||||
|
skills: ['TypeScript', 'Security', 'JWT']
|
||||||
|
});
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
// AI fields updated
|
||||||
|
expect(loadedTasks[0].relevantFiles).toHaveLength(2);
|
||||||
|
expect(loadedTasks[0].skills).toContain('JWT');
|
||||||
|
// User metadata preserved
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({
|
||||||
|
externalId: 'JIRA-789',
|
||||||
|
storyPoints: 5,
|
||||||
|
sprint: 'Sprint 10'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('edge cases for AI operations', () => {
|
||||||
|
it('should handle task with only metadata being updated by AI', async () => {
|
||||||
|
// Task has ONLY metadata set (sparse task)
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
metadata: { sparse: true, tracking: 'minimal' }
|
||||||
|
})
|
||||||
|
];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// AI fills in all the other fields
|
||||||
|
await storage.updateTask('1', {
|
||||||
|
title: 'AI Generated Title',
|
||||||
|
description: 'AI Generated Description',
|
||||||
|
details: 'AI Generated Details',
|
||||||
|
testStrategy: 'AI Generated Test Strategy',
|
||||||
|
priority: 'high'
|
||||||
|
});
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].title).toBe('AI Generated Title');
|
||||||
|
expect(loadedTasks[0].priority).toBe('high');
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({
|
||||||
|
sparse: true,
|
||||||
|
tracking: 'minimal'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve deeply nested metadata through AI operations', async () => {
|
||||||
|
const deepMetadata = {
|
||||||
|
integration: {
|
||||||
|
source: {
|
||||||
|
type: 'github',
|
||||||
|
repo: {
|
||||||
|
owner: 'org',
|
||||||
|
name: 'repo',
|
||||||
|
issue: {
|
||||||
|
number: 123,
|
||||||
|
labels: ['bug', 'priority-1']
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
const tasks: Task[] = [createTask('1', { metadata: deepMetadata })];
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Multiple AI operations
|
||||||
|
await storage.updateTask('1', { title: 'Update 1' });
|
||||||
|
await storage.updateTask('1', { description: 'Update 2' });
|
||||||
|
await storage.updateTask('1', { status: 'in-progress' });
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].metadata).toEqual(deepMetadata);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -0,0 +1,540 @@
|
|||||||
|
/**
|
||||||
|
* @fileoverview Integration tests for MCP tool metadata updates
|
||||||
|
*
|
||||||
|
* Tests that metadata updates via update-task and update-subtask MCP tools
|
||||||
|
* work correctly with the TASK_MASTER_ALLOW_METADATA_UPDATES flag.
|
||||||
|
*
|
||||||
|
* These tests validate the metadata flow from MCP tool layer through
|
||||||
|
* direct functions to the legacy scripts and storage layer.
|
||||||
|
*
|
||||||
|
* NOTE: These tests focus on validation logic (JSON parsing, env flags, merge behavior)
|
||||||
|
* rather than full end-to-end MCP tool calls. End-to-end behavior is covered by:
|
||||||
|
* - FileStorage metadata tests (storage layer)
|
||||||
|
* - AI operation metadata preservation tests (full workflow)
|
||||||
|
* - Direct function integration (covered by the validation tests here)
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||||
|
import * as fs from 'fs';
|
||||||
|
import * as path from 'path';
|
||||||
|
import * as os from 'os';
|
||||||
|
import { validateMcpMetadata } from '@tm/mcp';
|
||||||
|
|
||||||
|
describe('MCP Tool Metadata Updates - Integration Tests', () => {
|
||||||
|
let tempDir: string;
|
||||||
|
let tasksJsonPath: string;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
// Create a temp directory for each test
|
||||||
|
tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'taskmaster-mcp-test-'));
|
||||||
|
// Create .taskmaster/tasks directory structure
|
||||||
|
const taskmasterDir = path.join(tempDir, '.taskmaster', 'tasks');
|
||||||
|
fs.mkdirSync(taskmasterDir, { recursive: true });
|
||||||
|
tasksJsonPath = path.join(taskmasterDir, 'tasks.json');
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
// Clean up temp directory
|
||||||
|
fs.rmSync(tempDir, { recursive: true, force: true });
|
||||||
|
// Reset env vars
|
||||||
|
delete process.env.TASK_MASTER_ALLOW_METADATA_UPDATES;
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('metadata JSON validation', () => {
|
||||||
|
it('should validate metadata is a valid JSON object', () => {
|
||||||
|
// Test valid JSON objects
|
||||||
|
const validMetadata = [
|
||||||
|
'{"key": "value"}',
|
||||||
|
'{"githubIssue": 42, "sprint": "Q1"}',
|
||||||
|
'{"nested": {"deep": true}}'
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const meta of validMetadata) {
|
||||||
|
const parsed = JSON.parse(meta);
|
||||||
|
expect(typeof parsed).toBe('object');
|
||||||
|
expect(parsed).not.toBeNull();
|
||||||
|
expect(Array.isArray(parsed)).toBe(false);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should reject invalid metadata formats', () => {
|
||||||
|
const invalidMetadata = [
|
||||||
|
'"string"', // Just a string
|
||||||
|
'123', // Just a number
|
||||||
|
'true', // Just a boolean
|
||||||
|
'null', // Null
|
||||||
|
'[1, 2, 3]' // Array
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const meta of invalidMetadata) {
|
||||||
|
const parsed = JSON.parse(meta);
|
||||||
|
const isValidObject =
|
||||||
|
typeof parsed === 'object' &&
|
||||||
|
parsed !== null &&
|
||||||
|
!Array.isArray(parsed);
|
||||||
|
expect(isValidObject).toBe(false);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should reject invalid JSON strings', () => {
|
||||||
|
const invalidJson = [
|
||||||
|
'{key: "value"}', // Missing quotes
|
||||||
|
"{'key': 'value'}", // Single quotes
|
||||||
|
'{"key": }' // Incomplete
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const json of invalidJson) {
|
||||||
|
expect(() => JSON.parse(json)).toThrow();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('TASK_MASTER_ALLOW_METADATA_UPDATES flag', () => {
|
||||||
|
it('should block metadata updates when flag is not set', () => {
|
||||||
|
delete process.env.TASK_MASTER_ALLOW_METADATA_UPDATES;
|
||||||
|
const allowMetadataUpdates =
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES === 'true';
|
||||||
|
expect(allowMetadataUpdates).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should block metadata updates when flag is set to false', () => {
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES = 'false';
|
||||||
|
const allowMetadataUpdates =
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES === 'true';
|
||||||
|
expect(allowMetadataUpdates).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should allow metadata updates when flag is set to true', () => {
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES = 'true';
|
||||||
|
const allowMetadataUpdates =
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES === 'true';
|
||||||
|
expect(allowMetadataUpdates).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should be case-sensitive (TRUE should not work)', () => {
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES = 'TRUE';
|
||||||
|
const allowMetadataUpdates =
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES === 'true';
|
||||||
|
expect(allowMetadataUpdates).toBe(false);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('metadata merge logic', () => {
|
||||||
|
it('should merge new metadata with existing metadata', () => {
|
||||||
|
const existingMetadata = { githubIssue: 42, sprint: 'Q1' };
|
||||||
|
const newMetadata = { storyPoints: 5, reviewed: true };
|
||||||
|
|
||||||
|
const merged = {
|
||||||
|
...(existingMetadata || {}),
|
||||||
|
...(newMetadata || {})
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(merged).toEqual({
|
||||||
|
githubIssue: 42,
|
||||||
|
sprint: 'Q1',
|
||||||
|
storyPoints: 5,
|
||||||
|
reviewed: true
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should override existing keys with new values', () => {
|
||||||
|
const existingMetadata = { githubIssue: 42, sprint: 'Q1' };
|
||||||
|
const newMetadata = { sprint: 'Q2' }; // Override sprint
|
||||||
|
|
||||||
|
const merged = {
|
||||||
|
...(existingMetadata || {}),
|
||||||
|
...(newMetadata || {})
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(merged).toEqual({
|
||||||
|
githubIssue: 42,
|
||||||
|
sprint: 'Q2' // Overridden
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle empty existing metadata', () => {
|
||||||
|
const existingMetadata = undefined;
|
||||||
|
const newMetadata = { key: 'value' };
|
||||||
|
|
||||||
|
const merged = {
|
||||||
|
...(existingMetadata || {}),
|
||||||
|
...(newMetadata || {})
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(merged).toEqual({ key: 'value' });
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle empty new metadata', () => {
|
||||||
|
const existingMetadata = { key: 'value' };
|
||||||
|
const newMetadata = undefined;
|
||||||
|
|
||||||
|
const merged = {
|
||||||
|
...(existingMetadata || {}),
|
||||||
|
...(newMetadata || {})
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(merged).toEqual({ key: 'value' });
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve nested objects in metadata', () => {
|
||||||
|
const existingMetadata = {
|
||||||
|
jira: { key: 'PROJ-123' },
|
||||||
|
other: 'data'
|
||||||
|
};
|
||||||
|
const newMetadata = {
|
||||||
|
jira: { key: 'PROJ-456', type: 'bug' } // Replace entire jira object
|
||||||
|
};
|
||||||
|
|
||||||
|
const merged = {
|
||||||
|
...(existingMetadata || {}),
|
||||||
|
...(newMetadata || {})
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(merged).toEqual({
|
||||||
|
jira: { key: 'PROJ-456', type: 'bug' }, // Entire jira object replaced
|
||||||
|
other: 'data'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('metadata-only update detection', () => {
|
||||||
|
it('should detect metadata-only update when prompt is empty', () => {
|
||||||
|
const prompt: string = '';
|
||||||
|
const metadata = { key: 'value' };
|
||||||
|
|
||||||
|
const isMetadataOnly = metadata && (!prompt || prompt.trim() === '');
|
||||||
|
expect(isMetadataOnly).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should detect metadata-only update when prompt is whitespace', () => {
|
||||||
|
const prompt: string = ' ';
|
||||||
|
const metadata = { key: 'value' };
|
||||||
|
|
||||||
|
const isMetadataOnly = metadata && (!prompt || prompt.trim() === '');
|
||||||
|
expect(isMetadataOnly).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not be metadata-only when prompt is provided', () => {
|
||||||
|
const prompt: string = 'Update task details';
|
||||||
|
const metadata = { key: 'value' };
|
||||||
|
|
||||||
|
const isMetadataOnly = metadata && (!prompt || prompt.trim() === '');
|
||||||
|
expect(isMetadataOnly).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not be metadata-only when neither is provided', () => {
|
||||||
|
const prompt: string = '';
|
||||||
|
const metadata = null;
|
||||||
|
|
||||||
|
const isMetadataOnly = metadata && (!prompt || prompt.trim() === '');
|
||||||
|
expect(isMetadataOnly).toBeFalsy(); // metadata is null, so falsy
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('tasks.json file format with metadata', () => {
|
||||||
|
it('should write and read metadata correctly in tasks.json', () => {
|
||||||
|
const tasksData = {
|
||||||
|
tasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
title: 'Test Task',
|
||||||
|
description: 'Description',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
subtasks: [],
|
||||||
|
metadata: {
|
||||||
|
githubIssue: 42,
|
||||||
|
sprint: 'Q1-S3',
|
||||||
|
storyPoints: 5
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
metadata: {
|
||||||
|
version: '1.0.0',
|
||||||
|
lastModified: new Date().toISOString(),
|
||||||
|
taskCount: 1,
|
||||||
|
completedCount: 0
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Write
|
||||||
|
fs.writeFileSync(tasksJsonPath, JSON.stringify(tasksData, null, 2));
|
||||||
|
|
||||||
|
// Read and verify
|
||||||
|
const rawContent = fs.readFileSync(tasksJsonPath, 'utf-8');
|
||||||
|
const parsed = JSON.parse(rawContent);
|
||||||
|
|
||||||
|
expect(parsed.tasks[0].metadata).toEqual({
|
||||||
|
githubIssue: 42,
|
||||||
|
sprint: 'Q1-S3',
|
||||||
|
storyPoints: 5
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should write and read subtask metadata correctly', () => {
|
||||||
|
const tasksData = {
|
||||||
|
tasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
title: 'Parent Task',
|
||||||
|
description: 'Description',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
subtasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
parentId: 1,
|
||||||
|
title: 'Subtask',
|
||||||
|
description: 'Subtask description',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
metadata: {
|
||||||
|
linkedTicket: 'JIRA-456',
|
||||||
|
reviewed: true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
metadata: {
|
||||||
|
version: '1.0.0',
|
||||||
|
lastModified: new Date().toISOString(),
|
||||||
|
taskCount: 1,
|
||||||
|
completedCount: 0
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Write
|
||||||
|
fs.writeFileSync(tasksJsonPath, JSON.stringify(tasksData, null, 2));
|
||||||
|
|
||||||
|
// Read and verify
|
||||||
|
const rawContent = fs.readFileSync(tasksJsonPath, 'utf-8');
|
||||||
|
const parsed = JSON.parse(rawContent);
|
||||||
|
|
||||||
|
expect(parsed.tasks[0].subtasks[0].metadata).toEqual({
|
||||||
|
linkedTicket: 'JIRA-456',
|
||||||
|
reviewed: true
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('error message formatting', () => {
|
||||||
|
it('should provide clear error for disabled metadata updates', () => {
|
||||||
|
const errorMessage =
|
||||||
|
'Metadata updates are disabled. Set TASK_MASTER_ALLOW_METADATA_UPDATES=true in your MCP server environment to enable metadata modifications.';
|
||||||
|
|
||||||
|
expect(errorMessage).toContain('TASK_MASTER_ALLOW_METADATA_UPDATES');
|
||||||
|
expect(errorMessage).toContain('true');
|
||||||
|
expect(errorMessage).toContain('MCP server environment');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should provide clear error for invalid JSON', () => {
|
||||||
|
const invalidJson = '{key: value}';
|
||||||
|
const errorMessage = `Invalid metadata JSON: ${invalidJson}. Provide a valid JSON object string.`;
|
||||||
|
|
||||||
|
expect(errorMessage).toContain(invalidJson);
|
||||||
|
expect(errorMessage).toContain('valid JSON object');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should provide clear error for non-object JSON', () => {
|
||||||
|
const errorMessage =
|
||||||
|
'Invalid metadata: must be a JSON object (not null or array)';
|
||||||
|
|
||||||
|
expect(errorMessage).toContain('JSON object');
|
||||||
|
expect(errorMessage).toContain('not null or array');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Unit tests for the actual validateMcpMetadata function from @tm/mcp
|
||||||
|
* These tests verify the security gate behavior for MCP metadata updates.
|
||||||
|
*/
|
||||||
|
describe('validateMcpMetadata function', () => {
|
||||||
|
// Mock error response creator that matches the MCP ContentResult format
|
||||||
|
const mockCreateErrorResponse = (message: string) => ({
|
||||||
|
content: [{ type: 'text' as const, text: `Error: ${message}` }],
|
||||||
|
isError: true
|
||||||
|
});
|
||||||
|
|
||||||
|
// Helper to safely extract text from content
|
||||||
|
const getErrorText = (
|
||||||
|
error: { content: Array<{ type: string; text?: string }> } | undefined
|
||||||
|
): string => {
|
||||||
|
if (!error?.content?.[0]) return '';
|
||||||
|
const content = error.content[0];
|
||||||
|
return 'text' in content ? (content.text ?? '') : '';
|
||||||
|
};
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
delete process.env.TASK_MASTER_ALLOW_METADATA_UPDATES;
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('when metadataString is null/undefined', () => {
|
||||||
|
it('should return null parsedMetadata for undefined input', () => {
|
||||||
|
const result = validateMcpMetadata(undefined, mockCreateErrorResponse);
|
||||||
|
expect(result.parsedMetadata).toBeNull();
|
||||||
|
expect(result.error).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return null parsedMetadata for null input', () => {
|
||||||
|
const result = validateMcpMetadata(null, mockCreateErrorResponse);
|
||||||
|
expect(result.parsedMetadata).toBeNull();
|
||||||
|
expect(result.error).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return null parsedMetadata for empty string', () => {
|
||||||
|
const result = validateMcpMetadata('', mockCreateErrorResponse);
|
||||||
|
expect(result.parsedMetadata).toBeNull();
|
||||||
|
expect(result.error).toBeUndefined();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('when TASK_MASTER_ALLOW_METADATA_UPDATES is not set', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
delete process.env.TASK_MASTER_ALLOW_METADATA_UPDATES;
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error when flag is not set', () => {
|
||||||
|
const result = validateMcpMetadata(
|
||||||
|
'{"key": "value"}',
|
||||||
|
mockCreateErrorResponse
|
||||||
|
);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
expect(getErrorText(result.error)).toContain(
|
||||||
|
'TASK_MASTER_ALLOW_METADATA_UPDATES'
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error when flag is set to "false"', () => {
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES = 'false';
|
||||||
|
const result = validateMcpMetadata(
|
||||||
|
'{"key": "value"}',
|
||||||
|
mockCreateErrorResponse
|
||||||
|
);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error when flag is "TRUE" (case sensitive)', () => {
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES = 'TRUE';
|
||||||
|
const result = validateMcpMetadata(
|
||||||
|
'{"key": "value"}',
|
||||||
|
mockCreateErrorResponse
|
||||||
|
);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error when flag is "True" (case sensitive)', () => {
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES = 'True';
|
||||||
|
const result = validateMcpMetadata(
|
||||||
|
'{"key": "value"}',
|
||||||
|
mockCreateErrorResponse
|
||||||
|
);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('when TASK_MASTER_ALLOW_METADATA_UPDATES is "true"', () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
process.env.TASK_MASTER_ALLOW_METADATA_UPDATES = 'true';
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return parsed metadata for valid JSON object', () => {
|
||||||
|
const result = validateMcpMetadata(
|
||||||
|
'{"key": "value"}',
|
||||||
|
mockCreateErrorResponse
|
||||||
|
);
|
||||||
|
expect(result.parsedMetadata).toEqual({ key: 'value' });
|
||||||
|
expect(result.error).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return parsed metadata for complex nested object', () => {
|
||||||
|
const complexMeta = {
|
||||||
|
githubIssue: 42,
|
||||||
|
sprint: 'Q1-S3',
|
||||||
|
nested: { deep: { value: true } },
|
||||||
|
array: [1, 2, 3]
|
||||||
|
};
|
||||||
|
const result = validateMcpMetadata(
|
||||||
|
JSON.stringify(complexMeta),
|
||||||
|
mockCreateErrorResponse
|
||||||
|
);
|
||||||
|
expect(result.parsedMetadata).toEqual(complexMeta);
|
||||||
|
expect(result.error).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return parsed metadata for empty object', () => {
|
||||||
|
const result = validateMcpMetadata('{}', mockCreateErrorResponse);
|
||||||
|
expect(result.parsedMetadata).toEqual({});
|
||||||
|
expect(result.error).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for invalid JSON string', () => {
|
||||||
|
const result = validateMcpMetadata(
|
||||||
|
'{key: "value"}',
|
||||||
|
mockCreateErrorResponse
|
||||||
|
);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
expect(getErrorText(result.error)).toContain('Invalid metadata JSON');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for JSON array', () => {
|
||||||
|
const result = validateMcpMetadata('[1, 2, 3]', mockCreateErrorResponse);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
expect(getErrorText(result.error)).toContain(
|
||||||
|
'must be a JSON object (not null or array)'
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for JSON null', () => {
|
||||||
|
const result = validateMcpMetadata('null', mockCreateErrorResponse);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
expect(getErrorText(result.error)).toContain(
|
||||||
|
'must be a JSON object (not null or array)'
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for JSON string primitive', () => {
|
||||||
|
const result = validateMcpMetadata('"string"', mockCreateErrorResponse);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
expect(getErrorText(result.error)).toContain(
|
||||||
|
'must be a JSON object (not null or array)'
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for JSON number primitive', () => {
|
||||||
|
const result = validateMcpMetadata('123', mockCreateErrorResponse);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
expect(getErrorText(result.error)).toContain(
|
||||||
|
'must be a JSON object (not null or array)'
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return error for JSON boolean primitive', () => {
|
||||||
|
const result = validateMcpMetadata('true', mockCreateErrorResponse);
|
||||||
|
expect(result.error).toBeDefined();
|
||||||
|
expect(result.error?.isError).toBe(true);
|
||||||
|
expect(getErrorText(result.error)).toContain(
|
||||||
|
'must be a JSON object (not null or array)'
|
||||||
|
);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -0,0 +1,472 @@
|
|||||||
|
/**
|
||||||
|
* @fileoverview Integration tests for FileStorage metadata preservation
|
||||||
|
*
|
||||||
|
* Tests that user-defined metadata survives all FileStorage CRUD operations
|
||||||
|
* including load, save, update, and append.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { afterEach, beforeEach, describe, expect, it } from 'vitest';
|
||||||
|
import * as fs from 'fs';
|
||||||
|
import * as path from 'path';
|
||||||
|
import * as os from 'os';
|
||||||
|
import { FileStorage } from '../../../src/modules/storage/adapters/file-storage/file-storage.js';
|
||||||
|
import type { Task } from '../../../src/common/types/index.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Creates a minimal valid task for testing
|
||||||
|
*/
|
||||||
|
function createTask(id: string, overrides: Partial<Task> = {}): Task {
|
||||||
|
return {
|
||||||
|
id,
|
||||||
|
title: `Task ${id}`,
|
||||||
|
description: `Description for task ${id}`,
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
subtasks: [],
|
||||||
|
...overrides
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('FileStorage Metadata Preservation - Integration Tests', () => {
|
||||||
|
let tempDir: string;
|
||||||
|
let storage: FileStorage;
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
// Create a temp directory for each test
|
||||||
|
tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'taskmaster-test-'));
|
||||||
|
// Create .taskmaster/tasks directory structure
|
||||||
|
const taskmasterDir = path.join(tempDir, '.taskmaster', 'tasks');
|
||||||
|
fs.mkdirSync(taskmasterDir, { recursive: true });
|
||||||
|
storage = new FileStorage(tempDir);
|
||||||
|
});
|
||||||
|
|
||||||
|
afterEach(() => {
|
||||||
|
// Clean up temp directory
|
||||||
|
fs.rmSync(tempDir, { recursive: true, force: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('saveTasks() and loadTasks() round-trip', () => {
|
||||||
|
it('should preserve metadata through save and load cycle', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
metadata: {
|
||||||
|
externalId: 'JIRA-123',
|
||||||
|
source: 'import',
|
||||||
|
customField: { nested: 'value' }
|
||||||
|
}
|
||||||
|
}),
|
||||||
|
createTask('2', {
|
||||||
|
metadata: {
|
||||||
|
score: 85,
|
||||||
|
isUrgent: true
|
||||||
|
}
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
|
||||||
|
expect(loadedTasks).toHaveLength(2);
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({
|
||||||
|
externalId: 'JIRA-123',
|
||||||
|
source: 'import',
|
||||||
|
customField: { nested: 'value' }
|
||||||
|
});
|
||||||
|
expect(loadedTasks[1].metadata).toEqual({
|
||||||
|
score: 85,
|
||||||
|
isUrgent: true
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve empty metadata object', async () => {
|
||||||
|
const tasks: Task[] = [createTask('1', { metadata: {} })];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle tasks without metadata', async () => {
|
||||||
|
const tasks: Task[] = [createTask('1')]; // No metadata
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
|
||||||
|
expect(loadedTasks[0].metadata).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve complex metadata with various types', async () => {
|
||||||
|
const complexMetadata = {
|
||||||
|
string: 'value',
|
||||||
|
number: 42,
|
||||||
|
float: 3.14,
|
||||||
|
boolean: true,
|
||||||
|
nullValue: null,
|
||||||
|
array: [1, 'two', { three: 3 }],
|
||||||
|
nested: {
|
||||||
|
deep: {
|
||||||
|
deeper: {
|
||||||
|
value: 'found'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const tasks: Task[] = [createTask('1', { metadata: complexMetadata })];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
|
||||||
|
expect(loadedTasks[0].metadata).toEqual(complexMetadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata on subtasks', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
metadata: { parentMeta: 'value' },
|
||||||
|
subtasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'Subtask 1',
|
||||||
|
description: 'Description',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
metadata: { subtaskMeta: 'subtask-value' }
|
||||||
|
}
|
||||||
|
]
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({ parentMeta: 'value' });
|
||||||
|
expect(loadedTasks[0].subtasks[0].metadata).toEqual({
|
||||||
|
subtaskMeta: 'subtask-value'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('updateTask() metadata preservation', () => {
|
||||||
|
it('should preserve existing metadata when updating other fields', async () => {
|
||||||
|
const originalMetadata = { externalId: 'EXT-123', version: 1 };
|
||||||
|
const tasks: Task[] = [createTask('1', { metadata: originalMetadata })];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Update title only, not metadata
|
||||||
|
await storage.updateTask('1', { title: 'Updated Title' });
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].title).toBe('Updated Title');
|
||||||
|
expect(loadedTasks[0].metadata).toEqual(originalMetadata);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should allow updating metadata field directly', async () => {
|
||||||
|
const tasks: Task[] = [createTask('1', { metadata: { original: true } })];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Update metadata
|
||||||
|
await storage.updateTask('1', {
|
||||||
|
metadata: { original: true, updated: true, newField: 'value' }
|
||||||
|
});
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({
|
||||||
|
original: true,
|
||||||
|
updated: true,
|
||||||
|
newField: 'value'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should allow replacing metadata entirely', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', { metadata: { oldField: 'old' } })
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Replace metadata entirely
|
||||||
|
await storage.updateTask('1', { metadata: { newField: 'new' } });
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({ newField: 'new' });
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata when updating status', async () => {
|
||||||
|
const tasks: Task[] = [createTask('1', { metadata: { tracked: true } })];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
await storage.updateTask('1', { status: 'in-progress' });
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].status).toBe('in-progress');
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({ tracked: true });
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('appendTasks() metadata preservation', () => {
|
||||||
|
it('should preserve metadata on existing tasks when appending', async () => {
|
||||||
|
const existingTasks: Task[] = [
|
||||||
|
createTask('1', { metadata: { existing: true } })
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(existingTasks);
|
||||||
|
|
||||||
|
// Append new tasks
|
||||||
|
const newTasks: Task[] = [
|
||||||
|
createTask('2', { metadata: { newTask: true } })
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.appendTasks(newTasks);
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks).toHaveLength(2);
|
||||||
|
expect(loadedTasks.find((t) => t.id === '1')?.metadata).toEqual({
|
||||||
|
existing: true
|
||||||
|
});
|
||||||
|
expect(loadedTasks.find((t) => t.id === '2')?.metadata).toEqual({
|
||||||
|
newTask: true
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('loadTask() single task metadata', () => {
|
||||||
|
it('should preserve metadata when loading single task', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', { metadata: { specific: 'metadata' } }),
|
||||||
|
createTask('2', { metadata: { other: 'data' } })
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
const task = await storage.loadTask('1');
|
||||||
|
|
||||||
|
expect(task).toBeDefined();
|
||||||
|
expect(task?.metadata).toEqual({ specific: 'metadata' });
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('metadata alongside AI implementation metadata', () => {
|
||||||
|
it('should preserve both user metadata and AI metadata', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
// AI implementation metadata
|
||||||
|
relevantFiles: [
|
||||||
|
{
|
||||||
|
path: 'src/test.ts',
|
||||||
|
description: 'Test file',
|
||||||
|
action: 'modify'
|
||||||
|
}
|
||||||
|
],
|
||||||
|
category: 'development',
|
||||||
|
skills: ['TypeScript'],
|
||||||
|
acceptanceCriteria: ['Tests pass'],
|
||||||
|
// User-defined metadata
|
||||||
|
metadata: {
|
||||||
|
externalId: 'JIRA-456',
|
||||||
|
importedAt: '2024-01-15T10:00:00Z'
|
||||||
|
}
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
|
||||||
|
// AI metadata preserved
|
||||||
|
expect(loadedTasks[0].relevantFiles).toHaveLength(1);
|
||||||
|
expect(loadedTasks[0].category).toBe('development');
|
||||||
|
expect(loadedTasks[0].skills).toEqual(['TypeScript']);
|
||||||
|
|
||||||
|
// User metadata preserved
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({
|
||||||
|
externalId: 'JIRA-456',
|
||||||
|
importedAt: '2024-01-15T10:00:00Z'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('AI operation metadata preservation', () => {
|
||||||
|
it('should preserve metadata when updating task with AI-like partial update', async () => {
|
||||||
|
// Simulate existing task with user metadata
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
title: 'Original Title',
|
||||||
|
metadata: { externalId: 'JIRA-123', version: 1 }
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Simulate AI update - only updates specific fields, no metadata field
|
||||||
|
// This mimics what happens when AI processes update-task
|
||||||
|
const aiUpdate: Partial<Task> = {
|
||||||
|
title: 'AI Updated Title',
|
||||||
|
description: 'AI generated description',
|
||||||
|
details: 'AI generated details'
|
||||||
|
// Note: no metadata field - AI schemas don't include it
|
||||||
|
};
|
||||||
|
|
||||||
|
await storage.updateTask('1', aiUpdate);
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].title).toBe('AI Updated Title');
|
||||||
|
expect(loadedTasks[0].description).toBe('AI generated description');
|
||||||
|
// User metadata must be preserved
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({
|
||||||
|
externalId: 'JIRA-123',
|
||||||
|
version: 1
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata when adding AI-generated subtasks', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
metadata: { tracked: true, source: 'import' },
|
||||||
|
subtasks: []
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Simulate expand-task adding subtasks
|
||||||
|
// Subtasks from AI don't have metadata field
|
||||||
|
const updatedTask: Partial<Task> = {
|
||||||
|
subtasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'AI Generated Subtask',
|
||||||
|
description: 'Description',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: 'Details',
|
||||||
|
testStrategy: 'Tests'
|
||||||
|
// No metadata - AI doesn't generate it
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
|
||||||
|
await storage.updateTask('1', updatedTask);
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
// Parent task metadata preserved
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({
|
||||||
|
tracked: true,
|
||||||
|
source: 'import'
|
||||||
|
});
|
||||||
|
// Subtask has no metadata (as expected from AI)
|
||||||
|
expect(loadedTasks[0].subtasks[0].metadata).toBeUndefined();
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle multiple sequential AI updates preserving metadata', async () => {
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
metadata: { originalField: 'preserved' }
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// First AI update
|
||||||
|
await storage.updateTask('1', { title: 'First Update' });
|
||||||
|
// Second AI update
|
||||||
|
await storage.updateTask('1', { description: 'Second Update' });
|
||||||
|
// Third AI update
|
||||||
|
await storage.updateTask('1', { priority: 'high' });
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
expect(loadedTasks[0].title).toBe('First Update');
|
||||||
|
expect(loadedTasks[0].description).toBe('Second Update');
|
||||||
|
expect(loadedTasks[0].priority).toBe('high');
|
||||||
|
// Metadata preserved through all updates
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({ originalField: 'preserved' });
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata when update object omits metadata field entirely', async () => {
|
||||||
|
// This is how AI operations work - they simply don't include metadata
|
||||||
|
const tasks: Task[] = [
|
||||||
|
createTask('1', {
|
||||||
|
metadata: { important: 'data' }
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Update WITHOUT metadata field (AI schemas don't include it)
|
||||||
|
const updateWithoutMetadata: Partial<Task> = { title: 'Updated' };
|
||||||
|
await storage.updateTask('1', updateWithoutMetadata);
|
||||||
|
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
// When metadata field is absent from updates, existing metadata is preserved
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({ important: 'data' });
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('file format verification', () => {
|
||||||
|
it('should write metadata to JSON file correctly', async () => {
|
||||||
|
const tasks: Task[] = [createTask('1', { metadata: { written: true } })];
|
||||||
|
|
||||||
|
await storage.saveTasks(tasks);
|
||||||
|
|
||||||
|
// Read raw file to verify format
|
||||||
|
const filePath = path.join(tempDir, '.taskmaster', 'tasks', 'tasks.json');
|
||||||
|
const rawContent = fs.readFileSync(filePath, 'utf-8');
|
||||||
|
const parsed = JSON.parse(rawContent);
|
||||||
|
|
||||||
|
expect(parsed.tasks[0].metadata).toEqual({ written: true });
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should load metadata from pre-existing JSON file', async () => {
|
||||||
|
// Write a tasks.json file manually
|
||||||
|
const tasksDir = path.join(tempDir, '.taskmaster', 'tasks');
|
||||||
|
const filePath = path.join(tasksDir, 'tasks.json');
|
||||||
|
|
||||||
|
const fileContent = {
|
||||||
|
tasks: [
|
||||||
|
{
|
||||||
|
id: '1',
|
||||||
|
title: 'Pre-existing task',
|
||||||
|
description: 'Description',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
subtasks: [],
|
||||||
|
metadata: {
|
||||||
|
preExisting: true,
|
||||||
|
importedFrom: 'external-system'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
],
|
||||||
|
metadata: {
|
||||||
|
version: '1.0.0',
|
||||||
|
lastModified: new Date().toISOString(),
|
||||||
|
taskCount: 1,
|
||||||
|
completedCount: 0
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
fs.writeFileSync(filePath, JSON.stringify(fileContent, null, 2));
|
||||||
|
|
||||||
|
// Load through FileStorage
|
||||||
|
const loadedTasks = await storage.loadTasks();
|
||||||
|
|
||||||
|
expect(loadedTasks).toHaveLength(1);
|
||||||
|
expect(loadedTasks[0].metadata).toEqual({
|
||||||
|
preExisting: true,
|
||||||
|
importedFrom: 'external-system'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -492,4 +492,157 @@ describe('Task Metadata Extraction - Integration Tests', () => {
|
|||||||
expect(validCategories).toContain(task.category);
|
expect(validCategories).toContain(task.category);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('User-Defined Metadata Field', () => {
|
||||||
|
it('should preserve user-defined metadata through JSON serialization', () => {
|
||||||
|
const taskWithMetadata: Task = {
|
||||||
|
id: '1',
|
||||||
|
title: 'Task with custom metadata',
|
||||||
|
description: 'Test description',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'high',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
subtasks: [],
|
||||||
|
metadata: {
|
||||||
|
externalId: 'JIRA-123',
|
||||||
|
source: 'import',
|
||||||
|
customField: { nested: 'value' }
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const serialized = JSON.stringify(taskWithMetadata);
|
||||||
|
const deserialized: Task = JSON.parse(serialized);
|
||||||
|
|
||||||
|
expect(deserialized.metadata).toEqual(taskWithMetadata.metadata);
|
||||||
|
expect(deserialized.metadata?.externalId).toBe('JIRA-123');
|
||||||
|
expect(deserialized.metadata?.customField).toEqual({ nested: 'value' });
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata on subtasks through JSON serialization', () => {
|
||||||
|
const taskWithSubtasks: Task = {
|
||||||
|
id: '1',
|
||||||
|
title: 'Parent task',
|
||||||
|
description: 'Test',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
metadata: { parentMeta: true },
|
||||||
|
subtasks: [
|
||||||
|
{
|
||||||
|
id: 1,
|
||||||
|
parentId: '1',
|
||||||
|
title: 'Subtask 1',
|
||||||
|
description: 'Test',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
metadata: { subtaskMeta: 'value1' }
|
||||||
|
}
|
||||||
|
]
|
||||||
|
};
|
||||||
|
|
||||||
|
const serialized = JSON.stringify(taskWithSubtasks);
|
||||||
|
const deserialized: Task = JSON.parse(serialized);
|
||||||
|
|
||||||
|
expect(deserialized.metadata).toEqual({ parentMeta: true });
|
||||||
|
expect(deserialized.subtasks[0].metadata).toEqual({
|
||||||
|
subtaskMeta: 'value1'
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle empty metadata object', () => {
|
||||||
|
const task: Task = {
|
||||||
|
id: '1',
|
||||||
|
title: 'Task',
|
||||||
|
description: 'Test',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
subtasks: [],
|
||||||
|
metadata: {}
|
||||||
|
};
|
||||||
|
|
||||||
|
const serialized = JSON.stringify(task);
|
||||||
|
const deserialized: Task = JSON.parse(serialized);
|
||||||
|
|
||||||
|
expect(deserialized.metadata).toEqual({});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle complex metadata with various types', () => {
|
||||||
|
const task: Task = {
|
||||||
|
id: '1',
|
||||||
|
title: 'Task',
|
||||||
|
description: 'Test',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: '',
|
||||||
|
testStrategy: '',
|
||||||
|
subtasks: [],
|
||||||
|
metadata: {
|
||||||
|
string: 'value',
|
||||||
|
number: 42,
|
||||||
|
boolean: true,
|
||||||
|
nullValue: null,
|
||||||
|
array: [1, 2, 3],
|
||||||
|
nested: {
|
||||||
|
deep: {
|
||||||
|
value: 'found'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const serialized = JSON.stringify(task);
|
||||||
|
const deserialized: Task = JSON.parse(serialized);
|
||||||
|
|
||||||
|
expect(deserialized.metadata?.string).toBe('value');
|
||||||
|
expect(deserialized.metadata?.number).toBe(42);
|
||||||
|
expect(deserialized.metadata?.boolean).toBe(true);
|
||||||
|
expect(deserialized.metadata?.nullValue).toBeNull();
|
||||||
|
expect(deserialized.metadata?.array).toEqual([1, 2, 3]);
|
||||||
|
expect((deserialized.metadata?.nested as any)?.deep?.value).toBe('found');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should preserve metadata alongside AI implementation metadata', () => {
|
||||||
|
const task: Task = {
|
||||||
|
id: '1',
|
||||||
|
title: 'Task',
|
||||||
|
description: 'Test',
|
||||||
|
status: 'pending',
|
||||||
|
priority: 'medium',
|
||||||
|
dependencies: [],
|
||||||
|
details: 'Some details',
|
||||||
|
testStrategy: 'Unit tests',
|
||||||
|
subtasks: [],
|
||||||
|
// AI implementation metadata
|
||||||
|
relevantFiles: [
|
||||||
|
{ path: 'src/test.ts', description: 'Test file', action: 'modify' }
|
||||||
|
],
|
||||||
|
category: 'development',
|
||||||
|
skills: ['TypeScript'],
|
||||||
|
// User-defined metadata
|
||||||
|
metadata: {
|
||||||
|
externalId: 'EXT-456',
|
||||||
|
importedAt: '2024-01-15T10:00:00Z'
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const serialized = JSON.stringify(task);
|
||||||
|
const deserialized: Task = JSON.parse(serialized);
|
||||||
|
|
||||||
|
// Both types of metadata should be preserved
|
||||||
|
expect(deserialized.relevantFiles).toHaveLength(1);
|
||||||
|
expect(deserialized.category).toBe('development');
|
||||||
|
expect(deserialized.metadata?.externalId).toBe('EXT-456');
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -48,7 +48,13 @@ async function updateSubtaskById(
|
|||||||
context = {},
|
context = {},
|
||||||
outputFormat = context.mcpLog ? 'json' : 'text'
|
outputFormat = context.mcpLog ? 'json' : 'text'
|
||||||
) {
|
) {
|
||||||
const { session, mcpLog, projectRoot: providedProjectRoot, tag } = context;
|
const {
|
||||||
|
session,
|
||||||
|
mcpLog,
|
||||||
|
projectRoot: providedProjectRoot,
|
||||||
|
tag,
|
||||||
|
metadata
|
||||||
|
} = context;
|
||||||
const logFn = mcpLog || consoleLog;
|
const logFn = mcpLog || consoleLog;
|
||||||
const isMCP = !!mcpLog;
|
const isMCP = !!mcpLog;
|
||||||
|
|
||||||
@@ -71,10 +77,13 @@ async function updateSubtaskById(
|
|||||||
if (!subtaskId || typeof subtaskId !== 'string') {
|
if (!subtaskId || typeof subtaskId !== 'string') {
|
||||||
throw new Error('Subtask ID cannot be empty.');
|
throw new Error('Subtask ID cannot be empty.');
|
||||||
}
|
}
|
||||||
|
// Allow metadata-only updates (no prompt required if metadata is provided)
|
||||||
if (!prompt || typeof prompt !== 'string' || prompt.trim() === '') {
|
if (
|
||||||
|
(!prompt || typeof prompt !== 'string' || prompt.trim() === '') &&
|
||||||
|
!metadata
|
||||||
|
) {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
'Prompt cannot be empty. Please provide context for the subtask update.'
|
'Prompt cannot be empty unless metadata is provided. Please provide context for the subtask update or metadata to merge.'
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -93,6 +102,7 @@ async function updateSubtaskById(
|
|||||||
tag,
|
tag,
|
||||||
appendMode: true, // Subtask updates are always append mode
|
appendMode: true, // Subtask updates are always append mode
|
||||||
useResearch,
|
useResearch,
|
||||||
|
metadata,
|
||||||
isMCP,
|
isMCP,
|
||||||
outputFormat,
|
outputFormat,
|
||||||
report
|
report
|
||||||
@@ -164,6 +174,30 @@ async function updateSubtaskById(
|
|||||||
|
|
||||||
const subtask = parentTask.subtasks[subtaskIndex];
|
const subtask = parentTask.subtasks[subtaskIndex];
|
||||||
|
|
||||||
|
// --- Metadata-Only Update (Fast Path) ---
|
||||||
|
// If only metadata is provided (no prompt), skip AI and just update metadata
|
||||||
|
if (metadata && (!prompt || prompt.trim() === '')) {
|
||||||
|
report('info', `Metadata-only update for subtask ${subtaskId}`);
|
||||||
|
// Merge new metadata with existing
|
||||||
|
subtask.metadata = {
|
||||||
|
...(subtask.metadata || {}),
|
||||||
|
...metadata
|
||||||
|
};
|
||||||
|
parentTask.subtasks[subtaskIndex] = subtask;
|
||||||
|
writeJSON(tasksPath, data, projectRoot, tag);
|
||||||
|
report(
|
||||||
|
'success',
|
||||||
|
`Successfully updated metadata for subtask ${subtaskId}`
|
||||||
|
);
|
||||||
|
|
||||||
|
return {
|
||||||
|
updatedSubtask: subtask,
|
||||||
|
telemetryData: null,
|
||||||
|
tagInfo: { tag }
|
||||||
|
};
|
||||||
|
}
|
||||||
|
// --- End Metadata-Only Update ---
|
||||||
|
|
||||||
// --- Context Gathering ---
|
// --- Context Gathering ---
|
||||||
let gatheredContext = '';
|
let gatheredContext = '';
|
||||||
try {
|
try {
|
||||||
@@ -334,6 +368,14 @@ async function updateSubtaskById(
|
|||||||
|
|
||||||
const updatedSubtask = parentTask.subtasks[subtaskIndex];
|
const updatedSubtask = parentTask.subtasks[subtaskIndex];
|
||||||
|
|
||||||
|
// Merge metadata if provided (preserve existing metadata)
|
||||||
|
if (metadata) {
|
||||||
|
updatedSubtask.metadata = {
|
||||||
|
...(updatedSubtask.metadata || {}),
|
||||||
|
...metadata
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
if (outputFormat === 'text' && getDebugFlag(session)) {
|
if (outputFormat === 'text' && getDebugFlag(session)) {
|
||||||
console.log(
|
console.log(
|
||||||
'>>> DEBUG: Subtask details AFTER AI update:',
|
'>>> DEBUG: Subtask details AFTER AI update:',
|
||||||
|
|||||||
@@ -58,7 +58,13 @@ async function updateTaskById(
|
|||||||
outputFormat = 'text',
|
outputFormat = 'text',
|
||||||
appendMode = false
|
appendMode = false
|
||||||
) {
|
) {
|
||||||
const { session, mcpLog, projectRoot: providedProjectRoot, tag } = context;
|
const {
|
||||||
|
session,
|
||||||
|
mcpLog,
|
||||||
|
projectRoot: providedProjectRoot,
|
||||||
|
tag,
|
||||||
|
metadata
|
||||||
|
} = context;
|
||||||
const { report, isMCP } = createBridgeLogger(mcpLog, session);
|
const { report, isMCP } = createBridgeLogger(mcpLog, session);
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@@ -70,8 +76,15 @@ async function updateTaskById(
|
|||||||
if (taskId === null || taskId === undefined || String(taskId).trim() === '')
|
if (taskId === null || taskId === undefined || String(taskId).trim() === '')
|
||||||
throw new Error('Task ID cannot be empty.');
|
throw new Error('Task ID cannot be empty.');
|
||||||
|
|
||||||
if (!prompt || typeof prompt !== 'string' || prompt.trim() === '')
|
// Allow metadata-only updates (prompt can be empty if metadata is provided)
|
||||||
throw new Error('Prompt cannot be empty.');
|
if (
|
||||||
|
(!prompt || typeof prompt !== 'string' || prompt.trim() === '') &&
|
||||||
|
!metadata
|
||||||
|
) {
|
||||||
|
throw new Error(
|
||||||
|
'Prompt cannot be empty unless metadata is provided for update.'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
// Determine project root first (needed for API key checks)
|
// Determine project root first (needed for API key checks)
|
||||||
const projectRoot = providedProjectRoot || findProjectRoot();
|
const projectRoot = providedProjectRoot || findProjectRoot();
|
||||||
@@ -99,6 +112,7 @@ async function updateTaskById(
|
|||||||
tag,
|
tag,
|
||||||
appendMode,
|
appendMode,
|
||||||
useResearch,
|
useResearch,
|
||||||
|
metadata,
|
||||||
isMCP,
|
isMCP,
|
||||||
outputFormat,
|
outputFormat,
|
||||||
report
|
report
|
||||||
@@ -166,6 +180,27 @@ async function updateTaskById(
|
|||||||
}
|
}
|
||||||
// --- End Task Loading ---
|
// --- End Task Loading ---
|
||||||
|
|
||||||
|
// --- Metadata-Only Update (Fast Path) ---
|
||||||
|
// If only metadata is provided (no prompt), skip AI and just update metadata
|
||||||
|
if (metadata && (!prompt || prompt.trim() === '')) {
|
||||||
|
report('info', `Metadata-only update for task ${taskId}`);
|
||||||
|
// Merge new metadata with existing
|
||||||
|
taskToUpdate.metadata = {
|
||||||
|
...(taskToUpdate.metadata || {}),
|
||||||
|
...metadata
|
||||||
|
};
|
||||||
|
data.tasks[taskIndex] = taskToUpdate;
|
||||||
|
writeJSON(tasksPath, data, projectRoot, tag);
|
||||||
|
report('success', `Successfully updated metadata for task ${taskId}`);
|
||||||
|
|
||||||
|
return {
|
||||||
|
updatedTask: taskToUpdate,
|
||||||
|
telemetryData: null,
|
||||||
|
tagInfo: { tag }
|
||||||
|
};
|
||||||
|
}
|
||||||
|
// --- End Metadata-Only Update ---
|
||||||
|
|
||||||
// --- Context Gathering ---
|
// --- Context Gathering ---
|
||||||
let gatheredContext = '';
|
let gatheredContext = '';
|
||||||
try {
|
try {
|
||||||
@@ -385,6 +420,14 @@ async function updateTaskById(
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Merge metadata if provided
|
||||||
|
if (metadata) {
|
||||||
|
taskToUpdate.metadata = {
|
||||||
|
...(taskToUpdate.metadata || {}),
|
||||||
|
...metadata
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
// Write the updated task back to file
|
// Write the updated task back to file
|
||||||
data.tasks[taskIndex] = taskToUpdate;
|
data.tasks[taskIndex] = taskToUpdate;
|
||||||
writeJSON(tasksPath, data, projectRoot, tag);
|
writeJSON(tasksPath, data, projectRoot, tag);
|
||||||
@@ -455,6 +498,14 @@ async function updateTaskById(
|
|||||||
if (updatedTask.subtasks && Array.isArray(updatedTask.subtasks)) {
|
if (updatedTask.subtasks && Array.isArray(updatedTask.subtasks)) {
|
||||||
let currentSubtaskId = 1;
|
let currentSubtaskId = 1;
|
||||||
updatedTask.subtasks = updatedTask.subtasks.map((subtask) => {
|
updatedTask.subtasks = updatedTask.subtasks.map((subtask) => {
|
||||||
|
// Find original subtask to preserve its metadata
|
||||||
|
// Use type-coerced ID matching (AI may return string IDs vs numeric)
|
||||||
|
// Also match by title as fallback (subtask titles are typically unique)
|
||||||
|
const originalSubtask = taskToUpdate.subtasks?.find(
|
||||||
|
(st) =>
|
||||||
|
String(st.id) === String(subtask.id) ||
|
||||||
|
(subtask.title && st.title === subtask.title)
|
||||||
|
);
|
||||||
// Fix AI-generated subtask IDs that might be strings or use parent ID as prefix
|
// Fix AI-generated subtask IDs that might be strings or use parent ID as prefix
|
||||||
const correctedSubtask = {
|
const correctedSubtask = {
|
||||||
...subtask,
|
...subtask,
|
||||||
@@ -472,7 +523,11 @@ async function updateTaskById(
|
|||||||
)
|
)
|
||||||
: [],
|
: [],
|
||||||
status: subtask.status || 'pending',
|
status: subtask.status || 'pending',
|
||||||
testStrategy: subtask.testStrategy ?? null
|
testStrategy: subtask.testStrategy ?? null,
|
||||||
|
// Preserve subtask metadata from original (AI schema excludes metadata)
|
||||||
|
...(originalSubtask?.metadata && {
|
||||||
|
metadata: originalSubtask.metadata
|
||||||
|
})
|
||||||
};
|
};
|
||||||
currentSubtaskId++;
|
currentSubtaskId++;
|
||||||
return correctedSubtask;
|
return correctedSubtask;
|
||||||
@@ -529,6 +584,17 @@ async function updateTaskById(
|
|||||||
}
|
}
|
||||||
// --- End Task Validation/Correction ---
|
// --- End Task Validation/Correction ---
|
||||||
|
|
||||||
|
// --- Preserve and Merge Metadata ---
|
||||||
|
// AI responses don't include metadata (AI schema excludes it)
|
||||||
|
// Preserve existing metadata from original task and merge new metadata if provided
|
||||||
|
if (taskToUpdate.metadata || metadata) {
|
||||||
|
updatedTask.metadata = {
|
||||||
|
...(taskToUpdate.metadata || {}),
|
||||||
|
...(metadata || {})
|
||||||
|
};
|
||||||
|
}
|
||||||
|
// --- End Preserve and Merge Metadata ---
|
||||||
|
|
||||||
// --- Update Task Data (Keep existing) ---
|
// --- Update Task Data (Keep existing) ---
|
||||||
data.tasks[taskIndex] = updatedTask;
|
data.tasks[taskIndex] = updatedTask;
|
||||||
// --- End Update Task Data ---
|
// --- End Update Task Data ---
|
||||||
|
|||||||
@@ -10,6 +10,11 @@ import { z } from 'zod';
|
|||||||
*
|
*
|
||||||
* Other providers (Anthropic, Google, etc.) safely ignore this constraint.
|
* Other providers (Anthropic, Google, etc.) safely ignore this constraint.
|
||||||
* See: https://platform.openai.com/docs/guides/structured-outputs
|
* See: https://platform.openai.com/docs/guides/structured-outputs
|
||||||
|
*
|
||||||
|
* NOTE: The `metadata` field (user-defined task metadata) is intentionally EXCLUDED
|
||||||
|
* from all AI schemas. This ensures AI operations cannot overwrite user metadata.
|
||||||
|
* When tasks are updated via AI, the spread operator preserves existing metadata
|
||||||
|
* since AI responses won't include a metadata field.
|
||||||
*/
|
*/
|
||||||
export const TaskStatusSchema = z.enum([
|
export const TaskStatusSchema = z.enum([
|
||||||
'pending',
|
'pending',
|
||||||
|
|||||||
Reference in New Issue
Block a user