mirror of
https://github.com/github/spec-kit.git
synced 2026-04-01 10:13:08 +00:00
Stage 2: Copilot integration — proof of concept with shared template primitives (#2035)
* feat: Stage 2a — CopilotIntegration with shared template primitives - base.py: added granular primitives (shared_commands_dir, shared_templates_dir, list_command_templates, command_filename, commands_dest, copy_command_to_directory, record_file_in_manifest, write_file_and_record, process_template) - CopilotIntegration: uses primitives to produce .agent.md commands, companion .prompt.md files, and .vscode/settings.json - Verified byte-for-byte parity with old release script output - Copilot auto-registered in INTEGRATION_REGISTRY - 70 tests (22 new: base primitives + copilot integration) Part of #1924 * feat: Stage 2b — --integration flag, routing, agent.json, shared infra - Added --integration flag to init() (mutually exclusive with --ai) - --ai copilot auto-promotes to integration path with migration nudge - Integration setup writes .specify/agent.json with integration key - _install_shared_infra() copies scripts and templates to .specify/ - init-options.json records 'integration' key when used - 4 new CLI tests: mutual exclusivity, unknown rejection, copilot end-to-end, auto-promote (74 total integration tests) Part of #1924 * feat: Stage 2 completion — integration scripts, integration.json, shared manifest - Added copilot/scripts/update-context.sh and .ps1 (thin wrappers that delegate to the shared update-agent-context script) - CopilotIntegration.setup() installs integration scripts to .specify/integrations/copilot/scripts/ - Renamed agent.json → integration.json with script paths - _install_shared_infra() now tracks files in integration-shared.manifest.json - Updated tests: scripts installed, integration.json has script paths, shared manifest recorded (74 tests) Part of #1924 * refactor: rename shared manifest to speckit.manifest.json Cleaner naming — the shared infrastructure (scripts, templates) belongs to spec-kit itself, not to any specific integration. * fix: copilot update-context scripts reflect target architecture Scripts now source shared functions (via SPECKIT_SOURCE_ONLY=1) and call update_agent_file directly with .github/copilot-instructions.md, rather than delegating back to the shared case statement. * fix: simplify copilot scripts — dispatcher sources common functions Integration scripts now contain only copilot-specific logic (target path + agent name). The dispatcher is responsible for sourcing shared functions before calling the integration script. * fix: copilot update-context scripts are self-contained implementations These scripts ARE the implementation — the dispatcher calls them. They source common.sh + update-agent-context functions, gather feature/plan data, then call update_agent_file with the copilot target path (.github/copilot-instructions.md). * docs: add Stage 7 activation note to copilot update-context scripts * test: add complete file inventory test for copilot integration Validates every single file (37 total) produced by specify init --integration copilot --script sh --no-git. * test: add PowerShell file inventory test for copilot integration Validates all 37 files produced by --script ps variant, including .specify/scripts/powershell/ instead of bash. * refactor: split test_integrations.py into tests/integrations/ directory - test_base.py: IntegrationOption, IntegrationBase, MarkdownIntegration, primitives - test_manifest.py: IntegrationManifest, path traversal, persistence, validation - test_registry.py: INTEGRATION_REGISTRY - test_copilot.py: CopilotIntegration unit tests - test_cli.py: --integration flag, auto-promote, file inventories (sh + ps) - conftest.py: shared StubIntegration helper 76 integration tests + 48 consistency tests = 124 total, all passing. * refactor: move file inventory tests from test_cli to test_copilot File inventories are copilot-specific. test_cli.py now only tests CLI flag mechanics (mutual exclusivity, unknown rejection, auto-promote). * fix: skip JSONC merge to preserve user settings, fix docstring - _merge_vscode_settings() now returns early (skips merge) when existing settings.json can't be parsed (e.g. JSONC with comments), instead of overwriting with empty settings - Updated _install_shared_infra() docstring to match implementation (scripts + templates, speckit.manifest.json) * fix: warn user when JSONC settings merge is skipped * fix: show template content when JSONC merge is skipped User now sees the exact settings they should add manually. * fix: document process_template requirement, merge scripts without rmtree - base.py setup() docstring now explicitly states raw copy behavior and directs to CopilotIntegration for process_template example - _install_shared_infra() uses merge/overwrite instead of rmtree to preserve user-added files under .specify/scripts/ * fix: don't overwrite pre-existing shared scripts or templates Only write files that don't already exist — preserves any user modifications to shared scripts (common.sh etc.) and templates. * fix: warn user about skipped pre-existing shared files Lists all shared scripts and templates that were not copied because they already existed in the project. * test: add test for shared infra skip behavior on pre-existing files Verifies that _install_shared_infra() preserves user-modified scripts and templates while still installing missing ones. * fix: address review — containment check, deterministic prompts, manifest accuracy - CopilotIntegration.setup() adds dest containment check (relative_to) - Companion prompts generated from templates list, not directory glob - _install_shared_infra() only records files actually copied (not pre-existing) - VS Code settings tests made unconditional (assert template exists) - Inventory tests use .as_posix() for cross-platform paths * fix: correct PS1 function names, document SPECKIT_SOURCE_ONLY prerequisite - Fixed Get-FeaturePaths → Get-FeaturePathsEnv, Read-PlanData → Parse-PlanData - Documented that shared scripts must guard Main with SPECKIT_SOURCE_ONLY before these integration scripts can be activated (Stage 7) * fix: add dict type check for settings merge, simplify PS1 to subprocess - _merge_vscode_settings() skips merge with warning if parsed JSON is not a dict (array, null, etc.) - PS1 update-context.ps1 uses & invocation instead of dot-sourcing since the shared script runs Main unconditionally * fix: skip-write on no-op merge, bash subprocess, dynamic integration list - _merge_vscode_settings() only writes when keys were actually added - update-context.sh uses exec subprocess like PS1 version - Unknown integration error lists available integrations dynamically * fix: align path rewriting with release script, add .specify/.specify/ fix Path rewrite regex matches the release script's rewrite_paths() exactly (verified byte-identical output). Added .specify/.specify/ double-prefix fix for additional safety.
This commit is contained in:
@@ -1197,6 +1197,84 @@ def _locate_release_script() -> tuple[Path, str]:
|
||||
raise FileNotFoundError(f"Release script '{name}' not found in core_pack or source checkout")
|
||||
|
||||
|
||||
def _install_shared_infra(
|
||||
project_path: Path,
|
||||
script_type: str,
|
||||
tracker: StepTracker | None = None,
|
||||
) -> bool:
|
||||
"""Install shared infrastructure files into *project_path*.
|
||||
|
||||
Copies ``.specify/scripts/`` and ``.specify/templates/`` from the
|
||||
bundled core_pack or source checkout. Tracks all installed files
|
||||
in ``speckit.manifest.json``.
|
||||
Returns ``True`` on success.
|
||||
"""
|
||||
from .integrations.manifest import IntegrationManifest
|
||||
|
||||
core = _locate_core_pack()
|
||||
manifest = IntegrationManifest("speckit", project_path, version=get_speckit_version())
|
||||
|
||||
# Scripts
|
||||
if core and (core / "scripts").is_dir():
|
||||
scripts_src = core / "scripts"
|
||||
else:
|
||||
repo_root = Path(__file__).parent.parent.parent
|
||||
scripts_src = repo_root / "scripts"
|
||||
|
||||
skipped_files: list[str] = []
|
||||
|
||||
if scripts_src.is_dir():
|
||||
dest_scripts = project_path / ".specify" / "scripts"
|
||||
dest_scripts.mkdir(parents=True, exist_ok=True)
|
||||
variant_dir = "bash" if script_type == "sh" else "powershell"
|
||||
variant_src = scripts_src / variant_dir
|
||||
if variant_src.is_dir():
|
||||
dest_variant = dest_scripts / variant_dir
|
||||
dest_variant.mkdir(parents=True, exist_ok=True)
|
||||
# Merge without overwriting — only add files that don't exist yet
|
||||
for src_path in variant_src.rglob("*"):
|
||||
if src_path.is_file():
|
||||
rel_path = src_path.relative_to(variant_src)
|
||||
dst_path = dest_variant / rel_path
|
||||
if dst_path.exists():
|
||||
skipped_files.append(str(dst_path.relative_to(project_path)))
|
||||
else:
|
||||
dst_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
shutil.copy2(src_path, dst_path)
|
||||
rel = dst_path.relative_to(project_path).as_posix()
|
||||
manifest.record_existing(rel)
|
||||
|
||||
# Page templates (not command templates, not vscode-settings.json)
|
||||
if core and (core / "templates").is_dir():
|
||||
templates_src = core / "templates"
|
||||
else:
|
||||
repo_root = Path(__file__).parent.parent.parent
|
||||
templates_src = repo_root / "templates"
|
||||
|
||||
if templates_src.is_dir():
|
||||
dest_templates = project_path / ".specify" / "templates"
|
||||
dest_templates.mkdir(parents=True, exist_ok=True)
|
||||
for f in templates_src.iterdir():
|
||||
if f.is_file() and f.name != "vscode-settings.json" and not f.name.startswith("."):
|
||||
dst = dest_templates / f.name
|
||||
if dst.exists():
|
||||
skipped_files.append(str(dst.relative_to(project_path)))
|
||||
else:
|
||||
shutil.copy2(f, dst)
|
||||
rel = dst.relative_to(project_path).as_posix()
|
||||
manifest.record_existing(rel)
|
||||
|
||||
if skipped_files:
|
||||
import logging
|
||||
logging.getLogger(__name__).warning(
|
||||
"The following shared files already exist and were not overwritten:\n%s",
|
||||
"\n".join(f" {f}" for f in skipped_files),
|
||||
)
|
||||
|
||||
manifest.save()
|
||||
return True
|
||||
|
||||
|
||||
def scaffold_from_core_pack(
|
||||
project_path: Path,
|
||||
ai_assistant: str,
|
||||
@@ -1828,6 +1906,7 @@ def init(
|
||||
offline: bool = typer.Option(False, "--offline", help="Use assets bundled in the specify-cli package instead of downloading from GitHub (no network access required). Bundled assets will become the default in v0.6.0 and this flag will be removed."),
|
||||
preset: str = typer.Option(None, "--preset", help="Install a preset during initialization (by preset ID)"),
|
||||
branch_numbering: str = typer.Option(None, "--branch-numbering", help="Branch numbering strategy: 'sequential' (001, 002, ...) or 'timestamp' (YYYYMMDD-HHMMSS)"),
|
||||
integration: str = typer.Option(None, "--integration", help="Use the new integration system (e.g. --integration copilot). Mutually exclusive with --ai."),
|
||||
):
|
||||
"""
|
||||
Initialize a new Specify project.
|
||||
@@ -1889,6 +1968,35 @@ def init(
|
||||
if ai_assistant:
|
||||
ai_assistant = AI_ASSISTANT_ALIASES.get(ai_assistant, ai_assistant)
|
||||
|
||||
# --integration and --ai are mutually exclusive
|
||||
if integration and ai_assistant:
|
||||
console.print("[red]Error:[/red] --integration and --ai are mutually exclusive")
|
||||
console.print("[yellow]Use:[/yellow] --integration for the new integration system, or --ai for the legacy path")
|
||||
raise typer.Exit(1)
|
||||
|
||||
# Auto-promote: --ai copilot → integration path with a nudge
|
||||
use_integration = False
|
||||
if integration:
|
||||
from .integrations import INTEGRATION_REGISTRY, get_integration
|
||||
resolved_integration = get_integration(integration)
|
||||
if not resolved_integration:
|
||||
console.print(f"[red]Error:[/red] Unknown integration: '{integration}'")
|
||||
available = ", ".join(sorted(INTEGRATION_REGISTRY))
|
||||
console.print(f"[yellow]Available integrations:[/yellow] {available}")
|
||||
raise typer.Exit(1)
|
||||
use_integration = True
|
||||
# Map integration key to the ai_assistant variable for downstream compatibility
|
||||
ai_assistant = integration
|
||||
elif ai_assistant == "copilot":
|
||||
from .integrations import get_integration
|
||||
resolved_integration = get_integration("copilot")
|
||||
if resolved_integration:
|
||||
use_integration = True
|
||||
console.print(
|
||||
"[dim]Tip: Use [bold]--integration copilot[/bold] instead of "
|
||||
"--ai copilot. The --ai flag will be deprecated in a future release.[/dim]"
|
||||
)
|
||||
|
||||
if project_name == ".":
|
||||
here = True
|
||||
project_name = None # Clear project_name to use existing validation logic
|
||||
@@ -2057,7 +2165,10 @@ def init(
|
||||
"This will become the default in v0.6.0."
|
||||
)
|
||||
|
||||
if use_github:
|
||||
if use_integration:
|
||||
tracker.add("integration", "Install integration")
|
||||
tracker.add("shared-infra", "Install shared infrastructure")
|
||||
elif use_github:
|
||||
for key, label in [
|
||||
("fetch", "Fetch latest release"),
|
||||
("download", "Download template"),
|
||||
@@ -2092,7 +2203,39 @@ def init(
|
||||
verify = not skip_tls
|
||||
local_ssl_context = ssl_context if verify else False
|
||||
|
||||
if use_github:
|
||||
if use_integration:
|
||||
# Integration-based scaffolding (new path)
|
||||
from .integrations.manifest import IntegrationManifest
|
||||
tracker.start("integration")
|
||||
manifest = IntegrationManifest(
|
||||
resolved_integration.key, project_path, version=get_speckit_version()
|
||||
)
|
||||
resolved_integration.setup(
|
||||
project_path, manifest,
|
||||
script_type=selected_script,
|
||||
)
|
||||
manifest.save()
|
||||
|
||||
# Write .specify/integration.json
|
||||
script_ext = "sh" if selected_script == "sh" else "ps1"
|
||||
integration_json = project_path / ".specify" / "integration.json"
|
||||
integration_json.parent.mkdir(parents=True, exist_ok=True)
|
||||
integration_json.write_text(json.dumps({
|
||||
"integration": resolved_integration.key,
|
||||
"version": get_speckit_version(),
|
||||
"scripts": {
|
||||
"update-context": f".specify/integrations/{resolved_integration.key}/scripts/update-context.{script_ext}",
|
||||
},
|
||||
}, indent=2) + "\n", encoding="utf-8")
|
||||
|
||||
tracker.complete("integration", resolved_integration.config.get("name", resolved_integration.key))
|
||||
|
||||
# Install shared infrastructure (scripts, templates)
|
||||
tracker.start("shared-infra")
|
||||
_install_shared_infra(project_path, selected_script, tracker=tracker)
|
||||
tracker.complete("shared-infra", f"scripts ({selected_script}) + templates")
|
||||
|
||||
elif use_github:
|
||||
with httpx.Client(verify=local_ssl_context) as local_client:
|
||||
download_and_extract_template(
|
||||
project_path,
|
||||
@@ -2227,7 +2370,7 @@ def init(
|
||||
# Persist the CLI options so later operations (e.g. preset add)
|
||||
# can adapt their behaviour without re-scanning the filesystem.
|
||||
# Must be saved BEFORE preset install so _get_skills_dir() works.
|
||||
save_init_options(project_path, {
|
||||
init_opts = {
|
||||
"ai": selected_ai,
|
||||
"ai_skills": ai_skills,
|
||||
"ai_commands_dir": ai_commands_dir,
|
||||
@@ -2237,7 +2380,10 @@ def init(
|
||||
"offline": offline,
|
||||
"script": selected_script,
|
||||
"speckit_version": get_speckit_version(),
|
||||
})
|
||||
}
|
||||
if use_integration:
|
||||
init_opts["integration"] = resolved_integration.key
|
||||
save_init_options(project_path, init_opts)
|
||||
|
||||
# Install preset if specified
|
||||
if preset:
|
||||
|
||||
@@ -32,3 +32,15 @@ def _register(integration: IntegrationBase) -> None:
|
||||
def get_integration(key: str) -> IntegrationBase | None:
|
||||
"""Return the integration for *key*, or ``None`` if not registered."""
|
||||
return INTEGRATION_REGISTRY.get(key)
|
||||
|
||||
|
||||
# -- Register built-in integrations --------------------------------------
|
||||
|
||||
def _register_builtins() -> None:
|
||||
"""Register all built-in integrations."""
|
||||
from .copilot import CopilotIntegration
|
||||
|
||||
_register(CopilotIntegration())
|
||||
|
||||
|
||||
_register_builtins()
|
||||
|
||||
@@ -9,6 +9,7 @@ Provides:
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
import shutil
|
||||
from abc import ABC
|
||||
from dataclasses import dataclass
|
||||
@@ -84,35 +85,65 @@ class IntegrationBase(ABC):
|
||||
"""Return options this integration accepts. Default: none."""
|
||||
return []
|
||||
|
||||
def templates_dir(self) -> Path:
|
||||
"""Return the path to this integration's bundled templates.
|
||||
# -- Primitives — building blocks for setup() -------------------------
|
||||
|
||||
By convention, templates live in a ``templates/`` subdirectory
|
||||
next to the file where the integration class is defined.
|
||||
def shared_commands_dir(self) -> Path | None:
|
||||
"""Return path to the shared command templates directory.
|
||||
|
||||
Checks ``core_pack/commands/`` (wheel install) first, then
|
||||
``templates/commands/`` (source checkout). Returns ``None``
|
||||
if neither exists.
|
||||
"""
|
||||
import inspect
|
||||
|
||||
module_file = inspect.getfile(type(self))
|
||||
return Path(module_file).resolve().parent / "templates"
|
||||
pkg_dir = Path(inspect.getfile(IntegrationBase)).resolve().parent.parent
|
||||
for candidate in [
|
||||
pkg_dir / "core_pack" / "commands",
|
||||
pkg_dir.parent.parent / "templates" / "commands",
|
||||
]:
|
||||
if candidate.is_dir():
|
||||
return candidate
|
||||
return None
|
||||
|
||||
def setup(
|
||||
self,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
parsed_options: dict[str, Any] | None = None,
|
||||
**opts: Any,
|
||||
) -> list[Path]:
|
||||
"""Install integration files into *project_root*.
|
||||
def shared_templates_dir(self) -> Path | None:
|
||||
"""Return path to the shared page templates directory.
|
||||
|
||||
Returns the list of files created. The default implementation
|
||||
copies every file from ``templates_dir()`` into the commands
|
||||
directory derived from ``config``, recording each in *manifest*.
|
||||
Contains ``vscode-settings.json``, ``spec-template.md``, etc.
|
||||
Checks ``core_pack/templates/`` then ``templates/``.
|
||||
"""
|
||||
created: list[Path] = []
|
||||
tpl_dir = self.templates_dir()
|
||||
if not tpl_dir.is_dir():
|
||||
return created
|
||||
import inspect
|
||||
|
||||
pkg_dir = Path(inspect.getfile(IntegrationBase)).resolve().parent.parent
|
||||
for candidate in [
|
||||
pkg_dir / "core_pack" / "templates",
|
||||
pkg_dir.parent.parent / "templates",
|
||||
]:
|
||||
if candidate.is_dir():
|
||||
return candidate
|
||||
return None
|
||||
|
||||
def list_command_templates(self) -> list[Path]:
|
||||
"""Return sorted list of command template files from the shared directory."""
|
||||
cmd_dir = self.shared_commands_dir()
|
||||
if not cmd_dir or not cmd_dir.is_dir():
|
||||
return []
|
||||
return sorted(f for f in cmd_dir.iterdir() if f.is_file() and f.suffix == ".md")
|
||||
|
||||
def command_filename(self, template_name: str) -> str:
|
||||
"""Return the destination filename for a command template.
|
||||
|
||||
*template_name* is the stem of the source file (e.g. ``"plan"``).
|
||||
Default: ``speckit.{template_name}.md``. Subclasses override
|
||||
to change the extension or naming convention.
|
||||
"""
|
||||
return f"speckit.{template_name}.md"
|
||||
|
||||
def commands_dest(self, project_root: Path) -> Path:
|
||||
"""Return the absolute path to the commands output directory.
|
||||
|
||||
Derived from ``config["folder"]`` and ``config["commands_subdir"]``.
|
||||
Raises ``ValueError`` if ``config`` or ``folder`` is missing.
|
||||
"""
|
||||
if not self.config:
|
||||
raise ValueError(
|
||||
f"{type(self).__name__}.config is not set; integration "
|
||||
@@ -123,6 +154,179 @@ class IntegrationBase(ABC):
|
||||
raise ValueError(
|
||||
f"{type(self).__name__}.config is missing required 'folder' entry."
|
||||
)
|
||||
subdir = self.config.get("commands_subdir", "commands")
|
||||
return project_root / folder / subdir
|
||||
|
||||
# -- File operations — granular primitives for setup() ----------------
|
||||
|
||||
@staticmethod
|
||||
def copy_command_to_directory(
|
||||
src: Path,
|
||||
dest_dir: Path,
|
||||
filename: str,
|
||||
) -> Path:
|
||||
"""Copy a command template to *dest_dir* with the given *filename*.
|
||||
|
||||
Creates *dest_dir* if needed. Returns the absolute path of the
|
||||
written file. The caller can post-process the file before
|
||||
recording it in the manifest.
|
||||
"""
|
||||
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||
dst = dest_dir / filename
|
||||
shutil.copy2(src, dst)
|
||||
return dst
|
||||
|
||||
@staticmethod
|
||||
def record_file_in_manifest(
|
||||
file_path: Path,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
) -> None:
|
||||
"""Hash *file_path* and record it in *manifest*.
|
||||
|
||||
*file_path* must be inside *project_root*.
|
||||
"""
|
||||
rel = file_path.resolve().relative_to(project_root.resolve())
|
||||
manifest.record_existing(rel)
|
||||
|
||||
@staticmethod
|
||||
def write_file_and_record(
|
||||
content: str,
|
||||
dest: Path,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
) -> Path:
|
||||
"""Write *content* to *dest*, hash it, and record in *manifest*.
|
||||
|
||||
Creates parent directories as needed. Returns *dest*.
|
||||
"""
|
||||
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||
dest.write_text(content, encoding="utf-8")
|
||||
rel = dest.resolve().relative_to(project_root.resolve())
|
||||
manifest.record_existing(rel)
|
||||
return dest
|
||||
|
||||
@staticmethod
|
||||
def process_template(
|
||||
content: str,
|
||||
agent_name: str,
|
||||
script_type: str,
|
||||
arg_placeholder: str = "$ARGUMENTS",
|
||||
) -> str:
|
||||
"""Process a raw command template into agent-ready content.
|
||||
|
||||
Performs the same transformations as the release script:
|
||||
1. Extract ``scripts.<script_type>`` value from YAML frontmatter
|
||||
2. Replace ``{SCRIPT}`` with the extracted script command
|
||||
3. Extract ``agent_scripts.<script_type>`` and replace ``{AGENT_SCRIPT}``
|
||||
4. Strip ``scripts:`` and ``agent_scripts:`` sections from frontmatter
|
||||
5. Replace ``{ARGS}`` with *arg_placeholder*
|
||||
6. Replace ``__AGENT__`` with *agent_name*
|
||||
7. Rewrite paths: ``scripts/`` → ``.specify/scripts/`` etc.
|
||||
"""
|
||||
# 1. Extract script command from frontmatter
|
||||
script_command = ""
|
||||
script_pattern = re.compile(
|
||||
rf"^\s*{re.escape(script_type)}:\s*(.+)$", re.MULTILINE
|
||||
)
|
||||
# Find the scripts: block
|
||||
in_scripts = False
|
||||
for line in content.splitlines():
|
||||
if line.strip() == "scripts:":
|
||||
in_scripts = True
|
||||
continue
|
||||
if in_scripts and line and not line[0].isspace():
|
||||
in_scripts = False
|
||||
if in_scripts:
|
||||
m = script_pattern.match(line)
|
||||
if m:
|
||||
script_command = m.group(1).strip()
|
||||
break
|
||||
|
||||
# 2. Replace {SCRIPT}
|
||||
if script_command:
|
||||
content = content.replace("{SCRIPT}", script_command)
|
||||
|
||||
# 3. Extract agent_script command
|
||||
agent_script_command = ""
|
||||
in_agent_scripts = False
|
||||
for line in content.splitlines():
|
||||
if line.strip() == "agent_scripts:":
|
||||
in_agent_scripts = True
|
||||
continue
|
||||
if in_agent_scripts and line and not line[0].isspace():
|
||||
in_agent_scripts = False
|
||||
if in_agent_scripts:
|
||||
m = script_pattern.match(line)
|
||||
if m:
|
||||
agent_script_command = m.group(1).strip()
|
||||
break
|
||||
|
||||
if agent_script_command:
|
||||
content = content.replace("{AGENT_SCRIPT}", agent_script_command)
|
||||
|
||||
# 4. Strip scripts: and agent_scripts: sections from frontmatter
|
||||
lines = content.splitlines(keepends=True)
|
||||
output_lines: list[str] = []
|
||||
in_frontmatter = False
|
||||
skip_section = False
|
||||
dash_count = 0
|
||||
for line in lines:
|
||||
stripped = line.rstrip("\n\r")
|
||||
if stripped == "---":
|
||||
dash_count += 1
|
||||
if dash_count == 1:
|
||||
in_frontmatter = True
|
||||
else:
|
||||
in_frontmatter = False
|
||||
skip_section = False
|
||||
output_lines.append(line)
|
||||
continue
|
||||
if in_frontmatter:
|
||||
if stripped in ("scripts:", "agent_scripts:"):
|
||||
skip_section = True
|
||||
continue
|
||||
if skip_section:
|
||||
if line[0:1].isspace():
|
||||
continue # skip indented content under scripts/agent_scripts
|
||||
skip_section = False
|
||||
output_lines.append(line)
|
||||
content = "".join(output_lines)
|
||||
|
||||
# 5. Replace {ARGS}
|
||||
content = content.replace("{ARGS}", arg_placeholder)
|
||||
|
||||
# 6. Replace __AGENT__
|
||||
content = content.replace("__AGENT__", agent_name)
|
||||
|
||||
# 7. Rewrite paths (matches release script's rewrite_paths())
|
||||
content = re.sub(r"(/?)memory/", r".specify/memory/", content)
|
||||
content = re.sub(r"(/?)scripts/", r".specify/scripts/", content)
|
||||
content = re.sub(r"(/?)templates/", r".specify/templates/", content)
|
||||
# Fix double-prefix (same as release script's .specify.specify/ fix)
|
||||
content = content.replace(".specify.specify/", ".specify/")
|
||||
content = content.replace(".specify/.specify/", ".specify/")
|
||||
|
||||
return content
|
||||
|
||||
def setup(
|
||||
self,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
parsed_options: dict[str, Any] | None = None,
|
||||
**opts: Any,
|
||||
) -> list[Path]:
|
||||
"""Install integration command files into *project_root*.
|
||||
|
||||
Returns the list of files created. Copies raw templates without
|
||||
processing. Integrations that need placeholder replacement
|
||||
(e.g. ``{SCRIPT}``, ``__AGENT__``) should override ``setup()``
|
||||
and call ``process_template()`` in their own loop — see
|
||||
``CopilotIntegration`` for an example.
|
||||
"""
|
||||
templates = self.list_command_templates()
|
||||
if not templates:
|
||||
return []
|
||||
|
||||
project_root_resolved = project_root.resolve()
|
||||
if manifest.project_root != project_root_resolved:
|
||||
@@ -130,9 +334,8 @@ class IntegrationBase(ABC):
|
||||
f"manifest.project_root ({manifest.project_root}) does not match "
|
||||
f"project_root ({project_root_resolved})"
|
||||
)
|
||||
subdir = self.config.get("commands_subdir", "commands")
|
||||
dest = (project_root / folder / subdir).resolve()
|
||||
# Ensure destination stays within the project root
|
||||
|
||||
dest = self.commands_dest(project_root).resolve()
|
||||
try:
|
||||
dest.relative_to(project_root_resolved)
|
||||
except ValueError as exc:
|
||||
@@ -141,16 +344,13 @@ class IntegrationBase(ABC):
|
||||
f"project root {project_root_resolved}"
|
||||
) from exc
|
||||
|
||||
dest.mkdir(parents=True, exist_ok=True)
|
||||
created: list[Path] = []
|
||||
|
||||
for src_file in sorted(tpl_dir.iterdir()):
|
||||
if src_file.is_file():
|
||||
dst_file = dest / src_file.name
|
||||
dst_resolved = dst_file.resolve()
|
||||
rel = dst_resolved.relative_to(project_root_resolved)
|
||||
shutil.copy2(src_file, dst_file)
|
||||
manifest.record_existing(rel)
|
||||
created.append(dst_file)
|
||||
for src_file in templates:
|
||||
dst_name = self.command_filename(src_file.stem)
|
||||
dst_file = self.copy_command_to_directory(src_file, dest, dst_name)
|
||||
self.record_file_in_manifest(dst_file, project_root, manifest)
|
||||
created.append(dst_file)
|
||||
|
||||
return created
|
||||
|
||||
|
||||
197
src/specify_cli/integrations/copilot/__init__.py
Normal file
197
src/specify_cli/integrations/copilot/__init__.py
Normal file
@@ -0,0 +1,197 @@
|
||||
"""Copilot integration — GitHub Copilot in VS Code.
|
||||
|
||||
Copilot has several unique behaviors compared to standard markdown agents:
|
||||
- Commands use ``.agent.md`` extension (not ``.md``)
|
||||
- Each command gets a companion ``.prompt.md`` file in ``.github/prompts/``
|
||||
- Installs ``.vscode/settings.json`` with prompt file recommendations
|
||||
- Context file lives at ``.github/copilot-instructions.md``
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from ..base import IntegrationBase
|
||||
from ..manifest import IntegrationManifest
|
||||
|
||||
|
||||
class CopilotIntegration(IntegrationBase):
|
||||
"""Integration for GitHub Copilot in VS Code."""
|
||||
|
||||
key = "copilot"
|
||||
config = {
|
||||
"name": "GitHub Copilot",
|
||||
"folder": ".github/",
|
||||
"commands_subdir": "agents",
|
||||
"install_url": None,
|
||||
"requires_cli": False,
|
||||
}
|
||||
registrar_config = {
|
||||
"dir": ".github/agents",
|
||||
"format": "markdown",
|
||||
"args": "$ARGUMENTS",
|
||||
"extension": ".agent.md",
|
||||
}
|
||||
context_file = ".github/copilot-instructions.md"
|
||||
|
||||
def command_filename(self, template_name: str) -> str:
|
||||
"""Copilot commands use ``.agent.md`` extension."""
|
||||
return f"speckit.{template_name}.agent.md"
|
||||
|
||||
def setup(
|
||||
self,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
parsed_options: dict[str, Any] | None = None,
|
||||
**opts: Any,
|
||||
) -> list[Path]:
|
||||
"""Install copilot commands, companion prompts, and VS Code settings.
|
||||
|
||||
Uses base class primitives to: read templates, process them
|
||||
(replace placeholders, strip script blocks, rewrite paths),
|
||||
write as ``.agent.md``, then add companion prompts and VS Code settings.
|
||||
"""
|
||||
project_root_resolved = project_root.resolve()
|
||||
if manifest.project_root != project_root_resolved:
|
||||
raise ValueError(
|
||||
f"manifest.project_root ({manifest.project_root}) does not match "
|
||||
f"project_root ({project_root_resolved})"
|
||||
)
|
||||
|
||||
templates = self.list_command_templates()
|
||||
if not templates:
|
||||
return []
|
||||
|
||||
dest = self.commands_dest(project_root)
|
||||
dest_resolved = dest.resolve()
|
||||
try:
|
||||
dest_resolved.relative_to(project_root_resolved)
|
||||
except ValueError as exc:
|
||||
raise ValueError(
|
||||
f"Integration destination {dest_resolved} escapes "
|
||||
f"project root {project_root_resolved}"
|
||||
) from exc
|
||||
dest.mkdir(parents=True, exist_ok=True)
|
||||
created: list[Path] = []
|
||||
|
||||
script_type = opts.get("script_type", "sh")
|
||||
arg_placeholder = self.registrar_config.get("args", "$ARGUMENTS")
|
||||
|
||||
# 1. Process and write command files as .agent.md
|
||||
for src_file in templates:
|
||||
raw = src_file.read_text(encoding="utf-8")
|
||||
processed = self.process_template(raw, self.key, script_type, arg_placeholder)
|
||||
dst_name = self.command_filename(src_file.stem)
|
||||
dst_file = self.write_file_and_record(
|
||||
processed, dest / dst_name, project_root, manifest
|
||||
)
|
||||
created.append(dst_file)
|
||||
|
||||
# 2. Generate companion .prompt.md files from the templates we just wrote
|
||||
prompts_dir = project_root / ".github" / "prompts"
|
||||
for src_file in templates:
|
||||
cmd_name = f"speckit.{src_file.stem}"
|
||||
prompt_content = f"---\nagent: {cmd_name}\n---\n"
|
||||
prompt_file = self.write_file_and_record(
|
||||
prompt_content,
|
||||
prompts_dir / f"{cmd_name}.prompt.md",
|
||||
project_root,
|
||||
manifest,
|
||||
)
|
||||
created.append(prompt_file)
|
||||
|
||||
# Write .vscode/settings.json
|
||||
settings_src = self._vscode_settings_path()
|
||||
if settings_src and settings_src.is_file():
|
||||
dst_settings = project_root / ".vscode" / "settings.json"
|
||||
dst_settings.parent.mkdir(parents=True, exist_ok=True)
|
||||
if dst_settings.exists():
|
||||
# Merge into existing — don't track since we can't safely
|
||||
# remove the user's settings file on uninstall.
|
||||
self._merge_vscode_settings(settings_src, dst_settings)
|
||||
else:
|
||||
shutil.copy2(settings_src, dst_settings)
|
||||
self.record_file_in_manifest(dst_settings, project_root, manifest)
|
||||
created.append(dst_settings)
|
||||
|
||||
# 4. Install integration-specific update-context scripts
|
||||
scripts_src = Path(__file__).resolve().parent / "scripts"
|
||||
if scripts_src.is_dir():
|
||||
scripts_dest = project_root / ".specify" / "integrations" / "copilot" / "scripts"
|
||||
scripts_dest.mkdir(parents=True, exist_ok=True)
|
||||
for src_script in sorted(scripts_src.iterdir()):
|
||||
if src_script.is_file():
|
||||
dst_script = scripts_dest / src_script.name
|
||||
shutil.copy2(src_script, dst_script)
|
||||
# Make shell scripts executable
|
||||
if dst_script.suffix == ".sh":
|
||||
dst_script.chmod(dst_script.stat().st_mode | 0o111)
|
||||
self.record_file_in_manifest(dst_script, project_root, manifest)
|
||||
created.append(dst_script)
|
||||
|
||||
return created
|
||||
|
||||
def _vscode_settings_path(self) -> Path | None:
|
||||
"""Return path to the bundled vscode-settings.json template."""
|
||||
tpl_dir = self.shared_templates_dir()
|
||||
if tpl_dir:
|
||||
candidate = tpl_dir / "vscode-settings.json"
|
||||
if candidate.is_file():
|
||||
return candidate
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def _merge_vscode_settings(src: Path, dst: Path) -> None:
|
||||
"""Merge settings from *src* into existing *dst* JSON file.
|
||||
|
||||
Top-level keys from *src* are added only if missing in *dst*.
|
||||
For dict-valued keys, sub-keys are merged the same way.
|
||||
|
||||
If *dst* cannot be parsed (e.g. JSONC with comments), the merge
|
||||
is skipped to avoid overwriting user settings.
|
||||
"""
|
||||
try:
|
||||
existing = json.loads(dst.read_text(encoding="utf-8"))
|
||||
except (json.JSONDecodeError, OSError):
|
||||
# Cannot parse existing file (likely JSONC with comments).
|
||||
# Skip merge to preserve the user's settings, but show
|
||||
# what they should add manually.
|
||||
import logging
|
||||
template_content = src.read_text(encoding="utf-8")
|
||||
logging.getLogger(__name__).warning(
|
||||
"Could not parse %s (may contain JSONC comments). "
|
||||
"Skipping settings merge to preserve existing file.\n"
|
||||
"Please add the following settings manually:\n%s",
|
||||
dst, template_content,
|
||||
)
|
||||
return
|
||||
|
||||
new_settings = json.loads(src.read_text(encoding="utf-8"))
|
||||
|
||||
if not isinstance(existing, dict) or not isinstance(new_settings, dict):
|
||||
import logging
|
||||
logging.getLogger(__name__).warning(
|
||||
"Skipping settings merge: %s or template is not a JSON object.", dst
|
||||
)
|
||||
return
|
||||
|
||||
changed = False
|
||||
for key, value in new_settings.items():
|
||||
if key not in existing:
|
||||
existing[key] = value
|
||||
changed = True
|
||||
elif isinstance(existing[key], dict) and isinstance(value, dict):
|
||||
for sub_key, sub_value in value.items():
|
||||
if sub_key not in existing[key]:
|
||||
existing[key][sub_key] = sub_value
|
||||
changed = True
|
||||
|
||||
if not changed:
|
||||
return
|
||||
|
||||
dst.write_text(
|
||||
json.dumps(existing, indent=4) + "\n", encoding="utf-8"
|
||||
)
|
||||
@@ -0,0 +1,22 @@
|
||||
# update-context.ps1 — Copilot integration: create/update .github/copilot-instructions.md
|
||||
#
|
||||
# This is the copilot-specific implementation that produces the GitHub
|
||||
# Copilot instructions file. The shared dispatcher reads
|
||||
# .specify/integration.json and calls this script.
|
||||
#
|
||||
# NOTE: This script is not yet active. It will be activated in Stage 7
|
||||
# when the shared update-agent-context.ps1 replaces its switch statement
|
||||
# with integration.json-based dispatch. The shared script must also be
|
||||
# refactored to support SPECKIT_SOURCE_ONLY (guard the Main call) before
|
||||
# dot-sourcing will work.
|
||||
#
|
||||
# Until then, this delegates to the shared script as a subprocess.
|
||||
|
||||
$ErrorActionPreference = 'Stop'
|
||||
|
||||
$repoRoot = git rev-parse --show-toplevel 2>$null
|
||||
if (-not $repoRoot) { $repoRoot = $PWD.Path }
|
||||
|
||||
# Invoke shared update-agent-context script as a separate process.
|
||||
# Dot-sourcing is unsafe until that script guards its Main call.
|
||||
& "$repoRoot/.specify/scripts/powershell/update-agent-context.ps1" -AgentType copilot
|
||||
@@ -0,0 +1,22 @@
|
||||
#!/usr/bin/env bash
|
||||
# update-context.sh — Copilot integration: create/update .github/copilot-instructions.md
|
||||
#
|
||||
# This is the copilot-specific implementation that produces the GitHub
|
||||
# Copilot instructions file. The shared dispatcher reads
|
||||
# .specify/integration.json and calls this script.
|
||||
#
|
||||
# NOTE: This script is not yet active. It will be activated in Stage 7
|
||||
# when the shared update-agent-context.sh replaces its case statement
|
||||
# with integration.json-based dispatch. The shared script must also be
|
||||
# refactored to support SPECKIT_SOURCE_ONLY (guard the main logic)
|
||||
# before sourcing will work.
|
||||
#
|
||||
# Until then, this delegates to the shared script as a subprocess.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
REPO_ROOT="${REPO_ROOT:-$(git rev-parse --show-toplevel 2>/dev/null || pwd)}"
|
||||
|
||||
# Invoke shared update-agent-context script as a separate process.
|
||||
# Sourcing is unsafe until that script guards its main logic.
|
||||
exec "$REPO_ROOT/.specify/scripts/bash/update-agent-context.sh" copilot
|
||||
0
tests/integrations/__init__.py
Normal file
0
tests/integrations/__init__.py
Normal file
23
tests/integrations/conftest.py
Normal file
23
tests/integrations/conftest.py
Normal file
@@ -0,0 +1,23 @@
|
||||
"""Shared test helpers for integration tests."""
|
||||
|
||||
from specify_cli.integrations.base import MarkdownIntegration
|
||||
|
||||
|
||||
class StubIntegration(MarkdownIntegration):
|
||||
"""Minimal concrete integration for testing."""
|
||||
|
||||
key = "stub"
|
||||
config = {
|
||||
"name": "Stub Agent",
|
||||
"folder": ".stub/",
|
||||
"commands_subdir": "commands",
|
||||
"install_url": None,
|
||||
"requires_cli": False,
|
||||
}
|
||||
registrar_config = {
|
||||
"dir": ".stub/commands",
|
||||
"format": "markdown",
|
||||
"args": "$ARGUMENTS",
|
||||
"extension": ".md",
|
||||
}
|
||||
context_file = "STUB.md"
|
||||
169
tests/integrations/test_base.py
Normal file
169
tests/integrations/test_base.py
Normal file
@@ -0,0 +1,169 @@
|
||||
"""Tests for IntegrationOption, IntegrationBase, MarkdownIntegration, and primitives."""
|
||||
|
||||
import pytest
|
||||
|
||||
from specify_cli.integrations.base import (
|
||||
IntegrationBase,
|
||||
IntegrationOption,
|
||||
MarkdownIntegration,
|
||||
)
|
||||
from specify_cli.integrations.manifest import IntegrationManifest
|
||||
from .conftest import StubIntegration
|
||||
|
||||
|
||||
class TestIntegrationOption:
|
||||
def test_defaults(self):
|
||||
opt = IntegrationOption(name="--flag")
|
||||
assert opt.name == "--flag"
|
||||
assert opt.is_flag is False
|
||||
assert opt.required is False
|
||||
assert opt.default is None
|
||||
assert opt.help == ""
|
||||
|
||||
def test_flag_option(self):
|
||||
opt = IntegrationOption(name="--skills", is_flag=True, default=True, help="Enable skills")
|
||||
assert opt.is_flag is True
|
||||
assert opt.default is True
|
||||
assert opt.help == "Enable skills"
|
||||
|
||||
def test_required_option(self):
|
||||
opt = IntegrationOption(name="--commands-dir", required=True, help="Dir path")
|
||||
assert opt.required is True
|
||||
|
||||
def test_frozen(self):
|
||||
opt = IntegrationOption(name="--x")
|
||||
with pytest.raises(AttributeError):
|
||||
opt.name = "--y" # type: ignore[misc]
|
||||
|
||||
|
||||
class TestIntegrationBase:
|
||||
def test_key_and_config(self):
|
||||
i = StubIntegration()
|
||||
assert i.key == "stub"
|
||||
assert i.config["name"] == "Stub Agent"
|
||||
assert i.registrar_config["format"] == "markdown"
|
||||
assert i.context_file == "STUB.md"
|
||||
|
||||
def test_options_default_empty(self):
|
||||
assert StubIntegration.options() == []
|
||||
|
||||
def test_shared_commands_dir(self):
|
||||
i = StubIntegration()
|
||||
cmd_dir = i.shared_commands_dir()
|
||||
assert cmd_dir is not None
|
||||
assert cmd_dir.is_dir()
|
||||
|
||||
def test_setup_uses_shared_templates(self, tmp_path):
|
||||
i = StubIntegration()
|
||||
manifest = IntegrationManifest("stub", tmp_path)
|
||||
created = i.setup(tmp_path, manifest)
|
||||
assert len(created) > 0
|
||||
for f in created:
|
||||
assert f.parent == tmp_path / ".stub" / "commands"
|
||||
assert f.name.startswith("speckit.")
|
||||
assert f.name.endswith(".md")
|
||||
|
||||
def test_setup_copies_templates(self, tmp_path, monkeypatch):
|
||||
tpl = tmp_path / "_templates"
|
||||
tpl.mkdir()
|
||||
(tpl / "plan.md").write_text("plan content", encoding="utf-8")
|
||||
(tpl / "specify.md").write_text("spec content", encoding="utf-8")
|
||||
|
||||
i = StubIntegration()
|
||||
monkeypatch.setattr(type(i), "list_command_templates", lambda self: sorted(tpl.glob("*.md")))
|
||||
|
||||
project = tmp_path / "project"
|
||||
project.mkdir()
|
||||
created = i.setup(project, IntegrationManifest("stub", project))
|
||||
assert len(created) == 2
|
||||
assert (project / ".stub" / "commands" / "speckit.plan.md").exists()
|
||||
assert (project / ".stub" / "commands" / "speckit.specify.md").exists()
|
||||
|
||||
def test_install_delegates_to_setup(self, tmp_path):
|
||||
i = StubIntegration()
|
||||
manifest = IntegrationManifest("stub", tmp_path)
|
||||
result = i.install(tmp_path, manifest)
|
||||
assert len(result) > 0
|
||||
|
||||
def test_uninstall_delegates_to_teardown(self, tmp_path):
|
||||
i = StubIntegration()
|
||||
manifest = IntegrationManifest("stub", tmp_path)
|
||||
removed, skipped = i.uninstall(tmp_path, manifest)
|
||||
assert removed == []
|
||||
assert skipped == []
|
||||
|
||||
|
||||
class TestMarkdownIntegration:
|
||||
def test_is_subclass_of_base(self):
|
||||
assert issubclass(MarkdownIntegration, IntegrationBase)
|
||||
|
||||
def test_stub_is_markdown(self):
|
||||
assert isinstance(StubIntegration(), MarkdownIntegration)
|
||||
|
||||
|
||||
class TestBasePrimitives:
|
||||
def test_shared_commands_dir_returns_path(self):
|
||||
i = StubIntegration()
|
||||
cmd_dir = i.shared_commands_dir()
|
||||
assert cmd_dir is not None
|
||||
assert cmd_dir.is_dir()
|
||||
|
||||
def test_shared_templates_dir_returns_path(self):
|
||||
i = StubIntegration()
|
||||
tpl_dir = i.shared_templates_dir()
|
||||
assert tpl_dir is not None
|
||||
assert tpl_dir.is_dir()
|
||||
|
||||
def test_list_command_templates_returns_md_files(self):
|
||||
i = StubIntegration()
|
||||
templates = i.list_command_templates()
|
||||
assert len(templates) > 0
|
||||
assert all(t.suffix == ".md" for t in templates)
|
||||
|
||||
def test_command_filename_default(self):
|
||||
i = StubIntegration()
|
||||
assert i.command_filename("plan") == "speckit.plan.md"
|
||||
|
||||
def test_commands_dest(self, tmp_path):
|
||||
i = StubIntegration()
|
||||
dest = i.commands_dest(tmp_path)
|
||||
assert dest == tmp_path / ".stub" / "commands"
|
||||
|
||||
def test_commands_dest_no_config_raises(self, tmp_path):
|
||||
class NoConfig(MarkdownIntegration):
|
||||
key = "noconfig"
|
||||
with pytest.raises(ValueError, match="config is not set"):
|
||||
NoConfig().commands_dest(tmp_path)
|
||||
|
||||
def test_copy_command_to_directory(self, tmp_path):
|
||||
src = tmp_path / "source.md"
|
||||
src.write_text("content", encoding="utf-8")
|
||||
dest_dir = tmp_path / "output"
|
||||
result = IntegrationBase.copy_command_to_directory(src, dest_dir, "speckit.plan.md")
|
||||
assert result == dest_dir / "speckit.plan.md"
|
||||
assert result.read_text(encoding="utf-8") == "content"
|
||||
|
||||
def test_record_file_in_manifest(self, tmp_path):
|
||||
f = tmp_path / "f.txt"
|
||||
f.write_text("hello", encoding="utf-8")
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
IntegrationBase.record_file_in_manifest(f, tmp_path, m)
|
||||
assert "f.txt" in m.files
|
||||
|
||||
def test_write_file_and_record(self, tmp_path):
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
dest = tmp_path / "sub" / "f.txt"
|
||||
result = IntegrationBase.write_file_and_record("content", dest, tmp_path, m)
|
||||
assert result == dest
|
||||
assert dest.read_text(encoding="utf-8") == "content"
|
||||
assert "sub/f.txt" in m.files
|
||||
|
||||
def test_setup_copies_shared_templates(self, tmp_path):
|
||||
i = StubIntegration()
|
||||
m = IntegrationManifest("stub", tmp_path)
|
||||
created = i.setup(tmp_path, m)
|
||||
assert len(created) > 0
|
||||
for f in created:
|
||||
assert f.parent.name == "commands"
|
||||
assert f.name.startswith("speckit.")
|
||||
assert f.name.endswith(".md")
|
||||
122
tests/integrations/test_cli.py
Normal file
122
tests/integrations/test_cli.py
Normal file
@@ -0,0 +1,122 @@
|
||||
"""Tests for --integration flag on specify init (CLI-level)."""
|
||||
|
||||
import json
|
||||
import os
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
class TestInitIntegrationFlag:
|
||||
def test_integration_and_ai_mutually_exclusive(self):
|
||||
from typer.testing import CliRunner
|
||||
from specify_cli import app
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(app, [
|
||||
"init", "test-project", "--ai", "claude", "--integration", "copilot",
|
||||
])
|
||||
assert result.exit_code != 0
|
||||
assert "mutually exclusive" in result.output
|
||||
|
||||
def test_unknown_integration_rejected(self):
|
||||
from typer.testing import CliRunner
|
||||
from specify_cli import app
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(app, [
|
||||
"init", "test-project", "--integration", "nonexistent",
|
||||
])
|
||||
assert result.exit_code != 0
|
||||
assert "Unknown integration" in result.output
|
||||
|
||||
def test_integration_copilot_creates_files(self, tmp_path):
|
||||
from typer.testing import CliRunner
|
||||
from specify_cli import app
|
||||
runner = CliRunner()
|
||||
project = tmp_path / "int-test"
|
||||
project.mkdir()
|
||||
old_cwd = os.getcwd()
|
||||
try:
|
||||
os.chdir(project)
|
||||
result = runner.invoke(app, [
|
||||
"init", "--here", "--integration", "copilot", "--script", "sh", "--no-git",
|
||||
], catch_exceptions=False)
|
||||
finally:
|
||||
os.chdir(old_cwd)
|
||||
assert result.exit_code == 0, f"init failed: {result.output}"
|
||||
assert (project / ".github" / "agents" / "speckit.plan.agent.md").exists()
|
||||
assert (project / ".github" / "prompts" / "speckit.plan.prompt.md").exists()
|
||||
assert (project / ".specify" / "scripts" / "bash" / "common.sh").exists()
|
||||
|
||||
data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8"))
|
||||
assert data["integration"] == "copilot"
|
||||
assert "scripts" in data
|
||||
assert "update-context" in data["scripts"]
|
||||
|
||||
opts = json.loads((project / ".specify" / "init-options.json").read_text(encoding="utf-8"))
|
||||
assert opts["integration"] == "copilot"
|
||||
|
||||
assert (project / ".specify" / "integrations" / "copilot.manifest.json").exists()
|
||||
assert (project / ".specify" / "integrations" / "copilot" / "scripts" / "update-context.sh").exists()
|
||||
|
||||
shared_manifest = project / ".specify" / "integrations" / "speckit.manifest.json"
|
||||
assert shared_manifest.exists()
|
||||
|
||||
def test_ai_copilot_auto_promotes(self, tmp_path):
|
||||
from typer.testing import CliRunner
|
||||
from specify_cli import app
|
||||
project = tmp_path / "promote-test"
|
||||
project.mkdir()
|
||||
old_cwd = os.getcwd()
|
||||
try:
|
||||
os.chdir(project)
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(app, [
|
||||
"init", "--here", "--ai", "copilot", "--script", "sh", "--no-git",
|
||||
], catch_exceptions=False)
|
||||
finally:
|
||||
os.chdir(old_cwd)
|
||||
assert result.exit_code == 0
|
||||
assert "--integration copilot" in result.output
|
||||
assert (project / ".github" / "agents" / "speckit.plan.agent.md").exists()
|
||||
|
||||
def test_shared_infra_skips_existing_files(self, tmp_path):
|
||||
"""Pre-existing shared files are not overwritten by _install_shared_infra."""
|
||||
from typer.testing import CliRunner
|
||||
from specify_cli import app
|
||||
|
||||
project = tmp_path / "skip-test"
|
||||
project.mkdir()
|
||||
|
||||
# Pre-create a shared script with custom content
|
||||
scripts_dir = project / ".specify" / "scripts" / "bash"
|
||||
scripts_dir.mkdir(parents=True)
|
||||
custom_content = "# user-modified common.sh\n"
|
||||
(scripts_dir / "common.sh").write_text(custom_content, encoding="utf-8")
|
||||
|
||||
# Pre-create a shared template with custom content
|
||||
templates_dir = project / ".specify" / "templates"
|
||||
templates_dir.mkdir(parents=True)
|
||||
custom_template = "# user-modified spec-template\n"
|
||||
(templates_dir / "spec-template.md").write_text(custom_template, encoding="utf-8")
|
||||
|
||||
old_cwd = os.getcwd()
|
||||
try:
|
||||
os.chdir(project)
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(app, [
|
||||
"init", "--here", "--force",
|
||||
"--integration", "copilot",
|
||||
"--script", "sh",
|
||||
"--no-git",
|
||||
], catch_exceptions=False)
|
||||
finally:
|
||||
os.chdir(old_cwd)
|
||||
|
||||
assert result.exit_code == 0
|
||||
|
||||
# User's files should be preserved
|
||||
assert (scripts_dir / "common.sh").read_text(encoding="utf-8") == custom_content
|
||||
assert (templates_dir / "spec-template.md").read_text(encoding="utf-8") == custom_template
|
||||
|
||||
# Other shared files should still be installed
|
||||
assert (scripts_dir / "setup-plan.sh").exists()
|
||||
assert (templates_dir / "plan-template.md").exists()
|
||||
266
tests/integrations/test_copilot.py
Normal file
266
tests/integrations/test_copilot.py
Normal file
@@ -0,0 +1,266 @@
|
||||
"""Tests for CopilotIntegration."""
|
||||
|
||||
import json
|
||||
import os
|
||||
|
||||
from specify_cli.integrations import get_integration
|
||||
from specify_cli.integrations.manifest import IntegrationManifest
|
||||
|
||||
|
||||
class TestCopilotIntegration:
|
||||
def test_copilot_key_and_config(self):
|
||||
copilot = get_integration("copilot")
|
||||
assert copilot is not None
|
||||
assert copilot.key == "copilot"
|
||||
assert copilot.config["folder"] == ".github/"
|
||||
assert copilot.config["commands_subdir"] == "agents"
|
||||
assert copilot.registrar_config["extension"] == ".agent.md"
|
||||
assert copilot.context_file == ".github/copilot-instructions.md"
|
||||
|
||||
def test_command_filename_agent_md(self):
|
||||
copilot = get_integration("copilot")
|
||||
assert copilot.command_filename("plan") == "speckit.plan.agent.md"
|
||||
|
||||
def test_setup_creates_agent_md_files(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
created = copilot.setup(tmp_path, m)
|
||||
assert len(created) > 0
|
||||
agent_files = [f for f in created if ".agent." in f.name]
|
||||
assert len(agent_files) > 0
|
||||
for f in agent_files:
|
||||
assert f.parent == tmp_path / ".github" / "agents"
|
||||
assert f.name.endswith(".agent.md")
|
||||
|
||||
def test_setup_creates_companion_prompts(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
created = copilot.setup(tmp_path, m)
|
||||
prompt_files = [f for f in created if f.parent.name == "prompts"]
|
||||
assert len(prompt_files) > 0
|
||||
for f in prompt_files:
|
||||
assert f.name.endswith(".prompt.md")
|
||||
content = f.read_text(encoding="utf-8")
|
||||
assert content.startswith("---\nagent: speckit.")
|
||||
|
||||
def test_agent_and_prompt_counts_match(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
created = copilot.setup(tmp_path, m)
|
||||
agents = [f for f in created if ".agent.md" in f.name]
|
||||
prompts = [f for f in created if ".prompt.md" in f.name]
|
||||
assert len(agents) == len(prompts)
|
||||
|
||||
def test_setup_creates_vscode_settings_new(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
assert copilot._vscode_settings_path() is not None
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
created = copilot.setup(tmp_path, m)
|
||||
settings = tmp_path / ".vscode" / "settings.json"
|
||||
assert settings.exists()
|
||||
assert settings in created
|
||||
assert any("settings.json" in k for k in m.files)
|
||||
|
||||
def test_setup_merges_existing_vscode_settings(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
vscode_dir = tmp_path / ".vscode"
|
||||
vscode_dir.mkdir(parents=True)
|
||||
existing = {"editor.fontSize": 14, "custom.setting": True}
|
||||
(vscode_dir / "settings.json").write_text(json.dumps(existing, indent=4), encoding="utf-8")
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
created = copilot.setup(tmp_path, m)
|
||||
settings = tmp_path / ".vscode" / "settings.json"
|
||||
data = json.loads(settings.read_text(encoding="utf-8"))
|
||||
assert data["editor.fontSize"] == 14
|
||||
assert data["custom.setting"] is True
|
||||
assert settings not in created
|
||||
assert not any("settings.json" in k for k in m.files)
|
||||
|
||||
def test_all_created_files_tracked_in_manifest(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
created = copilot.setup(tmp_path, m)
|
||||
for f in created:
|
||||
rel = f.resolve().relative_to(tmp_path.resolve()).as_posix()
|
||||
assert rel in m.files, f"Created file {rel} not tracked in manifest"
|
||||
|
||||
def test_install_uninstall_roundtrip(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
created = copilot.install(tmp_path, m)
|
||||
assert len(created) > 0
|
||||
m.save()
|
||||
for f in created:
|
||||
assert f.exists()
|
||||
removed, skipped = copilot.uninstall(tmp_path, m)
|
||||
assert len(removed) == len(created)
|
||||
assert skipped == []
|
||||
|
||||
def test_modified_file_survives_uninstall(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
created = copilot.install(tmp_path, m)
|
||||
m.save()
|
||||
modified_file = created[0]
|
||||
modified_file.write_text("user modified this", encoding="utf-8")
|
||||
removed, skipped = copilot.uninstall(tmp_path, m)
|
||||
assert modified_file.exists()
|
||||
assert modified_file in skipped
|
||||
|
||||
def test_directory_structure(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
copilot.setup(tmp_path, m)
|
||||
agents_dir = tmp_path / ".github" / "agents"
|
||||
assert agents_dir.is_dir()
|
||||
agent_files = sorted(agents_dir.glob("speckit.*.agent.md"))
|
||||
assert len(agent_files) == 9
|
||||
expected_commands = {
|
||||
"analyze", "checklist", "clarify", "constitution",
|
||||
"implement", "plan", "specify", "tasks", "taskstoissues",
|
||||
}
|
||||
actual_commands = {f.name.removeprefix("speckit.").removesuffix(".agent.md") for f in agent_files}
|
||||
assert actual_commands == expected_commands
|
||||
|
||||
def test_templates_are_processed(self, tmp_path):
|
||||
from specify_cli.integrations.copilot import CopilotIntegration
|
||||
copilot = CopilotIntegration()
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
copilot.setup(tmp_path, m)
|
||||
agents_dir = tmp_path / ".github" / "agents"
|
||||
for agent_file in agents_dir.glob("speckit.*.agent.md"):
|
||||
content = agent_file.read_text(encoding="utf-8")
|
||||
assert "{SCRIPT}" not in content, f"{agent_file.name} has unprocessed {{SCRIPT}}"
|
||||
assert "__AGENT__" not in content, f"{agent_file.name} has unprocessed __AGENT__"
|
||||
assert "{ARGS}" not in content, f"{agent_file.name} has unprocessed {{ARGS}}"
|
||||
assert "\nscripts:\n" not in content
|
||||
assert "\nagent_scripts:\n" not in content
|
||||
|
||||
def test_complete_file_inventory_sh(self, tmp_path):
|
||||
"""Every file produced by specify init --integration copilot --script sh."""
|
||||
from typer.testing import CliRunner
|
||||
from specify_cli import app
|
||||
project = tmp_path / "inventory-sh"
|
||||
project.mkdir()
|
||||
old_cwd = os.getcwd()
|
||||
try:
|
||||
os.chdir(project)
|
||||
result = CliRunner().invoke(app, [
|
||||
"init", "--here", "--integration", "copilot", "--script", "sh", "--no-git",
|
||||
], catch_exceptions=False)
|
||||
finally:
|
||||
os.chdir(old_cwd)
|
||||
assert result.exit_code == 0
|
||||
actual = sorted(p.relative_to(project).as_posix() for p in project.rglob("*") if p.is_file())
|
||||
expected = sorted([
|
||||
".github/agents/speckit.analyze.agent.md",
|
||||
".github/agents/speckit.checklist.agent.md",
|
||||
".github/agents/speckit.clarify.agent.md",
|
||||
".github/agents/speckit.constitution.agent.md",
|
||||
".github/agents/speckit.implement.agent.md",
|
||||
".github/agents/speckit.plan.agent.md",
|
||||
".github/agents/speckit.specify.agent.md",
|
||||
".github/agents/speckit.tasks.agent.md",
|
||||
".github/agents/speckit.taskstoissues.agent.md",
|
||||
".github/prompts/speckit.analyze.prompt.md",
|
||||
".github/prompts/speckit.checklist.prompt.md",
|
||||
".github/prompts/speckit.clarify.prompt.md",
|
||||
".github/prompts/speckit.constitution.prompt.md",
|
||||
".github/prompts/speckit.implement.prompt.md",
|
||||
".github/prompts/speckit.plan.prompt.md",
|
||||
".github/prompts/speckit.specify.prompt.md",
|
||||
".github/prompts/speckit.tasks.prompt.md",
|
||||
".github/prompts/speckit.taskstoissues.prompt.md",
|
||||
".vscode/settings.json",
|
||||
".specify/integration.json",
|
||||
".specify/init-options.json",
|
||||
".specify/integrations/copilot.manifest.json",
|
||||
".specify/integrations/speckit.manifest.json",
|
||||
".specify/integrations/copilot/scripts/update-context.ps1",
|
||||
".specify/integrations/copilot/scripts/update-context.sh",
|
||||
".specify/scripts/bash/check-prerequisites.sh",
|
||||
".specify/scripts/bash/common.sh",
|
||||
".specify/scripts/bash/create-new-feature.sh",
|
||||
".specify/scripts/bash/setup-plan.sh",
|
||||
".specify/scripts/bash/update-agent-context.sh",
|
||||
".specify/templates/agent-file-template.md",
|
||||
".specify/templates/checklist-template.md",
|
||||
".specify/templates/constitution-template.md",
|
||||
".specify/templates/plan-template.md",
|
||||
".specify/templates/spec-template.md",
|
||||
".specify/templates/tasks-template.md",
|
||||
".specify/memory/constitution.md",
|
||||
])
|
||||
assert actual == expected, (
|
||||
f"Missing: {sorted(set(expected) - set(actual))}\n"
|
||||
f"Extra: {sorted(set(actual) - set(expected))}"
|
||||
)
|
||||
|
||||
def test_complete_file_inventory_ps(self, tmp_path):
|
||||
"""Every file produced by specify init --integration copilot --script ps."""
|
||||
from typer.testing import CliRunner
|
||||
from specify_cli import app
|
||||
project = tmp_path / "inventory-ps"
|
||||
project.mkdir()
|
||||
old_cwd = os.getcwd()
|
||||
try:
|
||||
os.chdir(project)
|
||||
result = CliRunner().invoke(app, [
|
||||
"init", "--here", "--integration", "copilot", "--script", "ps", "--no-git",
|
||||
], catch_exceptions=False)
|
||||
finally:
|
||||
os.chdir(old_cwd)
|
||||
assert result.exit_code == 0
|
||||
actual = sorted(p.relative_to(project).as_posix() for p in project.rglob("*") if p.is_file())
|
||||
expected = sorted([
|
||||
".github/agents/speckit.analyze.agent.md",
|
||||
".github/agents/speckit.checklist.agent.md",
|
||||
".github/agents/speckit.clarify.agent.md",
|
||||
".github/agents/speckit.constitution.agent.md",
|
||||
".github/agents/speckit.implement.agent.md",
|
||||
".github/agents/speckit.plan.agent.md",
|
||||
".github/agents/speckit.specify.agent.md",
|
||||
".github/agents/speckit.tasks.agent.md",
|
||||
".github/agents/speckit.taskstoissues.agent.md",
|
||||
".github/prompts/speckit.analyze.prompt.md",
|
||||
".github/prompts/speckit.checklist.prompt.md",
|
||||
".github/prompts/speckit.clarify.prompt.md",
|
||||
".github/prompts/speckit.constitution.prompt.md",
|
||||
".github/prompts/speckit.implement.prompt.md",
|
||||
".github/prompts/speckit.plan.prompt.md",
|
||||
".github/prompts/speckit.specify.prompt.md",
|
||||
".github/prompts/speckit.tasks.prompt.md",
|
||||
".github/prompts/speckit.taskstoissues.prompt.md",
|
||||
".vscode/settings.json",
|
||||
".specify/integration.json",
|
||||
".specify/init-options.json",
|
||||
".specify/integrations/copilot.manifest.json",
|
||||
".specify/integrations/speckit.manifest.json",
|
||||
".specify/integrations/copilot/scripts/update-context.ps1",
|
||||
".specify/integrations/copilot/scripts/update-context.sh",
|
||||
".specify/scripts/powershell/check-prerequisites.ps1",
|
||||
".specify/scripts/powershell/common.ps1",
|
||||
".specify/scripts/powershell/create-new-feature.ps1",
|
||||
".specify/scripts/powershell/setup-plan.ps1",
|
||||
".specify/scripts/powershell/update-agent-context.ps1",
|
||||
".specify/templates/agent-file-template.md",
|
||||
".specify/templates/checklist-template.md",
|
||||
".specify/templates/constitution-template.md",
|
||||
".specify/templates/plan-template.md",
|
||||
".specify/templates/spec-template.md",
|
||||
".specify/templates/tasks-template.md",
|
||||
".specify/memory/constitution.md",
|
||||
])
|
||||
assert actual == expected, (
|
||||
f"Missing: {sorted(set(expected) - set(actual))}\n"
|
||||
f"Extra: {sorted(set(actual) - set(expected))}"
|
||||
)
|
||||
@@ -1,164 +1,18 @@
|
||||
"""Tests for the integrations foundation (Stage 1).
|
||||
|
||||
Covers:
|
||||
- IntegrationOption dataclass
|
||||
- IntegrationBase ABC and MarkdownIntegration base class
|
||||
- IntegrationManifest — record, hash, save, load, uninstall, modified detection
|
||||
- INTEGRATION_REGISTRY basics
|
||||
"""
|
||||
"""Tests for IntegrationManifest — record, hash, save, load, uninstall, modified detection."""
|
||||
|
||||
import hashlib
|
||||
import json
|
||||
|
||||
import pytest
|
||||
|
||||
from specify_cli.integrations import (
|
||||
INTEGRATION_REGISTRY,
|
||||
_register,
|
||||
get_integration,
|
||||
)
|
||||
from specify_cli.integrations.base import (
|
||||
IntegrationBase,
|
||||
IntegrationOption,
|
||||
MarkdownIntegration,
|
||||
)
|
||||
from specify_cli.integrations.manifest import IntegrationManifest, _sha256
|
||||
|
||||
|
||||
# ── helpers ──────────────────────────────────────────────────────────────────
|
||||
|
||||
|
||||
class _StubIntegration(MarkdownIntegration):
|
||||
"""Minimal concrete integration for testing."""
|
||||
|
||||
key = "stub"
|
||||
config = {
|
||||
"name": "Stub Agent",
|
||||
"folder": ".stub/",
|
||||
"commands_subdir": "commands",
|
||||
"install_url": None,
|
||||
"requires_cli": False,
|
||||
}
|
||||
registrar_config = {
|
||||
"dir": ".stub/commands",
|
||||
"format": "markdown",
|
||||
"args": "$ARGUMENTS",
|
||||
"extension": ".md",
|
||||
}
|
||||
context_file = "STUB.md"
|
||||
|
||||
|
||||
# ═══════════════════════════════════════════════════════════════════════════
|
||||
# IntegrationOption
|
||||
# ═══════════════════════════════════════════════════════════════════════════
|
||||
|
||||
|
||||
class TestIntegrationOption:
|
||||
def test_defaults(self):
|
||||
opt = IntegrationOption(name="--flag")
|
||||
assert opt.name == "--flag"
|
||||
assert opt.is_flag is False
|
||||
assert opt.required is False
|
||||
assert opt.default is None
|
||||
assert opt.help == ""
|
||||
|
||||
def test_flag_option(self):
|
||||
opt = IntegrationOption(name="--skills", is_flag=True, default=True, help="Enable skills")
|
||||
assert opt.is_flag is True
|
||||
assert opt.default is True
|
||||
assert opt.help == "Enable skills"
|
||||
|
||||
def test_required_option(self):
|
||||
opt = IntegrationOption(name="--commands-dir", required=True, help="Dir path")
|
||||
assert opt.required is True
|
||||
|
||||
def test_frozen(self):
|
||||
opt = IntegrationOption(name="--x")
|
||||
with pytest.raises(AttributeError):
|
||||
opt.name = "--y" # type: ignore[misc]
|
||||
|
||||
|
||||
# ═══════════════════════════════════════════════════════════════════════════
|
||||
# IntegrationBase / MarkdownIntegration
|
||||
# ═══════════════════════════════════════════════════════════════════════════
|
||||
|
||||
|
||||
class TestIntegrationBase:
|
||||
def test_key_and_config(self):
|
||||
i = _StubIntegration()
|
||||
assert i.key == "stub"
|
||||
assert i.config["name"] == "Stub Agent"
|
||||
assert i.registrar_config["format"] == "markdown"
|
||||
assert i.context_file == "STUB.md"
|
||||
|
||||
def test_options_default_empty(self):
|
||||
assert _StubIntegration.options() == []
|
||||
|
||||
def test_templates_dir(self):
|
||||
i = _StubIntegration()
|
||||
td = i.templates_dir()
|
||||
# Should point to a templates/ dir next to this test module.
|
||||
# It won't exist, but the path should be well-formed.
|
||||
assert td.name == "templates"
|
||||
|
||||
def test_setup_no_templates_returns_empty(self, tmp_path):
|
||||
"""setup() gracefully returns empty list when templates dir is missing."""
|
||||
i = _StubIntegration()
|
||||
manifest = IntegrationManifest("stub", tmp_path)
|
||||
created = i.setup(tmp_path, manifest)
|
||||
assert created == []
|
||||
|
||||
def test_setup_copies_templates(self, tmp_path, monkeypatch):
|
||||
"""setup() copies template files and records them in the manifest."""
|
||||
# Create templates under tmp_path so we don't mutate the source tree
|
||||
tpl = tmp_path / "_templates"
|
||||
tpl.mkdir()
|
||||
(tpl / "speckit.plan.md").write_text("plan content", encoding="utf-8")
|
||||
(tpl / "speckit.specify.md").write_text("spec content", encoding="utf-8")
|
||||
|
||||
i = _StubIntegration()
|
||||
monkeypatch.setattr(type(i), "templates_dir", lambda self: tpl)
|
||||
|
||||
project = tmp_path / "project"
|
||||
project.mkdir()
|
||||
created = i.setup(project, IntegrationManifest("stub", project))
|
||||
assert len(created) == 2
|
||||
assert (project / ".stub" / "commands" / "speckit.plan.md").exists()
|
||||
assert (project / ".stub" / "commands" / "speckit.specify.md").exists()
|
||||
|
||||
def test_install_delegates_to_setup(self, tmp_path):
|
||||
i = _StubIntegration()
|
||||
manifest = IntegrationManifest("stub", tmp_path)
|
||||
result = i.install(tmp_path, manifest)
|
||||
assert result == [] # no templates dir → empty
|
||||
|
||||
def test_uninstall_delegates_to_teardown(self, tmp_path):
|
||||
i = _StubIntegration()
|
||||
manifest = IntegrationManifest("stub", tmp_path)
|
||||
removed, skipped = i.uninstall(tmp_path, manifest)
|
||||
assert removed == []
|
||||
assert skipped == []
|
||||
|
||||
|
||||
class TestMarkdownIntegration:
|
||||
def test_is_subclass_of_base(self):
|
||||
assert issubclass(MarkdownIntegration, IntegrationBase)
|
||||
|
||||
def test_stub_is_markdown(self):
|
||||
assert isinstance(_StubIntegration(), MarkdownIntegration)
|
||||
|
||||
|
||||
# ═══════════════════════════════════════════════════════════════════════════
|
||||
# IntegrationManifest
|
||||
# ═══════════════════════════════════════════════════════════════════════════
|
||||
|
||||
|
||||
class TestManifestRecordFile:
|
||||
def test_record_file_writes_and_hashes(self, tmp_path):
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
content = "hello world"
|
||||
abs_path = m.record_file("a/b.txt", content)
|
||||
|
||||
assert abs_path == tmp_path / "a" / "b.txt"
|
||||
assert abs_path.read_text(encoding="utf-8") == content
|
||||
expected_hash = hashlib.sha256(content.encode()).hexdigest()
|
||||
@@ -191,7 +45,6 @@ class TestManifestPathTraversal:
|
||||
m.record_file("/tmp/escape.txt", "bad")
|
||||
|
||||
def test_record_existing_rejects_parent_traversal(self, tmp_path):
|
||||
# Create a file outside the project root
|
||||
escape = tmp_path.parent / "escape.txt"
|
||||
escape.write_text("evil", encoding="utf-8")
|
||||
try:
|
||||
@@ -202,15 +55,11 @@ class TestManifestPathTraversal:
|
||||
escape.unlink(missing_ok=True)
|
||||
|
||||
def test_uninstall_skips_traversal_paths(self, tmp_path):
|
||||
"""If a manifest is corrupted with traversal paths, uninstall ignores them."""
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("safe.txt", "good")
|
||||
# Manually inject a traversal path into the manifest
|
||||
m._files["../outside.txt"] = "fakehash"
|
||||
m.save()
|
||||
|
||||
removed, skipped = m.uninstall()
|
||||
# Only the safe file should have been removed
|
||||
assert len(removed) == 1
|
||||
assert removed[0].name == "safe.txt"
|
||||
|
||||
@@ -234,7 +83,6 @@ class TestManifestCheckModified:
|
||||
assert m.check_modified() == []
|
||||
|
||||
def test_symlink_treated_as_modified(self, tmp_path):
|
||||
"""A tracked file replaced with a symlink is reported as modified."""
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("f.txt", "original")
|
||||
target = tmp_path / "target.txt"
|
||||
@@ -249,11 +97,9 @@ class TestManifestUninstall:
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("d/f.txt", "content")
|
||||
m.save()
|
||||
|
||||
removed, skipped = m.uninstall()
|
||||
assert len(removed) == 1
|
||||
assert not (tmp_path / "d" / "f.txt").exists()
|
||||
# Parent dir cleaned up because empty
|
||||
assert not (tmp_path / "d").exists()
|
||||
assert skipped == []
|
||||
|
||||
@@ -262,7 +108,6 @@ class TestManifestUninstall:
|
||||
m.record_file("f.txt", "original")
|
||||
m.save()
|
||||
(tmp_path / "f.txt").write_text("modified", encoding="utf-8")
|
||||
|
||||
removed, skipped = m.uninstall()
|
||||
assert removed == []
|
||||
assert len(skipped) == 1
|
||||
@@ -273,18 +118,15 @@ class TestManifestUninstall:
|
||||
m.record_file("f.txt", "original")
|
||||
m.save()
|
||||
(tmp_path / "f.txt").write_text("modified", encoding="utf-8")
|
||||
|
||||
removed, skipped = m.uninstall(force=True)
|
||||
assert len(removed) == 1
|
||||
assert skipped == []
|
||||
assert not (tmp_path / "f.txt").exists()
|
||||
|
||||
def test_already_deleted_file(self, tmp_path):
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("f.txt", "content")
|
||||
m.save()
|
||||
(tmp_path / "f.txt").unlink()
|
||||
|
||||
removed, skipped = m.uninstall()
|
||||
assert removed == []
|
||||
assert skipped == []
|
||||
@@ -294,7 +136,6 @@ class TestManifestUninstall:
|
||||
m.record_file("f.txt", "content")
|
||||
m.save()
|
||||
assert m.manifest_path.exists()
|
||||
|
||||
m.uninstall()
|
||||
assert not m.manifest_path.exists()
|
||||
|
||||
@@ -302,26 +143,19 @@ class TestManifestUninstall:
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("a/b/c/f.txt", "content")
|
||||
m.save()
|
||||
|
||||
m.uninstall()
|
||||
assert not (tmp_path / "a" / "b" / "c").exists()
|
||||
assert not (tmp_path / "a" / "b").exists()
|
||||
assert not (tmp_path / "a").exists()
|
||||
|
||||
def test_preserves_nonempty_parent_dirs(self, tmp_path):
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("a/b/tracked.txt", "content")
|
||||
# Create an untracked sibling
|
||||
(tmp_path / "a" / "b" / "other.txt").write_text("keep", encoding="utf-8")
|
||||
m.save()
|
||||
|
||||
m.uninstall()
|
||||
assert not (tmp_path / "a" / "b" / "tracked.txt").exists()
|
||||
assert (tmp_path / "a" / "b" / "other.txt").exists()
|
||||
assert (tmp_path / "a" / "b").is_dir()
|
||||
|
||||
def test_symlink_skipped_without_force(self, tmp_path):
|
||||
"""A tracked file replaced with a symlink is skipped unless force."""
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("f.txt", "original")
|
||||
m.save()
|
||||
@@ -329,14 +163,11 @@ class TestManifestUninstall:
|
||||
target.write_text("target", encoding="utf-8")
|
||||
(tmp_path / "f.txt").unlink()
|
||||
(tmp_path / "f.txt").symlink_to(target)
|
||||
|
||||
removed, skipped = m.uninstall()
|
||||
assert removed == []
|
||||
assert len(skipped) == 1
|
||||
assert (tmp_path / "f.txt").is_symlink() # still there
|
||||
|
||||
def test_symlink_removed_with_force(self, tmp_path):
|
||||
"""A tracked file replaced with a symlink is removed with force."""
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("f.txt", "original")
|
||||
m.save()
|
||||
@@ -344,11 +175,9 @@ class TestManifestUninstall:
|
||||
target.write_text("target", encoding="utf-8")
|
||||
(tmp_path / "f.txt").unlink()
|
||||
(tmp_path / "f.txt").symlink_to(target)
|
||||
|
||||
removed, skipped = m.uninstall(force=True)
|
||||
assert len(removed) == 1
|
||||
assert not (tmp_path / "f.txt").exists()
|
||||
assert target.exists() # target not deleted
|
||||
assert target.exists()
|
||||
|
||||
|
||||
class TestManifestPersistence:
|
||||
@@ -356,12 +185,10 @@ class TestManifestPersistence:
|
||||
m = IntegrationManifest("myagent", tmp_path, version="2.0.1")
|
||||
m.record_file("dir/file.md", "# Hello")
|
||||
m.save()
|
||||
|
||||
loaded = IntegrationManifest.load("myagent", tmp_path)
|
||||
assert loaded.key == "myagent"
|
||||
assert loaded.version == "2.0.1"
|
||||
assert loaded.files == m.files
|
||||
assert loaded._installed_at == m._installed_at
|
||||
|
||||
def test_manifest_path(self, tmp_path):
|
||||
m = IntegrationManifest("copilot", tmp_path)
|
||||
@@ -378,58 +205,16 @@ class TestManifestPersistence:
|
||||
assert path.exists()
|
||||
data = json.loads(path.read_text(encoding="utf-8"))
|
||||
assert data["integration"] == "test"
|
||||
assert "installed_at" in data
|
||||
assert "f.txt" in data["files"]
|
||||
|
||||
def test_save_preserves_installed_at(self, tmp_path):
|
||||
m = IntegrationManifest("test", tmp_path)
|
||||
m.record_file("f.txt", "content")
|
||||
m.save()
|
||||
first_ts = m._installed_at
|
||||
|
||||
# Save again — timestamp should not change
|
||||
m.save()
|
||||
assert m._installed_at == first_ts
|
||||
|
||||
|
||||
# ═══════════════════════════════════════════════════════════════════════════
|
||||
# Registry
|
||||
# ═══════════════════════════════════════════════════════════════════════════
|
||||
|
||||
|
||||
class TestRegistry:
|
||||
def test_registry_starts_empty(self):
|
||||
# Registry may have been populated by other tests; at minimum
|
||||
# it should be a dict.
|
||||
assert isinstance(INTEGRATION_REGISTRY, dict)
|
||||
|
||||
def test_register_and_get(self):
|
||||
stub = _StubIntegration()
|
||||
_register(stub)
|
||||
try:
|
||||
assert get_integration("stub") is stub
|
||||
finally:
|
||||
INTEGRATION_REGISTRY.pop("stub", None)
|
||||
|
||||
def test_get_missing_returns_none(self):
|
||||
assert get_integration("nonexistent-xyz") is None
|
||||
|
||||
def test_register_empty_key_raises(self):
|
||||
class EmptyKey(MarkdownIntegration):
|
||||
key = ""
|
||||
with pytest.raises(ValueError, match="empty key"):
|
||||
_register(EmptyKey())
|
||||
|
||||
def test_register_duplicate_raises(self):
|
||||
stub = _StubIntegration()
|
||||
_register(stub)
|
||||
try:
|
||||
with pytest.raises(KeyError, match="already registered"):
|
||||
_register(_StubIntegration())
|
||||
finally:
|
||||
INTEGRATION_REGISTRY.pop("stub", None)
|
||||
|
||||
|
||||
class TestManifestLoadValidation:
|
||||
def test_load_non_dict_raises(self, tmp_path):
|
||||
path = tmp_path / ".specify" / "integrations" / "bad.manifest.json"
|
||||
45
tests/integrations/test_registry.py
Normal file
45
tests/integrations/test_registry.py
Normal file
@@ -0,0 +1,45 @@
|
||||
"""Tests for INTEGRATION_REGISTRY."""
|
||||
|
||||
import pytest
|
||||
|
||||
from specify_cli.integrations import (
|
||||
INTEGRATION_REGISTRY,
|
||||
_register,
|
||||
get_integration,
|
||||
)
|
||||
from specify_cli.integrations.base import MarkdownIntegration
|
||||
from .conftest import StubIntegration
|
||||
|
||||
|
||||
class TestRegistry:
|
||||
def test_registry_is_dict(self):
|
||||
assert isinstance(INTEGRATION_REGISTRY, dict)
|
||||
|
||||
def test_register_and_get(self):
|
||||
stub = StubIntegration()
|
||||
_register(stub)
|
||||
try:
|
||||
assert get_integration("stub") is stub
|
||||
finally:
|
||||
INTEGRATION_REGISTRY.pop("stub", None)
|
||||
|
||||
def test_get_missing_returns_none(self):
|
||||
assert get_integration("nonexistent-xyz") is None
|
||||
|
||||
def test_register_empty_key_raises(self):
|
||||
class EmptyKey(MarkdownIntegration):
|
||||
key = ""
|
||||
with pytest.raises(ValueError, match="empty key"):
|
||||
_register(EmptyKey())
|
||||
|
||||
def test_register_duplicate_raises(self):
|
||||
stub = StubIntegration()
|
||||
_register(stub)
|
||||
try:
|
||||
with pytest.raises(KeyError, match="already registered"):
|
||||
_register(StubIntegration())
|
||||
finally:
|
||||
INTEGRATION_REGISTRY.pop("stub", None)
|
||||
|
||||
def test_copilot_registered(self):
|
||||
assert "copilot" in INTEGRATION_REGISTRY
|
||||
Reference in New Issue
Block a user