mirror of
https://github.com/github/spec-kit.git
synced 2026-04-02 18:53:09 +00:00
Stage 2: Copilot integration — proof of concept with shared template primitives (#2035)
* feat: Stage 2a — CopilotIntegration with shared template primitives - base.py: added granular primitives (shared_commands_dir, shared_templates_dir, list_command_templates, command_filename, commands_dest, copy_command_to_directory, record_file_in_manifest, write_file_and_record, process_template) - CopilotIntegration: uses primitives to produce .agent.md commands, companion .prompt.md files, and .vscode/settings.json - Verified byte-for-byte parity with old release script output - Copilot auto-registered in INTEGRATION_REGISTRY - 70 tests (22 new: base primitives + copilot integration) Part of #1924 * feat: Stage 2b — --integration flag, routing, agent.json, shared infra - Added --integration flag to init() (mutually exclusive with --ai) - --ai copilot auto-promotes to integration path with migration nudge - Integration setup writes .specify/agent.json with integration key - _install_shared_infra() copies scripts and templates to .specify/ - init-options.json records 'integration' key when used - 4 new CLI tests: mutual exclusivity, unknown rejection, copilot end-to-end, auto-promote (74 total integration tests) Part of #1924 * feat: Stage 2 completion — integration scripts, integration.json, shared manifest - Added copilot/scripts/update-context.sh and .ps1 (thin wrappers that delegate to the shared update-agent-context script) - CopilotIntegration.setup() installs integration scripts to .specify/integrations/copilot/scripts/ - Renamed agent.json → integration.json with script paths - _install_shared_infra() now tracks files in integration-shared.manifest.json - Updated tests: scripts installed, integration.json has script paths, shared manifest recorded (74 tests) Part of #1924 * refactor: rename shared manifest to speckit.manifest.json Cleaner naming — the shared infrastructure (scripts, templates) belongs to spec-kit itself, not to any specific integration. * fix: copilot update-context scripts reflect target architecture Scripts now source shared functions (via SPECKIT_SOURCE_ONLY=1) and call update_agent_file directly with .github/copilot-instructions.md, rather than delegating back to the shared case statement. * fix: simplify copilot scripts — dispatcher sources common functions Integration scripts now contain only copilot-specific logic (target path + agent name). The dispatcher is responsible for sourcing shared functions before calling the integration script. * fix: copilot update-context scripts are self-contained implementations These scripts ARE the implementation — the dispatcher calls them. They source common.sh + update-agent-context functions, gather feature/plan data, then call update_agent_file with the copilot target path (.github/copilot-instructions.md). * docs: add Stage 7 activation note to copilot update-context scripts * test: add complete file inventory test for copilot integration Validates every single file (37 total) produced by specify init --integration copilot --script sh --no-git. * test: add PowerShell file inventory test for copilot integration Validates all 37 files produced by --script ps variant, including .specify/scripts/powershell/ instead of bash. * refactor: split test_integrations.py into tests/integrations/ directory - test_base.py: IntegrationOption, IntegrationBase, MarkdownIntegration, primitives - test_manifest.py: IntegrationManifest, path traversal, persistence, validation - test_registry.py: INTEGRATION_REGISTRY - test_copilot.py: CopilotIntegration unit tests - test_cli.py: --integration flag, auto-promote, file inventories (sh + ps) - conftest.py: shared StubIntegration helper 76 integration tests + 48 consistency tests = 124 total, all passing. * refactor: move file inventory tests from test_cli to test_copilot File inventories are copilot-specific. test_cli.py now only tests CLI flag mechanics (mutual exclusivity, unknown rejection, auto-promote). * fix: skip JSONC merge to preserve user settings, fix docstring - _merge_vscode_settings() now returns early (skips merge) when existing settings.json can't be parsed (e.g. JSONC with comments), instead of overwriting with empty settings - Updated _install_shared_infra() docstring to match implementation (scripts + templates, speckit.manifest.json) * fix: warn user when JSONC settings merge is skipped * fix: show template content when JSONC merge is skipped User now sees the exact settings they should add manually. * fix: document process_template requirement, merge scripts without rmtree - base.py setup() docstring now explicitly states raw copy behavior and directs to CopilotIntegration for process_template example - _install_shared_infra() uses merge/overwrite instead of rmtree to preserve user-added files under .specify/scripts/ * fix: don't overwrite pre-existing shared scripts or templates Only write files that don't already exist — preserves any user modifications to shared scripts (common.sh etc.) and templates. * fix: warn user about skipped pre-existing shared files Lists all shared scripts and templates that were not copied because they already existed in the project. * test: add test for shared infra skip behavior on pre-existing files Verifies that _install_shared_infra() preserves user-modified scripts and templates while still installing missing ones. * fix: address review — containment check, deterministic prompts, manifest accuracy - CopilotIntegration.setup() adds dest containment check (relative_to) - Companion prompts generated from templates list, not directory glob - _install_shared_infra() only records files actually copied (not pre-existing) - VS Code settings tests made unconditional (assert template exists) - Inventory tests use .as_posix() for cross-platform paths * fix: correct PS1 function names, document SPECKIT_SOURCE_ONLY prerequisite - Fixed Get-FeaturePaths → Get-FeaturePathsEnv, Read-PlanData → Parse-PlanData - Documented that shared scripts must guard Main with SPECKIT_SOURCE_ONLY before these integration scripts can be activated (Stage 7) * fix: add dict type check for settings merge, simplify PS1 to subprocess - _merge_vscode_settings() skips merge with warning if parsed JSON is not a dict (array, null, etc.) - PS1 update-context.ps1 uses & invocation instead of dot-sourcing since the shared script runs Main unconditionally * fix: skip-write on no-op merge, bash subprocess, dynamic integration list - _merge_vscode_settings() only writes when keys were actually added - update-context.sh uses exec subprocess like PS1 version - Unknown integration error lists available integrations dynamically * fix: align path rewriting with release script, add .specify/.specify/ fix Path rewrite regex matches the release script's rewrite_paths() exactly (verified byte-identical output). Added .specify/.specify/ double-prefix fix for additional safety.
This commit is contained in:
@@ -1197,6 +1197,84 @@ def _locate_release_script() -> tuple[Path, str]:
|
||||
raise FileNotFoundError(f"Release script '{name}' not found in core_pack or source checkout")
|
||||
|
||||
|
||||
def _install_shared_infra(
|
||||
project_path: Path,
|
||||
script_type: str,
|
||||
tracker: StepTracker | None = None,
|
||||
) -> bool:
|
||||
"""Install shared infrastructure files into *project_path*.
|
||||
|
||||
Copies ``.specify/scripts/`` and ``.specify/templates/`` from the
|
||||
bundled core_pack or source checkout. Tracks all installed files
|
||||
in ``speckit.manifest.json``.
|
||||
Returns ``True`` on success.
|
||||
"""
|
||||
from .integrations.manifest import IntegrationManifest
|
||||
|
||||
core = _locate_core_pack()
|
||||
manifest = IntegrationManifest("speckit", project_path, version=get_speckit_version())
|
||||
|
||||
# Scripts
|
||||
if core and (core / "scripts").is_dir():
|
||||
scripts_src = core / "scripts"
|
||||
else:
|
||||
repo_root = Path(__file__).parent.parent.parent
|
||||
scripts_src = repo_root / "scripts"
|
||||
|
||||
skipped_files: list[str] = []
|
||||
|
||||
if scripts_src.is_dir():
|
||||
dest_scripts = project_path / ".specify" / "scripts"
|
||||
dest_scripts.mkdir(parents=True, exist_ok=True)
|
||||
variant_dir = "bash" if script_type == "sh" else "powershell"
|
||||
variant_src = scripts_src / variant_dir
|
||||
if variant_src.is_dir():
|
||||
dest_variant = dest_scripts / variant_dir
|
||||
dest_variant.mkdir(parents=True, exist_ok=True)
|
||||
# Merge without overwriting — only add files that don't exist yet
|
||||
for src_path in variant_src.rglob("*"):
|
||||
if src_path.is_file():
|
||||
rel_path = src_path.relative_to(variant_src)
|
||||
dst_path = dest_variant / rel_path
|
||||
if dst_path.exists():
|
||||
skipped_files.append(str(dst_path.relative_to(project_path)))
|
||||
else:
|
||||
dst_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
shutil.copy2(src_path, dst_path)
|
||||
rel = dst_path.relative_to(project_path).as_posix()
|
||||
manifest.record_existing(rel)
|
||||
|
||||
# Page templates (not command templates, not vscode-settings.json)
|
||||
if core and (core / "templates").is_dir():
|
||||
templates_src = core / "templates"
|
||||
else:
|
||||
repo_root = Path(__file__).parent.parent.parent
|
||||
templates_src = repo_root / "templates"
|
||||
|
||||
if templates_src.is_dir():
|
||||
dest_templates = project_path / ".specify" / "templates"
|
||||
dest_templates.mkdir(parents=True, exist_ok=True)
|
||||
for f in templates_src.iterdir():
|
||||
if f.is_file() and f.name != "vscode-settings.json" and not f.name.startswith("."):
|
||||
dst = dest_templates / f.name
|
||||
if dst.exists():
|
||||
skipped_files.append(str(dst.relative_to(project_path)))
|
||||
else:
|
||||
shutil.copy2(f, dst)
|
||||
rel = dst.relative_to(project_path).as_posix()
|
||||
manifest.record_existing(rel)
|
||||
|
||||
if skipped_files:
|
||||
import logging
|
||||
logging.getLogger(__name__).warning(
|
||||
"The following shared files already exist and were not overwritten:\n%s",
|
||||
"\n".join(f" {f}" for f in skipped_files),
|
||||
)
|
||||
|
||||
manifest.save()
|
||||
return True
|
||||
|
||||
|
||||
def scaffold_from_core_pack(
|
||||
project_path: Path,
|
||||
ai_assistant: str,
|
||||
@@ -1828,6 +1906,7 @@ def init(
|
||||
offline: bool = typer.Option(False, "--offline", help="Use assets bundled in the specify-cli package instead of downloading from GitHub (no network access required). Bundled assets will become the default in v0.6.0 and this flag will be removed."),
|
||||
preset: str = typer.Option(None, "--preset", help="Install a preset during initialization (by preset ID)"),
|
||||
branch_numbering: str = typer.Option(None, "--branch-numbering", help="Branch numbering strategy: 'sequential' (001, 002, ...) or 'timestamp' (YYYYMMDD-HHMMSS)"),
|
||||
integration: str = typer.Option(None, "--integration", help="Use the new integration system (e.g. --integration copilot). Mutually exclusive with --ai."),
|
||||
):
|
||||
"""
|
||||
Initialize a new Specify project.
|
||||
@@ -1889,6 +1968,35 @@ def init(
|
||||
if ai_assistant:
|
||||
ai_assistant = AI_ASSISTANT_ALIASES.get(ai_assistant, ai_assistant)
|
||||
|
||||
# --integration and --ai are mutually exclusive
|
||||
if integration and ai_assistant:
|
||||
console.print("[red]Error:[/red] --integration and --ai are mutually exclusive")
|
||||
console.print("[yellow]Use:[/yellow] --integration for the new integration system, or --ai for the legacy path")
|
||||
raise typer.Exit(1)
|
||||
|
||||
# Auto-promote: --ai copilot → integration path with a nudge
|
||||
use_integration = False
|
||||
if integration:
|
||||
from .integrations import INTEGRATION_REGISTRY, get_integration
|
||||
resolved_integration = get_integration(integration)
|
||||
if not resolved_integration:
|
||||
console.print(f"[red]Error:[/red] Unknown integration: '{integration}'")
|
||||
available = ", ".join(sorted(INTEGRATION_REGISTRY))
|
||||
console.print(f"[yellow]Available integrations:[/yellow] {available}")
|
||||
raise typer.Exit(1)
|
||||
use_integration = True
|
||||
# Map integration key to the ai_assistant variable for downstream compatibility
|
||||
ai_assistant = integration
|
||||
elif ai_assistant == "copilot":
|
||||
from .integrations import get_integration
|
||||
resolved_integration = get_integration("copilot")
|
||||
if resolved_integration:
|
||||
use_integration = True
|
||||
console.print(
|
||||
"[dim]Tip: Use [bold]--integration copilot[/bold] instead of "
|
||||
"--ai copilot. The --ai flag will be deprecated in a future release.[/dim]"
|
||||
)
|
||||
|
||||
if project_name == ".":
|
||||
here = True
|
||||
project_name = None # Clear project_name to use existing validation logic
|
||||
@@ -2057,7 +2165,10 @@ def init(
|
||||
"This will become the default in v0.6.0."
|
||||
)
|
||||
|
||||
if use_github:
|
||||
if use_integration:
|
||||
tracker.add("integration", "Install integration")
|
||||
tracker.add("shared-infra", "Install shared infrastructure")
|
||||
elif use_github:
|
||||
for key, label in [
|
||||
("fetch", "Fetch latest release"),
|
||||
("download", "Download template"),
|
||||
@@ -2092,7 +2203,39 @@ def init(
|
||||
verify = not skip_tls
|
||||
local_ssl_context = ssl_context if verify else False
|
||||
|
||||
if use_github:
|
||||
if use_integration:
|
||||
# Integration-based scaffolding (new path)
|
||||
from .integrations.manifest import IntegrationManifest
|
||||
tracker.start("integration")
|
||||
manifest = IntegrationManifest(
|
||||
resolved_integration.key, project_path, version=get_speckit_version()
|
||||
)
|
||||
resolved_integration.setup(
|
||||
project_path, manifest,
|
||||
script_type=selected_script,
|
||||
)
|
||||
manifest.save()
|
||||
|
||||
# Write .specify/integration.json
|
||||
script_ext = "sh" if selected_script == "sh" else "ps1"
|
||||
integration_json = project_path / ".specify" / "integration.json"
|
||||
integration_json.parent.mkdir(parents=True, exist_ok=True)
|
||||
integration_json.write_text(json.dumps({
|
||||
"integration": resolved_integration.key,
|
||||
"version": get_speckit_version(),
|
||||
"scripts": {
|
||||
"update-context": f".specify/integrations/{resolved_integration.key}/scripts/update-context.{script_ext}",
|
||||
},
|
||||
}, indent=2) + "\n", encoding="utf-8")
|
||||
|
||||
tracker.complete("integration", resolved_integration.config.get("name", resolved_integration.key))
|
||||
|
||||
# Install shared infrastructure (scripts, templates)
|
||||
tracker.start("shared-infra")
|
||||
_install_shared_infra(project_path, selected_script, tracker=tracker)
|
||||
tracker.complete("shared-infra", f"scripts ({selected_script}) + templates")
|
||||
|
||||
elif use_github:
|
||||
with httpx.Client(verify=local_ssl_context) as local_client:
|
||||
download_and_extract_template(
|
||||
project_path,
|
||||
@@ -2227,7 +2370,7 @@ def init(
|
||||
# Persist the CLI options so later operations (e.g. preset add)
|
||||
# can adapt their behaviour without re-scanning the filesystem.
|
||||
# Must be saved BEFORE preset install so _get_skills_dir() works.
|
||||
save_init_options(project_path, {
|
||||
init_opts = {
|
||||
"ai": selected_ai,
|
||||
"ai_skills": ai_skills,
|
||||
"ai_commands_dir": ai_commands_dir,
|
||||
@@ -2237,7 +2380,10 @@ def init(
|
||||
"offline": offline,
|
||||
"script": selected_script,
|
||||
"speckit_version": get_speckit_version(),
|
||||
})
|
||||
}
|
||||
if use_integration:
|
||||
init_opts["integration"] = resolved_integration.key
|
||||
save_init_options(project_path, init_opts)
|
||||
|
||||
# Install preset if specified
|
||||
if preset:
|
||||
|
||||
@@ -32,3 +32,15 @@ def _register(integration: IntegrationBase) -> None:
|
||||
def get_integration(key: str) -> IntegrationBase | None:
|
||||
"""Return the integration for *key*, or ``None`` if not registered."""
|
||||
return INTEGRATION_REGISTRY.get(key)
|
||||
|
||||
|
||||
# -- Register built-in integrations --------------------------------------
|
||||
|
||||
def _register_builtins() -> None:
|
||||
"""Register all built-in integrations."""
|
||||
from .copilot import CopilotIntegration
|
||||
|
||||
_register(CopilotIntegration())
|
||||
|
||||
|
||||
_register_builtins()
|
||||
|
||||
@@ -9,6 +9,7 @@ Provides:
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import re
|
||||
import shutil
|
||||
from abc import ABC
|
||||
from dataclasses import dataclass
|
||||
@@ -84,35 +85,65 @@ class IntegrationBase(ABC):
|
||||
"""Return options this integration accepts. Default: none."""
|
||||
return []
|
||||
|
||||
def templates_dir(self) -> Path:
|
||||
"""Return the path to this integration's bundled templates.
|
||||
# -- Primitives — building blocks for setup() -------------------------
|
||||
|
||||
By convention, templates live in a ``templates/`` subdirectory
|
||||
next to the file where the integration class is defined.
|
||||
def shared_commands_dir(self) -> Path | None:
|
||||
"""Return path to the shared command templates directory.
|
||||
|
||||
Checks ``core_pack/commands/`` (wheel install) first, then
|
||||
``templates/commands/`` (source checkout). Returns ``None``
|
||||
if neither exists.
|
||||
"""
|
||||
import inspect
|
||||
|
||||
module_file = inspect.getfile(type(self))
|
||||
return Path(module_file).resolve().parent / "templates"
|
||||
pkg_dir = Path(inspect.getfile(IntegrationBase)).resolve().parent.parent
|
||||
for candidate in [
|
||||
pkg_dir / "core_pack" / "commands",
|
||||
pkg_dir.parent.parent / "templates" / "commands",
|
||||
]:
|
||||
if candidate.is_dir():
|
||||
return candidate
|
||||
return None
|
||||
|
||||
def setup(
|
||||
self,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
parsed_options: dict[str, Any] | None = None,
|
||||
**opts: Any,
|
||||
) -> list[Path]:
|
||||
"""Install integration files into *project_root*.
|
||||
def shared_templates_dir(self) -> Path | None:
|
||||
"""Return path to the shared page templates directory.
|
||||
|
||||
Returns the list of files created. The default implementation
|
||||
copies every file from ``templates_dir()`` into the commands
|
||||
directory derived from ``config``, recording each in *manifest*.
|
||||
Contains ``vscode-settings.json``, ``spec-template.md``, etc.
|
||||
Checks ``core_pack/templates/`` then ``templates/``.
|
||||
"""
|
||||
created: list[Path] = []
|
||||
tpl_dir = self.templates_dir()
|
||||
if not tpl_dir.is_dir():
|
||||
return created
|
||||
import inspect
|
||||
|
||||
pkg_dir = Path(inspect.getfile(IntegrationBase)).resolve().parent.parent
|
||||
for candidate in [
|
||||
pkg_dir / "core_pack" / "templates",
|
||||
pkg_dir.parent.parent / "templates",
|
||||
]:
|
||||
if candidate.is_dir():
|
||||
return candidate
|
||||
return None
|
||||
|
||||
def list_command_templates(self) -> list[Path]:
|
||||
"""Return sorted list of command template files from the shared directory."""
|
||||
cmd_dir = self.shared_commands_dir()
|
||||
if not cmd_dir or not cmd_dir.is_dir():
|
||||
return []
|
||||
return sorted(f for f in cmd_dir.iterdir() if f.is_file() and f.suffix == ".md")
|
||||
|
||||
def command_filename(self, template_name: str) -> str:
|
||||
"""Return the destination filename for a command template.
|
||||
|
||||
*template_name* is the stem of the source file (e.g. ``"plan"``).
|
||||
Default: ``speckit.{template_name}.md``. Subclasses override
|
||||
to change the extension or naming convention.
|
||||
"""
|
||||
return f"speckit.{template_name}.md"
|
||||
|
||||
def commands_dest(self, project_root: Path) -> Path:
|
||||
"""Return the absolute path to the commands output directory.
|
||||
|
||||
Derived from ``config["folder"]`` and ``config["commands_subdir"]``.
|
||||
Raises ``ValueError`` if ``config`` or ``folder`` is missing.
|
||||
"""
|
||||
if not self.config:
|
||||
raise ValueError(
|
||||
f"{type(self).__name__}.config is not set; integration "
|
||||
@@ -123,6 +154,179 @@ class IntegrationBase(ABC):
|
||||
raise ValueError(
|
||||
f"{type(self).__name__}.config is missing required 'folder' entry."
|
||||
)
|
||||
subdir = self.config.get("commands_subdir", "commands")
|
||||
return project_root / folder / subdir
|
||||
|
||||
# -- File operations — granular primitives for setup() ----------------
|
||||
|
||||
@staticmethod
|
||||
def copy_command_to_directory(
|
||||
src: Path,
|
||||
dest_dir: Path,
|
||||
filename: str,
|
||||
) -> Path:
|
||||
"""Copy a command template to *dest_dir* with the given *filename*.
|
||||
|
||||
Creates *dest_dir* if needed. Returns the absolute path of the
|
||||
written file. The caller can post-process the file before
|
||||
recording it in the manifest.
|
||||
"""
|
||||
dest_dir.mkdir(parents=True, exist_ok=True)
|
||||
dst = dest_dir / filename
|
||||
shutil.copy2(src, dst)
|
||||
return dst
|
||||
|
||||
@staticmethod
|
||||
def record_file_in_manifest(
|
||||
file_path: Path,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
) -> None:
|
||||
"""Hash *file_path* and record it in *manifest*.
|
||||
|
||||
*file_path* must be inside *project_root*.
|
||||
"""
|
||||
rel = file_path.resolve().relative_to(project_root.resolve())
|
||||
manifest.record_existing(rel)
|
||||
|
||||
@staticmethod
|
||||
def write_file_and_record(
|
||||
content: str,
|
||||
dest: Path,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
) -> Path:
|
||||
"""Write *content* to *dest*, hash it, and record in *manifest*.
|
||||
|
||||
Creates parent directories as needed. Returns *dest*.
|
||||
"""
|
||||
dest.parent.mkdir(parents=True, exist_ok=True)
|
||||
dest.write_text(content, encoding="utf-8")
|
||||
rel = dest.resolve().relative_to(project_root.resolve())
|
||||
manifest.record_existing(rel)
|
||||
return dest
|
||||
|
||||
@staticmethod
|
||||
def process_template(
|
||||
content: str,
|
||||
agent_name: str,
|
||||
script_type: str,
|
||||
arg_placeholder: str = "$ARGUMENTS",
|
||||
) -> str:
|
||||
"""Process a raw command template into agent-ready content.
|
||||
|
||||
Performs the same transformations as the release script:
|
||||
1. Extract ``scripts.<script_type>`` value from YAML frontmatter
|
||||
2. Replace ``{SCRIPT}`` with the extracted script command
|
||||
3. Extract ``agent_scripts.<script_type>`` and replace ``{AGENT_SCRIPT}``
|
||||
4. Strip ``scripts:`` and ``agent_scripts:`` sections from frontmatter
|
||||
5. Replace ``{ARGS}`` with *arg_placeholder*
|
||||
6. Replace ``__AGENT__`` with *agent_name*
|
||||
7. Rewrite paths: ``scripts/`` → ``.specify/scripts/`` etc.
|
||||
"""
|
||||
# 1. Extract script command from frontmatter
|
||||
script_command = ""
|
||||
script_pattern = re.compile(
|
||||
rf"^\s*{re.escape(script_type)}:\s*(.+)$", re.MULTILINE
|
||||
)
|
||||
# Find the scripts: block
|
||||
in_scripts = False
|
||||
for line in content.splitlines():
|
||||
if line.strip() == "scripts:":
|
||||
in_scripts = True
|
||||
continue
|
||||
if in_scripts and line and not line[0].isspace():
|
||||
in_scripts = False
|
||||
if in_scripts:
|
||||
m = script_pattern.match(line)
|
||||
if m:
|
||||
script_command = m.group(1).strip()
|
||||
break
|
||||
|
||||
# 2. Replace {SCRIPT}
|
||||
if script_command:
|
||||
content = content.replace("{SCRIPT}", script_command)
|
||||
|
||||
# 3. Extract agent_script command
|
||||
agent_script_command = ""
|
||||
in_agent_scripts = False
|
||||
for line in content.splitlines():
|
||||
if line.strip() == "agent_scripts:":
|
||||
in_agent_scripts = True
|
||||
continue
|
||||
if in_agent_scripts and line and not line[0].isspace():
|
||||
in_agent_scripts = False
|
||||
if in_agent_scripts:
|
||||
m = script_pattern.match(line)
|
||||
if m:
|
||||
agent_script_command = m.group(1).strip()
|
||||
break
|
||||
|
||||
if agent_script_command:
|
||||
content = content.replace("{AGENT_SCRIPT}", agent_script_command)
|
||||
|
||||
# 4. Strip scripts: and agent_scripts: sections from frontmatter
|
||||
lines = content.splitlines(keepends=True)
|
||||
output_lines: list[str] = []
|
||||
in_frontmatter = False
|
||||
skip_section = False
|
||||
dash_count = 0
|
||||
for line in lines:
|
||||
stripped = line.rstrip("\n\r")
|
||||
if stripped == "---":
|
||||
dash_count += 1
|
||||
if dash_count == 1:
|
||||
in_frontmatter = True
|
||||
else:
|
||||
in_frontmatter = False
|
||||
skip_section = False
|
||||
output_lines.append(line)
|
||||
continue
|
||||
if in_frontmatter:
|
||||
if stripped in ("scripts:", "agent_scripts:"):
|
||||
skip_section = True
|
||||
continue
|
||||
if skip_section:
|
||||
if line[0:1].isspace():
|
||||
continue # skip indented content under scripts/agent_scripts
|
||||
skip_section = False
|
||||
output_lines.append(line)
|
||||
content = "".join(output_lines)
|
||||
|
||||
# 5. Replace {ARGS}
|
||||
content = content.replace("{ARGS}", arg_placeholder)
|
||||
|
||||
# 6. Replace __AGENT__
|
||||
content = content.replace("__AGENT__", agent_name)
|
||||
|
||||
# 7. Rewrite paths (matches release script's rewrite_paths())
|
||||
content = re.sub(r"(/?)memory/", r".specify/memory/", content)
|
||||
content = re.sub(r"(/?)scripts/", r".specify/scripts/", content)
|
||||
content = re.sub(r"(/?)templates/", r".specify/templates/", content)
|
||||
# Fix double-prefix (same as release script's .specify.specify/ fix)
|
||||
content = content.replace(".specify.specify/", ".specify/")
|
||||
content = content.replace(".specify/.specify/", ".specify/")
|
||||
|
||||
return content
|
||||
|
||||
def setup(
|
||||
self,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
parsed_options: dict[str, Any] | None = None,
|
||||
**opts: Any,
|
||||
) -> list[Path]:
|
||||
"""Install integration command files into *project_root*.
|
||||
|
||||
Returns the list of files created. Copies raw templates without
|
||||
processing. Integrations that need placeholder replacement
|
||||
(e.g. ``{SCRIPT}``, ``__AGENT__``) should override ``setup()``
|
||||
and call ``process_template()`` in their own loop — see
|
||||
``CopilotIntegration`` for an example.
|
||||
"""
|
||||
templates = self.list_command_templates()
|
||||
if not templates:
|
||||
return []
|
||||
|
||||
project_root_resolved = project_root.resolve()
|
||||
if manifest.project_root != project_root_resolved:
|
||||
@@ -130,9 +334,8 @@ class IntegrationBase(ABC):
|
||||
f"manifest.project_root ({manifest.project_root}) does not match "
|
||||
f"project_root ({project_root_resolved})"
|
||||
)
|
||||
subdir = self.config.get("commands_subdir", "commands")
|
||||
dest = (project_root / folder / subdir).resolve()
|
||||
# Ensure destination stays within the project root
|
||||
|
||||
dest = self.commands_dest(project_root).resolve()
|
||||
try:
|
||||
dest.relative_to(project_root_resolved)
|
||||
except ValueError as exc:
|
||||
@@ -141,16 +344,13 @@ class IntegrationBase(ABC):
|
||||
f"project root {project_root_resolved}"
|
||||
) from exc
|
||||
|
||||
dest.mkdir(parents=True, exist_ok=True)
|
||||
created: list[Path] = []
|
||||
|
||||
for src_file in sorted(tpl_dir.iterdir()):
|
||||
if src_file.is_file():
|
||||
dst_file = dest / src_file.name
|
||||
dst_resolved = dst_file.resolve()
|
||||
rel = dst_resolved.relative_to(project_root_resolved)
|
||||
shutil.copy2(src_file, dst_file)
|
||||
manifest.record_existing(rel)
|
||||
created.append(dst_file)
|
||||
for src_file in templates:
|
||||
dst_name = self.command_filename(src_file.stem)
|
||||
dst_file = self.copy_command_to_directory(src_file, dest, dst_name)
|
||||
self.record_file_in_manifest(dst_file, project_root, manifest)
|
||||
created.append(dst_file)
|
||||
|
||||
return created
|
||||
|
||||
|
||||
197
src/specify_cli/integrations/copilot/__init__.py
Normal file
197
src/specify_cli/integrations/copilot/__init__.py
Normal file
@@ -0,0 +1,197 @@
|
||||
"""Copilot integration — GitHub Copilot in VS Code.
|
||||
|
||||
Copilot has several unique behaviors compared to standard markdown agents:
|
||||
- Commands use ``.agent.md`` extension (not ``.md``)
|
||||
- Each command gets a companion ``.prompt.md`` file in ``.github/prompts/``
|
||||
- Installs ``.vscode/settings.json`` with prompt file recommendations
|
||||
- Context file lives at ``.github/copilot-instructions.md``
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import shutil
|
||||
from pathlib import Path
|
||||
from typing import Any
|
||||
|
||||
from ..base import IntegrationBase
|
||||
from ..manifest import IntegrationManifest
|
||||
|
||||
|
||||
class CopilotIntegration(IntegrationBase):
|
||||
"""Integration for GitHub Copilot in VS Code."""
|
||||
|
||||
key = "copilot"
|
||||
config = {
|
||||
"name": "GitHub Copilot",
|
||||
"folder": ".github/",
|
||||
"commands_subdir": "agents",
|
||||
"install_url": None,
|
||||
"requires_cli": False,
|
||||
}
|
||||
registrar_config = {
|
||||
"dir": ".github/agents",
|
||||
"format": "markdown",
|
||||
"args": "$ARGUMENTS",
|
||||
"extension": ".agent.md",
|
||||
}
|
||||
context_file = ".github/copilot-instructions.md"
|
||||
|
||||
def command_filename(self, template_name: str) -> str:
|
||||
"""Copilot commands use ``.agent.md`` extension."""
|
||||
return f"speckit.{template_name}.agent.md"
|
||||
|
||||
def setup(
|
||||
self,
|
||||
project_root: Path,
|
||||
manifest: IntegrationManifest,
|
||||
parsed_options: dict[str, Any] | None = None,
|
||||
**opts: Any,
|
||||
) -> list[Path]:
|
||||
"""Install copilot commands, companion prompts, and VS Code settings.
|
||||
|
||||
Uses base class primitives to: read templates, process them
|
||||
(replace placeholders, strip script blocks, rewrite paths),
|
||||
write as ``.agent.md``, then add companion prompts and VS Code settings.
|
||||
"""
|
||||
project_root_resolved = project_root.resolve()
|
||||
if manifest.project_root != project_root_resolved:
|
||||
raise ValueError(
|
||||
f"manifest.project_root ({manifest.project_root}) does not match "
|
||||
f"project_root ({project_root_resolved})"
|
||||
)
|
||||
|
||||
templates = self.list_command_templates()
|
||||
if not templates:
|
||||
return []
|
||||
|
||||
dest = self.commands_dest(project_root)
|
||||
dest_resolved = dest.resolve()
|
||||
try:
|
||||
dest_resolved.relative_to(project_root_resolved)
|
||||
except ValueError as exc:
|
||||
raise ValueError(
|
||||
f"Integration destination {dest_resolved} escapes "
|
||||
f"project root {project_root_resolved}"
|
||||
) from exc
|
||||
dest.mkdir(parents=True, exist_ok=True)
|
||||
created: list[Path] = []
|
||||
|
||||
script_type = opts.get("script_type", "sh")
|
||||
arg_placeholder = self.registrar_config.get("args", "$ARGUMENTS")
|
||||
|
||||
# 1. Process and write command files as .agent.md
|
||||
for src_file in templates:
|
||||
raw = src_file.read_text(encoding="utf-8")
|
||||
processed = self.process_template(raw, self.key, script_type, arg_placeholder)
|
||||
dst_name = self.command_filename(src_file.stem)
|
||||
dst_file = self.write_file_and_record(
|
||||
processed, dest / dst_name, project_root, manifest
|
||||
)
|
||||
created.append(dst_file)
|
||||
|
||||
# 2. Generate companion .prompt.md files from the templates we just wrote
|
||||
prompts_dir = project_root / ".github" / "prompts"
|
||||
for src_file in templates:
|
||||
cmd_name = f"speckit.{src_file.stem}"
|
||||
prompt_content = f"---\nagent: {cmd_name}\n---\n"
|
||||
prompt_file = self.write_file_and_record(
|
||||
prompt_content,
|
||||
prompts_dir / f"{cmd_name}.prompt.md",
|
||||
project_root,
|
||||
manifest,
|
||||
)
|
||||
created.append(prompt_file)
|
||||
|
||||
# Write .vscode/settings.json
|
||||
settings_src = self._vscode_settings_path()
|
||||
if settings_src and settings_src.is_file():
|
||||
dst_settings = project_root / ".vscode" / "settings.json"
|
||||
dst_settings.parent.mkdir(parents=True, exist_ok=True)
|
||||
if dst_settings.exists():
|
||||
# Merge into existing — don't track since we can't safely
|
||||
# remove the user's settings file on uninstall.
|
||||
self._merge_vscode_settings(settings_src, dst_settings)
|
||||
else:
|
||||
shutil.copy2(settings_src, dst_settings)
|
||||
self.record_file_in_manifest(dst_settings, project_root, manifest)
|
||||
created.append(dst_settings)
|
||||
|
||||
# 4. Install integration-specific update-context scripts
|
||||
scripts_src = Path(__file__).resolve().parent / "scripts"
|
||||
if scripts_src.is_dir():
|
||||
scripts_dest = project_root / ".specify" / "integrations" / "copilot" / "scripts"
|
||||
scripts_dest.mkdir(parents=True, exist_ok=True)
|
||||
for src_script in sorted(scripts_src.iterdir()):
|
||||
if src_script.is_file():
|
||||
dst_script = scripts_dest / src_script.name
|
||||
shutil.copy2(src_script, dst_script)
|
||||
# Make shell scripts executable
|
||||
if dst_script.suffix == ".sh":
|
||||
dst_script.chmod(dst_script.stat().st_mode | 0o111)
|
||||
self.record_file_in_manifest(dst_script, project_root, manifest)
|
||||
created.append(dst_script)
|
||||
|
||||
return created
|
||||
|
||||
def _vscode_settings_path(self) -> Path | None:
|
||||
"""Return path to the bundled vscode-settings.json template."""
|
||||
tpl_dir = self.shared_templates_dir()
|
||||
if tpl_dir:
|
||||
candidate = tpl_dir / "vscode-settings.json"
|
||||
if candidate.is_file():
|
||||
return candidate
|
||||
return None
|
||||
|
||||
@staticmethod
|
||||
def _merge_vscode_settings(src: Path, dst: Path) -> None:
|
||||
"""Merge settings from *src* into existing *dst* JSON file.
|
||||
|
||||
Top-level keys from *src* are added only if missing in *dst*.
|
||||
For dict-valued keys, sub-keys are merged the same way.
|
||||
|
||||
If *dst* cannot be parsed (e.g. JSONC with comments), the merge
|
||||
is skipped to avoid overwriting user settings.
|
||||
"""
|
||||
try:
|
||||
existing = json.loads(dst.read_text(encoding="utf-8"))
|
||||
except (json.JSONDecodeError, OSError):
|
||||
# Cannot parse existing file (likely JSONC with comments).
|
||||
# Skip merge to preserve the user's settings, but show
|
||||
# what they should add manually.
|
||||
import logging
|
||||
template_content = src.read_text(encoding="utf-8")
|
||||
logging.getLogger(__name__).warning(
|
||||
"Could not parse %s (may contain JSONC comments). "
|
||||
"Skipping settings merge to preserve existing file.\n"
|
||||
"Please add the following settings manually:\n%s",
|
||||
dst, template_content,
|
||||
)
|
||||
return
|
||||
|
||||
new_settings = json.loads(src.read_text(encoding="utf-8"))
|
||||
|
||||
if not isinstance(existing, dict) or not isinstance(new_settings, dict):
|
||||
import logging
|
||||
logging.getLogger(__name__).warning(
|
||||
"Skipping settings merge: %s or template is not a JSON object.", dst
|
||||
)
|
||||
return
|
||||
|
||||
changed = False
|
||||
for key, value in new_settings.items():
|
||||
if key not in existing:
|
||||
existing[key] = value
|
||||
changed = True
|
||||
elif isinstance(existing[key], dict) and isinstance(value, dict):
|
||||
for sub_key, sub_value in value.items():
|
||||
if sub_key not in existing[key]:
|
||||
existing[key][sub_key] = sub_value
|
||||
changed = True
|
||||
|
||||
if not changed:
|
||||
return
|
||||
|
||||
dst.write_text(
|
||||
json.dumps(existing, indent=4) + "\n", encoding="utf-8"
|
||||
)
|
||||
@@ -0,0 +1,22 @@
|
||||
# update-context.ps1 — Copilot integration: create/update .github/copilot-instructions.md
|
||||
#
|
||||
# This is the copilot-specific implementation that produces the GitHub
|
||||
# Copilot instructions file. The shared dispatcher reads
|
||||
# .specify/integration.json and calls this script.
|
||||
#
|
||||
# NOTE: This script is not yet active. It will be activated in Stage 7
|
||||
# when the shared update-agent-context.ps1 replaces its switch statement
|
||||
# with integration.json-based dispatch. The shared script must also be
|
||||
# refactored to support SPECKIT_SOURCE_ONLY (guard the Main call) before
|
||||
# dot-sourcing will work.
|
||||
#
|
||||
# Until then, this delegates to the shared script as a subprocess.
|
||||
|
||||
$ErrorActionPreference = 'Stop'
|
||||
|
||||
$repoRoot = git rev-parse --show-toplevel 2>$null
|
||||
if (-not $repoRoot) { $repoRoot = $PWD.Path }
|
||||
|
||||
# Invoke shared update-agent-context script as a separate process.
|
||||
# Dot-sourcing is unsafe until that script guards its Main call.
|
||||
& "$repoRoot/.specify/scripts/powershell/update-agent-context.ps1" -AgentType copilot
|
||||
@@ -0,0 +1,22 @@
|
||||
#!/usr/bin/env bash
|
||||
# update-context.sh — Copilot integration: create/update .github/copilot-instructions.md
|
||||
#
|
||||
# This is the copilot-specific implementation that produces the GitHub
|
||||
# Copilot instructions file. The shared dispatcher reads
|
||||
# .specify/integration.json and calls this script.
|
||||
#
|
||||
# NOTE: This script is not yet active. It will be activated in Stage 7
|
||||
# when the shared update-agent-context.sh replaces its case statement
|
||||
# with integration.json-based dispatch. The shared script must also be
|
||||
# refactored to support SPECKIT_SOURCE_ONLY (guard the main logic)
|
||||
# before sourcing will work.
|
||||
#
|
||||
# Until then, this delegates to the shared script as a subprocess.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
REPO_ROOT="${REPO_ROOT:-$(git rev-parse --show-toplevel 2>/dev/null || pwd)}"
|
||||
|
||||
# Invoke shared update-agent-context script as a separate process.
|
||||
# Sourcing is unsafe until that script guards its main logic.
|
||||
exec "$REPO_ROOT/.specify/scripts/bash/update-agent-context.sh" copilot
|
||||
Reference in New Issue
Block a user