mirror of
https://github.com/github/spec-kit.git
synced 2026-04-01 02:03:09 +00:00
* feat: Stage 2a — CopilotIntegration with shared template primitives - base.py: added granular primitives (shared_commands_dir, shared_templates_dir, list_command_templates, command_filename, commands_dest, copy_command_to_directory, record_file_in_manifest, write_file_and_record, process_template) - CopilotIntegration: uses primitives to produce .agent.md commands, companion .prompt.md files, and .vscode/settings.json - Verified byte-for-byte parity with old release script output - Copilot auto-registered in INTEGRATION_REGISTRY - 70 tests (22 new: base primitives + copilot integration) Part of #1924 * feat: Stage 2b — --integration flag, routing, agent.json, shared infra - Added --integration flag to init() (mutually exclusive with --ai) - --ai copilot auto-promotes to integration path with migration nudge - Integration setup writes .specify/agent.json with integration key - _install_shared_infra() copies scripts and templates to .specify/ - init-options.json records 'integration' key when used - 4 new CLI tests: mutual exclusivity, unknown rejection, copilot end-to-end, auto-promote (74 total integration tests) Part of #1924 * feat: Stage 2 completion — integration scripts, integration.json, shared manifest - Added copilot/scripts/update-context.sh and .ps1 (thin wrappers that delegate to the shared update-agent-context script) - CopilotIntegration.setup() installs integration scripts to .specify/integrations/copilot/scripts/ - Renamed agent.json → integration.json with script paths - _install_shared_infra() now tracks files in integration-shared.manifest.json - Updated tests: scripts installed, integration.json has script paths, shared manifest recorded (74 tests) Part of #1924 * refactor: rename shared manifest to speckit.manifest.json Cleaner naming — the shared infrastructure (scripts, templates) belongs to spec-kit itself, not to any specific integration. * fix: copilot update-context scripts reflect target architecture Scripts now source shared functions (via SPECKIT_SOURCE_ONLY=1) and call update_agent_file directly with .github/copilot-instructions.md, rather than delegating back to the shared case statement. * fix: simplify copilot scripts — dispatcher sources common functions Integration scripts now contain only copilot-specific logic (target path + agent name). The dispatcher is responsible for sourcing shared functions before calling the integration script. * fix: copilot update-context scripts are self-contained implementations These scripts ARE the implementation — the dispatcher calls them. They source common.sh + update-agent-context functions, gather feature/plan data, then call update_agent_file with the copilot target path (.github/copilot-instructions.md). * docs: add Stage 7 activation note to copilot update-context scripts * test: add complete file inventory test for copilot integration Validates every single file (37 total) produced by specify init --integration copilot --script sh --no-git. * test: add PowerShell file inventory test for copilot integration Validates all 37 files produced by --script ps variant, including .specify/scripts/powershell/ instead of bash. * refactor: split test_integrations.py into tests/integrations/ directory - test_base.py: IntegrationOption, IntegrationBase, MarkdownIntegration, primitives - test_manifest.py: IntegrationManifest, path traversal, persistence, validation - test_registry.py: INTEGRATION_REGISTRY - test_copilot.py: CopilotIntegration unit tests - test_cli.py: --integration flag, auto-promote, file inventories (sh + ps) - conftest.py: shared StubIntegration helper 76 integration tests + 48 consistency tests = 124 total, all passing. * refactor: move file inventory tests from test_cli to test_copilot File inventories are copilot-specific. test_cli.py now only tests CLI flag mechanics (mutual exclusivity, unknown rejection, auto-promote). * fix: skip JSONC merge to preserve user settings, fix docstring - _merge_vscode_settings() now returns early (skips merge) when existing settings.json can't be parsed (e.g. JSONC with comments), instead of overwriting with empty settings - Updated _install_shared_infra() docstring to match implementation (scripts + templates, speckit.manifest.json) * fix: warn user when JSONC settings merge is skipped * fix: show template content when JSONC merge is skipped User now sees the exact settings they should add manually. * fix: document process_template requirement, merge scripts without rmtree - base.py setup() docstring now explicitly states raw copy behavior and directs to CopilotIntegration for process_template example - _install_shared_infra() uses merge/overwrite instead of rmtree to preserve user-added files under .specify/scripts/ * fix: don't overwrite pre-existing shared scripts or templates Only write files that don't already exist — preserves any user modifications to shared scripts (common.sh etc.) and templates. * fix: warn user about skipped pre-existing shared files Lists all shared scripts and templates that were not copied because they already existed in the project. * test: add test for shared infra skip behavior on pre-existing files Verifies that _install_shared_infra() preserves user-modified scripts and templates while still installing missing ones. * fix: address review — containment check, deterministic prompts, manifest accuracy - CopilotIntegration.setup() adds dest containment check (relative_to) - Companion prompts generated from templates list, not directory glob - _install_shared_infra() only records files actually copied (not pre-existing) - VS Code settings tests made unconditional (assert template exists) - Inventory tests use .as_posix() for cross-platform paths * fix: correct PS1 function names, document SPECKIT_SOURCE_ONLY prerequisite - Fixed Get-FeaturePaths → Get-FeaturePathsEnv, Read-PlanData → Parse-PlanData - Documented that shared scripts must guard Main with SPECKIT_SOURCE_ONLY before these integration scripts can be activated (Stage 7) * fix: add dict type check for settings merge, simplify PS1 to subprocess - _merge_vscode_settings() skips merge with warning if parsed JSON is not a dict (array, null, etc.) - PS1 update-context.ps1 uses & invocation instead of dot-sourcing since the shared script runs Main unconditionally * fix: skip-write on no-op merge, bash subprocess, dynamic integration list - _merge_vscode_settings() only writes when keys were actually added - update-context.sh uses exec subprocess like PS1 version - Unknown integration error lists available integrations dynamically * fix: align path rewriting with release script, add .specify/.specify/ fix Path rewrite regex matches the release script's rewrite_paths() exactly (verified byte-identical output). Added .specify/.specify/ double-prefix fix for additional safety.
246 lines
9.4 KiB
Python
246 lines
9.4 KiB
Python
"""Tests for IntegrationManifest — record, hash, save, load, uninstall, modified detection."""
|
|
|
|
import hashlib
|
|
import json
|
|
|
|
import pytest
|
|
|
|
from specify_cli.integrations.manifest import IntegrationManifest, _sha256
|
|
|
|
|
|
class TestManifestRecordFile:
|
|
def test_record_file_writes_and_hashes(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
content = "hello world"
|
|
abs_path = m.record_file("a/b.txt", content)
|
|
assert abs_path == tmp_path / "a" / "b.txt"
|
|
assert abs_path.read_text(encoding="utf-8") == content
|
|
expected_hash = hashlib.sha256(content.encode()).hexdigest()
|
|
assert m.files["a/b.txt"] == expected_hash
|
|
|
|
def test_record_file_bytes(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
data = b"\x00\x01\x02"
|
|
abs_path = m.record_file("bin.dat", data)
|
|
assert abs_path.read_bytes() == data
|
|
assert m.files["bin.dat"] == hashlib.sha256(data).hexdigest()
|
|
|
|
def test_record_existing(self, tmp_path):
|
|
f = tmp_path / "existing.txt"
|
|
f.write_text("content", encoding="utf-8")
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_existing("existing.txt")
|
|
assert m.files["existing.txt"] == _sha256(f)
|
|
|
|
|
|
class TestManifestPathTraversal:
|
|
def test_record_file_rejects_parent_traversal(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
with pytest.raises(ValueError, match="outside"):
|
|
m.record_file("../escape.txt", "bad")
|
|
|
|
def test_record_file_rejects_absolute_path(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
with pytest.raises(ValueError, match="Absolute paths"):
|
|
m.record_file("/tmp/escape.txt", "bad")
|
|
|
|
def test_record_existing_rejects_parent_traversal(self, tmp_path):
|
|
escape = tmp_path.parent / "escape.txt"
|
|
escape.write_text("evil", encoding="utf-8")
|
|
try:
|
|
m = IntegrationManifest("test", tmp_path)
|
|
with pytest.raises(ValueError, match="outside"):
|
|
m.record_existing("../escape.txt")
|
|
finally:
|
|
escape.unlink(missing_ok=True)
|
|
|
|
def test_uninstall_skips_traversal_paths(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("safe.txt", "good")
|
|
m._files["../outside.txt"] = "fakehash"
|
|
m.save()
|
|
removed, skipped = m.uninstall()
|
|
assert len(removed) == 1
|
|
assert removed[0].name == "safe.txt"
|
|
|
|
|
|
class TestManifestCheckModified:
|
|
def test_unmodified_file(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "original")
|
|
assert m.check_modified() == []
|
|
|
|
def test_modified_file(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "original")
|
|
(tmp_path / "f.txt").write_text("changed", encoding="utf-8")
|
|
assert m.check_modified() == ["f.txt"]
|
|
|
|
def test_deleted_file_not_reported(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "original")
|
|
(tmp_path / "f.txt").unlink()
|
|
assert m.check_modified() == []
|
|
|
|
def test_symlink_treated_as_modified(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "original")
|
|
target = tmp_path / "target.txt"
|
|
target.write_text("target", encoding="utf-8")
|
|
(tmp_path / "f.txt").unlink()
|
|
(tmp_path / "f.txt").symlink_to(target)
|
|
assert m.check_modified() == ["f.txt"]
|
|
|
|
|
|
class TestManifestUninstall:
|
|
def test_removes_unmodified(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("d/f.txt", "content")
|
|
m.save()
|
|
removed, skipped = m.uninstall()
|
|
assert len(removed) == 1
|
|
assert not (tmp_path / "d" / "f.txt").exists()
|
|
assert not (tmp_path / "d").exists()
|
|
assert skipped == []
|
|
|
|
def test_skips_modified(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "original")
|
|
m.save()
|
|
(tmp_path / "f.txt").write_text("modified", encoding="utf-8")
|
|
removed, skipped = m.uninstall()
|
|
assert removed == []
|
|
assert len(skipped) == 1
|
|
assert (tmp_path / "f.txt").exists()
|
|
|
|
def test_force_removes_modified(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "original")
|
|
m.save()
|
|
(tmp_path / "f.txt").write_text("modified", encoding="utf-8")
|
|
removed, skipped = m.uninstall(force=True)
|
|
assert len(removed) == 1
|
|
assert skipped == []
|
|
|
|
def test_already_deleted_file(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "content")
|
|
m.save()
|
|
(tmp_path / "f.txt").unlink()
|
|
removed, skipped = m.uninstall()
|
|
assert removed == []
|
|
assert skipped == []
|
|
|
|
def test_removes_manifest_file(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path, version="1.0")
|
|
m.record_file("f.txt", "content")
|
|
m.save()
|
|
assert m.manifest_path.exists()
|
|
m.uninstall()
|
|
assert not m.manifest_path.exists()
|
|
|
|
def test_cleans_empty_parent_dirs(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("a/b/c/f.txt", "content")
|
|
m.save()
|
|
m.uninstall()
|
|
assert not (tmp_path / "a").exists()
|
|
|
|
def test_preserves_nonempty_parent_dirs(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("a/b/tracked.txt", "content")
|
|
(tmp_path / "a" / "b" / "other.txt").write_text("keep", encoding="utf-8")
|
|
m.save()
|
|
m.uninstall()
|
|
assert not (tmp_path / "a" / "b" / "tracked.txt").exists()
|
|
assert (tmp_path / "a" / "b" / "other.txt").exists()
|
|
|
|
def test_symlink_skipped_without_force(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "original")
|
|
m.save()
|
|
target = tmp_path / "target.txt"
|
|
target.write_text("target", encoding="utf-8")
|
|
(tmp_path / "f.txt").unlink()
|
|
(tmp_path / "f.txt").symlink_to(target)
|
|
removed, skipped = m.uninstall()
|
|
assert removed == []
|
|
assert len(skipped) == 1
|
|
|
|
def test_symlink_removed_with_force(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "original")
|
|
m.save()
|
|
target = tmp_path / "target.txt"
|
|
target.write_text("target", encoding="utf-8")
|
|
(tmp_path / "f.txt").unlink()
|
|
(tmp_path / "f.txt").symlink_to(target)
|
|
removed, skipped = m.uninstall(force=True)
|
|
assert len(removed) == 1
|
|
assert target.exists()
|
|
|
|
|
|
class TestManifestPersistence:
|
|
def test_save_and_load_roundtrip(self, tmp_path):
|
|
m = IntegrationManifest("myagent", tmp_path, version="2.0.1")
|
|
m.record_file("dir/file.md", "# Hello")
|
|
m.save()
|
|
loaded = IntegrationManifest.load("myagent", tmp_path)
|
|
assert loaded.key == "myagent"
|
|
assert loaded.version == "2.0.1"
|
|
assert loaded.files == m.files
|
|
|
|
def test_manifest_path(self, tmp_path):
|
|
m = IntegrationManifest("copilot", tmp_path)
|
|
assert m.manifest_path == tmp_path / ".specify" / "integrations" / "copilot.manifest.json"
|
|
|
|
def test_load_missing_raises(self, tmp_path):
|
|
with pytest.raises(FileNotFoundError):
|
|
IntegrationManifest.load("nonexistent", tmp_path)
|
|
|
|
def test_save_creates_directories(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "content")
|
|
path = m.save()
|
|
assert path.exists()
|
|
data = json.loads(path.read_text(encoding="utf-8"))
|
|
assert data["integration"] == "test"
|
|
|
|
def test_save_preserves_installed_at(self, tmp_path):
|
|
m = IntegrationManifest("test", tmp_path)
|
|
m.record_file("f.txt", "content")
|
|
m.save()
|
|
first_ts = m._installed_at
|
|
m.save()
|
|
assert m._installed_at == first_ts
|
|
|
|
|
|
class TestManifestLoadValidation:
|
|
def test_load_non_dict_raises(self, tmp_path):
|
|
path = tmp_path / ".specify" / "integrations" / "bad.manifest.json"
|
|
path.parent.mkdir(parents=True)
|
|
path.write_text('"just a string"', encoding="utf-8")
|
|
with pytest.raises(ValueError, match="JSON object"):
|
|
IntegrationManifest.load("bad", tmp_path)
|
|
|
|
def test_load_bad_files_type_raises(self, tmp_path):
|
|
path = tmp_path / ".specify" / "integrations" / "bad.manifest.json"
|
|
path.parent.mkdir(parents=True)
|
|
path.write_text(json.dumps({"files": ["not", "a", "dict"]}), encoding="utf-8")
|
|
with pytest.raises(ValueError, match="mapping"):
|
|
IntegrationManifest.load("bad", tmp_path)
|
|
|
|
def test_load_bad_files_values_raises(self, tmp_path):
|
|
path = tmp_path / ".specify" / "integrations" / "bad.manifest.json"
|
|
path.parent.mkdir(parents=True)
|
|
path.write_text(json.dumps({"files": {"a.txt": 123}}), encoding="utf-8")
|
|
with pytest.raises(ValueError, match="mapping"):
|
|
IntegrationManifest.load("bad", tmp_path)
|
|
|
|
def test_load_invalid_json_raises(self, tmp_path):
|
|
path = tmp_path / ".specify" / "integrations" / "bad.manifest.json"
|
|
path.parent.mkdir(parents=True)
|
|
path.write_text("{not valid json", encoding="utf-8")
|
|
with pytest.raises(ValueError, match="invalid JSON"):
|
|
IntegrationManifest.load("bad", tmp_path)
|