feat(scaffold): ADR-022 scaffold profiles (all phases)

Add profile-aware scaffold system so SF does not lay down irrelevant
templates in infra/ops/docs repos.

## What ships

Phase 1 — data model
- scaffold-versioning.js: add 'disabled' to VALID_STATES; readScaffoldManifest
  returns profile field; recordScaffoldApply preserves manifest.profile (fixes
  roundtrip bug where profile was stripped on every write).
- scaffold-constants.js: PROFILES (app/library/infra/docs/minimal as Set<string>)
  and PROFILE_NAMES exports.

Phase 2 — profile-aware drift detection
- scaffold-drift.js: disabled bucket in emptyCounts, resolveActiveProfileSet
  integration, profile param on detectScaffoldDrift/migrateLegacyScaffold.
- doc-checker.js: filter to active profile, skip disabled-state files.

Phase 3 — auto-detection on first run
- scaffold-profiles.js: detectRepoProfile() heuristics (nix→infra,
  terraform→infra, react→app, node-no-ui→library, docs-only→docs, else→app).
- agentic-docs-scaffold.js: reads profile from manifest, auto-detects on first
  run, persists to manifest, filters SCAFFOLD_FILES to active profile.

Phase 4 — migrate command
- commands-scaffold-migrate.js: sf scaffold migrate --profile <name>
  Re-enables pending files entering the new profile; stamps state=disabled
  (or prunes with --prune) files leaving it; warns on editing/completed files.
- commands/handlers/ops.js, commands/catalog.js: registered and tab-completed.

Phase 5 — custom profiles + PREFERENCES.md frontmatter
- scaffold-profiles.js: readPreferencesProfile(), loadCustomProfileSet()
  (~/.sf/profiles/<name>.yaml with extends/add/remove), resolveActiveProfileSet()
  implementing full ADR-022 §6 precedence.
- All callers updated to use resolveActiveProfileSet as the single source of truth.

Tests: 28 new tests in adr-022-scaffold-profiles.test.mjs — all passing.
Pre-existing node:test stubs (3 files) unaffected.

ADR: docs/dev/ADR-022-scaffold-profiles.md

Misc: triage TODO.md dump into BACKLOG.md (phases-helpers export error T1,
/todo triage typed-handler gap T1, structured triage tiers T2, sha-track
markdown files T2, cross-repo triage T3). Reset TODO.md to empty template.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
This commit is contained in:
Mikael Hugo 2026-05-12 15:28:03 +02:00
parent ad53b792fb
commit 2bb9cdbeef
16 changed files with 1380 additions and 360 deletions

View file

@ -1,50 +0,0 @@
id: default-safe
description: >-
Conservative default. Confirmations required for destructive
filesystem and git operations; network and exec allowed but logged.
capabilities:
filesystem:
read: allow
write: confirm
delete: confirm
exec:
enabled: confirm
network:
enabled: allow
allow_hosts:
- "*"
deny_hosts: []
mcp:
enabled: allow
paths:
allow:
- "**"
deny:
- "~/.ssh/**"
- "**/.env"
- "**/.env.*"
- "**/secrets/**"
- ".sf/sf.db"
- ".sf/sf.db-*"
- ".sf/backups/**"
redact:
- "**/*api_key*"
- "**/*token*"
- "**/*password*"
- "**/.env*"
confirmations:
requiredFor:
- rm -rf
- git push --force
- git push -f
- git reset --hard
- git clean -fdx
- drop_table
- drop_database
limits:
max_files_per_op: 100
max_command_runtime_sec: 600

View file

@ -1,46 +0,0 @@
id: yolo
description: >-
Confirmation-free policy applied when the YOLO flag is active
(Ctrl+Y / /mode yolo). YOLO is a flag layered on top of Build or
Autonomous — it is NOT a mode and does not appear as a Shift+Tab
stop. Destructive operations execute without prompting. Path denies
and redactions still apply.
capabilities:
filesystem:
read: allow
write: allow
delete: allow
exec:
enabled: allow
network:
enabled: allow
allow_hosts:
- "*"
deny_hosts: []
mcp:
enabled: allow
paths:
allow:
- "**"
deny:
- "~/.ssh/**"
- "**/.env"
- "**/.env.*"
- "**/secrets/**"
- ".sf/sf.db"
- ".sf/sf.db-*"
- ".sf/backups/**"
redact:
- "**/*api_key*"
- "**/*token*"
- "**/*password*"
- "**/.env*"
confirmations:
requiredFor: []
limits:
max_files_per_op: 1000
max_command_runtime_sec: 3600

View file

@ -1,12 +0,0 @@
# Base Prompt
You are an AI agent working in this repository. Before changing code:
1. Read the file you're editing in full.
2. Read related files (callers, callees, tests).
3. Match existing patterns and style.
4. Add or update tests for behavior changes.
Default to the smallest change that solves the problem. Prefer fixing
the root cause over patching the symptom. Surface uncertainties to the
operator rather than guessing.

View file

@ -4,6 +4,63 @@ Items gated on future milestones or external dependencies.
---
## Phases-helpers extension-load error (pre-triage, T1)
- **Source:** TODO.md triage 2025-06
- **Symptom:** Every `sf …` invocation prints `Extension load error: './phases-helpers.js' does not provide an export named 'closeoutAndStop'`
- **Root cause:** Recent rename in `phases-helpers.js` not propagated to its importer(s); or `npm run copy-resources` shipped a partial state.
- **Fix:** Locate callers of `closeoutAndStop` in the extension source, update the import to the new symbol name. Add a test that imports every symbol from the extension entry point and asserts they all resolve.
- **Priority:** T1 — noisy on every run, degrades operator confidence.
---
## Slash command `/todo triage` must route through typed backend (pre-triage, T1)
- **Source:** TODO.md triage 2025-06
- **Symptom:** `sf --print "/todo triage"` triggers the agent, which reads TODO.md and emits triage-shaped markdown, but never calls `handleTodo → triageTodoDump`. DB records never written; patched backend bypassed.
- **Fix:**
1. In the slash-command dispatch prompt, enumerate handlers and forbid the LLM from doing the work itself when a typed handler exists.
2. Add integration test: run `sf --print "/todo triage"` against a fixture TODO.md, assert `triage_runs` rows appear in `sf.db`.
- **Priority:** T1 — core correctness issue, not a UX polish.
---
## Triage result needs structured tier/priority per item (pre-triage, T2)
- **Source:** TODO.md triage 2025-06
- **Problem:** Tiers (T1/T2/T3) appear only in LLM prose appended to `BUILD_PLAN.md`, not as structured fields per item. Blocks downstream automation that needs to escalate Tier-1 items to milestones.
- **Fix:** Extend triage JSON schema:
```ts
{ title: string, tier: "T1" | "T2" | "T3", rationale: string }
```
Update `appendBacklogItems` + future milestone-escalator to consume the structured tier.
- **Priority:** T2 — enables milestone automation; blocks `sf plan promote` from triage.
---
## Sha-track source-of-truth markdown files, diff on change (pre-triage, T2)
- **Source:** TODO.md triage 2025-06
- **Want:** On session start + autonomous-cycle entry, hash `AGENTS.md`, `README.md`, `.sf/wiki/**/*.md`, `.sf/milestones/**/*.md`, `docs/adr/**/*.md`, `docs/plans/**/*.md`. Diff against last-seen hash in `sf.db`. Surface changed files for review/accept.
- **Schema:**
```sql
CREATE TABLE tracked_md_files (
relpath TEXT PRIMARY KEY, sha256 TEXT NOT NULL, size_bytes INTEGER NOT NULL,
last_seen_at TEXT NOT NULL, last_seen_commit TEXT, category TEXT
);
```
- **Out of scope:** `TODO.md`, `CHANGELOG.md`, `BUILD_PLAN.md`, `node_modules`, `dist`.
- **Priority:** T2 — high value for cross-agent coordination; deferred behind T1 fixes.
---
## Cross-repo triage / unified backlog view (pre-triage, T3)
- **Source:** TODO.md triage 2025-06
- **Want:** `sf headless triage-all-repos --config ~/.sf/repos.yaml` — walk N repo paths, run `triageTodoDump` per repo in its own SF db, emit a unified read-only aggregated report sorted by priority/tier.
- **Constraints:** Per-repo SF dbs stay separate; cross-repo view is read-only aggregation into `~/.sf/cross-repo-view.md`.
- **Priority:** T3 — useful for multi-repo operators; deferred until T1/T2 items land.
## M009 Promote-Only Adoption Review
- **Gate:** M010 (schedule system) must ship first

195
TODO.md
View file

@ -3,198 +3,3 @@
Dump anything here.
---
## Cross-repo triage / unified backlog view
Today's dogfood: a scan across active repos found **~40 TODO.md files**
totalling **~10,000+ lines** across `/srv/infra`, `/srv/operations-memory`,
`/home/mhugo/code/singularity-engine` (27 subdir TODOs, 9 000+ lines),
`/home/mhugo/code/inference-fabric` (8 crate TODOs), plus per-repo
singletons in ace-coder, dks-web, vectordrive, centralcloud, etc.
The per-subdir files are **not noise** — most are substantive design
specs scoped to their domain/crate/service. Collapsing them into a
single root file would destroy useful structure.
The actual gap: **no single way to see "what's queued across all the
repos" at once.** Today this requires walking N repos by hand.
Wanted:
```
sf headless triage-all-repos --config ~/.sf/repos.yaml
```
Where `~/.sf/repos.yaml` is a list of repo paths and (optional) per-repo
priority. For each repo:
1. If `TODO.md` has non-template content, run `triageTodoDump` in that
repo's SF db.
2. After all repos triaged, emit a unified report: one row per backlog
item across all repos, sortable by priority / tier / inserted_at.
3. Optionally produce a single `~/.sf/cross-repo-view.md` for quick
human reading.
Per-repo SF dbs stay separate (each repo owns its work); the cross-repo
view is read-only aggregation.
## Slash command `/todo triage` should actually invoke the typed backend
Observed today: `sf --print "/todo triage"` ran the agent, which read
TODO.md and emitted a triage-shaped markdown response, but the agent
**did not call `handleTodo``triageTodoDump`** — it re-implemented the
flow in natural language via Read/Write tools. Side effect: a patched
backend in `commands-todo.js` was bypassed entirely.
Wanted: when a slash command has a registered typed handler in the
extension surface (i.e. `handleTodo`, `handleNewMilestone`, …), the
agent's prompt should *require* the call go through that handler rather
than letting the LLM improvise. The handler can be invoked as a tool
call so the LLM still has narrative space, but the side effects (DB
writes, file scaffolds, etc.) come from the typed path, not from raw
Write/Edit on TODO.md.
Concretely:
- In `slash-commands.md` (or wherever the slash dispatch prompt lives),
enumerate handlers and forbid the LLM from "doing the work" itself
when a typed handler exists.
- Add an integration test that runs `sf --print "/todo triage"` against
a fixture TODO.md and asserts that `triage_runs` rows appear in
`sf.db` (i.e. the backend ran, not just the LLM).
## Triage result needs structured tier/priority per item
Current shape:
```ts
result.implementation_tasks: string[] // titles only
result.memory_requirements: string[]
result.harness_suggestions: string[]
result.docs_or_tests: string[]
result.unclear_notes: string[]
result.eval_candidates: { id, task_input, expected_behavior, … }[]
```
Tiers (T1 / T2 / T3) appear only in the LLM-prose tier list it appends
to `BUILD_PLAN.md`. They are **not** present as a structured field per
item. That blocks any downstream "for each Tier-1 item, scaffold a
milestone" automation — the tier info is locked in prose.
Wanted: extend the triage JSON schema so each implementation task is
```ts
{ title: string, tier: "T1" | "T2" | "T3", rationale: string }
```
and update `appendBacklogItems` + a future milestone-escalator to read
the structured tier rather than re-parsing markdown.
## Sha-track every source-of-truth markdown file, diff on change
Generalised from the milestone-files case: **any markdown file that is
a source of truth for SF or for humans navigating the repo** should
be sha-tracked, and any change since SF last saw it should surface as
a diff for review (or auto-accept under a configured policy).
In scope (per repo):
- **Repo-level meta**`AGENTS.md`, `README.md`, `STATUS.md`,
`BACKLOG.md`, `STANDALONE.md`, `MIGRATION.md`, etc. (any uppercase
root-level `.md`)
- **Pointer**`.github/copilot-instructions.md`
- **Wiki**`.sf/wiki/**/*.md`
- **Planning**`.sf/milestones/**/*.md` (`CONTEXT`, `MILESTONE-SUMMARY`,
`ROADMAP`, `SUMMARY` per milestone; `PLAN` / `SUMMARY` per slice; same
per task)
- **ADRs**`docs/adr/**/*.md` (these should rarely change, so any
edit is loud and worth surfacing)
- **Triage outputs**`docs/plans/**/*.md`
Explicit out of scope:
- `TODO.md` — gets reset to empty template by `/todo triage` on every
cycle; tracking churn here is just noise.
- `CHANGELOG.md` / `BUILD_PLAN.md` — append-only by design; sha churn
is expected, no signal in tracking.
- `node_modules`, `dist`, vendored copies — irrelevant.
Storage in `sf.db` — sha + git ref, no content snapshots. Git is the
version store; the DB is just a pointer:
```sql
CREATE TABLE tracked_md_files (
relpath TEXT PRIMARY KEY, -- repo-relative path
sha256 TEXT NOT NULL, -- hash of last-seen content
size_bytes INTEGER NOT NULL,
last_seen_at TEXT NOT NULL,
last_seen_commit TEXT, -- git SHA1 of HEAD when observed
category TEXT -- 'meta'|'wiki'|'milestone'|'adr'|'plan'
);
```
Diff source priority:
1. **Tracked + committed at observation** (the common case):
`git diff <last_seen_commit> -- <path>` shows everything since.
Cheap, no blob, perfect history via `git log <path>` if needed.
2. **Tracked + uncommitted at observation** (mid-edit corner): no git
ref points at that exact content. Diff shows "changed since
`<last_seen_commit>`" but the prior intermediate working-tree state
isn't reconstructable. Acceptable trade-off — the main signal is
"changed", and the operator can commit before letting SF observe
if intermediate fidelity matters.
3. **Untracked / gitignored**: not tracked in this table. SF-generated
transient files don't belong in version control or in this audit.
History per file = `git log <relpath>` (already there, free). SF's DB
just records "where I left off." No `md_observation_log` history
table unless someone has a concrete need for an SF-side timeline.
On session start + each autonomous-cycle entry, walk the configured
glob set, hash each file, diff against `tracked_md_files.sha256`.
For each changed file:
1. Surface to operator: "**N** files changed since SF last saw — review
or accept?" with per-file diff (computed from git, not from a DB
blob).
2. On accept → update sha + last_seen_at. No content stored.
3. New files (sha not in DB) → classify by glob category, store sha,
continue.
4. Deleted files → archive the DB row (mark inactive); don't purge
until operator confirms.
Useful for:
- hand-edits / cross-agent edits / git pulls (the original
milestone-files motivation)
- catching when an AGENTS.md drifted because someone edited it during
a code review and nobody told SF
- ADR drift detection — ADRs should almost never change; if one does,
surface it loudly
- treating `.sf/wiki/*` as living docs that need review when they
drift from what `sf` has internalised
Storage cost: ~40 bytes per file (sha + meta) + optional gzipped
snapshot (typically 30-70 % of original size). Negligible vs. the
rest of `sf.db`.
## Phases-helpers extension-load error on every SF run
Every `sf …` invocation today prints:
```
[sf] Extension load error Error: Failed to load extension
"/home/mhugo/.sf/agent/extensions/sf/index.js": The requested module
'./phases-helpers.js' does not provide an export named 'closeoutAndStop'
```
Non-fatal (SF continues), but noisy and a sign of stale state. Either:
- A recent rename of `closeoutAndStop` in `phases-helpers.js` wasn't
propagated to its caller, and `npm run copy-resources` quietly shipped
the partial state, or
- A test gap doesn't catch missing exports from `phases-helpers.js`.
Add an import-time sanity check (or a test that imports every entry
in the extension index and asserts all required symbols resolve).

View file

@ -0,0 +1,245 @@
# ADR-022: Scaffold Profiles
**Status:** Proposed
**Date:** 2026-05-12
**Deciders:** Mikael Hugo
**Extends:** [ADR-021 — Versioned Documents and Upgrade Path](ADR-021-versioned-documents-and-upgrade-path.md)
## Context
ADR-021 introduced a per-file state machine (`pending` / `editing` / `completed`),
version markers, drift detection, and an automatic upgrade pipeline for the
`SCAFFOLD_FILES` template set. That set is a single flat list tuned for a
code-app shaped repo.
When SF is bootstrapped into a non-code-shaped repo — infra, ops, GitOps,
documentation-only — it writes ~6070 % of templates that don't apply. Files
like `docs/FRONTEND.md`, `docs/DESIGN.md`, and `docs/PRODUCT_SENSE.md` have no
meaning in a Kubernetes config repo. They sit at `state=pending` forever,
polluting the agent's scaffold context and producing spurious `upgradable`
findings in `/sf doctor` on every run.
SF is being adopted in non-code repos. Each new repo type that doesn't fit the
`app` profile creates the same friction without a general solution.
## Decision
Introduce a **profile** system for scaffold file sets. A profile is a named
subset of `SCAFFOLD_FILES` describing which templates apply to a given repo
shape. Add a `disabled` state to the ADR-021 state machine for templates that
are explicitly out-of-scope for the active profile.
### 1. Built-in profiles
Defined in `scaffold-constants.js`:
| Profile | Included templates | Typical repo shape |
|---------|-------------------|-------------------|
| `app` | Full `SCAFFOLD_FILES` list | Product or CLI with UI, tests, frontend |
| `library` | `app` minus `docs/FRONTEND.md`, `docs/PRODUCT_SENSE.md`, `docs/DESIGN.md` | Library, SDK, CLI tool without UI |
| `infra` | AGENTS.md, ARCHITECTURE.md, .siftignore, docs/SECURITY.md, docs/RELIABILITY.md, docs/records/**, .sf/PRINCIPLES.md, .sf/STYLE.md, .sf/NON-GOALS.md, .sf/harness/** | GitOps, Kubernetes, Flux, Helm, Terraform |
| `docs` | AGENTS.md, .sf/STYLE.md | Pure documentation repo |
| `minimal` | .siftignore, AGENTS.md | Any repo; smallest footprint |
`app` is the default (preserves existing behaviour for code repos).
Profiles are expressed as sets of template paths from `SCAFFOLD_FILES`. The
`PROFILES` object maps profile names to `Set<string>`. A file not in the active
profile is treated as if it were absent from `SCAFFOLD_FILES` for all scaffold
operations.
### 2. `disabled` state (4th ADR-021 state)
A new state alongside `pending` / `editing` / `completed`:
| State | Definition | SF action on drift |
|-------|------------|--------------------|
| `disabled` | Template does not apply to this repo. Written by migrate command or manually. | Never created. Never modified. Treated as out-of-scope. |
The marker format is unchanged:
```
<!-- sf-doc: version=2.75.x template=docs/FRONTEND.md state=disabled hash=sha256:… -->
```
`disabled` is valid in `VALID_STATES`. Any existing code path that calls
`parseMarker` on a disabled file will parse it correctly (not fall through to
`untracked`).
#### State derivation (extends ADR-021 §1 table)
| Marker? | state field | Body hash = marker.hash? | → State |
|---------|-------------|--------------------------|---------|
| yes | `disabled` | any | `disabled` |
| (existing ADR-021 rows) | … | … | … |
#### What `disabled` means for drift detection
- Files with `state=disabled``disabled` drift bucket. Never `missing`,
`upgradable`, or `editing-drift`.
- Files not in the active profile that don't exist on disk → skipped entirely.
Not reported as `missing`. SF does not write them.
- `/sf doctor` does not count `disabled` files as actionable. The doctor
finding counts only `missing + upgradable + editing-drift`.
### 3. Profile storage
Profile is stored in `.sf/scaffold-manifest.json` under a new `profile` field.
Added **additively** — no `schemaVersion` bump (a bump would wipe existing
`applied` records on first upgrade).
```json
{
"schemaVersion": 1,
"profile": "infra",
"applied": [ … ]
}
```
`readScaffoldManifest` returns `profile: parsed.profile ?? null`.
`recordScaffoldApply` preserves the `profile` field on every write (prior
implementation stripped all fields beyond `schemaVersion` + `applied`).
### 4. Profile auto-detection
`detectRepoProfile(basePath)` returns the best-fit built-in profile for a repo
it has never seen before. Runs once on first scaffold; result stored in manifest.
Detection logic (uses `repo-profiler.js` / `detectStacks()` as signal source):
| Signal | → Profile |
|--------|-----------|
| `kustomization.yaml`, `flux-system/`, `Chart.yaml`, or `helmrelease.yaml` at root | `infra` |
| `flake.nix` / `shell.nix` (nix stack) with no `package.json` | `infra` |
| `package.json` with UI framework dep (`next`, `vite`, `remix`, `sveltekit`, `@sveltejs/kit`, `nuxt`) | `app` |
| `package.json` without UI framework dep | `library` |
| `Cargo.toml`, `go.mod`, or `pyproject.toml` | `library` |
| No source files, only `.md` docs | `docs` |
| No matching signals | `app` (default) |
Signals are evaluated top-to-bottom; first match wins. The manifest `profile`
field overrides auto-detection on all subsequent runs (explicit beats inferred).
`ensureAgenticDocsScaffold` reads the active profile from the manifest (or
auto-detects it on first run) and filters `SCAFFOLD_FILES` to only those paths
in the active profile before writing any files.
### 5. `sf scaffold migrate --profile <name> [--prune]`
Manual command for repos already bootstrapped under the wrong profile.
```
sf scaffold migrate --profile infra [--prune]
```
Algorithm:
1. Read existing markers + manifest. Validate target profile name.
2. **Re-enable step** — for each file IN the target profile with `state=disabled`
on disk:
- If `bodyHash(body) === marker.hash` (no user edits since disable):
re-stamp `state=pending`.
- If hash diverged (user edited the disabled file): warn, leave `disabled`.
3. **Disable step** — for each file NOT in the target profile:
- Compute `bodyHash(body)` from on-disk content.
- If `state=pending` AND `bodyHash(body) === marker.hash` (no user edits):
re-stamp `state=disabled`.
- If `state=pending` AND hash diverged: treat as editing-drift — warn, leave
alone. (User has edits; SF will not silently disable them.)
- If `state=editing` or `state=completed`: warn, leave alone.
- If `--prune` AND `state=pending` AND `bodyHash(body) === marker.hash`:
delete the file. The same hash guard applies — prune never deletes
user-edited files.
4. Update `scaffold-manifest.json` `profile` field to target profile.
5. Run drift-aware sync for files in the new profile (applies `missing` and
`upgradable` items).
Migrate is idempotent: running it twice produces the same result.
### 6. Profile precedence
Multiple sources may specify a profile. Precedence (highest first):
1. **`PREFERENCES.md` frontmatter** — `sf_profile: infra` (Phase 5; explicit
user override, per-repo, committed to git)
2. **Manifest `profile` field** — set by migrate or first-run auto-detection
(per-repo runtime, gitignored)
3. **Auto-detection**`detectRepoProfile(basePath)` result (fallback)
This rule is defined now so Phase 5 (custom profiles) has a clear contract.
### 7. Custom profiles (Phase 5)
Future work. Custom profiles extend built-ins:
```yaml
# ~/.sf/profiles/monorepo.yaml
extends: library
add:
- docs/DESIGN.md
remove:
- docs/exec-plans/active/index.md
```
Resolved at profile load time. Phase 5 must document how `PREFERENCES.md`
frontmatter references a custom profile name (precedence rule §6 already
establishes where they fit).
## Implementation Phases
| Phase | Scope | Constraint |
|-------|-------|------------|
| **1** | `PROFILES` constant; `disabled` in `VALID_STATES`; manifest `profile` field round-trip fix | Must ship atomically with Phases 2 and 4 |
| **2** | Profile-aware `detectScaffoldDrift`; `disabled` bucket; profile-filtered `migrateLegacyScaffold`; update 3 unlisted callers; `doc-checker.js` fix | Must ship atomically with Phases 1 and 4 |
| **3** | `detectRepoProfile`; profile-filtered `ensureAgenticDocsScaffold` | Depends on Phases 1 + 2 |
| **4** | `sf scaffold migrate --profile <name> [--prune]` command | Must ship atomically with Phases 1 and 2 |
| **5** | Custom profiles (`~/.sf/profiles/*.yaml`); PREFERENCES.md frontmatter | Depends on Phase 4 |
Phases 1 + 2 + 4 must ship in the same release. The `disabled` state is not
parseable by `parseMarker` until Phase 1 lands; any code that stamps `disabled`
before Phase 1 ships causes affected files to fall into the `untracked` bucket
and be silently re-rendered on the next `ensureAgenticDocsScaffold` call.
## Consequences
### Becomes possible
- SF can be bootstrapped into any repo shape without laying down irrelevant
templates.
- Existing over-scaffolded repos can be cleaned up with a single migrate
command.
- Profile auto-detection removes the need for explicit `sf new-project --profile
infra` flags in the common case.
- `disabled` state is a clean per-file escape hatch without forking the profile.
### Becomes harder
- Profile set and `SCAFFOLD_FILES` must stay in sync; a template added to
`SCAFFOLD_FILES` needs explicit placement in every affected built-in profile.
- `detectRepoProfile` heuristics will misclassify hybrid repos (a repo with
both `package.json` and `kustomization.yaml`). Fallback is `app`; migrate
exists for correction.
### Failure modes
| Failure | Behaviour |
|---------|-----------|
| Unknown profile name in manifest | Fall back to `app`; log warning. |
| `detectRepoProfile` misclassifies repo | User runs `sf scaffold migrate --profile <correct>`. One-time correction. |
| Hash divergence on a `disabled` file during migrate | Warn; leave alone. User must manually resolve or mark `completed`. |
| Manifest `profile` field missing (pre-ADR-022 manifest) | `null` → fall back to auto-detection on next scaffold run. |
## Alternatives Considered
| Alternative | Why rejected |
|-------------|--------------|
| Per-file `applicable: false` flag in `SCAFFOLD_FILES` | Static; can't adapt to different repo shapes without forking the array. |
| Delete out-of-profile files on migrate | Destructive for `editing`/`completed` files; `disabled` with `--prune` for `pending` is safer and reversible. |
| Auto-migrate on every startup | Surprising; migrate is an explicit user action that changes profile context. Auto-detection for first scaffold only. |
| Single `minimal` profile as the new default | Breaks existing repos; `app` as default preserves current behaviour. |
## References
- ADR-021 — Versioned Documents and Upgrade Path (§1 state machine, §3 manifest,
§4 drift detection, §7 legacy migration — all extended here)
- `src/resources/extensions/sf/repo-profiler.js``detectStacks()` signal
source for `detectRepoProfile`

View file

@ -10,11 +10,14 @@ import {
import { dirname, join } from "node:path";
import { SCAFFOLD_FILES } from "./scaffold-constants.js";
import { migrateLegacyScaffold } from "./scaffold-drift.js";
import { resolveActiveProfileSet, readPreferencesProfile, detectRepoProfile } from "./scaffold-profiles.js";
import {
bodyHash,
extractMarker,
readScaffoldManifest,
recordScaffoldApply,
stampScaffoldFile,
writeScaffoldManifest,
} from "./scaffold-versioning.js";
import { logWarning } from "./workflow-logger.js";
@ -26,6 +29,9 @@ export { SCAFFOLD_FILES };
* wrote them, so legacy-hash migration in Phase C can identify them.
*/
const NO_MARKER_PATHS = new Set([".siftignore"]);
export { detectRepoProfile };
const LEGACY_ROOT_HARNESS_PATHS = [
"harness/AGENTS.md",
"harness/specs/AGENTS.md",
@ -97,19 +103,21 @@ function removeLegacyRootHarnessScaffold(basePath) {
}
/**
* Drift-aware scaffold sync (ADR-021 Phase C).
* Drift-aware scaffold sync (ADR-021 Phase C, extended by ADR-022).
*
* Behavior:
* 1. Run legacy migration first unmarked files whose body hash matches a
* 1. Read the active profile from the manifest. On first run, auto-detect
* the profile and write it to the manifest so future runs are stable.
* 2. Run legacy migration first unmarked files whose body hash matches a
* known prior version in SCAFFOLD_VERSION_ARCHIVE get promoted to pending
* and stamped. Handles projects that pre-date the marker system.
* 2. For each scaffold template:
* 3. For each scaffold template IN the active profile:
* - Missing on disk write template, stamp marker, record manifest entry.
* - Present, marker, state=pending, version drifted, hash matches stamp
* silent re-render with current template, restamp.
* - Present, marker says editing or completed leave alone (Phase D
* handles editing-drift via the scaffold-keeper background agent).
* - Present, marker says editing, completed, or disabled leave alone.
* - Present without marker after migration user-customised, leave alone.
* Files not in the active profile are skipped entirely.
*
* Silent contract: no stdout/stderr in normal paths. Only logWarning("scaffold")
* for unexpected I/O failures. Failure modes are non-fatal.
@ -117,9 +125,24 @@ function removeLegacyRootHarnessScaffold(basePath) {
export function ensureAgenticDocsScaffold(basePath) {
const sfVersion = process.env.SF_VERSION || "0.0.0";
const appliedAt = new Date().toISOString();
const manifest = readScaffoldManifest(basePath);
// PREFERENCES.md frontmatter takes highest precedence (ADR-022 §6).
// If no profile is set anywhere, auto-detect and persist to manifest.
let { profileName: activeProfile, profileSet, warning } = resolveActiveProfileSet(basePath, manifest, null);
if (warning) {
logWarning("scaffold", warning, {});
}
if (!manifest.profile && !readPreferencesProfile(basePath)) {
// First run — persist auto-detected profile to manifest.
try {
writeScaffoldManifest(basePath, { ...manifest, profile: activeProfile });
} catch (err) {
logWarning("scaffold", "failed to write profile to manifest", { error: err.message });
}
}
// Step 1: legacy migration — promote unmarked-but-recognised files.
try {
migrateLegacyScaffold(basePath);
migrateLegacyScaffold(basePath, activeProfile);
} catch (err) {
logWarning("scaffold", "legacy migration failed", {
error: err.message,
@ -128,6 +151,8 @@ export function ensureAgenticDocsScaffold(basePath) {
removeLegacyRootHarnessScaffold(basePath);
// Step 2: missing-file creation + pending-state silent upgrade.
for (const file of SCAFFOLD_FILES) {
// ADR-022: skip files that are not in the active profile.
if (!profileSet.has(file.path)) continue;
const target = join(basePath, file.path);
const skipMarker = NO_MARKER_PATHS.has(file.path);
if (!existsSync(target)) {
@ -161,7 +186,7 @@ export function ensureAgenticDocsScaffold(basePath) {
try {
const { marker, body } = extractMarker(target);
if (!marker) continue; // untracked / customised after migration — leave alone
if (marker.state !== "pending") continue; // editing or completed — Phase D territory
if (marker.state !== "pending") continue; // editing, completed, or disabled — leave alone
if (marker.version === sfVersion) continue; // already current
// Confirm on-disk hash matches the stamped hash. If diverged, the
// file was edited without removing the marker — treat as editing-drift

View file

@ -0,0 +1,225 @@
/**
* commands-scaffold-migrate.js `/scaffold migrate --profile <name>` (ADR-022).
*
* Purpose: transition an existing repo to a different scaffold profile, safely.
* Files leaving the active profile are stamped `state=disabled` (or pruned with
* `--prune`). Files re-entering the profile have their `state=disabled` reversed
* to `state=pending`. Files the user has edited or completed are left alone with
* a warning.
*
* Consumer: user-triggered via `/scaffold migrate --profile infra` after
* onboarding SF into a non-code-shaped repo.
*/
import { existsSync, rmSync, writeFileSync } from "node:fs";
import { join } from "node:path";
import { detectRepoProfile } from "./agentic-docs-scaffold.js";
import { projectRoot } from "./commands/context.js";
import { PROFILE_NAMES, SCAFFOLD_FILES } from "./scaffold-constants.js";
import { detectScaffoldDrift } from "./scaffold-drift.js";
import { resolveActiveProfileSet } from "./scaffold-profiles.js";
import {
bodyHash,
extractMarker,
readScaffoldManifest,
stampScaffoldFile,
writeScaffoldManifest,
} from "./scaffold-versioning.js";
/** Parse args for `/scaffold migrate`. */
export function parseScaffoldMigrateArgs(args) {
const trimmed = (args || "").trim();
const tokens = trimmed.length > 0 ? trimmed.split(/\s+/) : [];
const opts = {
profile: null,
prune: false,
dryRun: false,
};
for (const tok of tokens) {
if (tok.startsWith("--profile=")) {
opts.profile = tok.slice("--profile=".length).trim() || null;
} else if (tok === "--prune") {
opts.prune = true;
} else if (tok === "--dry-run") {
opts.dryRun = true;
}
}
return opts;
}
/**
* Core migrate algorithm (ADR-022 §5).
*
* Returns a result object describing what was changed and what needs review.
* Does not mutate the filesystem when `dryRun` is true.
*
* @param {string} basePath
* @param {string} targetProfile
* @param {{ prune?: boolean, dryRun?: boolean }} opts
*/
export function runScaffoldMigrate(basePath, targetProfile, opts = {}) {
const { prune = false, dryRun = false } = opts;
const sfVersion = process.env.SF_VERSION || "0.0.0";
const manifest = readScaffoldManifest(basePath);
const { profileSet: targetSet, warning } = resolveActiveProfileSet(basePath, manifest, targetProfile);
const result = {
reEnabled: [], // state=disabled → state=pending (re-entered profile)
disabled: [], // state=pending → state=disabled (left profile)
pruned: [], // deleted with --prune
warnings: [], // editing/completed/hash-diverged — left alone
};
if (warning) {
result.warnings.push({ path: "(profile)", reason: warning });
}
// Step 1: re-enable files that are IN the target profile but currently disabled.
for (const file of SCAFFOLD_FILES) {
if (!targetSet.has(file.path)) continue;
const target = join(basePath, file.path);
if (!existsSync(target)) continue;
try {
const { marker, body } = extractMarker(target);
if (!marker || marker.state !== "disabled") continue;
if (bodyHash(body) === marker.hash) {
// Clean hash → safe to re-enable.
result.reEnabled.push(file.path);
if (!dryRun) {
stampScaffoldFile(target, file.path, sfVersion, "pending");
}
} else {
result.warnings.push({
path: file.path,
reason: "disabled, hash diverged — left as-is (manual edit detected)",
});
}
} catch (err) {
result.warnings.push({ path: file.path, reason: `read error: ${err.message}` });
}
}
// Step 2: disable (or prune) files NOT in the target profile.
for (const file of SCAFFOLD_FILES) {
if (targetSet.has(file.path)) continue;
const target = join(basePath, file.path);
if (!existsSync(target)) continue;
try {
const { marker, body } = extractMarker(target);
if (!marker) continue; // untracked/customised — leave alone silently
if (marker.state === "disabled") continue; // already disabled
if (marker.state === "editing" || marker.state === "completed") {
result.warnings.push({
path: file.path,
reason: `state=${marker.state} — not auto-disabled (review manually)`,
});
continue;
}
// state=pending — check hash safety using extracted body (without marker).
const currentHash = bodyHash(body);
if (currentHash !== marker.hash) {
// User edited the file but marker still says pending — editing-drift.
result.warnings.push({
path: file.path,
reason: "state=pending but hash diverged — not auto-disabled (editing-drift)",
});
continue;
}
// Safe: pending + clean hash.
if (prune) {
result.pruned.push(file.path);
if (!dryRun) {
rmSync(target);
}
} else {
result.disabled.push(file.path);
if (!dryRun) {
stampScaffoldFile(target, file.path, sfVersion, "disabled");
}
}
} catch (err) {
result.warnings.push({ path: file.path, reason: `read error: ${err.message}` });
}
}
// Step 3: update manifest profile field.
if (!dryRun) {
try {
writeScaffoldManifest(basePath, { ...manifest, profile: targetProfile });
} catch (err) {
result.warnings.push({ path: "manifest", reason: `manifest write failed: ${err.message}` });
}
}
return result;
}
function formatMigrateResult(result, targetProfile, dryRun) {
const prefix = dryRun ? "[dry-run] " : "";
const lines = [`${prefix}Migrate to profile '${targetProfile}':`];
if (result.reEnabled.length > 0) {
lines.push(` Re-enabled (${result.reEnabled.length}):`);
for (const p of result.reEnabled) lines.push(` + ${p}`);
}
if (result.disabled.length > 0) {
lines.push(` Disabled (${result.disabled.length}):`);
for (const p of result.disabled) lines.push(` - ${p}`);
}
if (result.pruned.length > 0) {
lines.push(` Pruned/deleted (${result.pruned.length}):`);
for (const p of result.pruned) lines.push(` x ${p}`);
}
if (result.warnings.length > 0) {
lines.push(` Needs review (${result.warnings.length}):`);
for (const w of result.warnings) lines.push(` ! ${w.path}: ${w.reason}`);
}
if (
result.reEnabled.length === 0 &&
result.disabled.length === 0 &&
result.pruned.length === 0 &&
result.warnings.length === 0
) {
lines.push(" Nothing to do — repo is already on this profile.");
}
return lines.join("\n");
}
/**
* Top-level handler for `/scaffold migrate [args]`.
*
* Purpose: apply a profile change to an existing repo, stamping out-of-profile
* files as `disabled` and restoring in-profile files from `disabled` to
* `pending`. Safe: never touches user-edited or completed files without warning.
*
* Consumer: user, via `/scaffold migrate --profile infra` in the SF TUI.
*/
export async function handleScaffoldMigrate(args, ctx) {
const opts = parseScaffoldMigrateArgs(args);
const basePath = projectRoot();
// If no --profile, auto-detect and report what we'd do.
let targetProfile = opts.profile;
if (!targetProfile) {
targetProfile = detectRepoProfile(basePath);
ctx.ui.notify(
`No profile specified — auto-detected: '${targetProfile}'\nRe-run with --profile=${targetProfile} to apply, or --profile=<name> to override.\nBuilt-in profiles: ${PROFILE_NAMES.join(", ")}\nCustom profiles: create ~/.sf/profiles/<name>.yaml (extends + add/remove)`,
"info",
);
return;
}
const result = runScaffoldMigrate(basePath, targetProfile, {
prune: opts.prune,
dryRun: opts.dryRun,
});
ctx.ui.notify(formatMigrateResult(result, targetProfile, opts.dryRun), "info");
if (!opts.dryRun) {
// Run a drift sync to ensure in-profile files that are now pending get written.
const drift = detectScaffoldDrift(basePath, targetProfile);
const missingCount = drift.countsByBucket.missing ?? 0;
if (missingCount > 0) {
ctx.ui.notify(
`${missingCount} in-profile file(s) missing — run \`/scaffold sync\` to write them.`,
"info",
);
}
}
}

View file

@ -587,6 +587,22 @@ const NESTED_COMPLETIONS = {
cmd: "sync --only=",
desc: "Restrict the operation to a path glob (e.g. --only=harness/**)",
},
{
cmd: "migrate",
desc: "Auto-detect repo profile and show what would change",
},
{
cmd: "migrate --profile=",
desc: "Switch active profile (app/library/infra/docs/minimal); disables out-of-profile files",
},
{
cmd: "migrate --profile= --dry-run",
desc: "Preview what migrate would change without modifying files",
},
{
cmd: "migrate --profile= --prune",
desc: "Delete (not just disable) pending out-of-profile files with clean hash",
},
],
plan: [
{ cmd: "promote", desc: "Copy a planning artifact from ~/.sf/ into docs/" },

View file

@ -437,9 +437,19 @@ Examples:
);
return true;
}
if (trimmed === "scaffold migrate" || trimmed.startsWith("scaffold migrate ")) {
const { handleScaffoldMigrate } = await import(
"../../commands-scaffold-migrate.js"
);
await handleScaffoldMigrate(
trimmed.replace(/^scaffold migrate\s*/, "").trim(),
ctx,
);
return true;
}
if (trimmed === "scaffold") {
ctx.ui.notify(
"Usage: /scaffold sync [--dry-run] [--include-editing] [--only=<glob>]",
"Usage: /scaffold sync [--dry-run] [--include-editing] [--only=<glob>]\n /scaffold migrate [--profile=<name>] [--dry-run] [--prune]",
"warning",
);
return true;

View file

@ -6,49 +6,29 @@
* beyond the template stubs. Reports findings so the agent knows what needs
* attention never blocks, only surfaces.
*
* ADR-022: Only checks files in the active profile. Files with `state=disabled`
* are skipped; they are intentionally out of scope for this repo.
*
* Consumer: bootstrapProject (after scaffold init), milestone close workflows.
*/
import { existsSync, readFileSync, statSync } from "node:fs";
import { join } from "node:path";
import { SCAFFOLD_FILES } from "./scaffold-constants.js";
import { resolveActiveProfileSet } from "./scaffold-profiles.js";
import { extractMarker, readScaffoldManifest } from "./scaffold-versioning.js";
/** Files created by ensureAgenticDocsScaffold that should contain real content. */
const SCAFFOLD_FILES = [
// Root routing
"AGENTS.md",
"ARCHITECTURE.md",
// docs/ structure
"docs/AGENTS.md",
"docs/PLANS.md",
"docs/DESIGN.md",
"docs/FRONTEND.md",
"docs/QUALITY_SCORE.md",
"docs/RELIABILITY.md",
"docs/SECURITY.md",
"docs/product-specs/index.md",
"docs/product-specs/new-user-onboarding.md",
"docs/design-docs/index.md",
"docs/design-docs/core-beliefs.md",
"docs/exec-plans/active/index.md",
"docs/exec-plans/completed/index.md",
"docs/exec-plans/tech-debt-tracker.md",
"docs/exec-plans/AGENTS.md",
"docs/records/index.md",
"docs/records/AGENTS.md",
"docs/RECORDS_KEEPER.md",
// src/ and tests/ routing
"src/AGENTS.md",
"tests/AGENTS.md",
];
// Minimum lines considered "real content" vs stub. Template stubs are ~3-8 lines.
const STUB_LINE_COUNT = 10;
// Files that are allowed to stay as stubs (index/placeholder files)
const STUB_ALLOWED = new Set([
/** Files created by ensureAgenticDocsScaffold that should contain real content.
* Kept for compatibility checkDocsScaffold now derives its list from the
* active profile at runtime (ADR-022). */
const STUB_ALLOWED_PATHS = new Set([
"docs/product-specs/index.md",
"docs/design-docs/index.md",
"docs/exec-plans/active/index.md",
"docs/exec-plans/completed/index.md",
"docs/records/index.md",
]);
// Minimum lines considered "real content" vs stub. Template stubs are ~3-8 lines.
const STUB_LINE_COUNT = 10;
function countContentLines(content) {
// Count non-empty, non-comment lines
return content.split("\n").filter((line) => {
@ -96,12 +76,12 @@ function checkFile(repoRoot, relPath) {
return { file: relPath, status: "empty", lines: 0, note: "File is empty" };
}
if (contentLines < STUB_LINE_COUNT) {
const note = STUB_ALLOWED.has(relPath)
const note = STUB_ALLOWED_PATHS.has(relPath)
? `Stub file (${lines} lines) — acceptable for index/placeholder`
: `Stub file (${lines} lines) — needs real content beyond template`;
return {
file: relPath,
status: STUB_ALLOWED.has(relPath) ? "ok" : "stub",
status: STUB_ALLOWED_PATHS.has(relPath) ? "ok" : "stub",
lines,
note,
};
@ -114,13 +94,29 @@ function checkFile(repoRoot, relPath) {
};
}
/**
* Check all scaffold files in a repo. Returns a structured report.
* Check scaffold files in a repo. Returns a structured report.
* Never throws all errors are caught and reported as stub/missing.
*
* ADR-022: filters to the active profile and skips `state=disabled` files.
* Files not in the active profile are not checked (they don't apply to this
* repo shape). Files with `state=disabled` are skipped as intentionally
* out-of-scope.
*/
export function checkDocsScaffold(repoRoot) {
const manifest = readScaffoldManifest(repoRoot);
const { profileSet } = resolveActiveProfileSet(repoRoot, manifest, null);
const checks = [];
for (const file of SCAFFOLD_FILES) {
checks.push(checkFile(repoRoot, file));
// Skip files not in the active profile.
if (!profileSet.has(file.path)) continue;
// Skip files with state=disabled — intentionally out of scope.
try {
const { marker } = extractMarker(join(repoRoot, file.path));
if (marker?.state === "disabled") continue;
} catch {
// File unreadable — fall through to normal check.
}
checks.push(checkFile(repoRoot, file.path));
}
const summary = {
total: checks.length,

View file

@ -460,3 +460,72 @@ This is gold — most wrong agent calls come from not knowing what to avoid. Eac
`,
},
];
/**
* Built-in scaffold profiles each profile is the set of SCAFFOLD_FILES
* paths that apply to a given repo shape (ADR-022).
*
* A file not in the active profile is treated as absent from SCAFFOLD_FILES
* for all scaffold operations: it is never created, never reported as
* `missing`, and is stamped `state=disabled` by the migrate command.
*
* Profile names are intentionally lowercase strings. The `app` profile is
* the default and equals the full SCAFFOLD_FILES list.
*
* Consumer: agentic-docs-scaffold.js, scaffold-drift.js,
* commands-scaffold-migrate.js.
*/
const APP_PROFILE_PATHS = new Set(SCAFFOLD_FILES.map((f) => f.path));
const LIBRARY_EXCLUDED = new Set([
"docs/FRONTEND.md",
"docs/PRODUCT_SENSE.md",
"docs/DESIGN.md",
]);
const INFRA_INCLUDED = new Set([
".siftignore",
"AGENTS.md",
"ARCHITECTURE.md",
"docs/AGENTS.md",
"docs/SECURITY.md",
"docs/RELIABILITY.md",
"docs/records/AGENTS.md",
"docs/records/index.md",
"docs/RECORDS_KEEPER.md",
".sf/PRINCIPLES.md",
".sf/STYLE.md",
".sf/NON-GOALS.md",
".sf/harness/AGENTS.md",
".sf/harness/specs/AGENTS.md",
".sf/harness/specs/bootstrap.md",
".sf/harness/evals/AGENTS.md",
".sf/harness/graders/AGENTS.md",
]);
const DOCS_INCLUDED = new Set([
"AGENTS.md",
".sf/STYLE.md",
]);
const MINIMAL_INCLUDED = new Set([
".siftignore",
"AGENTS.md",
]);
export const PROFILES = {
/** Full template set. Default for product and CLI repos with UI and tests. */
app: APP_PROFILE_PATHS,
/** App minus frontend/design/product-sense files. For libraries, SDKs, CLI tools. */
library: new Set([...APP_PROFILE_PATHS].filter((p) => !LIBRARY_EXCLUDED.has(p))),
/** Infrastructure/GitOps/Kubernetes repos. Ops-focused subset. */
infra: INFRA_INCLUDED,
/** Documentation-only repos. Minimal footprint. */
docs: DOCS_INCLUDED,
/** Smallest possible footprint. Useful as a starting point or for scratch repos. */
minimal: MINIMAL_INCLUDED,
};
/** Names of all built-in profiles. */
export const PROFILE_NAMES = /** @type {const} */ (Object.keys(PROFILES));

View file

@ -1,14 +1,20 @@
/**
* Scaffold drift detection (ADR-021 Phase B).
* Scaffold drift detection (ADR-021 Phase B, extended by ADR-022).
*
* Reads the on-disk state of every entry in `SCAFFOLD_FILES`, parses the
* first-line marker (if any), and classifies each file into one of five
* first-line marker (if any), and classifies each file into one of six
* buckets. The result is structured and side-effect-free Phase C wires
* the report into the scaffold sync pipeline; Phase B is data-plane only.
*
* ADR-022 adds profile-aware filtering: files not in the active profile are
* skipped entirely (not reported as `missing`), and files with
* `state=disabled` go to the `disabled` bucket rather than any actionable
* bucket.
*/
import { existsSync, readFileSync } from "node:fs";
import { join } from "node:path";
import { SCAFFOLD_FILES } from "./scaffold-constants.js";
import { resolveActiveProfileSet } from "./scaffold-profiles.js";
import {
bodyHash,
extractMarker,
@ -49,9 +55,21 @@ function emptyCounts() {
untracked: 0,
current: 0,
customized: 0,
disabled: 0,
};
}
/**
* Resolve the active profile set for a project.
*
* Delegates to resolveActiveProfileSet but only needs the Set kept as a
* local shim so migrateLegacyScaffold has a consistent call site.
*/
function resolveProfileSet(profile, manifest, basePath) {
const { profileSet } = resolveActiveProfileSet(basePath ?? null, manifest, profile);
return profileSet;
}
/**
* Classify every `SCAFFOLD_FILES` entry against its on-disk state.
*
@ -59,14 +77,23 @@ function emptyCounts() {
* tests can rely on stable iteration. Failure modes are non-fatal: if a
* file cannot be read, it is reported as `untracked` rather than aborting
* the scan.
*
* @param {string} basePath - Absolute path to the repo root.
* @param {string|null} [profile] - Active profile name override. Falls back to
* PREFERENCES.md frontmatter manifest field auto-detect. Files not in
* the active profile are skipped entirely (not reported as `missing`). Files
* with `state=disabled` go to the `disabled` bucket.
*/
export function detectScaffoldDrift(basePath) {
export function detectScaffoldDrift(basePath, profile) {
const shipVersion = process.env.SF_VERSION || "0.0.0";
const items = [];
const counts = emptyCounts();
const manifest = readScaffoldManifest(basePath);
const manifestPresent = manifest.applied.length > 0;
const { profileSet } = resolveActiveProfileSet(basePath, manifest, profile);
for (const file of SCAFFOLD_FILES) {
// Skip files not in the active profile entirely — don't report as missing.
if (!profileSet.has(file.path)) continue;
const target = join(basePath, file.path);
// Files in SKIP_MARKER_PATHS use the manifest as their versioning
// source instead of an inline marker. For Phase B we treat them as
@ -151,6 +178,19 @@ export function detectScaffoldDrift(basePath) {
counts.current += 1;
continue;
}
// Marker says disabled → file is out-of-scope; never touch it.
if (marker.state === "disabled") {
items.push({
path: file.path,
template: file.path,
bucket: "disabled",
currentVersion: marker.version,
shipVersion,
hashDrifted: false,
});
counts.disabled += 1;
continue;
}
const currentHash = bodyHash(body);
const hashMatches = currentHash === marker.hash;
if (!hashMatches) {
@ -248,9 +288,9 @@ function seedArchiveWithCurrentShipVersion() {
}
}
/**
* Walk every `SCAFFOLD_FILES` entry and look for **unmarked** files whose
* body hash matches a known prior version recorded in
* `SCAFFOLD_VERSION_ARCHIVE`. Matching files are promoted to `pending` by
* Walk every `SCAFFOLD_FILES` entry (filtered to the active profile) and look
* for **unmarked** files whose body hash matches a known prior version recorded
* in `SCAFFOLD_VERSION_ARCHIVE`. Matching files are promoted to `pending` by
* stamping them with the matched version and recording a manifest entry.
*
* Behaviour:
@ -259,6 +299,8 @@ function seedArchiveWithCurrentShipVersion() {
* - Files missing on disk skipped (the missing-file flow handles those).
* - Files in `SKIP_MARKER_PATHS` (e.g. `.siftignore`) skipped here; the
* manifest is the versioning source for those.
* - Files not in the active profile skipped; only profile-relevant files
* are candidates for legacy promotion.
* - Files whose body hash matches an archive entry stamped with the
* matched version, manifest entry recorded, returned in `migrated`.
* - Files with no archive match returned in `skipped`. Treated as
@ -267,14 +309,21 @@ function seedArchiveWithCurrentShipVersion() {
* Idempotent: a second invocation finds the markers it just wrote and
* skips them. Failure modes (read error, write error) are swallowed and
* logged via `logWarning("scaffold", ...)`.
*
* @param {string} basePath
* @param {string|null} [profile] - Active profile; falls back to manifest then `app`.
*/
export function migrateLegacyScaffold(basePath) {
export function migrateLegacyScaffold(basePath, profile) {
seedArchiveWithCurrentShipVersion();
const manifest = readScaffoldManifest(basePath);
const profileSet = resolveProfileSet(profile, manifest, basePath);
const migrated = [];
const skipped = [];
const appliedAt = new Date().toISOString();
for (const file of SCAFFOLD_FILES) {
if (SKIP_MARKER_PATHS.has(file.path)) continue;
// Only migrate files in the active profile.
if (!profileSet.has(file.path)) continue;
const target = join(basePath, file.path);
if (!existsSync(target)) continue;
let body;

View file

@ -0,0 +1,250 @@
/**
* scaffold-profiles.js profile resolution for ADR-022 scaffold profiles.
*
* Purpose: single entry point for resolving the active profile Set<string> from
* all precedence sources: PREFERENCES.md frontmatter manifest auto-detection.
* Also handles custom profile YAML files under ~/.sf/profiles/.
*
* Consumer: detectScaffoldDrift, ensureAgenticDocsScaffold, doc-checker, migrate.
*/
import { existsSync, readFileSync } from "node:fs";
import { join } from "node:path";
import { PROFILE_NAMES, PROFILES, SCAFFOLD_FILES } from "./scaffold-constants.js";
import { sfHome } from "./sf-home.js";
const SCAFFOLD_FILE_PATHS = new Set(SCAFFOLD_FILES.map((f) => f.path));
/**
* Read the `sf_profile:` field from PREFERENCES.md frontmatter.
*
* Purpose: allow per-repo, git-committed profile override that takes precedence
* over the runtime manifest field. Returns null when the file is absent, has no
* frontmatter, or has no `sf_profile` key.
*
* Consumer: resolveActiveProfileSet.
*/
export function readPreferencesProfile(basePath) {
const prefsPath = join(basePath, "PREFERENCES.md");
let content;
try {
content = readFileSync(prefsPath, "utf-8");
} catch {
return null;
}
if (!content.startsWith("---")) return null;
const end = content.indexOf("\n---", 3);
if (end === -1) return null;
const block = content.slice(3, end);
for (const line of block.split("\n")) {
const colon = line.indexOf(":");
if (colon === -1) continue;
const key = line.slice(0, colon).trim();
if (key !== "sf_profile") continue;
const val = line
.slice(colon + 1)
.replace(/#.*$/, "")
.trim()
.replace(/^["']|["']$/g, "");
return val || null;
}
return null;
}
/**
* Load a custom profile YAML file from ~/.sf/profiles/<name>.yaml.
*
* Purpose: allow power-users to define repo-type profiles beyond the five
* built-ins without forking SF. Returns null when the file does not exist.
* Returns an error string (not a Set) when the file is malformed so callers
* can surface a diagnostic instead of silently falling back.
*
* Schema:
* extends: <built-in profile name> # required
* add: # optional paths relative to repo root
* - docs/DESIGN.md
* remove: # optional
* - docs/exec-plans/active/index.md
*
* Consumer: resolveActiveProfileSet.
*/
export function loadCustomProfileSet(name) {
const profilePath = join(sfHome(), "profiles", `${name}.yaml`);
if (!existsSync(profilePath)) return null;
let content;
try {
content = readFileSync(profilePath, "utf-8");
} catch (err) {
return `failed to read ${profilePath}: ${err.message}`;
}
// Minimal YAML parser: handles scalar "extends" and sequence "add"/"remove".
const fields = { extends: null, add: [], remove: [] };
let currentList = null;
for (const rawLine of content.split("\n")) {
const line = rawLine.replace(/#.*$/, ""); // strip comments
if (!line.trim()) continue;
// Sequence item
if (/^\s+-\s/.test(line)) {
const value = line.replace(/^\s+-\s*/, "").trim().replace(/^["']|["']$/g, "");
if (currentList === "add") fields.add.push(value);
else if (currentList === "remove") fields.remove.push(value);
continue;
}
// Key: value
const colon = line.indexOf(":");
if (colon <= 0) continue;
const key = line.slice(0, colon).trim();
const val = line
.slice(colon + 1)
.trim()
.replace(/^["']|["']$/g, "");
if (key === "extends") {
fields.extends = val || null;
currentList = null;
} else if (key === "add") {
currentList = "add";
} else if (key === "remove") {
currentList = "remove";
} else {
currentList = null;
}
}
if (!fields.extends) {
return `custom profile '${name}' is missing required 'extends' field`;
}
if (!PROFILE_NAMES.includes(fields.extends)) {
return `custom profile '${name}': extends '${fields.extends}' is not a built-in profile (${PROFILE_NAMES.join(", ")})`;
}
const base = new Set(PROFILES[fields.extends]);
// Validate add paths.
for (const p of fields.add) {
if (!SCAFFOLD_FILE_PATHS.has(p)) {
return `custom profile '${name}': add path '${p}' is not a known scaffold file`;
}
base.add(p);
}
// Validate remove paths.
for (const p of fields.remove) {
if (!SCAFFOLD_FILE_PATHS.has(p)) {
return `custom profile '${name}': remove path '${p}' is not a known scaffold file`;
}
base.delete(p);
}
return base;
}
/**
* Detect the most appropriate built-in profile for a repo.
*
* Purpose: prevent SF from laying down ~60-70% irrelevant scaffold files when
* bootstrapping into infra/ops/docs repos that don't have a code-app shape.
* Returns the profile name to store in the manifest and filter scaffold files.
*
* Heuristic (ADR-022 §4): checks for language/infra marker files in order of
* specificity. Falls back to "app" to preserve existing behaviour for repos
* that look like applications.
*
* Consumer: resolveActiveProfileSet (first-run path), ensureAgenticDocsScaffold.
*/
export function detectRepoProfile(basePath) {
// Nix repos are almost always infrastructure.
if (existsSync(join(basePath, "flake.nix")) || existsSync(join(basePath, "shell.nix"))) {
return "infra";
}
// Repos with a Terraform or Pulumi entry point.
if (
existsSync(join(basePath, "main.tf")) ||
existsSync(join(basePath, "Pulumi.yaml")) ||
existsSync(join(basePath, "Pulumi.yml"))
) {
return "infra";
}
// Pure docs repo: no source manifests, only markdown.
const hasSource =
existsSync(join(basePath, "package.json")) ||
existsSync(join(basePath, "go.mod")) ||
existsSync(join(basePath, "Cargo.toml")) ||
existsSync(join(basePath, "pyproject.toml")) ||
existsSync(join(basePath, "setup.py")) ||
existsSync(join(basePath, "CMakeLists.txt"));
if (!hasSource && existsSync(join(basePath, "docs"))) {
return "docs";
}
// Node repos: check for UI framework dependency (→ app) vs library.
if (existsSync(join(basePath, "package.json"))) {
try {
const pkg = JSON.parse(readFileSync(join(basePath, "package.json"), "utf-8"));
const allDeps = {
...(pkg.dependencies ?? {}),
...(pkg.devDependencies ?? {}),
};
const uiFrameworks = ["react", "vue", "svelte", "next", "@angular/core"];
if (uiFrameworks.some((f) => allDeps[f])) return "app";
return "library";
} catch {
return "app";
}
}
// Go, Rust, Python → library by default (no frontend implied).
if (
existsSync(join(basePath, "go.mod")) ||
existsSync(join(basePath, "Cargo.toml")) ||
existsSync(join(basePath, "pyproject.toml"))
) {
return "library";
}
// Fallback: preserve existing behaviour.
return "app";
}
/**
* Resolve the active profile Set<string> for a given repo path.
*
* Purpose: single authoritative implementation of ADR-022 §6 precedence rules
* so all callers (drift, scaffold, migrate, doctor) agree on the active profile.
*
* Precedence (highest first):
* 1. PREFERENCES.md frontmatter `sf_profile:` (committed, explicit)
* 2. `explicitProfile` argument (caller-supplied override, e.g. migrate --profile=)
* 3. Manifest `profile` field (runtime, gitignored)
* 4. Auto-detection via detectRepoProfile (first-run fallback)
*
* Returns `{ profileName, profileSet, warning? }`. `warning` is set when a
* custom profile file is malformed or unknown callers surface it as a notice,
* never as a hard error.
*
* Consumer: detectScaffoldDrift, ensureAgenticDocsScaffold, doc-checker,
* runScaffoldMigrate.
*/
export function resolveActiveProfileSet(basePath, manifest, explicitProfile) {
// 1. PREFERENCES.md frontmatter — highest precedence.
const prefsProfile = readPreferencesProfile(basePath);
const resolved = prefsProfile ?? explicitProfile ?? manifest?.profile ?? null;
if (!resolved) {
// Auto-detect from filesystem heuristics (first-run fallback).
const detected = basePath ? detectRepoProfile(basePath) : "app";
return { profileName: detected, profileSet: PROFILES[detected] };
}
// Built-in?
if (PROFILE_NAMES.includes(resolved)) {
return { profileName: resolved, profileSet: PROFILES[resolved] };
}
// Custom profile?
const custom = loadCustomProfileSet(resolved);
if (custom === null) {
// File doesn't exist — warn and fall back.
return {
profileName: "app",
profileSet: PROFILES.app,
warning: `custom profile '${resolved}' not found at ~/.sf/profiles/${resolved}.yaml — using 'app'`,
};
}
if (typeof custom === "string") {
// Malformed — warn and fall back.
return {
profileName: "app",
profileSet: PROFILES.app,
warning: `${custom} — using 'app'`,
};
}
return { profileName: resolved, profileSet: custom };
}

View file

@ -21,7 +21,7 @@ export const SF_DOC_MARKER_SUFFIX = "-->";
/** Manifest path under `.sf/`. Single source of truth for filename/location. */
export const SCAFFOLD_MANIFEST_RELPATH = ".sf/scaffold-manifest.json";
// ─── Marker parsing & formatting ─────────────────────────────────────────
const VALID_STATES = ["pending", "editing", "completed"];
const VALID_STATES = ["pending", "editing", "completed", "disabled"];
function isScaffoldDocState(s) {
return VALID_STATES.includes(s);
}
@ -170,19 +170,19 @@ function isScaffoldManifestEntry(e) {
export function readScaffoldManifest(basePath) {
const fp = manifestPath(basePath);
if (!existsSync(fp)) {
return { schemaVersion: 1, applied: [] };
return { schemaVersion: 1, profile: null, applied: [] };
}
let content;
try {
content = readFileSync(fp, "utf-8");
} catch {
return { schemaVersion: 1, applied: [] };
return { schemaVersion: 1, profile: null, applied: [] };
}
let parsed;
try {
parsed = JSON.parse(content);
} catch {
return { schemaVersion: 1, applied: [] };
return { schemaVersion: 1, profile: null, applied: [] };
}
if (
!parsed ||
@ -190,10 +190,12 @@ export function readScaffoldManifest(basePath) {
parsed.schemaVersion !== 1 ||
!Array.isArray(parsed.applied)
) {
return { schemaVersion: 1, applied: [] };
return { schemaVersion: 1, profile: null, applied: [] };
}
const applied = parsed.applied.filter(isScaffoldManifestEntry);
return { schemaVersion: 1, applied };
const profile =
typeof parsed.profile === "string" ? parsed.profile : null;
return { schemaVersion: 1, profile, applied };
}
/**
* Write the manifest. Never throws write failures are logged via
@ -213,6 +215,7 @@ export function writeScaffoldManifest(basePath, manifest) {
/**
* Record a scaffold application. Reads the manifest, removes any prior entry
* with the same `path`, appends the new one, writes back. Idempotent.
* Preserves the `profile` field from the existing manifest.
*/
export function recordScaffoldApply(basePath, entry) {
const manifest = readScaffoldManifest(basePath);
@ -220,6 +223,7 @@ export function recordScaffoldApply(basePath, entry) {
filtered.push(entry);
writeScaffoldManifest(basePath, {
schemaVersion: 1,
profile: manifest.profile,
applied: filtered,
});
}

View file

@ -0,0 +1,377 @@
/**
* adr-022-scaffold-profiles.test.mjs ADR-022 scaffold profiles test suite.
*
* Purpose: verify behaviour contracts for the profile system: PROFILES shape,
* disabled state round-trip, profile detection heuristics, drift skipping for
* disabled/out-of-profile files, and the migrate algorithm.
*
* Consumer: CI (npm run test:unit).
*/
import assert from "node:assert/strict";
import {
existsSync,
mkdirSync,
mkdtempSync,
readFileSync,
rmSync,
writeFileSync,
} from "node:fs";
import { tmpdir } from "node:os";
import { dirname, join } from "node:path";
import { afterEach, describe, test } from "vitest";
import {
detectRepoProfile,
ensureAgenticDocsScaffold,
} from "../agentic-docs-scaffold.js";
import { PROFILE_NAMES, PROFILES, SCAFFOLD_FILES } from "../scaffold-constants.js";
import { detectScaffoldDrift } from "../scaffold-drift.js";
import {
parseScaffoldMigrateArgs,
runScaffoldMigrate,
} from "../commands-scaffold-migrate.js";
import {
bodyHash,
extractMarker,
parseMarker,
readScaffoldManifest,
stampScaffoldFile,
} from "../scaffold-versioning.js";
const tmpRoots = [];
afterEach(() => {
for (const dir of tmpRoots.splice(0)) {
rmSync(dir, { recursive: true, force: true });
}
});
function makeProject() {
const root = mkdtempSync(join(tmpdir(), "sf-adr022-"));
tmpRoots.push(root);
return root;
}
function writePendingFile(root, relPath, content = `# ${relPath}\n`) {
const target = join(root, relPath);
mkdirSync(dirname(target), { recursive: true });
writeFileSync(target, content, "utf-8");
stampScaffoldFile(target, relPath, "1.0.0", "pending");
return target;
}
// ─── PROFILES shape ──────────────────────────────────────────────────────────
describe("PROFILES shape", () => {
test("PROFILE_NAMES_contains_five_built_in_profiles", () => {
assert.deepEqual(PROFILE_NAMES.sort(), ["app", "docs", "infra", "library", "minimal"].sort());
});
test("app_profile_contains_all_scaffold_files", () => {
const allPaths = new Set(SCAFFOLD_FILES.map((f) => f.path));
for (const p of PROFILES.app) {
assert.ok(allPaths.has(p), `app profile includes unknown path: ${p}`);
}
// app must be the superset — every file belongs to at least app.
for (const path of allPaths) {
assert.ok(PROFILES.app.has(path), `SCAFFOLD_FILES path not in app profile: ${path}`);
}
});
test("minimal_is_strict_subset_of_app", () => {
for (const p of PROFILES.minimal) {
assert.ok(PROFILES.app.has(p), `minimal has path not in app: ${p}`);
}
// minimal must be strictly smaller.
assert.ok(PROFILES.minimal.size < PROFILES.app.size);
});
test("infra_is_strict_subset_of_app", () => {
for (const p of PROFILES.infra) {
assert.ok(PROFILES.app.has(p), `infra has path not in app: ${p}`);
}
});
test("docs_is_strict_subset_of_app", () => {
for (const p of PROFILES.docs) {
assert.ok(PROFILES.app.has(p), `docs has path not in app: ${p}`);
}
});
});
// ─── disabled state round-trip ───────────────────────────────────────────────
describe("disabled state", () => {
test("parseMarker_accepts_disabled_state", () => {
// The disabled state must be parseable — otherwise files stamped disabled
// before Phase 1 ships would be read as null → untracked → re-rendered.
const line = "<!-- sf-doc: version=1.0.0 template=AGENTS.md state=disabled hash=abc123 -->";
const marker = parseMarker(line);
assert.ok(marker, "parseMarker should not return null for state=disabled");
assert.equal(marker.state, "disabled");
});
test("stampScaffoldFile_roundtrip_preserves_disabled_state", () => {
const root = makeProject();
const target = join(root, "AGENTS.md");
writeFileSync(target, "# AGENTS\n", "utf-8");
stampScaffoldFile(target, "AGENTS.md", "1.0.0", "disabled");
const { marker } = extractMarker(target);
assert.ok(marker, "marker should be present after stamping");
assert.equal(marker.state, "disabled");
});
test("detectScaffoldDrift_puts_disabled_files_in_disabled_bucket", () => {
const root = makeProject();
const target = join(root, "AGENTS.md");
writeFileSync(target, "# AGENTS\n", "utf-8");
stampScaffoldFile(target, "AGENTS.md", "1.0.0", "disabled");
const report = detectScaffoldDrift(root);
const item = report.items.find((i) => i.path === "AGENTS.md");
assert.ok(item, "AGENTS.md should appear in drift report");
assert.equal(item.bucket, "disabled");
assert.equal(report.countsByBucket.disabled, 1);
});
test("detectScaffoldDrift_disabled_bucket_is_not_actionable", () => {
const root = makeProject();
const target = join(root, "AGENTS.md");
writeFileSync(target, "# AGENTS\n", "utf-8");
stampScaffoldFile(target, "AGENTS.md", "1.0.0", "disabled");
const report = detectScaffoldDrift(root);
// Disabled files should not appear in missing/upgradable/editing-drift.
const actionable = report.items.filter(
(i) => i.path === "AGENTS.md" && i.bucket !== "disabled",
);
assert.equal(actionable.length, 0);
});
});
// ─── profile-aware drift detection ───────────────────────────────────────────
describe("profile-aware drift", () => {
test("detectScaffoldDrift_skips_missing_files_not_in_profile", () => {
const root = makeProject();
// Use "minimal" profile — only .siftignore and AGENTS.md.
// A missing file from "app" profile (e.g. ARCHITECTURE.md) should NOT appear as missing.
const archFile = "ARCHITECTURE.md";
assert.ok(!PROFILES.minimal.has(archFile), "ARCHITECTURE.md should not be in minimal");
const report = detectScaffoldDrift(root, "minimal");
const item = report.items.find((i) => i.path === archFile);
// Should not be in missing bucket — it's out of profile.
if (item) {
assert.notEqual(item.bucket, "missing");
}
});
test("detectScaffoldDrift_reports_missing_files_in_profile", () => {
const root = makeProject();
// AGENTS.md is in every profile — should appear as missing.
const report = detectScaffoldDrift(root, "minimal");
const item = report.items.find((i) => i.path === "AGENTS.md");
assert.ok(item, "AGENTS.md should be in drift report for minimal profile");
assert.equal(item.bucket, "missing");
});
});
// ─── profile auto-detection ───────────────────────────────────────────────────
describe("detectRepoProfile", () => {
test("detects_nix_repo_as_infra", () => {
const root = makeProject();
writeFileSync(join(root, "flake.nix"), "# nix\n", "utf-8");
assert.equal(detectRepoProfile(root), "infra");
});
test("detects_terraform_repo_as_infra", () => {
const root = makeProject();
writeFileSync(join(root, "main.tf"), "# terraform\n", "utf-8");
assert.equal(detectRepoProfile(root), "infra");
});
test("detects_node_react_repo_as_app", () => {
const root = makeProject();
writeFileSync(
join(root, "package.json"),
JSON.stringify({ dependencies: { react: "^18.0.0" } }),
"utf-8",
);
assert.equal(detectRepoProfile(root), "app");
});
test("detects_node_non_ui_repo_as_library", () => {
const root = makeProject();
writeFileSync(
join(root, "package.json"),
JSON.stringify({ dependencies: { express: "^4.0.0" } }),
"utf-8",
);
assert.equal(detectRepoProfile(root), "library");
});
test("detects_go_repo_as_library", () => {
const root = makeProject();
writeFileSync(join(root, "go.mod"), "module example.com\n", "utf-8");
assert.equal(detectRepoProfile(root), "library");
});
test("detects_no_source_with_docs_dir_as_docs", () => {
const root = makeProject();
mkdirSync(join(root, "docs"));
assert.equal(detectRepoProfile(root), "docs");
});
test("falls_back_to_app_for_unknown_shape", () => {
const root = makeProject();
assert.equal(detectRepoProfile(root), "app");
});
});
// ─── ensureAgenticDocsScaffold — profile-aware ────────────────────────────────
describe("ensureAgenticDocsScaffold profile-aware", () => {
test("writes_profile_to_manifest_on_first_run_when_nix_detected", () => {
const root = makeProject();
writeFileSync(join(root, "flake.nix"), "# nix\n", "utf-8");
ensureAgenticDocsScaffold(root);
const manifest = readScaffoldManifest(root);
assert.equal(manifest.profile, "infra");
});
test("does_not_write_out_of_profile_files", () => {
const root = makeProject();
writeFileSync(join(root, "flake.nix"), "# nix\n", "utf-8");
ensureAgenticDocsScaffold(root);
// FRONTEND.md is in "app" but not in "infra" — must not be written.
assert.ok(!PROFILES.infra.has("docs/FRONTEND.md"), "precondition: FRONTEND.md not in infra");
assert.equal(existsSync(join(root, "docs/FRONTEND.md")), false);
});
});
// ─── migrate algorithm ───────────────────────────────────────────────────────
describe("scaffold migrate", () => {
test("parseScaffoldMigrateArgs_parses_profile_prune_dry_run", () => {
const opts = parseScaffoldMigrateArgs("--profile=infra --prune --dry-run");
assert.equal(opts.profile, "infra");
assert.equal(opts.prune, true);
assert.equal(opts.dryRun, true);
});
test("runScaffoldMigrate_disables_pending_clean_hash_files_outside_target_profile", () => {
const root = makeProject();
// Find a file that is in app but not in infra.
const appOnly = SCAFFOLD_FILES.find(
(f) => PROFILES.app.has(f.path) && !PROFILES.infra.has(f.path),
);
assert.ok(appOnly, "precondition: at least one file in app not in infra");
writePendingFile(root, appOnly.path);
const result = runScaffoldMigrate(root, "infra");
assert.ok(
result.disabled.includes(appOnly.path),
`expected ${appOnly.path} to be disabled`,
);
});
test("runScaffoldMigrate_does_not_disable_editing_or_completed_files", () => {
const root = makeProject();
const appOnly = SCAFFOLD_FILES.find(
(f) => PROFILES.app.has(f.path) && !PROFILES.infra.has(f.path),
);
assert.ok(appOnly, "precondition: at least one file in app not in infra");
const target = join(root, appOnly.path);
mkdirSync(dirname(target), { recursive: true });
writeFileSync(target, `# ${appOnly.path}\n`, "utf-8");
stampScaffoldFile(target, appOnly.path, "1.0.0", "editing");
const result = runScaffoldMigrate(root, "infra");
assert.ok(!result.disabled.includes(appOnly.path));
assert.ok(result.warnings.some((w) => w.path === appOnly.path));
});
test("runScaffoldMigrate_does_not_disable_hash_diverged_pending_files", () => {
const root = makeProject();
const appOnly = SCAFFOLD_FILES.find(
(f) => PROFILES.app.has(f.path) && !PROFILES.infra.has(f.path),
);
assert.ok(appOnly, "precondition");
// Write file, stamp pending, then modify content so hash diverges.
const target = join(root, appOnly.path);
mkdirSync(dirname(target), { recursive: true });
writeFileSync(target, "original content\n", "utf-8");
stampScaffoldFile(target, appOnly.path, "1.0.0", "pending");
// Now modify content (simulating user edit without changing the marker).
const { marker } = extractMarker(target);
const line1 = readFileSync(target, "utf-8").split("\n")[0];
writeFileSync(target, `${line1}\nuser edited content\n`, "utf-8");
const result = runScaffoldMigrate(root, "infra");
assert.ok(!result.disabled.includes(appOnly.path));
assert.ok(result.warnings.some((w) => w.path === appOnly.path));
});
test("runScaffoldMigrate_reenables_disabled_file_in_target_profile_with_clean_hash", () => {
const root = makeProject();
// Find a file in infra profile.
const infraPath = [...PROFILES.infra][0];
assert.ok(infraPath, "precondition: infra has at least one file");
// Stamp it as disabled.
const target = join(root, infraPath);
mkdirSync(dirname(target), { recursive: true });
writeFileSync(target, `# ${infraPath}\n`, "utf-8");
stampScaffoldFile(target, infraPath, "1.0.0", "disabled");
const result = runScaffoldMigrate(root, "infra");
assert.ok(
result.reEnabled.includes(infraPath),
`expected ${infraPath} to be re-enabled`,
);
});
test("runScaffoldMigrate_dry_run_does_not_modify_files", () => {
const root = makeProject();
const appOnly = SCAFFOLD_FILES.find(
(f) => PROFILES.app.has(f.path) && !PROFILES.infra.has(f.path),
);
assert.ok(appOnly, "precondition");
writePendingFile(root, appOnly.path);
const result = runScaffoldMigrate(root, "infra", { dryRun: true });
assert.ok(result.disabled.includes(appOnly.path));
// File should still be pending on disk.
const { marker } = extractMarker(join(root, appOnly.path));
assert.equal(marker?.state, "pending");
});
test("runScaffoldMigrate_prune_deletes_pending_clean_hash_files", () => {
const root = makeProject();
const appOnly = SCAFFOLD_FILES.find(
(f) => PROFILES.app.has(f.path) && !PROFILES.infra.has(f.path),
);
assert.ok(appOnly, "precondition");
writePendingFile(root, appOnly.path);
const result = runScaffoldMigrate(root, "infra", { prune: true });
assert.ok(result.pruned.includes(appOnly.path));
assert.equal(existsSync(join(root, appOnly.path)), false);
});
test("runScaffoldMigrate_updates_manifest_profile", () => {
const root = makeProject();
runScaffoldMigrate(root, "infra");
const manifest = readScaffoldManifest(root);
assert.equal(manifest.profile, "infra");
});
});