perf(test): compile unit tests with esbuild, reclassify integration tests, fix node_modules symlink (#2809)
* fix(test): wire src/resources/extensions/shared/tests/ into test:unit runner The test:unit glob excluded src/resources/extensions/shared/tests/ entirely, leaving format-utils.test.ts (and any future tests there) silently unfired. - Add shared/tests/*.test.ts to the test:unit glob in package.json - Export newestSrcMtime from ensure-workspace-builds.cjs (require.main guard prevents side-effects on require) so the staleness logic can be tested - Add src/tests/ensure-workspace-builds.test.ts covering newestSrcMtime: non-existent dir, no .ts files, single file, max of multiple, recursion, node_modules skip Closes #2808 * perf(test): compile unit tests with esbuild and fix dist-test/node_modules Replace per-file --experimental-strip-types with a single esbuild compilation step (scripts/compile-tests.mjs) that compiles all src/ TypeScript to dist-test/ in ~3s, then runs the pre-compiled JS. Eliminates ~1.7s Node startup overhead per test file. - scripts/compile-tests.mjs: esbuild compilation, asset copy, .ts→.js rewrite, stale file cleanup; creates dist-test/node_modules symlink so resource-loader.ts resolves gsdNodeModules to a real path (fixes node-modules-symlink test failure) - scripts/dist-test-resolve.mjs: ESM loader hook for @gsd/* bare specifiers and .ts→.js fallback rewriting at runtime - .gitignore: exclude dist-test/ from version control - package.json: add test:compile script; update test:unit to compile-then-run; update test:integration globs to cover new integration/ subdirectories - worker-registry.ts: unref() cleanup timer so it does not keep the Node process alive after tests complete Closes #2858 * fix(test): update relative imports in tests/integration/ after directory move When tests were moved from tests/ to tests/integration/ in the previous commit, relative imports weren't updated. ../foo now resolves one level too shallow. Fix all 117 import paths across 43 test files: - ../foo → ../../foo (source files at gsd/ level) - ../../get-secrets-from-user.ts → ../../../ (at extensions/ level) - ../../subagent/worker-registry.ts → ../../../ (at extensions/ level) - ./marketplace-test-fixtures.js → ../marketplace-test-fixtures.ts - ./test-helpers.ts → ../test-helpers.ts typecheck:extensions now passes with zero errors. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * test(integration): set 10-minute timeout for integration test runner build job takes ~7min on main. Without a global timeout, hanging tests block the suite indefinitely. --test-timeout=600000 caps each test at 10min. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * Revert "test(integration): set 10-minute timeout for integration test runner" This reverts commit be77ead77d369ad8569292ae6b69ba56435f5433. * fix(test): correct formatDuration(0) edge case and docker test root path - formatDuration(0) now returns '0s' instead of '0ms' by guarding the sub-second branch with ms > 0 - docker-template.test.ts root path goes ../../.. from dist-test/src/tests/ to reach project root instead of landing in dist-test/ - replace require() calls in skill-health.ts and visualizer-overlay.ts with proper ES module imports Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(test): correct relative import paths in integration tests All affected tests were one directory level off — importing from ../web/ and ../resources/ when the correct paths are ../../web/ and ../../resources/. Tests live at src/tests/integration/, not src/tests/. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(test): add esbuild to root devDeps and wire dist-test-resolve hook P1: esbuild was only in web/package.json — compile-tests.mjs requires it at the root node_modules path, so CI failed on clean installs. P2: dist-test-resolve.mjs existed but was never loaded; @gsd/* imports in compiled tests resolved to installed workspace packages instead of freshly compiled dist-test output. Add --import to test:unit. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(deps): align esbuild version with lock file (0.25.12) ^0.27.4 didn't satisfy the existing lock file entry. Use the version already present so npm ci passes without regenerating the lock file. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(test): correct all relative import depths in src/tests/integration/ Tests in src/tests/integration/ need 3 levels up (../../..) to reach project-root dirs (web/, packages/) and 2 levels up (../..) to reach src-level dirs (src/web/, src/cli-web-branch.ts). Fixes: - ../../web/lib/ → ../../../web/lib/ (Next.js app, not src/web/) - ../../web/app/ → ../../../web/app/ - ../../packages/ → ../../../packages/ - ../cli-web-branch.ts → ../../cli-web-branch.ts - ../web-mode.ts → ../../web-mode.ts - ../resources/extensions/ → ../../resources/extensions/ - ci_monitor ROOT path: 2 levels up → 3 levels up - web-responsive WEB_ROOT: 2 levels up → 3 levels up Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * chore(test): use dot reporter for test:unit to reduce noise Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * chore(test): switch test:unit reporter to tap Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * chore(test): compact test reporter — silent on pass, failures + summary only Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * chore(test): include shared/tests in test:coverage Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(test): correct path depths in tests moved to integration/ Tests moved from tests/ to tests/integration/ need one extra ../ to reach the same source files. Also fix web component paths — those files live at web/ not src/web/. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(test): fix web component paths in web-session-parity-contract Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(test): use process.cwd() for project root in docker-template test Resolving relative to __dirname breaks under test:coverage which runs source files directly from src/tests/ — needs ../.. not ../../.. (the extra level only exists in the compiled dist-test/ output). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * ci: retrigger CI --------- Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
parent
48279ae5a4
commit
b6e105b058
84 changed files with 657 additions and 280 deletions
3
.gitignore
vendored
3
.gitignore
vendored
|
|
@ -1,4 +1,7 @@
|
|||
|
||||
# ── Compiled test output ──
|
||||
dist-test/
|
||||
|
||||
# ── GSD project state (development-only, lives in worktree branches) ──
|
||||
package-lock.json
|
||||
.claude/
|
||||
|
|
|
|||
|
|
@ -53,11 +53,12 @@
|
|||
"copy-resources": "node scripts/copy-resources.cjs",
|
||||
"copy-themes": "node scripts/copy-themes.cjs",
|
||||
"copy-export-html": "node scripts/copy-export-html.cjs",
|
||||
"test:unit": "node --import ./src/resources/extensions/gsd/tests/resolve-ts.mjs --experimental-strip-types --experimental-test-isolation=process --test src/resources/extensions/gsd/tests/*.test.ts src/resources/extensions/gsd/tests/*.test.mjs src/tests/*.test.ts",
|
||||
"test:compile": "node scripts/compile-tests.mjs",
|
||||
"test:unit": "npm run test:compile && node --import ./scripts/dist-test-resolve.mjs --experimental-test-isolation=process --test-reporter=./scripts/test-reporter-compact.mjs --test 'dist-test/src/tests/*.test.js' 'dist-test/src/resources/extensions/gsd/tests/*.test.js' 'dist-test/src/resources/extensions/gsd/tests/*.test.mjs' 'dist-test/src/resources/extensions/shared/tests/*.test.js' 'dist-test/src/resources/extensions/claude-code-cli/tests/*.test.js' 'dist-test/src/resources/extensions/github-sync/tests/*.test.js' 'dist-test/src/resources/extensions/universal-config/tests/*.test.js' 'dist-test/src/resources/extensions/voice/tests/*.test.js'",
|
||||
"test:packages": "node --test packages/pi-coding-agent/dist/core/*.test.js",
|
||||
"test:marketplace": "GSD_TEST_CLONE_MARKETPLACES=1 node --import ./src/resources/extensions/gsd/tests/resolve-ts.mjs --experimental-strip-types --test src/resources/extensions/gsd/tests/claude-import-tui.test.ts src/resources/extensions/gsd/tests/plugin-importer-live.test.ts src/tests/marketplace-discovery.test.ts",
|
||||
"test:coverage": "c8 --reporter=text --reporter=lcov --exclude='src/resources/extensions/gsd/tests/**' --exclude='src/tests/**' --exclude='scripts/**' --exclude='native/**' --exclude='node_modules/**' --check-coverage --statements=40 --lines=40 --branches=20 --functions=20 node --import ./src/resources/extensions/gsd/tests/resolve-ts.mjs --experimental-strip-types --experimental-test-isolation=process --test src/resources/extensions/gsd/tests/*.test.ts src/resources/extensions/gsd/tests/*.test.mjs src/tests/*.test.ts",
|
||||
"test:integration": "node --import ./src/resources/extensions/gsd/tests/resolve-ts.mjs --experimental-strip-types --experimental-test-isolation=process --test src/resources/extensions/gsd/tests/*integration*.test.ts src/tests/integration/*.test.ts",
|
||||
"test:coverage": "c8 --reporter=text --reporter=lcov --exclude='src/resources/extensions/gsd/tests/**' --exclude='src/tests/**' --exclude='scripts/**' --exclude='native/**' --exclude='node_modules/**' --check-coverage --statements=40 --lines=40 --branches=20 --functions=20 node --import ./src/resources/extensions/gsd/tests/resolve-ts.mjs --experimental-strip-types --experimental-test-isolation=process --test src/resources/extensions/gsd/tests/*.test.ts src/resources/extensions/gsd/tests/*.test.mjs src/tests/*.test.ts src/resources/extensions/shared/tests/*.test.ts",
|
||||
"test:integration": "node --import ./src/resources/extensions/gsd/tests/resolve-ts.mjs --experimental-strip-types --test 'src/tests/integration/*.test.ts' 'src/resources/extensions/gsd/tests/integration/*.test.ts' 'src/resources/extensions/async-jobs/*.test.ts' 'src/resources/extensions/browser-tools/tests/*.test.mjs'",
|
||||
"pretest": "npm run typecheck:extensions",
|
||||
"test": "npm run test:unit && npm run test:integration",
|
||||
"test:smoke": "node --experimental-strip-types tests/smoke/run.ts",
|
||||
|
|
@ -136,6 +137,7 @@
|
|||
"@types/node": "^24.12.0",
|
||||
"@types/picomatch": "^4.0.2",
|
||||
"c8": "^11.0.0",
|
||||
"esbuild": "^0.25.12",
|
||||
"jiti": "^2.6.1",
|
||||
"typescript": "^5.4.0"
|
||||
},
|
||||
|
|
|
|||
214
scripts/compile-tests.mjs
Normal file
214
scripts/compile-tests.mjs
Normal file
|
|
@ -0,0 +1,214 @@
|
|||
#!/usr/bin/env node
|
||||
/**
|
||||
* Compile all TypeScript source + test files to dist-test/ using esbuild.
|
||||
* Run compiled JS directly with node --test (no per-file TS overhead).
|
||||
*
|
||||
* Usage: node scripts/compile-tests.mjs
|
||||
*/
|
||||
|
||||
import { cp, mkdir, readdir, readFile, writeFile } from 'node:fs/promises';
|
||||
import { existsSync, symlinkSync } from 'node:fs';
|
||||
import { createRequire } from 'node:module';
|
||||
import { join } from 'node:path';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
|
||||
const __dirname = fileURLToPath(new URL('.', import.meta.url));
|
||||
const ROOT = join(__dirname, '..');
|
||||
|
||||
const require = createRequire(import.meta.url);
|
||||
const esbuild = require(join(ROOT, 'node_modules/esbuild'));
|
||||
|
||||
// Recursively collect files by extension (skip node_modules, templates, etc.)
|
||||
// Directories to skip during file collection
|
||||
const SKIP_DIRS = new Set(['node_modules', 'templates', '__tests__', 'integration']);
|
||||
|
||||
async function collectFiles(dir, exts = ['.ts', '.mjs']) {
|
||||
const results = [];
|
||||
let entries;
|
||||
try {
|
||||
entries = await readdir(dir, { withFileTypes: true });
|
||||
} catch {
|
||||
return results;
|
||||
}
|
||||
for (const entry of entries) {
|
||||
if (SKIP_DIRS.has(entry.name)) continue;
|
||||
const full = join(dir, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
results.push(...await collectFiles(full, exts));
|
||||
} else if (
|
||||
exts.some(ext => entry.name.endsWith(ext)) &&
|
||||
!entry.name.endsWith('.d.ts')
|
||||
) {
|
||||
results.push(full);
|
||||
}
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
// Dirs to skip when copying assets (node_modules are never useful in dist-test)
|
||||
const ASSET_SKIP_DIRS = new Set(['node_modules', '__tests__', 'integration']);
|
||||
|
||||
/**
|
||||
* Recursively copy files from srcDir to destDir.
|
||||
* Skips node_modules only. Copies everything: .ts/.tsx originals (for jiti),
|
||||
* .mjs helpers, .md/.yaml/.json assets, etc.
|
||||
* esbuild compiled .js output already lands in dist-test, so we just
|
||||
* overlay the asset files on top.
|
||||
*/
|
||||
async function copyAssets(srcDir, destDir) {
|
||||
let entries;
|
||||
try {
|
||||
entries = await readdir(srcDir, { withFileTypes: true });
|
||||
} catch {
|
||||
return; // directory doesn't exist, nothing to copy
|
||||
}
|
||||
for (const entry of entries) {
|
||||
if (ASSET_SKIP_DIRS.has(entry.name)) continue;
|
||||
const srcPath = join(srcDir, entry.name);
|
||||
const destPath = join(destDir, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
await copyAssets(srcPath, destPath);
|
||||
} else {
|
||||
await mkdir(destDir, { recursive: true });
|
||||
await cp(srcPath, destPath, { force: true });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const start = Date.now();
|
||||
|
||||
// Collect entry points from src/ and packages/*/src/
|
||||
const srcFiles = await collectFiles(join(ROOT, 'src'));
|
||||
|
||||
const packagesDir = join(ROOT, 'packages');
|
||||
const pkgEntries = await readdir(packagesDir, { withFileTypes: true });
|
||||
const packageFiles = [];
|
||||
for (const entry of pkgEntries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
const pkgSrc = join(packagesDir, entry.name, 'src');
|
||||
packageFiles.push(...await collectFiles(pkgSrc));
|
||||
}
|
||||
|
||||
// Also compile web/lib/ — some tests import from ../../web/lib/
|
||||
const webLibFiles = await collectFiles(join(ROOT, 'web', 'lib'));
|
||||
|
||||
const entryPoints = [...srcFiles, ...packageFiles, ...webLibFiles];
|
||||
console.log(`Compiling ${entryPoints.length} files to dist-test/...`);
|
||||
|
||||
// bundle:false transforms TypeScript but keeps import specifiers verbatim.
|
||||
// We post-process the output to rewrite .ts → .js in import strings.
|
||||
await esbuild.build({
|
||||
entryPoints,
|
||||
outdir: join(ROOT, 'dist-test'),
|
||||
outbase: ROOT,
|
||||
bundle: false,
|
||||
format: 'esm',
|
||||
platform: 'node',
|
||||
target: 'node22',
|
||||
sourcemap: 'inline',
|
||||
packages: 'external',
|
||||
logLevel: 'warning',
|
||||
});
|
||||
|
||||
// Copy non-compiled assets from src/ to dist-test/src/ maintaining structure.
|
||||
// Tests use import.meta.url to resolve sibling .md, .yaml, .json, .ts etc.
|
||||
// Also copy original .ts files — jiti-based imports load .ts source directly.
|
||||
const srcDir = join(ROOT, 'src');
|
||||
const distSrcDir = join(ROOT, 'dist-test', 'src');
|
||||
await copyAssets(srcDir, distSrcDir);
|
||||
console.log('Copied non-TS assets and .ts source files to dist-test/src/');
|
||||
|
||||
// Copy packages/*/src/ assets as well
|
||||
for (const entry of pkgEntries) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
const pkgSrc = join(packagesDir, entry.name, 'src');
|
||||
const pkgDistSrc = join(ROOT, 'dist-test', 'packages', entry.name, 'src');
|
||||
await copyAssets(pkgSrc, pkgDistSrc);
|
||||
}
|
||||
|
||||
// Copy web/lib/ assets (tests import from ../../web/lib/ relative to dist-test/src/tests/)
|
||||
await copyAssets(join(ROOT, 'web', 'lib'), join(ROOT, 'dist-test', 'web', 'lib'));
|
||||
|
||||
// Copy scripts/ non-TS files (.cjs etc) — some tests require() scripts directly
|
||||
await copyAssets(join(ROOT, 'scripts'), join(ROOT, 'dist-test', 'scripts'));
|
||||
|
||||
// Copy root package.json — some tests read it to check version/engines fields
|
||||
await cp(join(ROOT, 'package.json'), join(ROOT, 'dist-test', 'package.json'), { force: true });
|
||||
|
||||
// Copy root dist/ into dist-test/dist/ — some tests compute projectRoot as
|
||||
// 3 levels up from dist-test/src/tests/ which lands at dist-test/, then
|
||||
// import from dist/mcp-server.js etc.
|
||||
const rootDistDir = join(ROOT, 'dist');
|
||||
const distTestDistDir = join(ROOT, 'dist-test', 'dist');
|
||||
await copyAssets(rootDistDir, distTestDistDir);
|
||||
|
||||
// Post-process: rewrite .ts import specifiers to .js in all compiled JS files.
|
||||
// esbuild with bundle:false preserves original specifiers; Node can't load .ts.
|
||||
const compiledJsFiles = await collectFiles(join(ROOT, 'dist-test'), ['.js']);
|
||||
// Regex matches .ts in from/import() strings but not sourceMappingURL comments
|
||||
const tsImportRe = /(from\s+["'])(\.\.?\/[^"']*?)\.ts(["'])/g;
|
||||
const tsDynImportRe = /(import\(["'])(\.\.?\/[^"']*?)\.ts(["'])\)/g;
|
||||
|
||||
let rewritten = 0;
|
||||
await Promise.all(compiledJsFiles.map(async (file) => {
|
||||
const src = await readFile(file, 'utf-8');
|
||||
const out = src
|
||||
.replace(tsImportRe, (_, a, b, c) => `${a}${b}.js${c}`)
|
||||
.replace(tsDynImportRe, (_, a, b, c) => `${a}${b}.js${c})`);
|
||||
if (out !== src) {
|
||||
await writeFile(file, out, 'utf-8');
|
||||
rewritten++;
|
||||
}
|
||||
}));
|
||||
if (rewritten > 0) {
|
||||
console.log(`Rewrote .ts → .js imports in ${rewritten} files`);
|
||||
}
|
||||
|
||||
// Remove stale compiled test files: dist-test entries whose source no longer exists
|
||||
// in a non-integration source directory (e.g. test moved to integration/).
|
||||
// Only cleans *.test.js and *.test.ts files to avoid touching non-test outputs.
|
||||
const { rm } = await import('node:fs/promises');
|
||||
const { existsSync } = await import('node:fs');
|
||||
const testDirsToClean = [
|
||||
[join(ROOT, 'dist-test', 'src', 'tests'), join(ROOT, 'src', 'tests')],
|
||||
[join(ROOT, 'dist-test', 'src', 'resources', 'extensions', 'gsd', 'tests'),
|
||||
join(ROOT, 'src', 'resources', 'extensions', 'gsd', 'tests')],
|
||||
];
|
||||
let staleCleaned = 0;
|
||||
for (const [distDir, srcDir] of testDirsToClean) {
|
||||
let distEntries;
|
||||
try { distEntries = await readdir(distDir, { withFileTypes: true }); } catch { continue; }
|
||||
for (const entry of distEntries) {
|
||||
if (!entry.isFile()) continue;
|
||||
if (!entry.name.match(/\.test\.(js|ts)$/)) continue;
|
||||
const stem = entry.name.replace(/\.(js|ts)$/, '');
|
||||
// Source could be .ts or .mjs (esbuild compiles both to .js)
|
||||
const hasTsSrc = existsSync(join(srcDir, stem + '.ts'));
|
||||
const hasMjsSrc = existsSync(join(srcDir, stem + '.mjs'));
|
||||
if (!hasTsSrc && !hasMjsSrc) {
|
||||
await rm(join(distDir, entry.name));
|
||||
staleCleaned++;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (staleCleaned > 0) {
|
||||
console.log(`Removed ${staleCleaned} stale compiled test files from dist-test/`);
|
||||
}
|
||||
|
||||
// Ensure dist-test/node_modules exists so resource-loader.ts (which computes
|
||||
// packageRoot from import.meta.url) resolves gsdNodeModules to a real path.
|
||||
// Without this, initResources creates dangling symlinks in test environments.
|
||||
const distNodeModules = join(ROOT, 'dist-test', 'node_modules');
|
||||
if (!existsSync(distNodeModules)) {
|
||||
symlinkSync(join(ROOT, 'node_modules'), distNodeModules);
|
||||
}
|
||||
|
||||
const elapsed = ((Date.now() - start) / 1000).toFixed(2);
|
||||
console.log(`Done in ${elapsed}s`);
|
||||
}
|
||||
|
||||
main().catch(err => {
|
||||
console.error(err);
|
||||
process.exit(1);
|
||||
});
|
||||
46
scripts/dist-test-resolve.mjs
Normal file
46
scripts/dist-test-resolve.mjs
Normal file
|
|
@ -0,0 +1,46 @@
|
|||
/**
|
||||
* Minimal Node.js import hook for running tests from dist-test/.
|
||||
*
|
||||
* esbuild with bundle:false preserves import specifiers verbatim, so compiled
|
||||
* .js files still import '../foo.ts'. This hook redirects those to '.js' so
|
||||
* Node can find the compiled output.
|
||||
*
|
||||
* Also redirects @gsd bare imports to their compiled counterparts in dist-test.
|
||||
*/
|
||||
|
||||
import { fileURLToPath, pathToFileURL } from 'node:url';
|
||||
import { existsSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
|
||||
// dist-test root — everything compiled lands here
|
||||
const DIST_TEST = new URL('../dist-test/', import.meta.url).href;
|
||||
|
||||
// Absolute paths to compiled @gsd/* entry points
|
||||
const GSD_ALIASES = {
|
||||
'@gsd/pi-coding-agent': new URL('../dist-test/packages/pi-coding-agent/src/index.js', import.meta.url).href,
|
||||
'@gsd/pi-ai/oauth': new URL('../dist-test/packages/pi-ai/src/utils/oauth/index.js', import.meta.url).href,
|
||||
'@gsd/pi-ai': new URL('../dist-test/packages/pi-ai/src/index.js', import.meta.url).href,
|
||||
'@gsd/pi-agent-core': new URL('../dist-test/packages/pi-agent-core/src/index.js', import.meta.url).href,
|
||||
'@gsd/pi-tui': new URL('../dist-test/packages/pi-tui/src/index.js', import.meta.url).href,
|
||||
'@gsd/native': new URL('../dist-test/packages/native/src/index.js', import.meta.url).href,
|
||||
};
|
||||
|
||||
export function resolve(specifier, context, nextResolve) {
|
||||
// 1. @gsd/* bare imports → compiled dist-test counterpart
|
||||
if (specifier in GSD_ALIASES) {
|
||||
return nextResolve(GSD_ALIASES[specifier], context);
|
||||
}
|
||||
|
||||
// 2. .ts relative imports inside dist-test → .js
|
||||
if (
|
||||
specifier.endsWith('.ts') &&
|
||||
(specifier.startsWith('./') || specifier.startsWith('../')) &&
|
||||
context.parentURL &&
|
||||
context.parentURL.startsWith(DIST_TEST)
|
||||
) {
|
||||
const jsSpecifier = specifier.slice(0, -3) + '.js';
|
||||
return nextResolve(jsSpecifier, context);
|
||||
}
|
||||
|
||||
return nextResolve(specifier, context);
|
||||
}
|
||||
|
|
@ -18,25 +18,6 @@ const { existsSync, statSync, readdirSync } = require('fs')
|
|||
const { resolve, join } = require('path')
|
||||
const { execSync } = require('child_process')
|
||||
|
||||
const root = resolve(__dirname, '..')
|
||||
const packagesDir = join(root, 'packages')
|
||||
|
||||
// Skip if packages/ doesn't exist (published tarball / end-user install)
|
||||
if (!existsSync(packagesDir)) process.exit(0)
|
||||
|
||||
// Skip in CI — the pipeline runs `npm run build` explicitly
|
||||
if (process.env.CI === 'true' || process.env.CI === '1') process.exit(0)
|
||||
|
||||
// Workspace packages that need dist/index.js at runtime.
|
||||
// Order matters: dependencies must build before dependents.
|
||||
const WORKSPACE_PACKAGES = [
|
||||
'native',
|
||||
'pi-tui',
|
||||
'pi-ai',
|
||||
'pi-agent-core',
|
||||
'pi-coding-agent',
|
||||
]
|
||||
|
||||
/**
|
||||
* Returns the most recent mtime (ms) of any .ts file under dir, recursively.
|
||||
* Returns 0 if no .ts files found.
|
||||
|
|
@ -56,31 +37,54 @@ function newestSrcMtime(dir) {
|
|||
return newest
|
||||
}
|
||||
|
||||
const stale = []
|
||||
for (const pkg of WORKSPACE_PACKAGES) {
|
||||
const distIndex = join(packagesDir, pkg, 'dist', 'index.js')
|
||||
if (!existsSync(distIndex)) {
|
||||
stale.push(pkg)
|
||||
continue
|
||||
if (require.main === module) {
|
||||
const root = resolve(__dirname, '..')
|
||||
const packagesDir = join(root, 'packages')
|
||||
|
||||
// Skip if packages/ doesn't exist (published tarball / end-user install)
|
||||
if (!existsSync(packagesDir)) process.exit(0)
|
||||
|
||||
// Skip in CI — the pipeline runs `npm run build` explicitly
|
||||
if (process.env.CI === 'true' || process.env.CI === '1') process.exit(0)
|
||||
|
||||
// Workspace packages that need dist/index.js at runtime.
|
||||
// Order matters: dependencies must build before dependents.
|
||||
const WORKSPACE_PACKAGES = [
|
||||
'native',
|
||||
'pi-tui',
|
||||
'pi-ai',
|
||||
'pi-agent-core',
|
||||
'pi-coding-agent',
|
||||
]
|
||||
|
||||
const stale = []
|
||||
for (const pkg of WORKSPACE_PACKAGES) {
|
||||
const distIndex = join(packagesDir, pkg, 'dist', 'index.js')
|
||||
if (!existsSync(distIndex)) {
|
||||
stale.push(pkg)
|
||||
continue
|
||||
}
|
||||
const distMtime = statSync(distIndex).mtimeMs
|
||||
const srcMtime = newestSrcMtime(join(packagesDir, pkg, 'src'))
|
||||
if (srcMtime > distMtime) {
|
||||
stale.push(pkg)
|
||||
}
|
||||
}
|
||||
const distMtime = statSync(distIndex).mtimeMs
|
||||
const srcMtime = newestSrcMtime(join(packagesDir, pkg, 'src'))
|
||||
if (srcMtime > distMtime) {
|
||||
stale.push(pkg)
|
||||
|
||||
if (stale.length === 0) process.exit(0)
|
||||
|
||||
process.stderr.write(` Building ${stale.length} workspace package(s) with stale or missing dist/: ${stale.join(', ')}\n`)
|
||||
|
||||
for (const pkg of stale) {
|
||||
const pkgDir = join(packagesDir, pkg)
|
||||
try {
|
||||
execSync('npm run build', { cwd: pkgDir, stdio: 'pipe' })
|
||||
process.stderr.write(` ✓ ${pkg}\n`)
|
||||
} catch (err) {
|
||||
process.stderr.write(` ✗ ${pkg} build failed: ${err.message}\n`)
|
||||
// Non-fatal — the user can run `npm run build` manually
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (stale.length === 0) process.exit(0)
|
||||
|
||||
process.stderr.write(` Building ${stale.length} workspace package(s) with stale or missing dist/: ${stale.join(', ')}\n`)
|
||||
|
||||
for (const pkg of stale) {
|
||||
const pkgDir = join(packagesDir, pkg)
|
||||
try {
|
||||
execSync('npm run build', { cwd: pkgDir, stdio: 'pipe' })
|
||||
process.stderr.write(` ✓ ${pkg}\n`)
|
||||
} catch (err) {
|
||||
process.stderr.write(` ✗ ${pkg} build failed: ${err.message}\n`)
|
||||
// Non-fatal — the user can run `npm run build` manually
|
||||
}
|
||||
}
|
||||
module.exports = { newestSrcMtime }
|
||||
|
|
|
|||
44
scripts/test-reporter-compact.mjs
Normal file
44
scripts/test-reporter-compact.mjs
Normal file
|
|
@ -0,0 +1,44 @@
|
|||
/**
|
||||
* Compact test reporter: silent on pass, prints failures + final summary.
|
||||
* Usage: --test-reporter=./scripts/test-reporter-compact.mjs
|
||||
*/
|
||||
import { Transform } from 'node:stream';
|
||||
|
||||
export default class CompactReporter extends Transform {
|
||||
#pass = 0;
|
||||
#fail = 0;
|
||||
#skip = 0;
|
||||
#failures = [];
|
||||
|
||||
constructor() {
|
||||
super({ objectMode: true });
|
||||
}
|
||||
|
||||
_transform(event, _enc, cb) {
|
||||
switch (event.type) {
|
||||
case 'test:pass':
|
||||
if (!event.data.skip) this.#pass++;
|
||||
else this.#skip++;
|
||||
break;
|
||||
case 'test:fail': {
|
||||
this.#fail++;
|
||||
const { name, details } = event.data;
|
||||
const err = details?.error;
|
||||
const msg = err?.message ?? String(err ?? 'unknown');
|
||||
const loc = err?.cause?.stack?.split('\n')[1]?.trim() ?? '';
|
||||
this.#failures.push(` ✖ ${name}\n ${msg}${loc ? `\n ${loc}` : ''}`);
|
||||
break;
|
||||
}
|
||||
}
|
||||
cb();
|
||||
}
|
||||
|
||||
_flush(cb) {
|
||||
if (this.#failures.length) {
|
||||
this.push(`\n✖ failing tests:\n${this.#failures.join('\n\n')}\n`);
|
||||
}
|
||||
const status = this.#fail === 0 ? '✔' : '✖';
|
||||
this.push(`\n${status} ${this.#pass} passed, ${this.#fail} failed, ${this.#skip} skipped\n`);
|
||||
cb();
|
||||
}
|
||||
}
|
||||
|
|
@ -13,7 +13,7 @@
|
|||
* research identified as critical for skill quality.
|
||||
*/
|
||||
|
||||
import { existsSync, readFileSync, readdirSync } from "node:fs";
|
||||
import { existsSync, readFileSync, readdirSync, statSync } from "node:fs";
|
||||
import { join } from "node:path";
|
||||
import { homedir } from "node:os";
|
||||
import type { UnitMetrics, MetricsLedger } from "./metrics.js";
|
||||
|
|
@ -210,7 +210,7 @@ export function formatSkillDetail(basePath: string, skillName: string): string {
|
|||
// Check for SKILL.md existence
|
||||
const skillPath = join(homedir(), ".agents", "skills", skillName, "SKILL.md");
|
||||
if (existsSync(skillPath)) {
|
||||
const stat = require("node:fs").statSync(skillPath);
|
||||
const stat = statSync(skillPath);
|
||||
lines.push("");
|
||||
lines.push(`SKILL.md: ${skillPath}`);
|
||||
lines.push(`Last modified: ${stat.mtime.toISOString().slice(0, 10)}`);
|
||||
|
|
|
|||
|
|
@ -31,7 +31,7 @@ import {
|
|||
isInAutoWorktree,
|
||||
getAutoWorktreeOriginalBase,
|
||||
mergeMilestoneToMain,
|
||||
} from "../auto-worktree.ts";
|
||||
} from "../../auto-worktree.ts";
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
|
||||
|
|
@ -78,9 +78,9 @@ function createMilestoneArtifacts(dir: string, mid: string): void {
|
|||
// ─── Source-level: verify the merge code exists in the "all complete" path ────
|
||||
|
||||
test("auto-loop 'all milestones complete' path merges before stopping (#962)", () => {
|
||||
const loopSrc = readFileSync(join(__dirname, "..", "auto", "phases.ts"), "utf-8");
|
||||
const loopSrc = readFileSync(join(__dirname, "../..", "auto", "phases.ts"), "utf-8");
|
||||
const resolverSrc = readFileSync(
|
||||
join(__dirname, "..", "worktree-resolver.ts"),
|
||||
join(__dirname, "../..", "worktree-resolver.ts"),
|
||||
"utf-8",
|
||||
);
|
||||
|
||||
|
|
@ -9,7 +9,7 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import test from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
import { runGSDDoctor } from "../doctor.ts";
|
||||
import { runGSDDoctor } from "../../doctor.ts";
|
||||
|
||||
function makeTmp(name: string): string {
|
||||
const dir = join(tmpdir(), `atomic-closeout-${name}-${Date.now()}-${Math.random().toString(36).slice(2)}`);
|
||||
|
|
@ -4,7 +4,7 @@ import { mkdtempSync, mkdirSync, rmSync, writeFileSync } from "node:fs";
|
|||
import { join } from "node:path";
|
||||
import { tmpdir } from "node:os";
|
||||
|
||||
import { runGSDDoctor, selectDoctorScope, filterDoctorIssues } from "../doctor.js";
|
||||
import { runGSDDoctor, selectDoctorScope, filterDoctorIssues } from "../../doctor.js";
|
||||
|
||||
test("auto-preflight scopes to active milestone, ignoring historical", async (t) => {
|
||||
const tmpBase = mkdtempSync(join(tmpdir(), "gsd-auto-preflight-test-"));
|
||||
|
|
@ -11,19 +11,19 @@ import {
|
|||
diagnoseExpectedArtifact,
|
||||
buildLoopRemediationSteps,
|
||||
hasImplementationArtifacts,
|
||||
} from "../auto-recovery.ts";
|
||||
import { parseRoadmap, parsePlan } from "../parsers-legacy.ts";
|
||||
import { parseTaskPlanFile, clearParseCache } from "../files.ts";
|
||||
import { invalidateAllCaches } from "../cache.ts";
|
||||
import { deriveState, invalidateStateCache } from "../state.ts";
|
||||
} from "../../auto-recovery.ts";
|
||||
import { parseRoadmap, parsePlan } from "../../parsers-legacy.ts";
|
||||
import { parseTaskPlanFile, clearParseCache } from "../../files.ts";
|
||||
import { invalidateAllCaches } from "../../cache.ts";
|
||||
import { deriveState, invalidateStateCache } from "../../state.ts";
|
||||
import {
|
||||
openDatabase,
|
||||
closeDatabase,
|
||||
insertMilestone,
|
||||
insertSlice,
|
||||
insertTask,
|
||||
} from "../gsd-db.ts";
|
||||
import { renderPlanFromDb } from "../markdown-renderer.ts";
|
||||
} from "../../gsd-db.ts";
|
||||
import { renderPlanFromDb } from "../../markdown-renderer.ts";
|
||||
|
||||
function makeTmpBase(): string {
|
||||
const base = join(tmpdir(), `gsd-test-${randomUUID()}`);
|
||||
|
|
@ -16,8 +16,8 @@ import assert from 'node:assert/strict';
|
|||
import { mkdirSync, writeFileSync, readFileSync, rmSync } from 'node:fs';
|
||||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
import { getManifestStatus } from '../files.ts';
|
||||
import { collectSecretsFromManifest } from '../../get-secrets-from-user.ts';
|
||||
import { getManifestStatus } from '../../files.ts';
|
||||
import { collectSecretsFromManifest } from '../../../get-secrets-from-user.ts';
|
||||
|
||||
function makeTempDir(prefix: string): string {
|
||||
const dir = join(tmpdir(), `${prefix}-${Date.now()}-${Math.random().toString(36).slice(2)}`);
|
||||
|
|
@ -12,8 +12,8 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { execSync } from "node:child_process";
|
||||
|
||||
import { createAutoWorktree, mergeMilestoneToMain } from "../auto-worktree.ts";
|
||||
import { nativeMergeSquash } from "../native-git-bridge.ts";
|
||||
import { createAutoWorktree, mergeMilestoneToMain } from "../../auto-worktree.ts";
|
||||
import { nativeMergeSquash } from "../../native-git-bridge.ts";
|
||||
|
||||
function run(cmd: string, cwd: string): string {
|
||||
return execSync(cmd, { cwd, stdio: ["ignore", "pipe", "pipe"], encoding: "utf-8" }).trim();
|
||||
|
|
@ -88,7 +88,7 @@ test("#2151 bug 1: auto-stash unblocks merge when unrelated files are dirty", ()
|
|||
});
|
||||
|
||||
test("#2151 bug 2: nativeMergeSquash returns dirty filenames", async () => {
|
||||
const { nativeMergeSquash } = await import("../native-git-bridge.ts");
|
||||
const { nativeMergeSquash } = await import("../../native-git-bridge.ts");
|
||||
const repo = createTempRepo();
|
||||
try {
|
||||
run("git checkout -b milestone/M210", repo);
|
||||
|
|
@ -21,9 +21,9 @@ import {
|
|||
createAutoWorktree,
|
||||
mergeMilestoneToMain,
|
||||
getAutoWorktreeOriginalBase,
|
||||
} from "../auto-worktree.ts";
|
||||
import { getSliceBranchName } from "../worktree.ts";
|
||||
import { nativeMergeSquash } from "../native-git-bridge.ts";
|
||||
} from "../../auto-worktree.ts";
|
||||
import { getSliceBranchName } from "../../worktree.ts";
|
||||
import { nativeMergeSquash } from "../../native-git-bridge.ts";
|
||||
|
||||
function run(cmd: string, cwd: string): string {
|
||||
// Safe: all inputs are hardcoded test strings, not user input
|
||||
|
|
@ -329,7 +329,7 @@ describe("auto-worktree-milestone-merge", { timeout: 300_000 }, () => {
|
|||
});
|
||||
|
||||
test("#1738 bug 1: nativeMergeSquash detects dirty working tree", async () => {
|
||||
const { nativeMergeSquash } = await import("../native-git-bridge.ts");
|
||||
const { nativeMergeSquash } = await import("../../native-git-bridge.ts");
|
||||
const repo = freshRepo();
|
||||
|
||||
run("git checkout -b milestone/M070", repo);
|
||||
|
|
@ -20,7 +20,7 @@ import {
|
|||
enterAutoWorktree,
|
||||
getAutoWorktreeOriginalBase,
|
||||
getActiveAutoWorktreeContext,
|
||||
} from "../auto-worktree.ts";
|
||||
} from "../../auto-worktree.ts";
|
||||
|
||||
// Note: execSync is used intentionally in tests for git operations with
|
||||
// controlled, hardcoded inputs (no user input). This is safe and matches
|
||||
|
|
@ -150,7 +150,7 @@ describe("auto-worktree lifecycle", () => {
|
|||
run("git commit -m \"add milestone\"", tempDir);
|
||||
|
||||
// Import createWorktree directly for manual worktree
|
||||
const { createWorktree } = await import("../worktree-manager.ts");
|
||||
const { createWorktree } = await import("../../worktree-manager.ts");
|
||||
|
||||
// Create manual worktree (uses worktree/<name> branch)
|
||||
const manualWt = createWorktree(tempDir, "feature-x");
|
||||
|
|
@ -164,7 +164,7 @@ describe("auto-worktree lifecycle", () => {
|
|||
|
||||
// Cleanup both
|
||||
teardownAutoWorktree(tempDir, "M003");
|
||||
const { removeWorktree } = await import("../worktree-manager.ts");
|
||||
const { removeWorktree } = await import("../../worktree-manager.ts");
|
||||
removeWorktree(tempDir, "feature-x");
|
||||
});
|
||||
|
||||
|
|
@ -190,7 +190,7 @@ describe("auto-worktree lifecycle", () => {
|
|||
run("git add .", tempDir);
|
||||
run("git commit -m \"add milestone\"", tempDir);
|
||||
|
||||
const { GitServiceImpl } = await import("../git-service.ts");
|
||||
const { GitServiceImpl } = await import("../../git-service.ts");
|
||||
|
||||
// Create worktree
|
||||
const wtPath = createAutoWorktree(tempDir, "M005");
|
||||
|
|
@ -215,7 +215,7 @@ describe("auto-worktree lifecycle", () => {
|
|||
run("git commit -m \"add milestone\"", tempDir);
|
||||
|
||||
// Simulate a crash leaving a stale directory with no .git file.
|
||||
const { worktreePath } = await import("../worktree-manager.ts");
|
||||
const { worktreePath } = await import("../../worktree-manager.ts");
|
||||
const staleDir = worktreePath(tempDir, "M010");
|
||||
mkdirSync(staleDir, { recursive: true });
|
||||
writeFileSync(join(staleDir, "orphan.txt"), "stale leftover\n");
|
||||
|
|
@ -12,7 +12,7 @@
|
|||
import { describe, it } from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
|
||||
import { computeBudgets } from "../context-budget.js";
|
||||
import { computeBudgets } from "../../context-budget.js";
|
||||
|
||||
// ─── Pure threshold / pipeline tests ──────────────────────────────────────────
|
||||
// These test the budget engine outputs that the continue-here monitor relies on.
|
||||
|
|
@ -164,7 +164,7 @@ describe("continue-here", () => {
|
|||
describe("continueHereFired runtime record field", () => {
|
||||
it("AutoUnitRuntimeRecord includes continueHereFired with default false", async (t) => {
|
||||
// Import writeUnitRuntimeRecord to verify the field is present and defaults
|
||||
const { writeUnitRuntimeRecord, readUnitRuntimeRecord, clearUnitRuntimeRecord } = await import("../unit-runtime.js");
|
||||
const { writeUnitRuntimeRecord, readUnitRuntimeRecord, clearUnitRuntimeRecord } = await import("../../unit-runtime.js");
|
||||
const fs = await import("node:fs");
|
||||
const path = await import("node:path");
|
||||
const os = await import("node:os");
|
||||
|
|
@ -202,7 +202,7 @@ describe("continue-here", () => {
|
|||
|
||||
describe("context-pressure monitor integration", () => {
|
||||
it("should fire wrap-up when context >= threshold and mark continueHereFired", async (t) => {
|
||||
const { writeUnitRuntimeRecord, readUnitRuntimeRecord, clearUnitRuntimeRecord } = await import("../unit-runtime.js");
|
||||
const { writeUnitRuntimeRecord, readUnitRuntimeRecord, clearUnitRuntimeRecord } = await import("../../unit-runtime.js");
|
||||
const fs = await import("node:fs");
|
||||
const path = await import("node:path");
|
||||
const os = await import("node:os");
|
||||
|
|
@ -10,7 +10,7 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import test from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
import { runGSDDoctor } from "../doctor.ts";
|
||||
import { runGSDDoctor } from "../../doctor.ts";
|
||||
|
||||
function makeTmp(name: string): string {
|
||||
const dir = join(tmpdir(), `doctor-deferral-${name}-${Date.now()}-${Math.random().toString(36).slice(2)}`);
|
||||
|
|
@ -10,7 +10,7 @@ import assert from "node:assert/strict";
|
|||
import { mkdtempSync, mkdirSync, readFileSync, rmSync, writeFileSync } from "node:fs";
|
||||
import { join } from "node:path";
|
||||
import { tmpdir } from "node:os";
|
||||
import { runGSDDoctor } from "../doctor.js";
|
||||
import { runGSDDoctor } from "../../doctor.js";
|
||||
|
||||
test("doctor fix=true sanitizes em-dash in milestone title", async (t) => {
|
||||
const tmpBase = mkdtempSync(join(tmpdir(), "gsd-doctor-delim-"));
|
||||
|
|
@ -4,8 +4,8 @@ import { mkdtempSync, mkdirSync, rmSync, writeFileSync, existsSync } from "node:
|
|||
import { join } from "node:path";
|
||||
import { tmpdir } from "node:os";
|
||||
|
||||
import { runGSDDoctor } from "../doctor.js";
|
||||
import { formatDoctorReportJson } from "../doctor-format.js";
|
||||
import { runGSDDoctor } from "../../doctor.js";
|
||||
import { formatDoctorReportJson } from "../../doctor-format.js";
|
||||
// ── Helpers ─────────────────────────────────────────────────────────────────
|
||||
|
||||
function makeBase(): { base: string; gsd: string; mDir: string } {
|
||||
|
|
@ -230,7 +230,7 @@ describe('doctor-enhancements', async () => {
|
|||
const historyPath = join(gsd, "doctor-history.jsonl");
|
||||
assert.ok(existsSync(historyPath), "doctor-history.jsonl is created after run");
|
||||
|
||||
const { readDoctorHistory } = await import("../doctor.js");
|
||||
const { readDoctorHistory } = await import("../../doctor.js");
|
||||
const history = await readDoctorHistory(base);
|
||||
assert.ok(history.length >= 1, "history has at least one entry");
|
||||
assert.ok(typeof history[0]?.ts === "string", "history entry has ts field");
|
||||
|
|
@ -20,7 +20,7 @@ import {
|
|||
runEnvironmentChecks,
|
||||
environmentResultsToDoctorIssues,
|
||||
checkEnvironmentHealth,
|
||||
} from "../doctor-environment.ts";
|
||||
} from "../../doctor-environment.ts";
|
||||
/** Create a directory tree with files. */
|
||||
function createDir(files: Record<string, string> = {}): string {
|
||||
const dir = mkdtempSync(join(tmpdir(), "gsd-wt-env-"));
|
||||
|
|
@ -26,7 +26,7 @@ import {
|
|||
formatEnvironmentReport,
|
||||
checkEnvironmentHealth,
|
||||
type EnvironmentCheckResult,
|
||||
} from "../doctor-environment.ts";
|
||||
} from "../../doctor-environment.ts";
|
||||
function createProjectDir(files: Record<string, string> = {}): string {
|
||||
const dir = mkdtempSync(join(tmpdir(), "gsd-env-test-"));
|
||||
for (const [name, content] of Object.entries(files)) {
|
||||
|
|
@ -14,8 +14,8 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import test from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
import { runGSDDoctor } from "../doctor.ts";
|
||||
import { closeDatabase } from "../gsd-db.ts";
|
||||
import { runGSDDoctor } from "../../doctor.ts";
|
||||
import { closeDatabase } from "../../gsd-db.ts";
|
||||
|
||||
function makeTmp(name: string): string {
|
||||
const dir = join(tmpdir(), `doctor-fixlevel-${name}-${Date.now()}-${Math.random().toString(36).slice(2)}`);
|
||||
|
|
@ -15,7 +15,7 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { execSync } from "node:child_process";
|
||||
|
||||
import { runGSDDoctor } from "../doctor.ts";
|
||||
import { runGSDDoctor } from "../../doctor.ts";
|
||||
function run(cmd: string, cwd: string): string {
|
||||
return execSync(cmd, { cwd, stdio: ["ignore", "pipe", "pipe"], encoding: "utf-8" }).trim();
|
||||
}
|
||||
|
|
@ -23,7 +23,7 @@ import {
|
|||
checkHealEscalation,
|
||||
resetProactiveHealing,
|
||||
formatHealthSummary,
|
||||
} from "../doctor-proactive.ts";
|
||||
} from "../../doctor-proactive.ts";
|
||||
function run(cmd: string, cwd: string): string {
|
||||
return execSync(cmd, { cwd, stdio: ["ignore", "pipe", "pipe"], encoding: "utf-8" }).trim();
|
||||
}
|
||||
|
|
@ -12,7 +12,7 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import test from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
import { runGSDDoctor } from "../doctor.ts";
|
||||
import { runGSDDoctor } from "../../doctor.ts";
|
||||
|
||||
function makeTmp(name: string): string {
|
||||
const dir = join(tmpdir(), `doctor-roadmap-summary-${name}-${Date.now()}-${Math.random().toString(36).slice(2)}`);
|
||||
|
|
@ -14,7 +14,7 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { execSync } from "node:child_process";
|
||||
|
||||
import { runGSDDoctor } from "../doctor.ts";
|
||||
import { runGSDDoctor } from "../../doctor.ts";
|
||||
function run(cmd: string, cwd: string): string {
|
||||
return execSync(cmd, { cwd, stdio: ["ignore", "pipe", "pipe"], encoding: "utf-8" }).trim();
|
||||
}
|
||||
|
|
@ -4,7 +4,7 @@ import { mkdtempSync, mkdirSync, readFileSync, rmSync, writeFileSync, existsSync
|
|||
import { join } from "node:path";
|
||||
import { tmpdir } from "node:os";
|
||||
|
||||
import { formatDoctorReport, runGSDDoctor, summarizeDoctorIssues, filterDoctorIssues, selectDoctorScope, validateTitle } from "../doctor.js";
|
||||
import { formatDoctorReport, runGSDDoctor, summarizeDoctorIssues, filterDoctorIssues, selectDoctorScope, validateTitle } from "../../doctor.js";
|
||||
const tmpBase = mkdtempSync(join(tmpdir(), "gsd-doctor-test-"));
|
||||
const gsd = join(tmpBase, ".gsd");
|
||||
const mDir = join(gsd, "milestones", "M001");
|
||||
|
|
@ -34,11 +34,11 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { stringify, parse } from "yaml";
|
||||
|
||||
import { CustomWorkflowEngine } from "../custom-workflow-engine.ts";
|
||||
import { CustomExecutionPolicy } from "../custom-execution-policy.ts";
|
||||
import { createRun, listRuns } from "../run-manager.ts";
|
||||
import { readGraph, writeGraph } from "../graph.ts";
|
||||
import { validateDefinition } from "../definition-loader.ts";
|
||||
import { CustomWorkflowEngine } from "../../custom-workflow-engine.ts";
|
||||
import { CustomExecutionPolicy } from "../../custom-execution-policy.ts";
|
||||
import { createRun, listRuns } from "../../run-manager.ts";
|
||||
import { readGraph, writeGraph } from "../../graph.ts";
|
||||
import { validateDefinition } from "../../definition-loader.ts";
|
||||
|
||||
// ─── Helpers ─────────────────────────────────────────────────────────────
|
||||
|
||||
|
|
@ -26,10 +26,10 @@ import {
|
|||
createAutoWorktree,
|
||||
mergeMilestoneToMain,
|
||||
autoWorktreeBranch,
|
||||
} from "../auto-worktree.ts";
|
||||
import { captureIntegrationBranch, getSliceBranchName } from "../worktree.ts";
|
||||
import { writeIntegrationBranch, readIntegrationBranch } from "../git-service.ts";
|
||||
import { nextMilestoneId, generateMilestoneSuffix } from "../guided-flow.ts";
|
||||
} from "../../auto-worktree.ts";
|
||||
import { captureIntegrationBranch, getSliceBranchName } from "../../worktree.ts";
|
||||
import { writeIntegrationBranch, readIntegrationBranch } from "../../git-service.ts";
|
||||
import { nextMilestoneId, generateMilestoneSuffix } from "../../guided-flow.ts";
|
||||
|
||||
// ─── Helpers ────────────────────────────────────────────────────────────────
|
||||
|
||||
|
|
@ -12,9 +12,9 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { execFileSync } from "node:child_process";
|
||||
|
||||
import { GIT_NO_PROMPT_ENV } from "../git-constants.ts";
|
||||
import { nativeAddAllWithExclusions } from "../native-git-bridge.ts";
|
||||
import { RUNTIME_EXCLUSION_PATHS } from "../git-service.ts";
|
||||
import { GIT_NO_PROMPT_ENV } from "../../git-constants.ts";
|
||||
import { nativeAddAllWithExclusions } from "../../native-git-bridge.ts";
|
||||
import { RUNTIME_EXCLUSION_PATHS } from "../../git-service.ts";
|
||||
function git(cwd: string, ...args: string[]): string {
|
||||
return execFileSync("git", args, { cwd, stdio: ["ignore", "pipe", "pipe"], encoding: "utf-8" }).trim();
|
||||
}
|
||||
|
|
@ -101,7 +101,7 @@ describe('git-locale', async () => {
|
|||
// We verify indirectly: the source code must pass env: GIT_NO_PROMPT_ENV.
|
||||
// Read the source and check for the pattern. This is a static check.
|
||||
const src = readFileSync(
|
||||
join(import.meta.dirname, "..", "native-git-bridge.ts"),
|
||||
join(import.meta.dirname, "../..", "native-git-bridge.ts"),
|
||||
"utf-8"
|
||||
);
|
||||
|
||||
|
|
@ -14,7 +14,7 @@ import assert from "node:assert/strict";
|
|||
import {
|
||||
abortAndReset,
|
||||
formatGitError,
|
||||
} from "../git-self-heal.js";
|
||||
} from "../../git-self-heal.js";
|
||||
|
||||
// ─── Helpers ─────────────────────────────────────────────────────────
|
||||
|
||||
|
|
@ -20,8 +20,8 @@ import {
|
|||
type CommitOptions,
|
||||
type PreMergeCheckResult,
|
||||
type TaskCommitContext,
|
||||
} from "../git-service.ts";
|
||||
import { nativeAddAllWithExclusions } from "../native-git-bridge.ts";
|
||||
} from "../../git-service.ts";
|
||||
import { nativeAddAllWithExclusions } from "../../native-git-bridge.ts";
|
||||
function run(command: string, cwd: string): string {
|
||||
return execSync(command, { cwd, stdio: ["ignore", "pipe", "pipe"], encoding: "utf-8" }).trim();
|
||||
}
|
||||
|
|
@ -1113,7 +1113,7 @@ describe('git-service', async () => {
|
|||
// ─── untrackRuntimeFiles: removes tracked runtime files from index ───
|
||||
|
||||
test('untrackRuntimeFiles', async () => {
|
||||
const { untrackRuntimeFiles } = await import("../gitignore.ts");
|
||||
const { untrackRuntimeFiles } = await import("../../gitignore.ts");
|
||||
const repo = mkdtempSync(join(tmpdir(), "gsd-untrack-"));
|
||||
runGit(repo, ["init", "-b", "main"]);
|
||||
runGit(repo, ["config", "user.email", "test@test.com"]);
|
||||
|
|
@ -1222,7 +1222,7 @@ describe('git-service', async () => {
|
|||
// ─── ensureGitignore: always adds .gsd to gitignore ──────────────────
|
||||
|
||||
test('ensureGitignore: adds .gsd entry', async () => {
|
||||
const { ensureGitignore } = await import("../gitignore.ts");
|
||||
const { ensureGitignore } = await import("../../gitignore.ts");
|
||||
const repo = mkdtempSync(join(tmpdir(), "gsd-gitignore-external-state-"));
|
||||
|
||||
// Should add .gsd to gitignore (external state dir is a symlink)
|
||||
|
|
@ -22,8 +22,8 @@ import {
|
|||
import { join } from "node:path";
|
||||
import { tmpdir } from "node:os";
|
||||
|
||||
import { ensureGitignore, hasGitTrackedGsdFiles } from "../gitignore.ts";
|
||||
import { migrateToExternalState } from "../migrate-external.ts";
|
||||
import { ensureGitignore, hasGitTrackedGsdFiles } from "../../gitignore.ts";
|
||||
import { migrateToExternalState } from "../../migrate-external.ts";
|
||||
|
||||
// ─── Helpers ─────────────────────────────────────────────────────────
|
||||
|
||||
|
|
@ -7,7 +7,7 @@ import {
|
|||
writeBlockerPlaceholder,
|
||||
verifyExpectedArtifact,
|
||||
buildLoopRemediationSteps,
|
||||
} from "../auto-recovery.ts";
|
||||
} from "../../auto-recovery.ts";
|
||||
import { describe, test, beforeEach, afterEach } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
|
||||
|
|
@ -299,7 +299,7 @@ test('writeBlockerPlaceholder: updates DB task status for execute-task (#2531)',
|
|||
const base = createFixtureBase();
|
||||
try {
|
||||
const { openDatabase, closeDatabase, insertMilestone, insertSlice, insertTask, getTask, isDbAvailable } =
|
||||
await import("../gsd-db.ts");
|
||||
await import("../../gsd-db.ts");
|
||||
|
||||
const dbPath = join(base, ".gsd", "gsd.db");
|
||||
// Create the tasks directory (required for artifact path resolution)
|
||||
|
|
@ -334,7 +334,7 @@ test('writeBlockerPlaceholder: does NOT update DB for non-execute-task types', a
|
|||
const base = createFixtureBase();
|
||||
try {
|
||||
const { openDatabase, closeDatabase, insertMilestone, insertSlice, getSlice, isDbAvailable } =
|
||||
await import("../gsd-db.ts");
|
||||
await import("../../gsd-db.ts");
|
||||
|
||||
const dbPath = join(base, ".gsd", "gsd.db");
|
||||
mkdirSync(join(base, ".gsd", "milestones", "M001", "slices", "S01"), { recursive: true });
|
||||
|
|
@ -24,7 +24,7 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { execFileSync } from "node:child_process";
|
||||
|
||||
import { isInheritedRepo } from "../repo-identity.ts";
|
||||
import { isInheritedRepo } from "../../repo-identity.ts";
|
||||
|
||||
function run(cmd: string, args: string[], cwd: string): string {
|
||||
return execFileSync(cmd, args, {
|
||||
|
|
@ -12,15 +12,15 @@ import { mkdtempSync, mkdirSync, rmSync, writeFileSync, readFileSync, appendFile
|
|||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
|
||||
import { openDatabase, closeDatabase, isDbAvailable, _getAdapter } from '../gsd-db.ts';
|
||||
import { migrateFromMarkdown, parseDecisionsTable } from '../md-importer.ts';
|
||||
import { openDatabase, closeDatabase, isDbAvailable, _getAdapter } from '../../gsd-db.ts';
|
||||
import { migrateFromMarkdown, parseDecisionsTable } from '../../md-importer.ts';
|
||||
import {
|
||||
queryDecisions,
|
||||
queryRequirements,
|
||||
formatDecisionsForPrompt,
|
||||
formatRequirementsForPrompt,
|
||||
} from '../context-store.ts';
|
||||
import { saveDecisionToDb, generateDecisionsMd } from '../db-writer.ts';
|
||||
} from '../../context-store.ts';
|
||||
import { saveDecisionToDb, generateDecisionsMd } from '../../db-writer.ts';
|
||||
import { describe, test, beforeEach, afterEach } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
|
||||
|
|
@ -11,15 +11,15 @@ import { execSync } from 'node:child_process';
|
|||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
|
||||
import { deriveState } from '../state.ts';
|
||||
import { indexWorkspace } from '../workspace-index.ts';
|
||||
import { inlinePriorMilestoneSummary } from '../files.ts';
|
||||
import { getPriorSliceCompletionBlocker } from '../dispatch-guard.ts';
|
||||
import { deriveState } from '../../state.ts';
|
||||
import { indexWorkspace } from '../../workspace-index.ts';
|
||||
import { inlinePriorMilestoneSummary } from '../../files.ts';
|
||||
import { getPriorSliceCompletionBlocker } from '../../dispatch-guard.ts';
|
||||
import {
|
||||
getSliceBranchName,
|
||||
parseSliceBranch,
|
||||
} from '../worktree.ts';
|
||||
import { clearPathCache } from '../paths.ts';
|
||||
} from '../../worktree.ts';
|
||||
import { clearPathCache } from '../../paths.ts';
|
||||
import { describe, test, beforeEach, afterEach } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
|
||||
|
|
@ -50,11 +50,11 @@ import {
|
|||
transaction,
|
||||
isDbAvailable,
|
||||
_getAdapter,
|
||||
} from "../gsd-db.ts";
|
||||
} from "../../gsd-db.ts";
|
||||
|
||||
// ── Tool handlers ─────────────────────────────────────────────────────────
|
||||
import { handleCompleteTask } from "../tools/complete-task.ts";
|
||||
import { handleCompleteSlice } from "../tools/complete-slice.ts";
|
||||
import { handleCompleteTask } from "../../tools/complete-task.ts";
|
||||
import { handleCompleteSlice } from "../../tools/complete-slice.ts";
|
||||
|
||||
// ── Markdown renderer ─────────────────────────────────────────────────────
|
||||
import {
|
||||
|
|
@ -63,32 +63,32 @@ import {
|
|||
renderAllFromDb,
|
||||
detectStaleRenders,
|
||||
repairStaleRenders,
|
||||
} from "../markdown-renderer.ts";
|
||||
} from "../../markdown-renderer.ts";
|
||||
|
||||
// ── State derivation ──────────────────────────────────────────────────────
|
||||
import {
|
||||
deriveStateFromDb,
|
||||
_deriveStateImpl,
|
||||
invalidateStateCache,
|
||||
} from "../state.ts";
|
||||
} from "../../state.ts";
|
||||
|
||||
// ── Auto-migration ───────────────────────────────────────────────────────
|
||||
import {
|
||||
migrateHierarchyToDb,
|
||||
migrateFromMarkdown,
|
||||
} from "../md-importer.ts";
|
||||
} from "../../md-importer.ts";
|
||||
|
||||
// ── Post-unit diagnostics ─────────────────────────────────────────────────
|
||||
import { detectRogueFileWrites } from "../auto-post-unit.ts";
|
||||
import { detectRogueFileWrites } from "../../auto-post-unit.ts";
|
||||
|
||||
// ── Doctor ────────────────────────────────────────────────────────────────
|
||||
import { runGSDDoctor } from "../doctor.ts";
|
||||
import { runGSDDoctor } from "../../doctor.ts";
|
||||
|
||||
// ── Undo/reset ────────────────────────────────────────────────────────────
|
||||
import { handleUndoTask, handleResetSlice } from "../undo.ts";
|
||||
import { handleUndoTask, handleResetSlice } from "../../undo.ts";
|
||||
|
||||
// ── Cache invalidation ───────────────────────────────────────────────────
|
||||
import { invalidateAllCaches } from "../cache.ts";
|
||||
import { invalidateAllCaches } from "../../cache.ts";
|
||||
|
||||
// ═══════════════════════════════════════════════════════════════════════════
|
||||
// Helpers
|
||||
|
|
@ -400,7 +400,7 @@ test("full lifecycle: migration through completion through doctor", async (t) =>
|
|||
writeFileSync(join(rogueDir, "T99-SUMMARY.md"), "# Rogue Summary\n", "utf-8");
|
||||
|
||||
// Clear path cache so resolveTaskFile sees the newly written file
|
||||
const { clearPathCache } = await import("../paths.ts");
|
||||
const { clearPathCache } = await import("../../paths.ts");
|
||||
clearPathCache();
|
||||
|
||||
const rogues = detectRogueFileWrites("execute-task", "M001/S01/T99", base);
|
||||
|
|
@ -458,7 +458,7 @@ test("recovery: DB loss → migrateFromMarkdown restores state, stale render det
|
|||
assert.equal(existsSync(dbPath), false, "DB file should be deleted");
|
||||
|
||||
// Clear path caches so gsdRoot re-probes after DB deletion
|
||||
const { clearPathCache: clearPaths } = await import("../paths.ts");
|
||||
const { clearPathCache: clearPaths } = await import("../../paths.ts");
|
||||
clearPaths();
|
||||
invalidateAllCaches();
|
||||
|
||||
|
|
@ -13,8 +13,8 @@ import {
|
|||
transformToGSD,
|
||||
generatePreview,
|
||||
writeGSDDirectory,
|
||||
} from '../migrate/index.ts';
|
||||
import { deriveState } from '../state.ts';
|
||||
} from '../../migrate/index.ts';
|
||||
import { deriveState } from '../../state.ts';
|
||||
import { describe, test, beforeEach, afterEach } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
|
||||
|
|
@ -24,7 +24,7 @@ import {
|
|||
isInAutoWorktree,
|
||||
getAutoWorktreeOriginalBase,
|
||||
mergeMilestoneToMain,
|
||||
} from "../auto-worktree.ts";
|
||||
} from "../../auto-worktree.ts";
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
|
||||
|
|
@ -124,7 +124,7 @@ test("worktree swap on milestone transition: merge old, create new", () => {
|
|||
|
||||
test("auto/phases.ts milestone transition block contains worktree lifecycle", () => {
|
||||
const phasesSrc = readFileSync(
|
||||
join(__dirname, "..", "auto", "phases.ts"),
|
||||
join(__dirname, "../..", "auto", "phases.ts"),
|
||||
"utf-8",
|
||||
);
|
||||
|
||||
|
|
@ -147,7 +147,7 @@ test("auto/phases.ts milestone transition block contains worktree lifecycle", ()
|
|||
|
||||
test("worktree-resolver mergeAndExit preserves branch when roadmap is missing (#1573)", () => {
|
||||
const resolverSrc = readFileSync(
|
||||
join(__dirname, "..", "worktree-resolver.ts"),
|
||||
join(__dirname, "../..", "worktree-resolver.ts"),
|
||||
"utf-8",
|
||||
);
|
||||
|
||||
|
|
@ -32,12 +32,12 @@ import {
|
|||
mergeAllCompleted,
|
||||
formatMergeResults,
|
||||
type MergeResult,
|
||||
} from "../parallel-merge.ts";
|
||||
import type { WorkerInfo } from "../parallel-orchestrator.ts";
|
||||
} from "../../parallel-merge.ts";
|
||||
import type { WorkerInfo } from "../../parallel-orchestrator.ts";
|
||||
import {
|
||||
writeSessionStatus,
|
||||
readSessionStatus,
|
||||
} from "../session-status-io.ts";
|
||||
} from "../../session-status-io.ts";
|
||||
|
||||
// ─── Helpers ──────────────────────────────────────────────────────────────────
|
||||
|
||||
|
|
@ -26,12 +26,12 @@ import {
|
|||
getWorkerBatches,
|
||||
hasActiveWorkers,
|
||||
resetWorkerRegistry,
|
||||
} from '../../subagent/worker-registry.ts';
|
||||
} from '../../../subagent/worker-registry.ts';
|
||||
import {
|
||||
getBudgetAlertLevel,
|
||||
getNewBudgetAlertLevel,
|
||||
getBudgetEnforcementAction,
|
||||
} from '../auto-budget.ts';
|
||||
} from '../../auto-budget.ts';
|
||||
import {
|
||||
type UnitMetrics,
|
||||
type MetricsLedger,
|
||||
|
|
@ -42,7 +42,7 @@ import {
|
|||
formatCostProjection,
|
||||
getAverageCostPerUnitType,
|
||||
predictRemainingCost,
|
||||
} from '../metrics.ts';
|
||||
} from '../../metrics.ts';
|
||||
|
||||
// ─── Fixture helpers ──────────────────────────────────────────────────────────
|
||||
|
||||
|
|
@ -5,7 +5,7 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { spawnSync } from "node:child_process";
|
||||
|
||||
import { gsdRoot, _clearGsdRootCache } from "../paths.ts";
|
||||
import { gsdRoot, _clearGsdRootCache } from "../../paths.ts";
|
||||
/** Create a tmp dir and resolve symlinks + 8.3 short names (macOS /var→/private/var, Windows RUNNER~1→runneradmin). */
|
||||
function tmp(): string {
|
||||
const p = mkdtempSync(join(tmpdir(), "gsd-paths-test-"));
|
||||
|
|
@ -11,8 +11,8 @@
|
|||
|
||||
import { describe, it, before, after } from 'node:test';
|
||||
import assert from 'node:assert';
|
||||
import { PluginImporter, type DiscoveryResult, type ImportManifest } from '../plugin-importer.js';
|
||||
import { getMarketplaceFixtures } from './marketplace-test-fixtures.js';
|
||||
import { PluginImporter, type DiscoveryResult, type ImportManifest } from '../../plugin-importer.js';
|
||||
import { getMarketplaceFixtures } from '../marketplace-test-fixtures.ts';
|
||||
|
||||
// ============================================================================
|
||||
// Live Test Configuration
|
||||
|
|
@ -15,9 +15,9 @@ import { mkdtempSync, mkdirSync, rmSync, writeFileSync } from "node:fs";
|
|||
import { join } from "node:path";
|
||||
import { tmpdir } from "node:os";
|
||||
|
||||
import { buildExistingMilestonesContext } from "../guided-flow-queue.ts";
|
||||
import type { GSDState, MilestoneRegistryEntry } from "../types.ts";
|
||||
import { createTestContext } from "./test-helpers.ts";
|
||||
import { buildExistingMilestonesContext } from "../../guided-flow-queue.ts";
|
||||
import type { GSDState, MilestoneRegistryEntry } from "../../types.ts";
|
||||
import { createTestContext } from "../test-helpers.ts";
|
||||
|
||||
const { assertTrue, assertEq, report } = createTestContext();
|
||||
|
||||
|
|
@ -17,10 +17,10 @@ import { mkdtempSync, mkdirSync, rmSync, writeFileSync, readFileSync, existsSync
|
|||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
|
||||
import { deriveState, invalidateStateCache } from '../state.ts';
|
||||
import { findMilestoneIds } from '../guided-flow.ts';
|
||||
import { saveQueueOrder, loadQueueOrder } from '../queue-order.ts';
|
||||
import { parseContextDependsOn } from '../files.ts';
|
||||
import { deriveState, invalidateStateCache } from '../../state.ts';
|
||||
import { findMilestoneIds } from '../../guided-flow.ts';
|
||||
import { saveQueueOrder, loadQueueOrder } from '../../queue-order.ts';
|
||||
import { parseContextDependsOn } from '../../files.ts';
|
||||
// ─── Fixture Helpers ───────────────────────────────────────────────────────
|
||||
|
||||
function createFixtureBase(): string {
|
||||
|
|
@ -298,7 +298,7 @@ test('E2E: DB-backed path respects queue order (#2556)', async () => {
|
|||
// the dispatch guard (which respects queue order) blocked completion.
|
||||
const base = createFixtureBase();
|
||||
try {
|
||||
const { openDatabase, closeDatabase, insertMilestone, isDbAvailable } = await import('../gsd-db.ts');
|
||||
const { openDatabase, closeDatabase, insertMilestone, isDbAvailable } = await import('../../gsd-db.ts');
|
||||
const dbPath = join(base, '.gsd', 'gsd.db');
|
||||
|
||||
// Create milestone directories (required for findMilestoneIds)
|
||||
|
|
@ -14,8 +14,8 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { execSync } from "node:child_process";
|
||||
|
||||
import { captureIntegrationBranch, getCurrentBranch } from "../worktree.ts";
|
||||
import { readIntegrationBranch, QUICK_BRANCH_RE } from "../git-service.ts";
|
||||
import { captureIntegrationBranch, getCurrentBranch } from "../../worktree.ts";
|
||||
import { readIntegrationBranch, QUICK_BRANCH_RE } from "../../git-service.ts";
|
||||
|
||||
function run(command: string, cwd: string): string {
|
||||
return execSync(command, { cwd, stdio: ["ignore", "pipe", "pipe"], encoding: "utf-8" }).trim();
|
||||
|
|
@ -139,7 +139,7 @@ test('cleanupQuickBranch: merges back and cleans up (same session)', async () =>
|
|||
// Import and call cleanupQuickBranch
|
||||
// Use dynamic import to get a fresh module scope — the in-memory state
|
||||
// won't be set, so it will fall through to disk recovery
|
||||
const { cleanupQuickBranch } = await import("../quick.ts");
|
||||
const { cleanupQuickBranch } = await import("../../quick.ts");
|
||||
const result = cleanupQuickBranch();
|
||||
|
||||
assert.ok(result, "cleanupQuickBranch returns true");
|
||||
|
|
@ -187,7 +187,7 @@ test('cleanupQuickBranch: recovers from disk state (cross-session)', async () =>
|
|||
|
||||
process.chdir(repo);
|
||||
|
||||
const { cleanupQuickBranch } = await import("../quick.ts");
|
||||
const { cleanupQuickBranch } = await import("../../quick.ts");
|
||||
const result = cleanupQuickBranch();
|
||||
|
||||
assert.ok(result, "cross-session recovery returns true");
|
||||
|
|
@ -207,7 +207,7 @@ test('cleanupQuickBranch: no-op without pending state', async () => {
|
|||
const origCwd = process.cwd();
|
||||
process.chdir(repo);
|
||||
|
||||
const { cleanupQuickBranch } = await import("../quick.ts");
|
||||
const { cleanupQuickBranch } = await import("../../quick.ts");
|
||||
const result = cleanupQuickBranch();
|
||||
|
||||
assert.ok(!result, "returns false when no pending state");
|
||||
|
|
@ -5,12 +5,12 @@ import { join, dirname } from 'node:path';
|
|||
import { tmpdir } from 'node:os';
|
||||
import { fileURLToPath } from 'node:url';
|
||||
|
||||
import { extractUatType } from '../files.ts';
|
||||
import { resolveSliceFile } from '../paths.ts';
|
||||
import { checkNeedsRunUat } from '../auto-prompts.ts';
|
||||
import { extractUatType } from '../../files.ts';
|
||||
import { resolveSliceFile } from '../../paths.ts';
|
||||
import { checkNeedsRunUat } from '../../auto-prompts.ts';
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
const worktreePromptsDir = join(__dirname, '..', 'prompts');
|
||||
const worktreePromptsDir = join(__dirname, '../..', 'prompts');
|
||||
|
||||
function loadPromptFromWorktree(name: string, vars: Record<string, string> = {}): string {
|
||||
const path = join(worktreePromptsDir, `${name}.md`);
|
||||
|
|
@ -10,14 +10,14 @@ import { mkdtempSync, mkdirSync, rmSync, writeFileSync, readFileSync } from 'nod
|
|||
import { join } from 'node:path';
|
||||
import { tmpdir } from 'node:os';
|
||||
|
||||
import { openDatabase, closeDatabase } from '../gsd-db.ts';
|
||||
import { migrateFromMarkdown } from '../md-importer.ts';
|
||||
import { openDatabase, closeDatabase } from '../../gsd-db.ts';
|
||||
import { migrateFromMarkdown } from '../../md-importer.ts';
|
||||
import {
|
||||
queryDecisions,
|
||||
queryRequirements,
|
||||
formatDecisionsForPrompt,
|
||||
formatRequirementsForPrompt,
|
||||
} from '../context-store.ts';
|
||||
} from '../../context-store.ts';
|
||||
import { test } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
|
||||
|
|
@ -18,10 +18,10 @@ import { execSync } from "node:child_process";
|
|||
import {
|
||||
createAutoWorktree,
|
||||
mergeMilestoneToMain,
|
||||
} from "../auto-worktree.ts";
|
||||
import { getSliceBranchName } from "../worktree.ts";
|
||||
import { abortAndReset } from "../git-self-heal.ts";
|
||||
import { runGSDDoctor } from "../doctor.ts";
|
||||
} from "../../auto-worktree.ts";
|
||||
import { getSliceBranchName } from "../../worktree.ts";
|
||||
import { abortAndReset } from "../../git-self-heal.ts";
|
||||
import { runGSDDoctor } from "../../doctor.ts";
|
||||
import { describe, test } from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
|
||||
|
|
@ -14,7 +14,10 @@ import {
|
|||
renderHealthView,
|
||||
type ProgressFilter,
|
||||
} from "./visualizer-views.js";
|
||||
import { writeFileSync, mkdirSync } from "node:fs";
|
||||
import { join } from "node:path";
|
||||
import { writeExportFile } from "./export.js";
|
||||
import { gsdRoot } from "./paths.js";
|
||||
import { stripAnsi } from "../shared/mod.js";
|
||||
|
||||
const TAB_COUNT = 10;
|
||||
|
|
@ -350,9 +353,6 @@ export class GSDVisualizerOverlay {
|
|||
// Capture current active tab's rendered lines as snapshot
|
||||
const snapshotLines = this.renderTabContent(this.activeTab, 80);
|
||||
const timestamp = new Date().toISOString().replace(/[:.]/g, "-").slice(0, 19);
|
||||
const { writeFileSync, mkdirSync } = require("node:fs");
|
||||
const { join } = require("node:path");
|
||||
const { gsdRoot } = require("./paths.js");
|
||||
const exportDir = gsdRoot(this.basePath);
|
||||
mkdirSync(exportDir, { recursive: true });
|
||||
const outPath = join(exportDir, `snapshot-${timestamp}.txt`);
|
||||
|
|
|
|||
|
|
@ -11,7 +11,7 @@
|
|||
|
||||
/** Format a millisecond duration as a compact human-readable string. */
|
||||
export function formatDuration(ms: number): string {
|
||||
if (ms < 1000) return `${ms}ms`;
|
||||
if (ms > 0 && ms < 1000) return `${ms}ms`;
|
||||
const s = Math.floor(ms / 1000);
|
||||
if (s < 60) return `${s}s`;
|
||||
const m = Math.floor(s / 60);
|
||||
|
|
|
|||
|
|
@ -54,9 +54,10 @@ export function updateWorker(id: string, status: "completed" | "failed"): void {
|
|||
if (entry) {
|
||||
entry.status = status;
|
||||
// Remove after a brief display window (5 seconds)
|
||||
// unref() so the timer doesn't keep the process alive in test environments
|
||||
setTimeout(() => {
|
||||
activeWorkers.delete(id);
|
||||
}, 5000);
|
||||
}, 5000).unref();
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,11 +1,9 @@
|
|||
import test from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
import { readFileSync, existsSync } from "node:fs";
|
||||
import { resolve, dirname } from "node:path";
|
||||
import { fileURLToPath } from "node:url";
|
||||
import { resolve } from "node:path";
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
const root = resolve(__dirname, "../..");
|
||||
const root = process.cwd();
|
||||
|
||||
function readFile(relativePath: string): string {
|
||||
const full = resolve(root, relativePath);
|
||||
|
|
|
|||
64
src/tests/ensure-workspace-builds.test.ts
Normal file
64
src/tests/ensure-workspace-builds.test.ts
Normal file
|
|
@ -0,0 +1,64 @@
|
|||
import { describe, it, beforeEach, afterEach } from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
import { mkdtempSync, writeFileSync, mkdirSync, rmSync, utimesSync } from "node:fs";
|
||||
import { tmpdir } from "node:os";
|
||||
import { join } from "node:path";
|
||||
import { createRequire } from "node:module";
|
||||
|
||||
const require = createRequire(import.meta.url);
|
||||
const { newestSrcMtime } = require("../../scripts/ensure-workspace-builds.cjs");
|
||||
|
||||
describe("newestSrcMtime", () => {
|
||||
let tmp: string;
|
||||
|
||||
beforeEach(() => { tmp = mkdtempSync(join(tmpdir(), "gsd-mtime-test-")); });
|
||||
afterEach(() => { rmSync(tmp, { recursive: true, force: true }); });
|
||||
|
||||
it("returns 0 for a non-existent directory", () => {
|
||||
assert.equal(newestSrcMtime(join(tmp, "does-not-exist")), 0);
|
||||
});
|
||||
|
||||
it("returns 0 when directory has no .ts files", () => {
|
||||
writeFileSync(join(tmp, "index.js"), "");
|
||||
writeFileSync(join(tmp, "config.json"), "");
|
||||
assert.equal(newestSrcMtime(tmp), 0);
|
||||
});
|
||||
|
||||
it("returns the mtime of a single .ts file", () => {
|
||||
const file = join(tmp, "index.ts");
|
||||
writeFileSync(file, "");
|
||||
const mtime = new Date("2024-01-15T10:00:00Z");
|
||||
utimesSync(file, mtime, mtime);
|
||||
assert.equal(newestSrcMtime(tmp), mtime.getTime());
|
||||
});
|
||||
|
||||
it("returns the max mtime across multiple .ts files", () => {
|
||||
const older = join(tmp, "a.ts");
|
||||
const newer = join(tmp, "b.ts");
|
||||
writeFileSync(older, "");
|
||||
writeFileSync(newer, "");
|
||||
utimesSync(older, new Date("2024-01-01T00:00:00Z"), new Date("2024-01-01T00:00:00Z"));
|
||||
utimesSync(newer, new Date("2024-06-01T00:00:00Z"), new Date("2024-06-01T00:00:00Z"));
|
||||
assert.equal(newestSrcMtime(tmp), new Date("2024-06-01T00:00:00Z").getTime());
|
||||
});
|
||||
|
||||
it("recurses into subdirectories", () => {
|
||||
const subdir = join(tmp, "nested", "deep");
|
||||
mkdirSync(subdir, { recursive: true });
|
||||
const file = join(subdir, "util.ts");
|
||||
writeFileSync(file, "");
|
||||
const mtime = new Date("2024-03-01T00:00:00Z");
|
||||
utimesSync(file, mtime, mtime);
|
||||
assert.equal(newestSrcMtime(tmp), mtime.getTime());
|
||||
});
|
||||
|
||||
it("skips node_modules entirely", () => {
|
||||
const nm = join(tmp, "node_modules", "some-pkg");
|
||||
mkdirSync(nm, { recursive: true });
|
||||
const nmFile = join(nm, "index.ts");
|
||||
writeFileSync(nmFile, "");
|
||||
const future = new Date("2099-01-01T00:00:00Z");
|
||||
utimesSync(nmFile, future, future);
|
||||
assert.equal(newestSrcMtime(tmp), 0);
|
||||
});
|
||||
});
|
||||
|
|
@ -13,7 +13,7 @@ import { join, dirname } from 'node:path';
|
|||
import { fileURLToPath } from 'node:url';
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
const ROOT = join(__dirname, '..', '..');
|
||||
const ROOT = join(__dirname, '..', '..', '..');
|
||||
const SCRIPT_PATH = join(ROOT, 'scripts', 'ci_monitor.cjs');
|
||||
|
||||
let passed = 0;
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
import test from "node:test"
|
||||
import assert from "node:assert/strict"
|
||||
|
||||
import { resolveTypeStrippingFlag } from "../web/ts-subprocess-flags.ts"
|
||||
import { resolveTypeStrippingFlag } from "../../web/ts-subprocess-flags.ts"
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Bug 1 — resolveTypeStrippingFlag selects the correct flag
|
||||
|
|
@ -8,12 +8,12 @@ import { PassThrough } from "node:stream";
|
|||
import { StringDecoder } from "node:string_decoder";
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
const bridge = await import("../web/bridge-service.ts");
|
||||
const onboarding = await import("../web/onboarding-service.ts");
|
||||
const bridge = await import("../../web/bridge-service.ts");
|
||||
const onboarding = await import("../../web/onboarding-service.ts");
|
||||
const { AuthStorage } = await import("@gsd/pi-coding-agent");
|
||||
const bootRoute = await import("../../web/app/api/boot/route.ts");
|
||||
const commandRoute = await import("../../web/app/api/session/command/route.ts");
|
||||
const eventsRoute = await import("../../web/app/api/session/events/route.ts");
|
||||
const bootRoute = await import("../../../web/app/api/boot/route.ts");
|
||||
const commandRoute = await import("../../../web/app/api/session/command/route.ts");
|
||||
const eventsRoute = await import("../../../web/app/api/session/events/route.ts");
|
||||
|
||||
class FakeRpcChild extends EventEmitter {
|
||||
stdin = new PassThrough();
|
||||
|
|
@ -14,7 +14,7 @@ import test from "node:test";
|
|||
import assert from "node:assert/strict";
|
||||
import { resolve } from "node:path";
|
||||
|
||||
const bridge = await import("../web/bridge-service.ts");
|
||||
const bridge = await import("../../web/bridge-service.ts");
|
||||
|
||||
test("resolveBridgeRuntimeConfig uses GSD_WEB_PACKAGE_ROOT when set", () => {
|
||||
const env = {
|
||||
|
|
@ -8,10 +8,10 @@ import { PassThrough } from "node:stream";
|
|||
import { StringDecoder } from "node:string_decoder";
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
const bridge = await import("../web/bridge-service.ts");
|
||||
const streamRoute = await import("../../web/app/api/bridge-terminal/stream/route.ts");
|
||||
const inputRoute = await import("../../web/app/api/bridge-terminal/input/route.ts");
|
||||
const resizeRoute = await import("../../web/app/api/bridge-terminal/resize/route.ts");
|
||||
const bridge = await import("../../web/bridge-service.ts");
|
||||
const streamRoute = await import("../../../web/app/api/bridge-terminal/stream/route.ts");
|
||||
const inputRoute = await import("../../../web/app/api/bridge-terminal/input/route.ts");
|
||||
const resizeRoute = await import("../../../web/app/api/bridge-terminal/resize/route.ts");
|
||||
|
||||
class FakeRpcChild extends EventEmitter {
|
||||
stdin = new PassThrough();
|
||||
|
|
@ -5,7 +5,7 @@ import { join } from "node:path";
|
|||
import { tmpdir } from "node:os";
|
||||
import { pathToFileURL } from "node:url";
|
||||
|
||||
const { resolveGsdCliEntry } = await import("../web/cli-entry.ts");
|
||||
const { resolveGsdCliEntry } = await import("../../web/cli-entry.ts");
|
||||
|
||||
function makeFixture(paths: string[]): string {
|
||||
const root = mkdtempSync(join(tmpdir(), "gsd-cli-entry-"));
|
||||
|
|
@ -3,19 +3,19 @@ import assert from "node:assert/strict"
|
|||
import { readFileSync } from "node:fs"
|
||||
import { resolve } from "node:path"
|
||||
|
||||
const { BUILTIN_SLASH_COMMANDS } = await import("../../packages/pi-coding-agent/src/core/slash-commands.ts")
|
||||
const { BUILTIN_SLASH_COMMANDS } = await import("../../../packages/pi-coding-agent/src/core/slash-commands.ts")
|
||||
const {
|
||||
dispatchBrowserSlashCommand,
|
||||
getBrowserSlashCommandTerminalNotice,
|
||||
} = await import("../../web/lib/browser-slash-command-dispatch.ts")
|
||||
} = await import("../../../web/lib/browser-slash-command-dispatch.ts")
|
||||
const {
|
||||
applyCommandSurfaceActionResult,
|
||||
createInitialCommandSurfaceState,
|
||||
openCommandSurfaceState,
|
||||
setCommandSurfacePending,
|
||||
surfaceOutcomeToOpenRequest,
|
||||
} = await import("../../web/lib/command-surface-contract.ts")
|
||||
const gsdExtension = await import("../resources/extensions/gsd/index.ts")
|
||||
} = await import("../../../web/lib/command-surface-contract.ts")
|
||||
const gsdExtension = await import("../../resources/extensions/gsd/index.ts")
|
||||
|
||||
const EXPECTED_BUILTIN_OUTCOMES = new Map<string, "rpc" | "surface" | "reject">([
|
||||
["settings", "surface"],
|
||||
|
|
@ -680,7 +680,7 @@ test("surface action state keeps compaction summaries inspectable", () => {
|
|||
})
|
||||
|
||||
test("command-surface session affordances use the shared store action path", () => {
|
||||
const commandSurfacePath = resolve(import.meta.dirname, "../../web/components/gsd/command-surface.tsx")
|
||||
const commandSurfacePath = resolve(import.meta.dirname, "../../../web/components/gsd/command-surface.tsx")
|
||||
const commandSurfaceSource = readFileSync(commandSurfacePath, "utf-8")
|
||||
|
||||
assert.match(
|
||||
|
|
@ -25,18 +25,18 @@ import type {
|
|||
SkillHealthReport,
|
||||
SkillHealthEntry,
|
||||
SkillHealSuggestion,
|
||||
} from "../../web/lib/diagnostics-types.ts"
|
||||
} from "../../../web/lib/diagnostics-types.ts"
|
||||
|
||||
const {
|
||||
createInitialCommandSurfaceState,
|
||||
commandSurfaceSectionForRequest,
|
||||
} = await import("../../web/lib/command-surface-contract.ts")
|
||||
} = await import("../../../web/lib/command-surface-contract.ts")
|
||||
|
||||
const {
|
||||
dispatchBrowserSlashCommand,
|
||||
} = await import("../../web/lib/browser-slash-command-dispatch.ts")
|
||||
} = await import("../../../web/lib/browser-slash-command-dispatch.ts")
|
||||
|
||||
const { GSDWorkspaceStore } = await import("../../web/lib/gsd-workspace-store.tsx")
|
||||
const { GSDWorkspaceStore } = await import("../../../web/lib/gsd-workspace-store.tsx")
|
||||
|
||||
// ─── Block 1: Type exports (R103, R104, R105) ───────────────────────────────
|
||||
|
||||
|
|
@ -8,11 +8,11 @@ import { PassThrough } from "node:stream";
|
|||
import { StringDecoder } from "node:string_decoder";
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
const bridge = await import("../web/bridge-service.ts");
|
||||
const onboarding = await import("../web/onboarding-service.ts");
|
||||
const bridge = await import("../../web/bridge-service.ts");
|
||||
const onboarding = await import("../../web/onboarding-service.ts");
|
||||
const { AuthStorage } = await import("@gsd/pi-coding-agent");
|
||||
const commandRoute = await import("../../web/app/api/session/command/route.ts");
|
||||
const eventsRoute = await import("../../web/app/api/session/events/route.ts");
|
||||
const commandRoute = await import("../../../web/app/api/session/command/route.ts");
|
||||
const eventsRoute = await import("../../../web/app/api/session/events/route.ts");
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Test infrastructure (reused from web-bridge-contract.test.ts)
|
||||
|
|
@ -8,13 +8,13 @@ import { PassThrough } from "node:stream";
|
|||
import { StringDecoder } from "node:string_decoder";
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
const bridge = await import("../web/bridge-service.ts");
|
||||
const onboarding = await import("../web/onboarding-service.ts");
|
||||
const bridge = await import("../../web/bridge-service.ts");
|
||||
const onboarding = await import("../../web/onboarding-service.ts");
|
||||
const { AuthStorage } = await import("@gsd/pi-coding-agent");
|
||||
const commandRoute = await import("../../web/app/api/session/command/route.ts");
|
||||
const manageRoute = await import("../../web/app/api/session/manage/route.ts");
|
||||
const eventsRoute = await import("../../web/app/api/session/events/route.ts");
|
||||
const liveStateRoute = await import("../../web/app/api/live-state/route.ts");
|
||||
const commandRoute = await import("../../../web/app/api/session/command/route.ts");
|
||||
const manageRoute = await import("../../../web/app/api/session/manage/route.ts");
|
||||
const eventsRoute = await import("../../../web/app/api/session/events/route.ts");
|
||||
const liveStateRoute = await import("../../../web/app/api/live-state/route.ts");
|
||||
|
||||
class FakeRpcChild extends EventEmitter {
|
||||
stdin = new PassThrough();
|
||||
|
|
@ -6,8 +6,8 @@ import { tmpdir } from 'node:os'
|
|||
|
||||
const projectRoot = process.cwd()
|
||||
|
||||
const cliWeb = await import('../cli-web-branch.ts')
|
||||
const webMode = await import('../web-mode.ts')
|
||||
const cliWeb = await import('../../cli-web-branch.ts')
|
||||
const webMode = await import('../../web-mode.ts')
|
||||
|
||||
test('parseCliArgs recognizes --web explicitly', () => {
|
||||
const flags = cliWeb.parseCliArgs(['node', 'dist/loader.js', '--web'])
|
||||
|
|
@ -4,8 +4,8 @@ import { mkdirSync, mkdtempSync, rmSync, writeFileSync } from 'node:fs'
|
|||
import { join } from 'node:path'
|
||||
import { tmpdir } from 'node:os'
|
||||
|
||||
const cliWeb = await import('../cli-web-branch.ts')
|
||||
const webMode = await import('../web-mode.ts')
|
||||
const cliWeb = await import('../../cli-web-branch.ts')
|
||||
const webMode = await import('../../web-mode.ts')
|
||||
|
||||
// ─── CLI flag parsing ────────────────────────────────────────────────
|
||||
|
||||
|
|
@ -8,7 +8,7 @@ import { PassThrough } from "node:stream";
|
|||
import { StringDecoder } from "node:string_decoder";
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
const bridge = await import("../web/bridge-service.ts");
|
||||
const bridge = await import("../../web/bridge-service.ts");
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Helpers (same shape as web-bridge-contract.test.ts)
|
||||
|
|
@ -8,11 +8,11 @@ import { PassThrough } from "node:stream";
|
|||
import { StringDecoder } from "node:string_decoder";
|
||||
|
||||
const repoRoot = process.cwd();
|
||||
const bridge = await import("../web/bridge-service.ts");
|
||||
const onboarding = await import("../web/onboarding-service.ts");
|
||||
const bootRoute = await import("../../web/app/api/boot/route.ts");
|
||||
const onboardingRoute = await import("../../web/app/api/onboarding/route.ts");
|
||||
const commandRoute = await import("../../web/app/api/session/command/route.ts");
|
||||
const bridge = await import("../../web/bridge-service.ts");
|
||||
const onboarding = await import("../../web/onboarding-service.ts");
|
||||
const bootRoute = await import("../../../web/app/api/boot/route.ts");
|
||||
const onboardingRoute = await import("../../../web/app/api/onboarding/route.ts");
|
||||
const commandRoute = await import("../../../web/app/api/session/command/route.ts");
|
||||
const { AuthStorage } = await import("@gsd/pi-coding-agent");
|
||||
|
||||
const ONBOARDING_ENV_KEYS = [
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
import test from "node:test"
|
||||
import assert from "node:assert/strict"
|
||||
|
||||
const { getOnboardingPresentation } = await import("../../web/lib/gsd-workspace-store.tsx")
|
||||
const { getOnboardingPresentation } = await import("../../../web/lib/gsd-workspace-store.tsx")
|
||||
|
||||
function makeOnboardingState(overrides: Record<string, unknown> = {}) {
|
||||
return {
|
||||
|
|
@ -4,8 +4,9 @@ import { mkdtempSync, mkdirSync, rmSync, writeFileSync } from "node:fs";
|
|||
import { tmpdir } from "node:os";
|
||||
import { basename, join } from "node:path";
|
||||
|
||||
import { discoverProjects } from "../web/project-discovery-service.ts";
|
||||
import { detectMonorepo } from "../web/bridge-service.ts";
|
||||
|
||||
import { discoverProjects } from "../../web/project-discovery-service.ts";
|
||||
import { detectMonorepo } from "../../web/bridge-service.ts";
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Fixture setup — standard multi-project root
|
||||
|
|
@ -1,7 +1,7 @@
|
|||
import test from "node:test"
|
||||
import assert from "node:assert/strict"
|
||||
|
||||
import { buildProjectAbsoluteUrl, buildProjectPath } from "../../web/lib/project-url.ts"
|
||||
import { buildProjectAbsoluteUrl, buildProjectPath } from "../../../web/lib/project-url.ts"
|
||||
|
||||
test("buildProjectPath leaves non-project routes unchanged", () => {
|
||||
assert.equal(buildProjectPath("/api/terminal/input"), "/api/terminal/input")
|
||||
|
|
@ -8,8 +8,8 @@ import { PassThrough } from "node:stream"
|
|||
import { StringDecoder } from "node:string_decoder"
|
||||
|
||||
const repoRoot = process.cwd()
|
||||
const bridge = await import("../web/bridge-service.ts")
|
||||
const recoveryRoute = await import("../../web/app/api/recovery/route.ts")
|
||||
const bridge = await import("../../web/bridge-service.ts")
|
||||
const recoveryRoute = await import("../../../web/app/api/recovery/route.ts")
|
||||
|
||||
class FakeRpcChild extends EventEmitter {
|
||||
stdin = new PassThrough()
|
||||
|
|
@ -10,7 +10,7 @@ import assert from 'node:assert/strict'
|
|||
import { readFileSync } from 'node:fs'
|
||||
import { resolve } from 'node:path'
|
||||
|
||||
const WEB_ROOT = resolve(import.meta.dirname, '../../web')
|
||||
const WEB_ROOT = resolve(import.meta.dirname, '../../../web')
|
||||
|
||||
function readComponent(relativePath: string): string {
|
||||
return readFileSync(resolve(WEB_ROOT, relativePath), 'utf-8')
|
||||
|
|
@ -9,11 +9,11 @@ import { PassThrough } from "node:stream"
|
|||
import { StringDecoder } from "node:string_decoder"
|
||||
|
||||
const repoRoot = process.cwd()
|
||||
const bridge = await import("../web/bridge-service.ts")
|
||||
const onboarding = await import("../web/onboarding-service.ts")
|
||||
const browserRoute = await import("../../web/app/api/session/browser/route.ts")
|
||||
const manageRoute = await import("../../web/app/api/session/manage/route.ts")
|
||||
const gitRoute = await import("../../web/app/api/git/route.ts")
|
||||
const bridge = await import("../../web/bridge-service.ts")
|
||||
const onboarding = await import("../../web/onboarding-service.ts")
|
||||
const browserRoute = await import("../../../web/app/api/session/browser/route.ts")
|
||||
const manageRoute = await import("../../../web/app/api/session/manage/route.ts")
|
||||
const gitRoute = await import("../../../web/app/api/git/route.ts")
|
||||
const { AuthStorage } = await import("@gsd/pi-coding-agent")
|
||||
|
||||
class FakeRpcChild extends EventEmitter {
|
||||
|
|
@ -635,12 +635,12 @@ test("/api/git exposes an explicit not-a-repo state instead of failing silently"
|
|||
})
|
||||
|
||||
test("browser session, settings, and git surfaces keep inspectable browse/manage/state markers on the shared surface", () => {
|
||||
const rpcTypesSource = readFileSync(resolve(import.meta.dirname, "../../packages/pi-coding-agent/src/modes/rpc/rpc-types.ts"), "utf8")
|
||||
const contractSource = readFileSync(resolve(import.meta.dirname, "../../web/lib/command-surface-contract.ts"), "utf8")
|
||||
const storeSource = readFileSync(resolve(import.meta.dirname, "../../web/lib/gsd-workspace-store.tsx"), "utf8")
|
||||
const surfaceSource = readFileSync(resolve(import.meta.dirname, "../../web/components/gsd/command-surface.tsx"), "utf8")
|
||||
const sidebarSource = readFileSync(resolve(import.meta.dirname, "../../web/components/gsd/sidebar.tsx"), "utf8")
|
||||
const gitRouteSource = readFileSync(resolve(import.meta.dirname, "../../web/app/api/git/route.ts"), "utf8")
|
||||
const rpcTypesSource = readFileSync(resolve(import.meta.dirname, "../../../packages/pi-coding-agent/src/modes/rpc/rpc-types.ts"), "utf8")
|
||||
const contractSource = readFileSync(resolve(import.meta.dirname, "../../../web/lib/command-surface-contract.ts"), "utf8")
|
||||
const storeSource = readFileSync(resolve(import.meta.dirname, "../../../web/lib/gsd-workspace-store.tsx"), "utf8")
|
||||
const surfaceSource = readFileSync(resolve(import.meta.dirname, "../../../web/components/gsd/command-surface.tsx"), "utf8")
|
||||
const sidebarSource = readFileSync(resolve(import.meta.dirname, "../../../web/components/gsd/sidebar.tsx"), "utf8")
|
||||
const gitRouteSource = readFileSync(resolve(import.meta.dirname, "../../../web/app/api/git/route.ts"), "utf8")
|
||||
|
||||
assert.match(rpcTypesSource, /autoRetryEnabled: boolean/, "rpc-types.ts must expose retry-enabled state in get_state")
|
||||
assert.match(rpcTypesSource, /retryInProgress: boolean/, "rpc-types.ts must expose retry-in-progress state in get_state")
|
||||
|
|
@ -6,12 +6,12 @@ import { join, resolve } from "node:path";
|
|||
|
||||
// ─── Imports ──────────────────────────────────────────────────────────
|
||||
const workspaceIndex = await import(
|
||||
"../resources/extensions/gsd/workspace-index.ts"
|
||||
"../../resources/extensions/gsd/workspace-index.ts"
|
||||
);
|
||||
const filesRoute = await import("../../web/app/api/files/route.ts");
|
||||
const filesRoute = await import("../../../web/app/api/files/route.ts");
|
||||
|
||||
// Re-import status helpers from the web-side module
|
||||
const workspaceStatus = await import("../../web/lib/workspace-status.ts");
|
||||
const workspaceStatus = await import("../../../web/lib/workspace-status.ts");
|
||||
|
||||
// ─── Helpers ──────────────────────────────────────────────────────────
|
||||
function makeGsdFixture(): { root: string; gsdDir: string; cleanup: () => void } {
|
||||
|
|
@ -384,11 +384,11 @@ const MOCK_DATA_PATTERNS = [
|
|||
/\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}.*Z["'](?:.*,\s*$)/m, // hardcoded ISO timestamps in array literals
|
||||
];
|
||||
|
||||
const webRoot = resolve(import.meta.dirname, "../../web");
|
||||
const webRoot = resolve(import.meta.dirname, "../../../web");
|
||||
|
||||
test("view components contain no static mock data arrays", () => {
|
||||
for (const filePath of VIEW_FILES) {
|
||||
const fullPath = resolve(import.meta.dirname, "../..", filePath);
|
||||
const fullPath = resolve(import.meta.dirname, "../../..", filePath);
|
||||
const source = readFileSync(fullPath, "utf-8");
|
||||
for (const pattern of MOCK_DATA_PATTERNS) {
|
||||
const match = source.match(pattern);
|
||||
|
|
@ -416,7 +416,7 @@ test("view components read from real data sources (store or API)", () => {
|
|||
];
|
||||
|
||||
for (const filePath of STORE_VIEWS) {
|
||||
const fullPath = resolve(import.meta.dirname, "../..", filePath);
|
||||
const fullPath = resolve(import.meta.dirname, "../../..", filePath);
|
||||
const source = readFileSync(fullPath, "utf-8");
|
||||
assert.ok(
|
||||
source.includes("gsd-workspace-store"),
|
||||
|
|
@ -425,7 +425,7 @@ test("view components read from real data sources (store or API)", () => {
|
|||
}
|
||||
|
||||
for (const { path: filePath, apiPattern } of API_VIEWS) {
|
||||
const fullPath = resolve(import.meta.dirname, "../..", filePath);
|
||||
const fullPath = resolve(import.meta.dirname, "../../..", filePath);
|
||||
const source = readFileSync(fullPath, "utf-8");
|
||||
assert.ok(
|
||||
source.includes(apiPattern),
|
||||
|
|
@ -438,7 +438,7 @@ test("view components read from real data sources (store or API)", () => {
|
|||
// from the dashboard. Live signals are visible in the terminal/power mode instead.
|
||||
|
||||
test("status bar consumes statusTexts from store", () => {
|
||||
const statusBarPath = resolve(import.meta.dirname, "../../web/components/gsd/status-bar.tsx");
|
||||
const statusBarPath = resolve(import.meta.dirname, "../../../web/components/gsd/status-bar.tsx");
|
||||
const source = readFileSync(statusBarPath, "utf-8");
|
||||
|
||||
assert.ok(
|
||||
|
|
@ -452,10 +452,10 @@ test("status bar consumes statusTexts from store", () => {
|
|||
});
|
||||
|
||||
test("browser shell renders title overrides, widgets, and editor prefills from store-backed state", () => {
|
||||
const storePath = resolve(import.meta.dirname, "../../web/lib/gsd-workspace-store.tsx");
|
||||
const appShellPath = resolve(import.meta.dirname, "../../web/components/gsd/app-shell.tsx");
|
||||
const statusBarPath = resolve(import.meta.dirname, "../../web/components/gsd/status-bar.tsx");
|
||||
const terminalPath = resolve(import.meta.dirname, "../../web/components/gsd/terminal.tsx");
|
||||
const storePath = resolve(import.meta.dirname, "../../../web/lib/gsd-workspace-store.tsx");
|
||||
const appShellPath = resolve(import.meta.dirname, "../../../web/components/gsd/app-shell.tsx");
|
||||
const statusBarPath = resolve(import.meta.dirname, "../../../web/components/gsd/status-bar.tsx");
|
||||
const terminalPath = resolve(import.meta.dirname, "../../../web/components/gsd/terminal.tsx");
|
||||
|
||||
const storeSource = readFileSync(storePath, "utf-8");
|
||||
const appShellSource = readFileSync(appShellPath, "utf-8");
|
||||
|
|
@ -478,7 +478,7 @@ test("browser shell renders title overrides, widgets, and editor prefills from s
|
|||
});
|
||||
|
||||
test("terminal consumes activeToolExecution from store", () => {
|
||||
const terminalPath = resolve(import.meta.dirname, "../../web/components/gsd/terminal.tsx");
|
||||
const terminalPath = resolve(import.meta.dirname, "../../../web/components/gsd/terminal.tsx");
|
||||
const source = readFileSync(terminalPath, "utf-8");
|
||||
|
||||
assert.ok(
|
||||
|
|
@ -488,12 +488,12 @@ test("terminal consumes activeToolExecution from store", () => {
|
|||
});
|
||||
|
||||
test("live browser panels consume live selectors and expose inspectable freshness markers", () => {
|
||||
const contractPath = resolve(import.meta.dirname, "../../web/lib/command-surface-contract.ts")
|
||||
const storePath = resolve(import.meta.dirname, "../../web/lib/gsd-workspace-store.tsx")
|
||||
const dashboardPath = resolve(import.meta.dirname, "../../web/components/gsd/dashboard.tsx")
|
||||
const sidebarPath = resolve(import.meta.dirname, "../../web/components/gsd/sidebar.tsx")
|
||||
const roadmapPath = resolve(import.meta.dirname, "../../web/components/gsd/roadmap.tsx")
|
||||
const statusBarPath = resolve(import.meta.dirname, "../../web/components/gsd/status-bar.tsx")
|
||||
const contractPath = resolve(import.meta.dirname, "../../../web/lib/command-surface-contract.ts")
|
||||
const storePath = resolve(import.meta.dirname, "../../../web/lib/gsd-workspace-store.tsx")
|
||||
const dashboardPath = resolve(import.meta.dirname, "../../../web/components/gsd/dashboard.tsx")
|
||||
const sidebarPath = resolve(import.meta.dirname, "../../../web/components/gsd/sidebar.tsx")
|
||||
const roadmapPath = resolve(import.meta.dirname, "../../../web/components/gsd/roadmap.tsx")
|
||||
const statusBarPath = resolve(import.meta.dirname, "../../../web/components/gsd/status-bar.tsx")
|
||||
|
||||
const contractSource = readFileSync(contractPath, "utf-8")
|
||||
const storeSource = readFileSync(storePath, "utf-8")
|
||||
|
|
@ -528,9 +528,9 @@ test("live browser panels consume live selectors and expose inspectable freshnes
|
|||
})
|
||||
|
||||
test("workflow action surfaces route new-milestone CTAs through the shared command path", () => {
|
||||
const dashboardPath = resolve(import.meta.dirname, "../../web/components/gsd/dashboard.tsx")
|
||||
const sidebarPath = resolve(import.meta.dirname, "../../web/components/gsd/sidebar.tsx")
|
||||
const chatPath = resolve(import.meta.dirname, "../../web/components/gsd/chat-mode.tsx")
|
||||
const dashboardPath = resolve(import.meta.dirname, "../../../web/components/gsd/dashboard.tsx")
|
||||
const sidebarPath = resolve(import.meta.dirname, "../../../web/components/gsd/sidebar.tsx")
|
||||
const chatPath = resolve(import.meta.dirname, "../../../web/components/gsd/chat-mode.tsx")
|
||||
|
||||
const dashboardSource = readFileSync(dashboardPath, "utf-8")
|
||||
const sidebarSource = readFileSync(sidebarPath, "utf-8")
|
||||
|
|
@ -549,10 +549,10 @@ test("workflow action surfaces route new-milestone CTAs through the shared comma
|
|||
})
|
||||
|
||||
test("sidebar Git affordance opens a real git-summary surface with visible repo/not-repo/error states", () => {
|
||||
const contractPath = resolve(import.meta.dirname, "../../web/lib/command-surface-contract.ts");
|
||||
const storePath = resolve(import.meta.dirname, "../../web/lib/gsd-workspace-store.tsx");
|
||||
const surfacePath = resolve(import.meta.dirname, "../../web/components/gsd/command-surface.tsx");
|
||||
const sidebarPath = resolve(import.meta.dirname, "../../web/components/gsd/sidebar.tsx");
|
||||
const contractPath = resolve(import.meta.dirname, "../../../web/lib/command-surface-contract.ts");
|
||||
const storePath = resolve(import.meta.dirname, "../../../web/lib/gsd-workspace-store.tsx");
|
||||
const surfacePath = resolve(import.meta.dirname, "../../../web/components/gsd/command-surface.tsx");
|
||||
const sidebarPath = resolve(import.meta.dirname, "../../../web/components/gsd/sidebar.tsx");
|
||||
|
||||
const contractSource = readFileSync(contractPath, "utf-8");
|
||||
const storeSource = readFileSync(storePath, "utf-8");
|
||||
|
|
@ -573,11 +573,11 @@ test("sidebar Git affordance opens a real git-summary surface with visible repo/
|
|||
});
|
||||
|
||||
test("recovery diagnostics surface stays on a dedicated route with explicit stale and action state", () => {
|
||||
const contractPath = resolve(import.meta.dirname, "../../web/lib/command-surface-contract.ts");
|
||||
const storePath = resolve(import.meta.dirname, "../../web/lib/gsd-workspace-store.tsx");
|
||||
const surfacePath = resolve(import.meta.dirname, "../../web/components/gsd/command-surface.tsx");
|
||||
const dashboardPath = resolve(import.meta.dirname, "../../web/components/gsd/dashboard.tsx");
|
||||
const sidebarPath = resolve(import.meta.dirname, "../../web/components/gsd/sidebar.tsx");
|
||||
const contractPath = resolve(import.meta.dirname, "../../../web/lib/command-surface-contract.ts");
|
||||
const storePath = resolve(import.meta.dirname, "../../../web/lib/gsd-workspace-store.tsx");
|
||||
const surfacePath = resolve(import.meta.dirname, "../../../web/components/gsd/command-surface.tsx");
|
||||
const dashboardPath = resolve(import.meta.dirname, "../../../web/components/gsd/dashboard.tsx");
|
||||
const sidebarPath = resolve(import.meta.dirname, "../../../web/components/gsd/sidebar.tsx");
|
||||
|
||||
const contractSource = readFileSync(contractPath, "utf-8");
|
||||
const storeSource = readFileSync(storePath, "utf-8");
|
||||
|
|
@ -5,7 +5,7 @@ import { join } from "node:path"
|
|||
import {
|
||||
isUnderNodeModules,
|
||||
resolveSubprocessModule,
|
||||
} from "../web/ts-subprocess-flags.ts"
|
||||
} from "../../web/ts-subprocess-flags.ts"
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// isUnderNodeModules — exported utility
|
||||
|
|
@ -1,8 +1,8 @@
|
|||
import test from "node:test";
|
||||
import assert from "node:assert/strict";
|
||||
|
||||
const sessionsRoute = await import("../../web/app/api/terminal/sessions/route.ts");
|
||||
const streamRoute = await import("../../web/app/api/terminal/stream/route.ts");
|
||||
const sessionsRoute = await import("../../../web/app/api/terminal/sessions/route.ts");
|
||||
const streamRoute = await import("../../../web/app/api/terminal/stream/route.ts");
|
||||
|
||||
test("terminal session creation rejects disallowed commands", async () => {
|
||||
const response = await sessionsRoute.POST(
|
||||
|
|
@ -5,7 +5,7 @@ const {
|
|||
derivePendingWorkflowCommandLabel,
|
||||
executeWorkflowActionInPowerMode,
|
||||
navigateToGSDView,
|
||||
} = await import("../../web/lib/workflow-action-execution.ts")
|
||||
} = await import("../../../web/lib/workflow-action-execution.ts")
|
||||
|
||||
test("derivePendingWorkflowCommandLabel prefers the latest input line while a command is in flight", () => {
|
||||
const label = derivePendingWorkflowCommandLabel({
|
||||
|
|
@ -2,7 +2,7 @@ import test from "node:test";
|
|||
import assert from "node:assert/strict";
|
||||
|
||||
// ─── Import ──────────────────────────────────────────────────────────
|
||||
const { deriveWorkflowAction } = await import("../../web/lib/workflow-actions.ts");
|
||||
const { deriveWorkflowAction } = await import("../../../web/lib/workflow-actions.ts");
|
||||
|
||||
// ─── Helpers ──────────────────────────────────────────────────────────
|
||||
function baseInput(overrides: Partial<Parameters<typeof deriveWorkflowAction>[0]> = {}) {
|
||||
Loading…
Add table
Reference in a new issue