singularity-forge/packages
Mikael Hugo ccdd3027ab perf(read): stream lines when offset/limit provided to avoid loading entire file
When offset or limit are specified, use Node.js readline streaming instead of
loading the entire file into memory. This fixes the truncation issue for large
files (>50KB) where the read tool would return truncated content even when
requesting a small slice.

- Add readLinesStreamed() for memory-efficient line reading
- Add countLines() for total line count without full read
- Use streaming path when offset !== undefined || limit !== undefined
- Keep existing full-file read path when no offset/limit specified
- Add tests for streaming behavior with large files

Fixes the long-standing issue where reading large files like src/headless.ts
(~50KB) with offset/limit would still hit truncation limits.
2026-05-04 15:20:16 +02:00
..
daemon fix(sf): add sf-dev batch server command 2026-05-02 22:44:14 +02:00
mcp-server fix(sf): harden auto loops and skill sandbox 2026-05-02 19:46:36 +02:00
native chore: purge bun from internal toolchain 2026-05-02 08:38:20 +02:00
pi-agent-core fix(gemini): keep cli tools in pi harness 2026-05-02 13:32:05 +02:00
pi-ai fix(sf): recover model routes and self-feedback 2026-05-02 22:07:10 +02:00
pi-coding-agent perf(read): stream lines when offset/limit provided to avoid loading entire file 2026-05-04 15:20:16 +02:00
pi-tui sf snapshot: pre-dispatch, uncommitted changes after 32m inactivity 2026-05-02 11:25:51 +02:00
rpc-client fix(sf): recover model routes and self-feedback 2026-05-02 22:07:10 +02:00