feat(types): M8 IRONCLAD — eliminate all wildcard casts#40
feat(types): M8 IRONCLAD — eliminate all wildcard casts#40flyingrobots wants to merge 4 commits intomainfrom
Conversation
…n + infra Remove all TODO(ts-cleanup) tagged @type {*} / @type {any} casts from src/domain/warp/, src/domain/services/, and src/infrastructure/adapters/. Replaced with role-specific persistence types (PersistenceReader, PersistenceWriter, CheckpointPersistence, IndexStorage), error narrowing helpers (isError, hasErrorCode, hasMessage), and properly typed aliases. Adds Zod validation to HttpSyncServer constructor, assertion typing to SyncAuthService._validateKeys, ambient Bun/Deno types in globals.d.ts, consumer smoke test in test/type-check/, and type-surface manifest in contracts/. All gates pass: typecheck, typecheck:policy, typecheck:consumer, lint, 3799/3799 unit tests including no-coordination regression suite.
Tier 1 quick wins (5 files): - scripts/setup-hooks.js: error narrowing via instanceof - BunHttpAdapter.js: BunServer typedef for server state - roaring.js: RoaringModule typedef for lazy singleton - WarpGraph.js: CheckpointPersistence cast for audit service - text.js: unknown/Record types for formatDelta/formatOpSummaryPlain Tier 2 catch clauses (6 files): - doctor/checks.js: 7 catch clauses + internalError param + buildHookFinding param - doctor/index.js: 1 catch clause + error narrowing - verify-audit.js: catch clause + instanceof Error narrowing - path.js: catch clause + code property narrowing - view.js: catch clause + typed errObj intermediate - install-hooks.js: catch clause + ENOENT narrowing
… casts Zero TODO(ts-cleanup) tags, @type {*}, @type {any}, @returns {*}, and @returns {any} remain in src/, bin/, or test/. Narrowed CLI types (QueryBuilderLike, WarpGraphInstance, Persistence), made MinHeap generic, typed presenters/json.js, and replaced test mock wildcards with proper vitest Mock types. All 3,799 tests pass.
📝 WalkthroughWalkthroughLarge-scale typing and validation hardening across the codebase: adds consumer type-check, introduces PatchV2/PatchBuilderV2 surface and extensive .d.ts/manifest updates, replaces loose wildcard/any types with precise typedefs, centralizes error guards, adds Zod config validation for HttpSyncServer, and tightens many JSDoc signatures. Changes
Sequence Diagram(s)sequenceDiagram
participant Client as "Client"
participant Http as "HttpSyncServer\n(options validated via Zod)"
participant Auth as "SyncAuthService"
participant Graph as "WarpGraph"
participant Persistence as "CorePersistence\n(Commit/Blob/Tree/Ref)"
rect rgba(200,200,255,0.5)
Client->>Http: HTTP sync request
Http->>Auth: validate/auth headers
Auth-->>Http: auth verdict
Http->>Graph: processSyncRequest(request)
Graph->>Persistence: read patches / load data
Persistence-->>Graph: patch data / frontier
Graph-->>Http: sync response / status
Http-->>Client: HTTP response
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (3)
src/domain/services/SyncAuthService.js (1)
172-185:⚠️ Potential issue | 🟡 MinorJSDoc marks
options.keysoptional but runtime requires it.
This can mislead consumers into thinkingkeysis optional. Consider keepingoptionsoptional butoptions.keysrequired when provided.✍️ Suggested JSDoc fix
- * `@param` {Record<string, string>} [options.keys] - Key-id to secret mapping + * `@param` {Record<string, string>} options.keys - Key-id to secret mapping🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncAuthService.js` around lines 172 - 185, The JSDoc incorrectly marks options.keys as optional while the constructor calls _validateKeys(keys) and requires it; update the docs and add an explicit runtime check: in SyncAuthService.constructor ensure you validate presence of keys before calling _validateKeys (throw a clear error like "options.keys is required" if missing) and update the `@param` for options.keys to remove the optional brackets (make it required) so the JSDoc matches runtime behavior; reference SyncAuthService, constructor, and _validateKeys when making these changes.bin/cli/commands/verify-audit.js (1)
49-50:⚠️ Potential issue | 🟡 MinorMissed wildcard in
@returnsJSDoc.
{payload: *, exitCode: number}still carries a*cast — at odds with this PR's stated goal of zero wildcards insrc/bin. The payload type can be narrowed toRecord<string, unknown>to match the declared type ofpayloadat line 62.🛠️ Proposed fix
- * `@returns` {Promise<{payload: *, exitCode: number}>} + * `@returns` {Promise<{payload: Record<string, unknown>, exitCode: number}>}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/verify-audit.js` around lines 49 - 50, Update the JSDoc for the function in verify-audit.js so the `@returns` payload type is not a wildcard; replace the "*" with "Record<string, unknown>" to match the runtime/declared type of payload (the payload variable used later in the function) and keep exitCode as number so the `@returns` reads {payload: Record<string, unknown>, exitCode: number}.bin/cli/commands/tree.js (1)
83-83:⚠️ Potential issue | 🟡 MinorReplace all
Record<string, *>withRecord<string, unknown>and fix return type wildcardLines 83 and 105 use
Record<string, *>which should beRecord<string, unknown>to align with the codebase cleanup pattern seen throughoutbin/directory. Additionally, line 154's return type usespayload: *which also needs the same replacement.♻️ Suggested replacements
/** - * `@param` {Record<string, *>} nodeProps + * `@param` {Record<string, unknown>} nodeProps- * `@param` {Map<string, Record<string, *>>} params.propsMap + * `@param` {Map<string, Record<string, unknown>>} params.propsMap- * `@returns` {Promise<{payload: *, exitCode: number}>} + * `@returns` {Promise<{payload: unknown, exitCode: number}>}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/tree.js` at line 83, Replace all occurrences of the JSDoc wildcard type `Record<string, *>` with `Record<string, unknown>` and change any `payload: *` return-type wildcards to `payload: unknown`; specifically update the JSDoc for the `nodeProps` parameter (the `@param {Record<string, *>} nodeProps` entry) and the function return/type annotation that currently uses `payload: *` so they read `Record<string, unknown>` and `payload: unknown` respectively to match the codebase convention.
🧹 Nitpick comments (28)
src/domain/services/MessageCodecInternal.js (2)
69-70: Type annotation is documentary only — effective type isanyBecause the import on line 16 is guarded by
@ts-expect-error(no.d.tsfor@git-stunts/trailer-codec),TrailerCodecresolves toanyin the type checker, making@type {TrailerCodec|null}equivalent toanyfor enforcement purposes. The annotation is still worthwhile as intent documentation and will gain enforcement value if/when the library ships declarations. No action required now.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/MessageCodecInternal.js` around lines 69 - 70, The JSDoc type on the module-scoped variable _codec (/** `@type` {TrailerCodec|null} */) is only documentary because the import of TrailerCodec is guarded with `@ts-expect-error` and thus resolves to any in the checker; no code change is required—leave the annotation in place for intent documentation and future enforcement once `@git-stunts/trailer-codec` provides declarations.
76-82:getCodec()return will become a TS error once the library ships declarationsTypeScript does not narrow module-level mutable variables — after the
if (!_codec)guard,_codecis still typed asTrailerCodec|nullat thereturnsite, conflicting with the@returns {TrailerCodec}promise. This is currently masked because the import resolves toany, but it will surface the moment@git-stunts/trailer-codecships a.d.ts.♻️ Forward-compatible fix using a local variable
export function getCodec() { if (!_codec) { const service = new TrailerCodecService(); _codec = new TrailerCodec({ service }); } - return _codec; + const codec = _codec; + return codec;Or, more tersely, with a non-null assertion JSDoc cast:
- return _codec; + return /** `@type` {TrailerCodec} */ (_codec);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/MessageCodecInternal.js` around lines 76 - 82, getCodec() returns _codec which is typed as TrailerCodec|null at module scope, causing a TS declaration error; fix by assigning the newly created instance to a local non-null-typed variable and return that (or cast `_codec` to non-null via a JSDoc non-null assertion) so the return type is guaranteed TrailerCodec. Specifically, inside getCodec() use a local const (e.g. const codec = new TrailerCodec({ service })) then set _codec = codec and return codec, referencing getCodec, _codec, TrailerCodecService, and TrailerCodec to locate the change.src/domain/services/LogicalTraversal.js (1)
155-155: Consider extracting the long inline cast to a@typedef.The inline type assertion on
this._graphis correct and a clear improvement over a wildcard cast, but at ~200 characters it makes the line hard to read. Extracting it to a named@typedefwould make the assumed shape of_materializeGraph's return value easier to audit and update if the implementation changes.♻️ Suggested refactor
Add near the top of the file (after the
TraversalErrorimport):+/** + * `@typedef` {{ adjacency: { outgoing: Map<string, Array<{neighborId: string, label: string}>>, incoming: Map<string, Array<{neighborId: string, label: string}>> } }} MaterializedGraphShape + */Then simplify line 155:
- const materialized = await /** `@type` {{ _materializeGraph: () => Promise<{adjacency: {outgoing: Map<string, Array<{neighborId: string, label: string}>>, incoming: Map<string, Array<{neighborId: string, label: string}>>}}> }} */ (this._graph)._materializeGraph(); + const materialized = await /** `@type` {{ _materializeGraph: () => Promise<MaterializedGraphShape> }} */ (this._graph)._materializeGraph();Note: This cast is the sole source of truth for the return shape TypeScript expects here. Since
_materializeGraph's actual implementation lacks strict typing, keep the cast in sync with its real return structure if it changes.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/LogicalTraversal.js` at line 155, Extract the long inline JSDoc cast into a named typedef and use it where the cast is applied: add a typedef (e.g., typedef MaterializedGraphReturn) after the TraversalError import that describes the Promise-resolved shape returned by _materializeGraph (including adjacency.outgoing and adjacency.incoming Maps of neighborId/label arrays), then replace the inline cast on (this._graph)._materializeGraph() with a concise /** `@type` {MaterializedGraphReturn} */ cast so the const materialized assignment uses the typedef instead of the ~200-char inline assertion.src/infrastructure/adapters/WebCryptoAdapter.js (1)
39-43: Buffer branch is unreachable — dead codeThe
Bufferclass is a subclass ofUint8Array, so anyBufferinstance satisfiesdata instanceof Uint8Arrayon line 39 and is returned there, before execution ever reaches theBuffer.isBufferguard on line 41. The same applies to Bun, where the Node.js Buffer class extendsUint8Array.This makes lines 41–43 dead code in every supported runtime. Note that returning the
Bufferdirectly via theinstanceof Uint8Arraypath is functionally equivalent — the Buffer's ownbyteOffsetandbyteLengthare preserved as part of theTypedArrayinterface, so the Web Crypto API receives the correct byte range either way.If the intent is to always produce a plain
Uint8Arrayview (rather than aBuffersubclass), the check order needs to be swapped:♻️ Suggested fix: check `Buffer.isBuffer` before `instanceof Uint8Array`
function toUint8Array(data) { - if (data instanceof Uint8Array) { return data; } - if (typeof data === 'string') { return new TextEncoder().encode(data); } if (typeof Buffer !== 'undefined' && Buffer.isBuffer(data)) { const buf = /** `@type` {Buffer} */ (data); return new Uint8Array(buf.buffer, buf.byteOffset, buf.byteLength); } + if (data instanceof Uint8Array) { return data; } + if (typeof data === 'string') { return new TextEncoder().encode(data); } throw new Error('WebCryptoAdapter: data must be string, Buffer, or Uint8Array'); }Alternatively, if there is no semantic reason to distinguish a plain
Uint8Arrayfrom aBuffersubclass for this use-case, theBuffer.isBufferbranch can simply be removed.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/infrastructure/adapters/WebCryptoAdapter.js` around lines 39 - 43, The Buffer-specific branch (Buffer.isBuffer) is dead because Buffer extends Uint8Array so the instanceof Uint8Array check returns true first; update the conversion logic in the WebCryptoAdapter byte-conversion routine to either (a) remove the Buffer.isBuffer branch entirely if a Buffer subclass can be treated the same as a plain Uint8Array, or (b) if you must produce a plain Uint8Array view, swap the checks so you call Buffer.isBuffer(data) before the instanceof Uint8Array branch and then create a new Uint8Array from the Buffer's buffer/byteOffset/byteLength; keep the existing string handling (TextEncoder) unchanged and ensure Buffer.isBuffer and instanceof Uint8Array are the referenced guards.src/domain/utils/roaring.js (1)
37-39:[default]property type is missingisNativelyInstalled?: booleanfor the CJS/1.x surface.The
[default]property (line 38) models the unwrapped CJS module as{ RoaringBitmap32: ... }. In roaring 1.x the CJS module also exposesisNativelyInstalled: booleanat the top level (the path exercised at line 177). AfterroaringModule = roaringModule.default(line 91), TypeScript widens back toRoaringModule | nullvia the variable's declared type, soisNativelyInstalledremains accessible with the correct type — no runtime impact.Aligning the
[default]type with the full 1.x module surface would make the typedef self-documenting:🔧 Suggested typedef refinement
- * `@property` {{ RoaringBitmap32: typeof import('roaring').RoaringBitmap32 }} [default] + * `@property` {{ RoaringBitmap32: typeof import('roaring').RoaringBitmap32, isNativelyInstalled?: boolean }} [default]🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/utils/roaring.js` around lines 37 - 39, The JSDoc typedef for the unwrapped CJS module is incomplete: update the `[default]` property to include isNativelyInstalled?: boolean so the CJS/1.x surface is fully modeled; modify the `@property` that currently reads `{ RoaringBitmap32: typeof import('roaring').RoaringBitmap32 } [default]` to include `isNativelyInstalled?: boolean` alongside `RoaringBitmap32` (this ensures the typedef aligns with the runtime usage where roaringModule.default may expose isNativelyInstalled, and references like RoaringBitmap32 remain correct).bin/cli/commands/path.js (1)
82-87: Optional: eliminate the remaining JSDoc cast with'code' in errornarrowing.After
instanceof Error,'code' in errornarrowserrortoError & Record<"code", unknown>, makingerror['code'] === 'NODE_NOT_FOUND'valid without any cast — staying consistent with the PR's goal of replacing type assertions with type guards.♻️ Proposed refactor
- } catch (error) { - if (error instanceof Error && /** `@type` {{code?: string}} */ (error).code === 'NODE_NOT_FOUND') { + } catch (error) { + if (error instanceof Error && 'code' in error && error['code'] === 'NODE_NOT_FOUND') { throw notFoundError(error.message); } throw error; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/path.js` around lines 82 - 87, Replace the JSDoc cast in the catch block by using a type guard: after checking "error instanceof Error", check "'code' in error" and then compare error.code === 'NODE_NOT_FOUND' before throwing notFoundError(error.message); update the condition currently using /** `@type` {{code?: string}} */ (error).code to use the 'code' in error narrowing with the same notFoundError(error.message) and falling back to rethrowing error.bin/cli/commands/doctor/types.js (1)
9-9:unknownis a solid upgrade over*; recursive self-reference would be more precise.TypeScript supports recursive JSDoc typedefs, so the inner types can reference
JsonValuedirectly, constraining nested arrays/objects to valid JSON values rather thanunknown.♻️ Fully recursive definition
-/** `@typedef` {null | boolean | number | string | Array<unknown> | {[k:string]: unknown}} JsonValue */ +/** `@typedef` {null | boolean | number | string | Array<JsonValue> | {[k:string]: JsonValue}} JsonValue */This ensures that consumers who read nested values get
JsonValue(notunknown) and benefit from the same type guards transitively.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/doctor/types.js` at line 9, Update the JSDoc typedef for JsonValue so nested arrays and objects reference JsonValue recursively instead of using unknown; modify the existing /** `@typedef` {null | boolean | number | string | Array<unknown> | {[k:string]: unknown}} JsonValue */ declaration so Array and object value types point to JsonValue (e.g., Array<JsonValue> and {[k:string]: JsonValue}) to ensure nested values are typed as JsonValue throughout.bin/cli/commands/install-hooks.js (1)
21-22: Remaining@param {*}/ embedded*wildcards still present after this PR's cleanup pass.Lines 21, 40 (
@param {*} classification) and Line 95 (@returns {Promise<{payload: *, exitCode: number}>}) weren't addressed.classifyExistingHookalready has a precise return type, soclassificationcan be typed accordingly.♻️ Proposed cleanup
/** - * `@param` {*} classification + * `@param` {{ kind: 'none'|'ours'|'foreign', version?: string, appended?: boolean }} classification * `@param` {{force: boolean}} hookOptions */ async function resolveStrategy(classification, hookOptions) { /** - * `@param` {*} classification + * `@param` {{ kind: 'ours', version?: string }} classification */ async function promptForOursStrategy(classification) {For the return type of
handleInstallHooks, a dedicated payload typedef (or a union of the three shapes returned) would replace the embedded*.Also applies to: 40-41, 95-95
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/install-hooks.js` around lines 21 - 22, Update the JSDoc types to remove wildcard asterisks: replace "@param {*} classification" with the concrete type returned by classifyExistingHook (reference classifyExistingHook's return shape) for both occurrences, and replace the "@returns {Promise<{payload: *, exitCode: number}>}" on handleInstallHooks with a proper typedef (or a union) describing the exact payload shapes returned by handleInstallHooks (create a named typedef like InstallHooksResult or a union of the three payload shapes and use "@returns {Promise<InstallHooksResult>}"). Ensure the JSDoc param name matches the function signature for handleInstallHooks and reference classifyExistingHook and handleInstallHooks in the typedef to keep types consistent.src/domain/services/PatchBuilderV2.js (1)
451-458: Snapshot the version vector in patch context to avoid accidental mutation.
build()returns a patch for inspection; if the builder is reused, the shared_vvreference can drift. Consider cloning the vector when embedding it in the patch context.♻️ Suggested adjustment
- context: /** `@type` {import('../types/WarpTypesV2.js').VersionVector} */ (/** `@type` {unknown} */ (this._vv)), + context: /** `@type` {import('../types/WarpTypesV2.js').VersionVector} */ (/** `@type` {unknown} */ (vvClone(this._vv))),- context: /** `@type` {import('../types/WarpTypesV2.js').VersionVector} */ (/** `@type` {unknown} */ (this._vv)), + context: /** `@type` {import('../types/WarpTypesV2.js').VersionVector} */ (/** `@type` {unknown} */ (vvClone(this._vv))),Also applies to: 543-550
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/PatchBuilderV2.js` around lines 451 - 458, The build() method embeds the internal version-vector reference _vv directly into the patch context causing accidental mutations if the builder is reused; when calling createPatchV2 (and the other similar builder method around the 543-550 region), clone the version vector (deep copy) instead of passing this._vv so the patch.context contains an immutable snapshot — update build() and the analogous function to pass a cloned object (e.g., shallow/deep clone depending on VV structure) into the context parameter when calling createPatchV2 to prevent shared-state drift.bin/cli/commands/verify-audit.js (1)
91-100: Static fallback discards thrown value for non-Errorexceptions.If
evaluateTrust(or a dependency) throws a non-Errorvalue — a string, a custom status object, etc. — the'Trust evaluation failed'fallback completely hides the cause.String(err)would preserve it without leaking unexpected structure.🔧 Suggested improvement
- error: err instanceof Error ? err.message : 'Trust evaluation failed', + error: err instanceof Error ? err.message : String(err),🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/verify-audit.js` around lines 91 - 100, The catch block in the verify-audit flow currently replaces non-Error thrown values with the fixed string "Trust evaluation failed"; update the payload.trustAssessment assignment so the error field preserves the original thrown value by using String(err) for non-Error cases (e.g., set error to err instanceof Error ? err.message : String(err)). Locate the catch handling around evaluateTrust (the catch(err) that builds payload.trustAssessment) and change only the error fallback there to use String(err) so non-Error throws (strings, status objects) are captured.src/infrastructure/adapters/GitGraphAdapter.js (1)
709-714:this._getExitCodeis inconsistent withnodeExistsusing the module-levelgetExitCodedirectly.
_getExitCodeis a one-line delegate (lines 774–776). BothnodeExists(line 634) andisAncestor(line 710) are performing identical exit-code checks; using the instance method in one place and the module helper in the other adds noise without benefit.♻️ Suggested alignment
- if (this._getExitCode(/** `@type` {GitError} */ (err)) === 1) { + if (getExitCode(/** `@type` {GitError} */ (err)) === 1) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/infrastructure/adapters/GitGraphAdapter.js` around lines 709 - 714, The isAncestor catch block mixes instance and module helpers—replace the call to this._getExitCode with the module-level getExitCode so both isAncestor and nodeExists use the same helper; update the check in isAncestor (currently calling this._getExitCode) to call getExitCode(/** `@type` {GitError} */ (err)) === 1 to match nodeExists and remove the unnecessary one-line delegate usage.bin/presenters/json.js (1)
52-53:Record<string, unknown> | unknownsimplifies tounknown.Since
unknownis the top type,Record<string, unknown>is already a subtype ofunknownand the union is redundant. If the intent is to document that callers will commonly pass a record, consider using an overloaded JSDoc or justunknownwith a prose note.♻️ Simplification
- * `@param` {Record<string, unknown> | unknown} payload - * `@returns` {Record<string, unknown> | unknown} + * `@param` {unknown} payload + * `@returns` {unknown}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/presenters/json.js` around lines 52 - 53, The JSDoc type "Record<string, unknown> | unknown" is redundant because it simplifies to "unknown"; update the JSDoc for the affected function in bin/presenters/json.js by replacing the union type with "unknown" (or, if you want to emphasize callers commonly pass an object, use a JSDoc overload or add a prose note) — search for the JSDoc lines containing "Record<string, unknown> | unknown" (the param and `@returns` annotations) and change them to use "unknown" or add an overload documenting Record<string, unknown> explicitly.src/domain/services/HttpSyncServer.js (2)
227-241: Stale duplicate JSDoc block oninitAuth.There are two consecutive JSDoc blocks before
initAuth. Only the second one (lines 232–235) is effective; most tooling will use the JSDoc immediately preceding the function declaration, making the first block (lines 227–231) dead documentation that could confuse readers.♻️ Suggested fix — remove the stale JSDoc
-/** - * `@param` {{ keys: Record<string, string>, mode: 'enforce'|'log-only', crypto?: *, logger?: *, wallClockMs?: () => number }|undefined} auth - * `@param` {string[]} [allowedWriters] - * `@returns` {{ auth: SyncAuthService|null, authMode: string|null }} - * `@private` - */ /** * `@param` {z.infer<typeof authSchema>} [auth] * `@param` {string[]} [allowedWriters] */ function initAuth(auth, allowedWriters) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/HttpSyncServer.js` around lines 227 - 241, Remove the stale duplicate JSDoc block that precedes the effective one for the initAuth function; keep only the second JSDoc (the z.infer<typeof authSchema> variant) immediately above function initAuth to avoid confusing tooling and readers, ensuring the function signature and description remain documented by the single correct comment.
21-54: Zod schema validation looks well-structured overall.The schemas provide clear validation with sensible defaults and the
superRefinerule correctly enforces thatallowedWritersrequiresauth. A couple of observations:
httpPortandgraph(lines 33–39) still usez.any().refine(...)— this is pragmatic for port interfaces that Zod can't deeply validate, but consider documenting that these are intentional residualz.any()usages (e.g., a short comment) so future cleanup sweeps don't re-flag them.
cryptoandloggerinauthSchema(lines 27–28) similarly usez.any().optional().Given the PR's stated goal of eliminating wildcasts, a brief inline comment (e.g.,
// z.any(): port interface, not schema-validatable) would clarify intent.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/HttpSyncServer.js` around lines 21 - 54, Add short inline comments explaining the intentional use of z.any() for non-schema-validatable interfaces: annotate optionsSchema.httpPort and optionsSchema.graph with a comment like "// z.any(): port/graph interface not schema-validatable" and annotate authSchema.crypto and authSchema.logger with "// z.any(): opaque runtime objects (crypto/logger) intentionally left untyped"; keep the existing z.any().optional() and superRefine logic unchanged so future maintainers won't remove these wildcasts accidentally.bin/cli/types.js (1)
86-94: Consider tightening thewherecallback type.The
wheremethod usesFunctionwhich is still a loose type. Since this type is consumed byapplyQueryStepinquery.jswhere the callback receives a{props?: Record<string, unknown>}node, a more precise signature would be:`@property` {(fn: (node: {id: string, props?: Record<string, unknown>}) => boolean) => QueryBuilderLike} whereThis would provide better type safety at the call sites. That said, if the underlying implementation accepts arbitrary predicates beyond this shape,
Functionis acceptable.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/types.js` around lines 86 - 94, The typedef for QueryBuilderLike uses a loose Function type for the where method; tighten it by changing the where signature to accept a predicate that takes a node shaped like {id: string, props?: Record<string, unknown>} and returns a boolean (matching how applyQueryStep in query.js invokes the predicate). Update the `@property` for where in the QueryBuilderLike JSDoc to use (fn: (node: {id: string, props?: Record<string, unknown>}) => boolean) => QueryBuilderLike so call sites get stronger type information while remaining compatible with applyQueryStep.src/domain/services/SyncProtocol.js (1)
401-408: Divergence detection error narrowing is correct but slightly redundant.The
err instanceof Errorcheck appears in both sides of the||. A minor readability improvement would be to hoist it:if (err instanceof Error && ( ('code' in err && /** `@type` {{ code: string }} */ (err).code === 'E_SYNC_DIVERGENCE') || err.message?.includes('Divergence detected') )) {The message-based fallback (
'Divergence detected') is inherently fragile, but given it's a safety net for potential legacy paths, it's acceptable.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/SyncProtocol.js` around lines 401 - 408, Refactor the divergence check inside the catch block in SyncProtocol.js to avoid repeating "err instanceof Error": first ensure err is an Error, then test the two divergence conditions (error.code === 'E_SYNC_DIVERGENCE' on the code property and the fallback err.message?.includes('Divergence detected')) as a single combined conditional; update the existing catch block that currently checks both sides of the || to use this hoisted instanceof Error check while preserving the continue behavior for detected divergence and rethrowing otherwise.src/domain/services/ProvenanceIndex.js (1)
279-280: Redundant double-annotation — remove the standalone@typeline.Line 279's
/**@type{...} */and line 280's inline cast both set identical types. One is sufficient; the rest of the codebase (e.g.,Frontier.jsline 76,CheckpointSerializerV5.jsline 99) uses only the inline cast pattern.♻️ Proposed simplification
- /** `@type` {{ version?: number, entries?: Array<[string, string[]]> }} */ - const obj = /** `@type` {{ version?: number, entries?: Array<[string, string[]]> }} */ (c.decode(buffer)); + const obj = /** `@type` {{ version?: number, entries?: Array<[string, string[]]> }} */ (c.decode(buffer));🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/ProvenanceIndex.js` around lines 279 - 280, Remove the redundant standalone JSDoc type annotation applied before the declaration of obj; keep only the inline cast used with c.decode(buffer) (the inline /** `@type` {...} */ cast) so the variable obj (result of c.decode) has the type without the duplicate /** `@type` {...} */ line—update the ProvenanceIndex.js declaration that assigns obj from c.decode(buffer) to use only the inline cast form.src/domain/services/ObserverView.js (1)
131-131: Consider extracting the_materializeGraphreturn shape to a typedef.The inline cast spans the entire line. Extracting the internal type makes the method body scannable.
♻️ Proposed simplification
Add a local typedef (top of file or in a types file):
/** * `@typedef` {{ _materializeGraph: () => Promise<{ * state: import('./JoinReducer.js').WarpStateV5, * stateHash: string, * adjacency: { * outgoing: Map<string, Array<{neighborId: string, label: string}>>, * incoming: Map<string, Array<{neighborId: string, label: string}>> * } * }> }} MaterializableGraph */Then at line 131:
- const materialized = await /** `@type` {{ _materializeGraph: () => Promise<{state: import('./JoinReducer.js').WarpStateV5, stateHash: string, adjacency: {outgoing: Map<string, Array<{neighborId: string, label: string}>>, incoming: Map<string, Array<{neighborId: string, label: string}>>}}> }} */ (this._graph)._materializeGraph(); + const materialized = await /** `@type` {MaterializableGraph} */ (this._graph)._materializeGraph();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/ObserverView.js` at line 131, Extract the long inline JSDoc cast into a named typedef (e.g., MaterializableGraph) that describes the _materializeGraph return shape (including state: import('./JoinReducer.js').WarpStateV5, stateHash: string, and adjacency with outgoing/incoming maps of {neighborId, label}). Add that typedef near the top of the file (or in a shared types file) and then replace the inline cast on the materialized assignment with a concise cast to MaterializableGraph (e.g., /** `@type` {MaterializableGraph} */ (this._graph)). Ensure you reference the existing _materializeGraph symbol and preserve the exact nested types from the original inline annotation.src/domain/warp/provenance.methods.js (1)
264-265: Residualanyin_sortPatchesCausallyparameter type.The JSDoc
@param {Array<{patch: any, sha: string}>}still usesanyfor thepatchfield. Given this PR's goal of eliminating wildcard/any types, consider narrowing this to a more specific patch shape (e.g.,{lamport?: number, writer?: string}), consistent with the fields actually accessed in the sort comparator (lines 270, 276).♻️ Suggested type narrowing
-/** - * `@param` {Array<{patch: any, sha: string}>} patches - Unsorted patch entries - * `@returns` {Array<{patch: any, sha: string}>} Sorted patch entries - */ +/** + * `@param` {Array<{patch: {lamport?: number, writer?: string}, sha: string}>} patches - Unsorted patch entries + * `@returns` {Array<{patch: {lamport?: number, writer?: string}, sha: string}>} Sorted patch entries + */🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/provenance.methods.js` around lines 264 - 265, Update the JSDoc for _sortPatchesCausally to replace the residual any with a narrowed patch shape that matches what the comparator reads (e.g., use `@param` {Array<{patch: {lamport?: number, writer?: string}, sha: string}>} or similar), so the patch object documents lamport and writer (optional if they can be missing) and aligns with the comparator logic in _sortPatchesCausally; adjust any other references to the previous typedef if present to keep types consistent.src/domain/warp/patch.methods.js (1)
268-270: Outer cast usesObject(capital-O) — consider using the precise type.Lines 268 and 270 cast through
unknown, which is the correct pattern to replace wildcards. However, the outer target type on line 270 uses{patch: Object, sha: string}—Objectis nearly as loose asany. Since the inner lambda already typesoptswith the precisePatchV2import, the outer cast could match:- onCommitSuccess: /** `@type` {(result: {patch: Object, sha: string}) => void} */ (/** `@type` {unknown} */ ((/** `@type` {{patch?: import('../types/WarpTypesV2.js').PatchV2, sha?: string}} */ opts) => this._onPatchCommitted(resolvedWriterId, opts))), + onCommitSuccess: /** `@type` {(result: {patch: import('../types/WarpTypesV2.js').PatchV2, sha: string}) => void} */ (/** `@type` {unknown} */ ((/** `@type` {{patch?: import('../types/WarpTypesV2.js').PatchV2, sha?: string}} */ opts) => this._onPatchCommitted(resolvedWriterId, opts))),Same issue exists on line 328 in
createWriter(). This is a minor leftover from the wildcard-elimination effort.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/patch.methods.js` around lines 268 - 270, The outer cast for onCommitSuccess uses a loose `{patch: Object, sha: string}` type; update the outer cast to match the precise inner signature (use `{patch?: import('../types/WarpTypesV2.js').PatchV2, sha?: string}` or the exact optionality used in the inner lambda) so the cast is not effectively `Object`, and make the same change in the matching cast inside createWriter() (line ~328) so both onCommitSuccess and the createWriter cast use the exact PatchV2 import and correct optional properties.src/domain/services/CommitDagTraversalService.js (1)
57-167: Facade delegation methods still use@param {*} options.The delegating methods (
bfs,dfs,ancestors, etc.) retain@param {*} options— effectively wildcards. These aren't part of this PR's changed lines, but they represent residual loose types in the facade. A follow-up could type them to match the sub-service signatures (or useParameters<...>[0]like thejoinPatchpattern inpatch.methods.js).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/CommitDagTraversalService.js` around lines 57 - 167, The facade methods (bfs, dfs, ancestors, descendants, isReachable, findPath, shortestPath, weightedShortestPath, aStarSearch, bidirectionalAStar, commonAncestors, topologicalSort) use a loose JSDoc param `@param {*} options`; update each method's JSDoc to reflect the actual parameter type by referencing the delegate signature (e.g. use a TypeScript-style JSDoc type like `@param {Parameters<import('./DagTraversal').default['bfs']>[0]} options` or `Parameters<typeof this._traversal.bfs>[0]` for traversal methods and analogous `Parameters<...>[0]` for _pathFinding and _topology methods) so the facade accurately mirrors the sub-service types (match the patterns used in patch.methods.js's joinPatch).src/domain/services/IndexRebuildService.js (1)
392-392: Intersection cast forloadIndexFrontierstorage parameter.The cast
/**@type{IndexStoragePort & BlobPort} */is needed becausethis.storageis typed asIndexStoragePortbutloadIndexFrontierrequires blob-reading capabilities too. This works but is a signal that the constructor's storage type could be widened to includeBlobPort(or use theIndexStoragealias fromWarpPersistence.jswhich already includes it).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IndexRebuildService.js` at line 392, The explicit intersection cast on this.storage when calling loadIndexFrontier indicates the class constructor type is too narrow; update the service to accept a storage type that includes both IndexStoragePort and BlobPort (or use the existing IndexStorage alias from WarpPersistence.js) so you can remove the inline cast. Concretely, change the constructor/field type for the service (the this.storage declaration) to IndexStorage or an intersection of IndexStoragePort & BlobPort, then remove the /** `@type` {...} */ cast at the loadIndexFrontier call; keep function name loadIndexFrontier and this.storage so reviewers can find the change.src/domain/services/BitmapIndexReader.js (1)
93-98: Constructor still uses a typed cast on the empty default, unlike other files.Other constructors in this PR (e.g.,
IndexRebuildService,DagPathFinding) use plain= {}. This one retains= /**@type{{ storage: IndexStoragePort, ... }} */ ({}). Both patterns work, but it's an inconsistency. Minor nit — no functional impact since the runtime guard on line 94 catches missingstorage.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BitmapIndexReader.js` around lines 93 - 98, The constructor for BitmapIndexReader uses a JSDoc typed cast on its empty default parameter (the "constructor" in class BitmapIndexReader) which is inconsistent with other constructors like IndexRebuildService and DagPathFinding; change the parameter default from the JSDoc-cast form (/** `@type` {…} */ ({})) to a plain = {} default and keep the existing runtime storage guard/throw, so behavior is unchanged but style matches the rest of the codebase.src/domain/services/StreamingBitmapIndexBuilder.js (1)
11-11: Dual type references for storage:IndexStoragePort(param) vsIndexStorage(property).The constructor param is typed as
IndexStoragePort(line 86), but the property is typed asIndexStoragefromWarpPersistence.js(line 111–112). These should be structurally identical, but having two separate type aliases for the same concept could cause confusion if they drift apart.Consider using the same type alias consistently — either
IndexStoragePorteverywhere orIndexStorageeverywhere.Also applies to: 111-112
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/StreamingBitmapIndexBuilder.js` at line 11, The code mixes two type aliases for the same concept: the constructor parameter annotated as IndexStoragePort and the instance property annotated as IndexStorage; unify them to a single alias to avoid drift. Pick one alias (either IndexStoragePort or IndexStorage), update the top typedef import and all JSDoc annotations so the constructor parameter (in the constructor signature) and the property (the instance field assigned in StreamingBitmapIndexBuilder) use the same type name, and remove the alternate typedef reference so both the param and the property consistently reference the identical type alias.src/domain/types/WarpPersistence.js (1)
15-33:PersistenceReader,PersistenceWriter, andCheckpointPersistenceare structurally identical.All three are
CommitPort & BlobPort & TreePort & RefPort. The doc comments imply different access patterns (read-side vs. write-side vs. no-config), but the types don't actually enforce that distinction. If these ports expose both read and write methods, aPersistenceWriterreference can still call read methods and vice versa.This is fine as semantic documentation today, but consider splitting the ports themselves into finer-grained read/write interfaces if you want compile-time enforcement in the future.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/types/WarpPersistence.js` around lines 15 - 33, The typedefs PersistenceReader, PersistenceWriter, and CheckpointPersistence currently all alias the same full-port types (CommitPort, BlobPort, TreePort, RefPort), so they do not enforce read-only vs write-only semantics; to fix this, introduce separate read/write variants for each port (e.g., CommitPortRead/CommitPortWrite, BlobPortRead/BlobPortWrite, TreePortRead/TreePortWrite, RefPortRead/RefPortWrite) that only expose the appropriate methods, then update PersistenceReader to combine the *Read variants, PersistenceWriter to combine the *Write variants, and leave CheckpointPersistence as the precise combination you intend (e.g., reads+writes as needed); update the import typedefs in WarpPersistence.js to reference these new symbol names so the types enforce access patterns.src/domain/warp/_wiredMethods.d.ts (2)
205-213:serve()useshttpPort: unknownandauth?: unknown— weaker than the public surface.
index.d.tstypes these asHttpServerPortandSyncAuthServerOptionsrespectively. The looser typing here means internal callers ofthis.serve(...)won't get compile-time checks on these parameters.This is acceptable if importing those port types would create a circular dependency, but worth documenting with a brief comment.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/_wiredMethods.d.ts` around lines 205 - 213, The serve method currently types httpPort and auth as unknown which weakens compile-time checks; update the signature of serve(httpPort: HttpServerPort, auth?: SyncAuthServerOptions) so internal callers get proper types (use the same HttpServerPort and SyncAuthServerOptions from the public surface), and if importing those types would cause a circular dependency instead add a brief comment above serve explaining that unknown was chosen intentionally due to circular-import risk and reference the canonical types (HttpServerPort, SyncAuthServerOptions) located in the public index.d.ts so future maintainers understand the rationale.
137-143: AlignWormholeEdge.payloadtype between internal and public API declarations.The internal
WormholeEdgeinterface in_wiredMethods.d.tsdeclarespayload: unknown, but the public API inindex.d.tsexports it aspayload: ProvenancePayload(line 2416). This inconsistency means code using the internalcreateWormhole()method cannot safely access payload properties without explicit casts.Import
ProvenancePayloadfrom../services/ProvenancePayload.jsand update the type to match the public declaration.♻️ Suggested alignment
+import type { ProvenancePayload } from '../services/ProvenancePayload.js'; interface WormholeEdge { fromSha: string; toSha: string; writerId: string; - payload: unknown; + payload: ProvenancePayload; patchCount: number; }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/_wiredMethods.d.ts` around lines 137 - 143, The internal WormholeEdge interface in _wiredMethods.d.ts declares payload as unknown but must match the public API; import ProvenancePayload from ../services/ProvenancePayload.js and change the WormholeEdge.payload type to ProvenancePayload so createWormhole and any callers can use the payload without casts; update the import statement and the interface declaration (WormholeEdge, referenced by createWormhole) to use ProvenancePayload.test/unit/cli/doctor.test.js (1)
28-31: Considervi.mocked()over the double-cast pattern.The
unknown → Mockdouble-cast works but is the JSDoc workaround for whatvi.mocked()solves directly and more readably:♻️ Proposed refactor using `vi.mocked()`
-const _shared = await import('../../../bin/cli/shared.js'); -const createPersistence = /** `@type` {import('vitest').Mock} */ (/** `@type` {unknown} */ (_shared.createPersistence)); -const resolveGraphName = /** `@type` {import('vitest').Mock} */ (/** `@type` {unknown} */ (_shared.resolveGraphName)); -const createHookInstaller = /** `@type` {import('vitest').Mock} */ (/** `@type` {unknown} */ (_shared.createHookInstaller)); +const _shared = await import('../../../bin/cli/shared.js'); +const createPersistence = vi.mocked(_shared.createPersistence); +const resolveGraphName = vi.mocked(_shared.resolveGraphName); +const createHookInstaller = vi.mocked(_shared.createHookInstaller);
vi.mockis hoisted and always executed before imports, so the mock is in place when theawait import(...)resolves — the timing is correct either way.vi.mocked()is available in Vitest 2.1.8 and just removes the noise.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/cli/doctor.test.js` around lines 28 - 31, Replace the double-cast JSDoc mock conversion for the imported _shared functions with Vitest's vi.mocked helper: after importing _shared (the result of await import('../../../bin/cli/shared.js')), call vi.mocked(_shared.createPersistence), vi.mocked(_shared.resolveGraphName), and vi.mocked(_shared.createHookInstaller) to get typed mocks instead of using the /** `@type` {unknown} */ → /** `@type` {import('vitest').Mock} */ casts; this removes the noisy double-cast and relies on vi.mocked() to produce properly typed mocks for createPersistence, resolveGraphName, and createHookInstaller.
| ### Changed | ||
|
|
||
| - **M8 IRONCLAD Wave 3: cast elimination** — Removed ~107 wildcard casts (`@type {*}` / `@type {any}`) across ~40 files in `src/domain/warp/`, `src/domain/services/`, and `src/infrastructure/adapters/`. All casts replaced with role-specific persistence types (`PersistenceReader`, `PersistenceWriter`, `CheckpointPersistence`, `IndexStorage`), error narrowing helpers (`isError`, `hasErrorCode`, `hasMessage`), and properly typed aliases. Zero `TODO(ts-cleanup)` tags remain in Wave 3 scope. | ||
| - **HttpSyncServer Zod validation** — Constructor now uses Zod schema validation; removed manual mode validation and wildcard casts. `initAuth` typed via schema inference. `z.function()` replaced with `z.custom()` for correct `wallClockMs` TypeScript inference. | ||
| - **HookInstaller** — Constructor deps parameter changed from optional to required (all callers already provided all fields). | ||
| - **SyncAuthService** — `_validateKeys` now typed as assertion function for proper post-validation narrowing. | ||
| - **WarpErrors** — `hasErrorCode`/`hasMessage` helpers upgraded from wildcard casts to `Record<string, unknown>` narrowing. | ||
| - **WarpPersistence types** — Added `IndexStorage` typedef (`BlobPort & TreePort & RefPort`). | ||
|
|
There was a problem hiding this comment.
Verify Wave 3 metrics and scope in the changelog.
The entry claims ~107 casts removed in Wave 3, but the PR summary indicates that figure for Wave 1. Please reconcile the counts/paths to avoid misleading release notes.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@CHANGELOG.md` around lines 10 - 18, Update the "M8 IRONCLAD Wave 3: cast
elimination" changelog line to match the actual PR scope: verify whether the
~107 wildcard casts and the listed paths (`src/domain/warp/`,
`src/domain/services/`, `src/infrastructure/adapters/`) belong to Wave 3 or Wave
1, correct the numeric count and/or wave label accordingly, and ensure the
sentence accurately reflects which files and waves were changed (edit the "~107"
and/or "Wave 3" text in the header and the file path list to match the PR
summary).
| "syncWith": { | ||
| "async": true, | ||
| "params": [ | ||
| { "name": "remote", "type": "string | WarpGraph" }, | ||
| { "name": "options", "type": "SyncWithOptions", "optional": true } | ||
| ], | ||
| "returns": "Promise<{ applied: number; attempts: number }>" | ||
| }, |
There was a problem hiding this comment.
syncWith return type is missing the optional state property.
The manifest declares "returns": "Promise<{ applied: number; attempts: number }>" but both index.d.ts (line 1761) and _wiredMethods.d.ts (line 204) declare Promise<{ applied: number; attempts: number; state?: WarpStateV5 }>. Since this manifest is described as the "source of truth for T3 (index.d.ts) and T9 (consumer test)", this discrepancy could cause the consumer test to miss validating the state return when materialize: true is passed.
Suggested fix
"syncWith": {
"async": true,
"params": [
{ "name": "remote", "type": "string | WarpGraph" },
{ "name": "options", "type": "SyncWithOptions", "optional": true }
],
- "returns": "Promise<{ applied: number; attempts: number }>"
+ "returns": "Promise<{ applied: number; attempts: number; state?: WarpStateV5 }>"
},🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@contracts/type-surface.m8.json` around lines 155 - 162, The manifest's
syncWith method return type lacks the optional state property; update the
"returns" for "syncWith" to "Promise<{ applied: number; attempts: number;
state?: WarpStateV5 }>" so it matches index.d.ts and _wiredMethods.d.ts; ensure
the change references the existing params (remote, options: SyncWithOptions) and
the WarpStateV5 type so consumers receive the optional state when materialize:
true.
| }, | ||
| "properties": { | ||
| "graphName": { "type": "string", "readonly": true }, | ||
| "writerId": { "type": "string", "readonly": true }, | ||
| "seekCache": { "type": "SeekCachePort | null", "readonly": true }, | ||
| "provenanceIndex": { "type": "ProvenanceIndex | null", "readonly": true }, | ||
| "persistence": { "type": "GraphPersistencePort", "readonly": true, "getter": true }, | ||
| "onDeleteWithData": { "type": "'reject' | 'cascade' | 'warn'", "readonly": true, "getter": true }, | ||
| "gcPolicy": { "type": "GCPolicyConfig", "readonly": true, "getter": true }, | ||
| "temporal": { "type": "TemporalQuery", "readonly": true, "getter": true }, | ||
| "traverse": { "type": "LogicalTraversal" } | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
rg -nP 'setSeekCache' -g '!node_modules/**'Repository: git-stunts/git-warp
Length of output: 1403
🏁 Script executed:
# Check the index.d.ts file around line 1589
fd -e d.ts -e ts | head -20Repository: git-stunts/git-warp
Length of output: 535
🏁 Script executed:
# Look for the manifest file
fd type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 93
🏁 Script executed:
# Examine the manifest file to see the structure and check for setSeekCache
cat contracts/type-surface.m8.json | head -300 | tail -100Repository: git-stunts/git-warp
Length of output: 3506
🏁 Script executed:
# Search for setSeekCache in the manifest file
rg -n 'setSeekCache' contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 45
🏁 Script executed:
# Let's also check what the "instance" section contains - look at the full structure
jq '.instance' contracts/type-surface.m8.json | head -100Repository: git-stunts/git-warp
Length of output: 67
🏁 Script executed:
# Let's look at the overall structure of the manifest
jq 'keys' contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 118
🏁 Script executed:
# Look for the WarpGraph class definition in the manifest
jq '.WarpGraph' contracts/type-surface.m8.json | head -200Repository: git-stunts/git-warp
Length of output: 67
🏁 Script executed:
# Count the number of methods in WarpGraph instance
jq '.WarpGraph.instance | keys | length' contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 137
🏁 Script executed:
# List all method names in WarpGraph instance
jq '.WarpGraph.instance | keys' contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 137
🏁 Script executed:
# Let's examine the structure more carefully
jq '.exports | keys' contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 1532
🏁 Script executed:
# Look for WarpGraph in exports
jq '.exports.WarpGraph' contracts/type-surface.m8.json | head -300Repository: git-stunts/git-warp
Length of output: 7118
🏁 Script executed:
# Look at the entire structure to understand layout
cat contracts/type-surface.m8.json | head -50Repository: git-stunts/git-warp
Length of output: 1983
🏁 Script executed:
# Check if there are any comments or documentation about intentional omissions
rg -n 'setSeekCache' contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 45
🏁 Script executed:
# Search for any comments in the manifest or documentation about missing methods
rg -i 'intentional|omit|exclude|skip' contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 45
🏁 Script executed:
# Check the git history or comments around the WarpGraph section in the manifest
head -n 500 contracts/type-surface.m8.json | tail -n 200Repository: git-stunts/git-warp
Length of output: 10910
🏁 Script executed:
# Let's verify that setSeekCache is actually implemented and tested
cat src/domain/WarpGraph.js | sed -n '185,195p'Repository: git-stunts/git-warp
Length of output: 426
🏁 Script executed:
# Check if there are references in tests to verify it's a public API
rg -B2 -A5 'setSeekCache' test/unit/domain/WarpGraph.prototypeWiring.test.jsRepository: git-stunts/git-warp
Length of output: 332
🏁 Script executed:
# Let's check the test file more broadly to understand what's being tested
rg -B5 -A10 "setSeekCache" test/unit/domain/seekCache.test.js | head -40Repository: git-stunts/git-warp
Length of output: 903
🏁 Script executed:
# Verify the method signature in index.d.ts
sed -n '1585,1595p' index.d.tsRepository: git-stunts/git-warp
Length of output: 405
🏁 Script executed:
# Let's check if there are any comments in the manifest about intentional omissions or notes
head -30 contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 1235
🏁 Script executed:
# Check if there are any other methods that might give us a pattern for what's included/excluded
rg -n 'deprecated.*true' contracts/type-surface.m8.jsonRepository: git-stunts/git-warp
Length of output: 350
Add setSeekCache to the manifest's instance methods.
The manifest declares itself as the "source of truth for T3 (index.d.ts)", but setSeekCache(cache: SeekCachePort | null): void is missing from the WarpGraph instance methods. This method exists in the implementation (WarpGraph.js:190), is documented in index.d.ts (lines 1589–1590), and is tested across multiple test files. The manifest includes the seekCache property but excludes the setter method that modifies it, creating an inconsistency.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@contracts/type-surface.m8.json` around lines 286 - 297, The manifest is
missing the WarpGraph instance method setSeekCache(cache: SeekCachePort | null):
void which leaves seekCache writable only in implementation; add an instance
method entry named "setSeekCache" with the signature accepting SeekCachePort |
null and returning void to the manifest's properties/methods block (alongside
"seekCache" and other getters) so the JSON surface matches the implementation
(WarpGraph.js setSeekCache) and index.d.ts documentation.
| ``` | ||
| B29 ──→ B30 ──→ B41/B42 ──→ [M8 GATE] ──→ B33 ──→ [M9 GATE] | ||
| │ B32 ──┘ │ | ||
| B38 B31(opt) │ | ||
| B14 ▼ | ||
| B1 ──→ [M10 GATE] ──→ B2(impl) | ||
| B39 ──┘ │ B3 | ||
| B40 ──┘ │ B11 | ||
| B2(spec)───┘ ▼ | ||
| [M11 GATE] | ||
| ``` |
There was a problem hiding this comment.
Fenced code block missing language specifier (markdownlint MD040).
The ASCII art critical-path diagram at line 805 lacks a language annotation, triggering the markdownlint-cli2 warning. Adding text will satisfy the linter without altering rendering.
🔧 Fix
-```
+```text
B29 ──→ B30 ──→ B41/B42 ──→ [M8 GATE] ──→ B33 ──→ [M9 GATE]📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ``` | |
| B29 ──→ B30 ──→ B41/B42 ──→ [M8 GATE] ──→ B33 ──→ [M9 GATE] | |
| │ B32 ──┘ │ | |
| B38 B31(opt) │ | |
| B14 ▼ | |
| B1 ──→ [M10 GATE] ──→ B2(impl) | |
| B39 ──┘ │ B3 | |
| B40 ──┘ │ B11 | |
| B2(spec)───┘ ▼ | |
| [M11 GATE] | |
| ``` |
🧰 Tools
🪛 markdownlint-cli2 (0.21.0)
[warning] 805-805: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@ROADMAP.md` around lines 805 - 815, The fenced code block containing the
ASCII art (the block starting with the lines showing nodes like B29, B30,
B41/B42, [M8 GATE], etc.) is missing a language specifier and triggers
markdownlint MD040; fix it by changing the opening fence from ``` to ```text
(leave the ASCII art and closing fence unchanged) so the linter recognizes it as
plain text.
| } | ||
|
|
||
| return await /** @type {any} */ (builder).finalize({ signal, frontier }); // TODO(ts-cleanup): narrow port type | ||
| return await builder.finalize({ signal, frontier: /** @type {Map<string, number>|undefined} */ (/** @type {unknown} */ (frontier)) }); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's find and examine IndexRebuildService.js around line 101 and 278
fd -t f IndexRebuildService.js -x cat -n {} | head -300Repository: git-stunts/git-warp
Length of output: 15498
🏁 Script executed:
# Also check StreamingBitmapIndexBuilder.js for the finalize method signature
fd -t f StreamingBitmapIndexBuilder.js -x cat -n {}Repository: git-stunts/git-warp
Length of output: 32195
🏁 Script executed:
# Search for all .rebuild( calls to understand how frontier is passed
rg -nP --type=js '\.rebuild\(' -C5 | head -100Repository: git-stunts/git-warp
Length of output: 8728
🏁 Script executed:
# Search for rebuild calls that pass frontier values
rg -nP --type=js 'frontier\s*:' -C5 | head -80Repository: git-stunts/git-warp
Length of output: 6021
🏁 Script executed:
# Check the _persistIndex method implementation to see how it uses frontier
sed -n '300,350p' $(fd -t f IndexRebuildService.js)Repository: git-stunts/git-warp
Length of output: 3116
🏁 Script executed:
# Look for any frontier-related test data or mock values
rg -nP --type=js 'new Map.*frontier|frontier.*new Map' -C3 | head -60Repository: git-stunts/git-warp
Length of output: 4510
The frontier parameter type declarations are inconsistent across the codebase.
The rebuild() method and all in-memory/streaming code paths receive frontier as Map<string, string> (writer ID → tip SHA), but StreamingBitmapIndexBuilder.finalize() JSDoc declares it as Map<string, number> (writerId → clock). This forces the double cast through unknown at line 278.
The mismatch is purely a documentation error: the finalize implementation (lines 444-449) doesn't enforce numeric values—it just sorts keys and serializes values to JSON, which handles both strings and numbers. Since all actual usage passes SHAs (strings), update StreamingBitmapIndexBuilder.finalize() JSDoc at line 415 to @param {Map<string, string>} [options.frontier] and remove the double cast at line 278.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/services/IndexRebuildService.js` at line 278, The frontier type is
mismatched: update StreamingBitmapIndexBuilder.finalize() JSDoc to declare
options.frontier as Map<string, string> (writerId → tip SHA) instead of
Map<string, number>, and then remove the unnecessary double cast to unknown and
to Map<string, number> where rebuild() calls builder.finalize (the return at the
current call site that does await builder.finalize({... frontier: /** `@type`
{Map<string, number>|undefined} */ (/** `@type` {unknown} */ (frontier)) })).
Ensure the finalize JSDoc change references StreamingBitmapIndexBuilder.finalize
and that rebuild() continues to pass the Map<string, string> frontier directly
to builder.finalize without casting.
| /** | ||
| * @typedef {Object} PatchCommitEvent | ||
| * @property {unknown} [patch] | ||
| * @property {string} [sha] | ||
| */ |
There was a problem hiding this comment.
sha should be required in PatchCommitEvent.
Both fields are marked optional, making {} a valid PatchCommitEvent. For a commit event, the sha identifying the commit is semantically mandatory — an event without it has no identity. patch being optional (e.g., streamed/deferred) is plausible, but sha should not be.
🛡️ Proposed fix
/**
* `@typedef` {Object} PatchCommitEvent
* `@property` {unknown} [patch]
- * `@property` {string} [sha]
+ * `@property` {string} sha
*/📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| /** | |
| * @typedef {Object} PatchCommitEvent | |
| * @property {unknown} [patch] | |
| * @property {string} [sha] | |
| */ | |
| /** | |
| * `@typedef` {Object} PatchCommitEvent | |
| * `@property` {unknown} [patch] | |
| * `@property` {string} sha | |
| */ |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/types/WarpOptions.js` around lines 23 - 27, Update the JSDoc
typedef for PatchCommitEvent so the sha property is required: change the
`@property` annotation for sha from optional to required (remove the square
brackets around sha) while leaving patch as optional; this ensures
PatchCommitEvent always contains a commit identifier (refer to the
PatchCommitEvent typedef and the `@property` entries for patch and sha).
| const currentFrontier = /** @type {Map<string, string>} */ (/** @type {unknown} */ (this._cachedState.observedFrontier)); | ||
| const result = /** @type {{state: import('../services/JoinReducer.js').WarpStateV5, frontier: Map<string, string>, applied: number}} */ (applySyncResponseImpl(response, this._cachedState, currentFrontier)); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
rg -nP --type=js -C5 'observedFrontier' -g '!node_modules/**' | head -120Repository: git-stunts/git-warp
Length of output: 9102
🏁 Script executed:
# Check the type declaration in index.d.ts
cat -n index.d.ts | sed -n '2190,2210p'Repository: git-stunts/git-warp
Length of output: 767
🏁 Script executed:
# Search for vvSerialize and vvDeserialize to understand what they do
rg -nP 'export.*(?:vvSerialize|vvDeserialize|createVersionVector)' --type=js -A2Repository: git-stunts/git-warp
Length of output: 979
🏁 Script executed:
# Check applySyncResponseImpl signature
rg -nP 'function applySyncResponseImpl|export.*applySyncResponseImpl' --type=js -A10Repository: git-stunts/git-warp
Length of output: 45
🏁 Script executed:
rg -nP 'applySyncResponseImpl' --type=js -B2 -A10Repository: git-stunts/git-warp
Length of output: 2476
🏁 Script executed:
# Also search for where frontier is used in response handling
rg -nP 'frontier.*string|frontier.*number' --type=js -C3Repository: git-stunts/git-warp
Length of output: 10801
🏁 Script executed:
rg -nP 'export.*function applySyncResponse|function applySyncResponse' src/domain/services/SyncProtocol.js -A25Repository: git-stunts/git-warp
Length of output: 1383
🏁 Script executed:
# Also check the Frontier.js module to understand what createFrontier and updateFrontier do
rg -nP 'export.*(?:createFrontier|updateFrontier|Frontier)' src/domain/services/Frontier.js -A8Repository: git-stunts/git-warp
Length of output: 2271
🏁 Script executed:
# Check the full sync response handling to see if there's a frontier being constructed
rg -nP 'const currentFrontier|_lastFrontier' src/domain/warp/sync.methods.js -B3 -A3Repository: git-stunts/git-warp
Length of output: 1787
🏁 Script executed:
# Check the return type and usage of applySyncResponse more carefully
rg -nP 'result\.frontier|newFrontier' src/domain/services/SyncProtocol.js -B2 -A2Repository: git-stunts/git-warp
Length of output: 720
🏁 Script executed:
# Check initialization and documentation of observedFrontier in checkpoint/state creation
rg -nP 'observedFrontier.*=' src/domain/services/JoinReducer.js -B2 -A2Repository: git-stunts/git-warp
Length of output: 426
🏁 Script executed:
# Check what vvSerialize really contains and how observedFrontier is used after being loaded
rg -nP 'vvDeserialize|appliedVV' src/domain/services/CheckpointService.js -B2 -A5Repository: git-stunts/git-warp
Length of output: 4753
observedFrontier is a version vector (Map<string, number>), not a frontier (Map<string, string>)—passing it to applySyncResponseImpl is incorrect.
observedFrontier is a CRDT version vector (lamport timestamps by writerId) created via createVersionVector(), serialized with vvSerialize(), and merged via vvMerge(). The cast through unknown masks a critical semantic mismatch: you are passing version vector data where applySyncResponseImpl expects writer frontier data (current patch SHAs). This is not just a type bypass—it passes the wrong data structure entirely, which will cause incorrect state merging in the sync protocol.
Separate the concerns: either load or reconstruct the actual frontier (writerId → patchSha) from a source like the checkpoint's frontier.cbor, or track it separately in the instance state, then pass that to applySyncResponseImpl.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/warp/sync.methods.js` around lines 261 - 262, The code wrongly
casts this._cachedState.observedFrontier (a version vector of Map<string,
number>) to a writer frontier Map<string, string> and passes it into
applySyncResponseImpl; instead, obtain the actual frontier (writerId→patchSha)
and pass that. Replace the unsafe cast of observedFrontier with logic that
reconstructs/loads the real frontier: e.g., parse/deserialize the checkpoint
frontier (frontier.cbor) from this._cachedState.checkpoint or use an explicit
cached writer-frontier field on this instance, and pass that Map<string,string>
(not the VV) into applySyncResponseImpl; keep observedFrontier as the version
vector (created by createVersionVector/vvSerialize/vvMerge) and do not reuse it
for writer SHAs.
| /** | ||
| * @typedef {{ hostname: string, port: number, stop: () => void }} BunServer | ||
| */ |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Bun Server stop() method signature return type Promise
💡 Result:
server.stop is typed as:
stop(closeActiveConnections?: boolean): Promise<void>;So yes—its return type is Promise<void>. [1]
🏁 Script executed:
# Check the actual code in BunHttpAdapter.js
cat -n src/infrastructure/adapters/BunHttpAdapter.js | head -180 | tail -60Repository: git-stunts/git-warp
Length of output: 1835
🏁 Script executed:
# Search for BunServer typedef and stopServer usage
rg "BunServer|stopServer" --type js -B 2 -A 2Repository: git-stunts/git-warp
Length of output: 2195
stop: () => void mismatches Bun's actual async API — stopServer silently swallows errors and fires its callback early.
Bun's Server.stop() signature is stop(closeActiveConnections?: boolean): Promise<void> — not () => void. The inaccurate typedef has two downstream effects in stopServer (lines 157–171):
- Callback fires before the server is fully stopped —
state.server.stop()is not awaited, socallback()on line 164 is called while the stop is still in-flight. - Rejection is silently swallowed — the
try/catchon lines 166–169 won't catch aPromise<void>rejection; the error never reachescallback(err).
🔧 Proposed fix
Update the typedef to reflect the real signature:
-/** `@typedef` {{ hostname: string, port: number, stop: () => void }} BunServer */
+/** `@typedef` {{ hostname: string, port: number, stop: (closeActiveConnections?: boolean) => Promise<void> }} BunServer */Then update stopServer to await stop() and handle the async error properly:
-function stopServer(state, callback) {
- try {
- if (state.server) {
- state.server.stop();
- state.server = null;
- }
- if (callback) {
- callback();
- }
- } catch (err) {
- if (callback) {
- callback(err);
- }
- }
-}
+async function stopServer(state, callback) {
+ try {
+ if (state.server) {
+ await state.server.stop();
+ state.server = null;
+ }
+ if (callback) {
+ callback();
+ }
+ } catch (err) {
+ if (callback) {
+ callback(err);
+ }
+ }
+}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/infrastructure/adapters/BunHttpAdapter.js` around lines 128 - 130, The
typedef for BunServer is wrong and stopServer misuses the async stop API: update
the BunServer typedef to declare stop(closeActiveConnections?: boolean):
Promise<void>, then change stopServer to await state.server.stop(...) and wrap
that await in try/catch so any rejection is passed to callback(err) and
callback() is only invoked after the awaited stop completes; refer to the
BunServer typedef and the stopServer function and the call to state.server.stop
in your edits.
…ialize, upgrade policy checker
- Replace 48 embedded wildcards (Array<*>, Map<string, *>, LWWRegister<*>,
Promise<{payload: *}>, etc.) with unknown or proper specific types across
31 files in src/, bin/, and infrastructure/
- Restore vvSerialize() in PatchBuilderV2 (Map→Object serialization bug)
- Collapse PersistenceReader/Writer/CheckpointPersistence → CorePersistence
- Replace z.any() with z.custom() in HttpSyncServer
- Upgrade ts-policy-check.js: now bans embedded wildcards and z.any()
- Add AuditReceipt and CasStore typedefs for proper return types
- Fix constructor JSDoc: required params no longer marked optional
- Bump to 11.3.1
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (13)
src/domain/types/TickReceipt.js (1)
278-289:⚠️ Potential issue | 🟡 MinorTwo embedded wildcard annotations in
sortedReplacerremain unfixed despite PR intent.Lines 280 and 282 contain
@type {{ [x: string]: * }}— the same embedded wildcard pattern this PR targets for elimination. Thets-policy-check.jsscript's Rule 3 explicitly scans for and bans this pattern via the regex\{\[[\w:]+\]:\s*\*\}, making these violations that should have been caught before merge.♻️ Proposed fix
- /** `@type` {{ [x: string]: * }} */ + /** `@type` {{ [x: string]: unknown }} */ const sorted = {}; - const obj = /** `@type` {{ [x: string]: * }} */ (value); + const obj = /** `@type` {{ [x: string]: unknown }} */ (value);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/types/TickReceipt.js` around lines 278 - 289, The JSDoc wildcard annotations inside function sortedReplacer still use the banned pattern ({ [x: string]: * }); update both annotations to a non-wildcard type such as Record<string, unknown> (or { [x: string]: unknown }) and adjust the cast for the obj variable accordingly so the lines declaring sorted and obj use /** `@type` {Record<string, unknown>} */ instead of the wildcard form; ensure sortedReplacer, sorted, and obj references are updated consistently.src/domain/services/DagTopology.js (1)
79-85:⚠️ Potential issue | 🟠 MajorPre-existing:
this._traversalis not guarded before use in theshas.length === 1branch.
traversalon line 79 is a bare type-cast ofthis._traversal, which isundefinedwhen the constructor is called without atraversaloption. The cast suppresses the type warning but does not prevent aTypeErrorat line 82 whentraversal.ancestors(...)is called. The multi-SHA path at line 96 has the same exposure.Since
traversalis documented as optional, a guard (or an early-throw in the constructor whentraversalis needed) would prevent a confusing runtime crash.🛡️ Proposed fix — guard before use
+ if (!this._traversal) { + throw new Error('DagTopology.commonAncestors requires a traversal service'); + } const traversal = /** `@type` {import('./DagTraversal.js').default} */ (this._traversal);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/DagTopology.js` around lines 79 - 85, The code casts this._traversal to traversal and calls traversal.ancestors (and other traversal.* calls in the multi-SHA branch) without ensuring this._traversal is defined, leading to a runtime TypeError; add a guard that checks for this._traversal before use (either throw a clear TypeError in the DagTopology constructor when traversal is required, or check before calling traversal.ancestors/commonAncestor methods and throw a descriptive error like "DagTopology requires a traversal to perform ancestor lookups"), updating the code paths that reference traversal (the shas.length === 1 branch that uses traversal.ancestors and the multi-SHA branch that calls traversal methods) so they never call methods on an undefined this._traversal.src/domain/services/JoinReducer.js (1)
397-399:⚠️ Potential issue | 🟡 MinorRemaining wildcard cast
value: *missed in the receipt-enabledPropSetpath.Line 398 still uses
/**@type{{node: string, key: string, value: *}} */while every otherop.valueannotation in this file (lines 84, 297, 350, 533) was updated tounknownas part of this PR.🔧 Proposed fix
- outcome = propSetOutcome(state.prop, /** `@type` {{node: string, key: string, value: *}} */ (op), eventId); + outcome = propSetOutcome(state.prop, /** `@type` {{node: string, key: string, value: unknown}} */ (op), eventId);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/JoinReducer.js` around lines 397 - 399, The JSDoc cast for the PropSet case still uses a wildcard type for op.value; update the cast in the 'PropSet' branch so op is typed like the other updated annotations (use value: unknown) — specifically edit the cast applied when calling propSetOutcome(state.prop, /** `@type` {{node: string, key: string, value: *}} */ (op), eventId) to use value: unknown so it matches the annotations used elsewhere in this file.src/domain/services/IndexRebuildService.js (1)
269-273:⚠️ Potential issue | 🟡 MinorThe type cast is necessary; improve the source method's return type instead.
StreamingBitmapIndexBuilder.getMemoryStats()declares a generic@returns {Object}with properties documented via@propertytags. This provides no type information to type checkers, making the cast at line 269 load-bearing rather than redundant. Instead of casting at the call site, define a proper typedef in the source method:/** * `@typedef` {Object} MemoryStats * `@property` {number} estimatedBitmapBytes * `@property` {number} estimatedMappingBytes * `@property` {number} totalFlushedBytes * `@property` {number} flushCount * `@property` {number} nodeCount * `@property` {number} bitmapCount */ /** * Returns current memory statistics for monitoring and debugging. * `@returns` {MemoryStats} */ getMemoryStats() { ... }This eliminates the need for the cast in
IndexRebuildService.jsand provides proper type information across the codebase.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IndexRebuildService.js` around lines 269 - 273, Add a proper typedef and update the return JSDoc on StreamingBitmapIndexBuilder.getMemoryStats so it returns a strongly-typed MemoryStats object (including estimatedBitmapBytes, estimatedMappingBytes, totalFlushedBytes, flushCount, nodeCount, bitmapCount) and change the method JSDoc `@returns` to {`@returns` {MemoryStats}}; then remove the inline cast in IndexRebuildService.js around builder.getMemoryStats() and rely on the new return type so onProgress receives currentMemoryBytes from stats.estimatedBitmapBytes without a type assertion.src/domain/errors/QueryError.js (1)
33-33:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated.Line 33 still uses
Object; line 38 was tightened toRecord<string, unknown>.📝 Proposed fix
- * `@property` {Object} context - Serializable context object with error details + * `@property` {Record<string, unknown>} context - Serializable context object with error details🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/QueryError.js` at line 33, Update the JSDoc for the QueryError class to use the tightened type for the context property: change the `@property` declaration that currently reads "{Object} context" to "{Record<string, unknown>} context" so it matches the tightened type used elsewhere (e.g., the other annotation on line 38) in the QueryError class.src/domain/errors/OperationAbortedError.js (1)
13-13:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated.Same inconsistency as flagged in
SchemaUnsupportedError— the class-level@property(line 13) still usesObjectwhile the constructor param (line 18) was tightened toRecord<string, unknown>.📝 Proposed fix
- * `@property` {Object} context - Serializable context object for debugging + * `@property` {Record<string, unknown>} context - Serializable context object for debugging🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/OperationAbortedError.js` at line 13, Update the class JSDoc in OperationAbortedError to match the tightened constructor parameter type: change the class-level `@property` for "context" from {Object} to {Record<string, unknown>} so it aligns with the constructor signature in the OperationAbortedError class; ensure the `@property` name remains "context" and no other wording changes are made.src/domain/errors/IndexError.js (1)
11-11:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated.Line 11 still uses
Object; line 22 was tightened toRecord<string, unknown>.📝 Proposed fix
- * `@property` {Object} context - Serializable context object for debugging + * `@property` {Record<string, unknown>} context - Serializable context object for debugging🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/IndexError.js` at line 11, The JSDoc for the IndexError class still declares "@property {Object} context" but the actual type was tightened to "Record<string, unknown>" elsewhere; update the class-level JSDoc in IndexError (the "@property context" entry) to use "{Record<string, unknown>}" so the comment matches the implementation and the tightened type used later for the context property.src/domain/errors/TraversalError.js (1)
11-11:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated.Same pattern as other error classes — line 11 still uses the loose
Objecttype while line 22 was tightened toRecord<string, unknown>.📝 Proposed fix
- * `@property` {Object} context - Serializable context object for debugging + * `@property` {Record<string, unknown>} context - Serializable context object for debugging🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/TraversalError.js` at line 11, Update the class JSDoc for TraversalError so the context property uses the tightened type: change the `@property` annotation for "context" from {Object} to {Record<string, unknown>} to match the pattern used elsewhere (e.g., the later annotation at line 22); locate the TraversalError class JSDoc and replace the type token for the "context" property accordingly.src/domain/errors/SchemaUnsupportedError.js (1)
11-11:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextnot updated to match the tightened constructor type.The constructor param was updated to
Record<string, unknown>(line 16), but the class-level@property {Object} context(line 11) still uses the looseObjecttype — contradicting the PR's goal. The same pattern applies toOperationAbortedError,TraversalError,QueryError,IndexError, andTrustError.📝 Proposed fix
- * `@property` {Object} context - Serializable context object for debugging + * `@property` {Record<string, unknown>} context - Serializable context object for debugging🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/SchemaUnsupportedError.js` at line 11, Update the class-level JSDoc `@property` for context to match the tightened constructor type: change the `@property` {Object} context to `@property` {Record<string, unknown>} context in SchemaUnsupportedError (and apply the same change to OperationAbortedError, TraversalError, QueryError, IndexError, TrustError) so the class docblock aligns with the constructor signature that now accepts Record<string, unknown>; locate the class-level JSDoc above each error class (e.g., SchemaUnsupportedError) and replace the type there.src/domain/errors/TrustError.js (1)
19-19:⚠️ Potential issue | 🟡 MinorClass-level
@property {Object} contextdoes not match constructor parameter type.Line 19 has the stale
@property {Object} context, while line 24 correctly declares the parameter asRecord<string, unknown>. The base classWarpErroralready usesRecord<string, unknown>for this property, so the subclass JSDoc should match.📝 Proposed fix
- * `@property` {Object} context - Serializable context object with error details + * `@property` {Record<string, unknown>} context - Serializable context object with error details🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/TrustError.js` at line 19, Update the TrustError JSDoc so its context property type matches the constructor and the base class: change the class-level `@property` for context from {Object} to {Record<string, unknown>} (the same type used in the TrustError constructor and WarpError) so the documented type aligns with the actual constructor parameter and base-class definition.src/domain/warp/query.methods.js (1)
69-89:⚠️ Potential issue | 🟡 MinorLeftover
@type {Record<string, any>}ingetEdgePropsbody contradicts the updated return type.The public return type at line 69 was narrowed to
Record<string, unknown>, but the internalpropsobject at line 88 still carries a@type {Record<string, any>}annotation — a missedanythat the PR set out to eliminate.🛠️ Proposed fix
- /** `@type` {Record<string, any>} */ + /** `@type` {Record<string, unknown>} */ const props = {};🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/warp/query.methods.js` around lines 69 - 89, The local JSDoc type for the internal props object in getEdgeProps is still annotated as `@type` {Record<string, any>} which contradicts the function's public return type; change the annotation on the props variable (in getEdgeProps) from Record<string, any> to Record<string, unknown> (or remove the inline `@type` and let the function return type stand) and ensure any subsequent assignments to props conform to unknown typing (adjust casts or checks if needed), referencing the props variable and getEdgeProps function to locate the change.bin/cli/commands/patch.js (1)
122-141:⚠️ Potential issue | 🟡 Minor
extractNodeIdsreturn typestring[]is unsound given the updated parameter type.With
ops: Array<Record<string, unknown>>,op.node,op.from, andop.toare eachunknown. The untypednew Set()is inferred asSet<unknown>, so[...ids].sort()producesunknown[], notstring[]. The declared return type is therefore incorrect, and any truthy non-string values would silently propagate through.🛠️ Proposed fix — add `typeof` guards before `ids.add()`
function extractNodeIds(ops) { if (!Array.isArray(ops)) { return []; } - const ids = new Set(); + /** `@type` {Set<string>} */ + const ids = new Set(); for (const op of ops) { - if (op.node) { + if (typeof op.node === 'string') { ids.add(op.node); } - if (op.from) { + if (typeof op.from === 'string') { ids.add(op.from); } - if (op.to) { + if (typeof op.to === 'string') { ids.add(op.to); } } return [...ids].sort(); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/patch.js` around lines 122 - 141, The function extractNodeIds currently treats op.node/op.from/op.to as unknowns and returns string[], so change it to use a Set<string> and only add values after confirming they're strings: initialize ids as new Set<string>() and for each op check e.g. const v = op.node; if (typeof v === 'string') ids.add(v); (and likewise for op.from and op.to) so the returned [...ids].sort() is actually string[] and no non-string values propagate.src/domain/services/HttpSyncServer.js (1)
108-109:⚠️ Potential issue | 🟡 MinorResidual
@param {*}wildcards injsonResponseandisValidSyncRequest
jsonResponse(line 108) andisValidSyncRequest(line 123) still carry@param {*}annotations. These were not touched in this PR, but they are inconsistent with the PR's stated goal of eliminating all wildcard types insrc/.Suggested narrowings:
jsonResponse:@param {unknown} dataisValidSyncRequest:@param {unknown} parsedAlso applies to: 122-125
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/HttpSyncServer.js` around lines 108 - 109, Update the JSDoc param types for jsonResponse and isValidSyncRequest by replacing the wildcard {*} with the narrower unknown type: change the `@param` of jsonResponse to "@param {unknown} data" and change the `@param` of isValidSyncRequest to "@param {unknown} parsed"; ensure the corresponding JSDoc blocks for the functions jsonResponse and isValidSyncRequest in HttpSyncServer.js are updated accordingly so other annotations remain unchanged.
🧹 Nitpick comments (18)
src/domain/services/WormholeService.js (1)
324-324: Redundant double-cast ontypedJson.Both a left-side JSDoc annotation and a right-side JSDoc cast are applied — only one is needed. Either form alone is sufficient for TypeScript's type checker in a
.jsfile.♻️ Proposed simplification
- const /** `@type` {Record<string, unknown>} */ typedJson = /** `@type` {Record<string, unknown>} */ (json); + const typedJson = /** `@type` {Record<string, unknown>} */ (json);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/WormholeService.js` at line 324, The local variable declaration uses a redundant double-cast for typedJson in WormholeService.js; remove one of the annotations so only a single JSDoc type is applied (either the left-side /** `@type` {Record<string, unknown>} */ typedJson or the right-side cast (json) remains). Update the declaration for typedJson to use a single JSDoc/type cast so TypeScript still understands the type but there is no duplicate cast.src/domain/services/DagTopology.js (1)
40-40: Type cast on the default{}is inaccurate and contradicts the PR's accuracy goal.
/**@type{{ indexReader: import('./BitmapIndexReader.js').default }} */ ({})asserts that an empty object satisfies a type with a non-optionalindexReader, which it plainly does not. This is still a lie to the type system, just a narrower one than a wildcard. It also creates an inconsistency: the inline@typeon the parameter itself marksindexReaderas required, yet the default value proves it can be absent at call time.Two cleaner approaches:
Option A — Drop the default entirely (recommended if
indexReaderis always required, which the runtime check on line 41–43 implies):♻️ Proposed refactor — remove the misleading default
- constructor(/** `@type` {{ indexReader: import('./BitmapIndexReader.js').default, logger?: import('../../ports/LoggerPort.js').default, traversal?: import('./DagTraversal.js').default }} */ { indexReader, logger = nullLogger, traversal } = /** `@type` {{ indexReader: import('./BitmapIndexReader.js').default }} */ ({})) { + constructor({ indexReader, logger = nullLogger, traversal }) {The
@paramJSDoc on lines 35–39 already covers the types; the inline@typeon the destructured parameter is entirely redundant.Option B — Keep the default but use an honest optional type:
♻️ Proposed refactor — honest optional type on the default
- constructor(/** `@type` {{ indexReader: import('./BitmapIndexReader.js').default, logger?: import('../../ports/LoggerPort.js').default, traversal?: import('./DagTraversal.js').default }} */ { indexReader, logger = nullLogger, traversal } = /** `@type` {{ indexReader: import('./BitmapIndexReader.js').default }} */ ({})) { + constructor({ indexReader, logger = nullLogger, traversal } = /** `@type` {{ indexReader?: import('./BitmapIndexReader.js').default, logger?: import('../../ports/LoggerPort.js').default, traversal?: import('./DagTraversal.js').default }} */ ({})) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/DagTopology.js` at line 40, The inline type cast on the default empty object in the DagTopology constructor is incorrect; remove the misleading default and redundant inline JSDoc cast so the constructor signature requires the parameter (constructor({ indexReader, logger = nullLogger, traversal }) { ... }) and rely on the existing runtime checks around indexReader (lines validating presence) and the existing JSDoc param block for types; alternatively, if you want to keep a default, change the inline cast to an optional shape (e.g., /** `@type` {{ indexReader?: import('./BitmapIndexReader.js').default }} */) — but prefer removing the default per the PR suggestion so indexReader remains required.test/unit/domain/services/PatchBuilderV2.test.js (1)
611-641: Consider adding a negative test for the race-detection failure path.The
expectedParentShamechanism is the TOCTOU guard for concurrent writers, but only the happy path (SHA matches) is covered. A test wherereadRefreturns a different SHA thanexpectedParentShawould verify that the guard actually rejects/throws as intended rather than silently proceeding with a stale parent.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/services/PatchBuilderV2.test.js` around lines 611 - 641, Add a negative test for the TOCTOU guard by creating a case using PatchBuilderV2 where persistence.readRef returns a different SHA than the provided expectedParentSha; call builder.commit() and assert it rejects/throws (and that persistence.commitNodeWithTree was not invoked) to verify the race-detection path actually prevents committing with a stale parent. Use the same setup as the existing happy-path test (mock persistence, addNode('x'), etc.) but change the mocked readRef value to differ from expectedParentSha and assert commit() throws and no commitNodeWithTree call was made.src/domain/crdt/ORSet.js (1)
314-318: Internal@type {Array<[any, string[]]>}is inconsistent with the updated public return type.Line 314 now documents the return as
Array<[unknown, string[]]>, but the local variable on line 318 is still cast toArray<[any, string[]]>.anydisables typechecking on the element slot whileunknownenforces narrowing; mixing them undermines the improvement.-/** `@type` {Array<[any, string[]]>} */ +/** `@type` {Array<[unknown, string[]]>} */🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/crdt/ORSet.js` around lines 314 - 318, Update the internal JSDoc type annotation in the orsetSerialize function to match the public return type by replacing the local "@type {Array<[any, string[]]>}" with "@type {Array<[unknown, string[]]>}" so the element slot uses unknown (not any); this keeps the local variable's type consistent with the documented return signature and preserves stricter type narrowing for elements when serializing the Map-to-array conversion in orsetSerialize.src/domain/services/IndexRebuildService.js (2)
392-392: Root-cause fix: widenstoragefield type to eliminate all double-casts.This new intersection cast (
IndexStoragePort & BlobPort) follows the same forced-cast pattern already present at lines 305, 308, and 387, all becausethis.storageis declared only asIndexStoragePort. Widening the constructor'sstorageparameter (and the corresponding field) to the full intersection it must satisfy would remove every cast throughunknownacross the file.♻️ Proposed fix — widen `storage` at declaration
In the constructor JSDoc, change the
storagetype to the full intersection:- * `@param` {import('../../ports/IndexStoragePort.js').default} options.storage - Storage adapter + * `@param` {import('../../ports/IndexStoragePort.js').default & import('../../ports/BlobPort.js').default & import('../../ports/TreePort.js').default} options.storage - Storage adapterThen remove the individual casts in
_persistIndex,load, and_rebuildStreaming:- const oid = await /** `@type` {import('../../ports/BlobPort.js').default} */ (/** `@type` {unknown} */ (this.storage)).writeBlob(buffer); + const oid = await this.storage.writeBlob(buffer);- return await /** `@type` {import('../../ports/TreePort.js').default} */ (/** `@type` {unknown} */ (this.storage)).writeTree(flatEntries); + return await this.storage.writeTree(flatEntries);- const shardOids = await /** `@type` {import('../../ports/TreePort.js').default} */ (/** `@type` {unknown} */ (this.storage)).readTreeOids(treeOid); + const shardOids = await this.storage.readTreeOids(treeOid);- const indexFrontier = await loadIndexFrontier(shardOids, /** `@type` {import('../../ports/IndexStoragePort.js').default & import('../../ports/BlobPort.js').default} */ (this.storage), { codec: this._codec }); + const indexFrontier = await loadIndexFrontier(shardOids, this.storage, { codec: this._codec });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IndexRebuildService.js` at line 392, The storage field is typed too narrowly, causing repeated forced intersection casts; update the constructor JSDoc so the constructor parameter and the this.storage field are declared with the full intersection type (IndexStoragePort & BlobPort) and then remove the ad-hoc casts around this.storage (e.g., in usages inside _persistIndex, load, and _rebuildStreaming such as the loadIndexFrontier(...) call) so the code uses this.storage directly with the correct widened type.
53-53: Consider extracting the inline default-parameter type to a@typedef.The inline
@typeannotation in the default parameter is ~240 characters long and only covers two of the five destructured fields, making the constructor signature nearly unreadable. A named@typedefat the top of the file would be cleaner and reusable.♻️ Proposed refactor
Add above the class:
+/** + * `@typedef` {{ graphService: { iterateNodes: (opts: { ref: string, limit: number }) => AsyncIterable<{ sha: string, parents: string[] }> }, storage: import('../../ports/IndexStoragePort.js').default }} IndexRebuildServiceOptions + */Then simplify the constructor signature:
- constructor({ graphService, storage, logger = nullLogger, codec, crypto } = /** `@type` {{ graphService: { iterateNodes: (opts: { ref: string, limit: number }) => AsyncIterable<{ sha: string, parents: string[] }> }, storage: import('../../ports/IndexStoragePort.js').default }} */ ({})) { + constructor({ graphService, storage, logger = nullLogger, codec, crypto } = /** `@type` {IndexRebuildServiceOptions} */ ({})) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/IndexRebuildService.js` at line 53, Extract the long inline JSDoc for the constructor default parameter into a named typedef (e.g., IndexRebuildServiceDeps) placed at the top of the file, declaring the five properties: graphService, storage, logger, codec, and crypto (with the appropriate types for graphService.iterateNodes and storage). Then update the IndexRebuildService constructor signature to use the new typedef in its JSDoc (replace the inline `@type`(...) with `@type` {IndexRebuildServiceDeps} or similar) and keep the same destructured defaults (logger = nullLogger) so behavior is unchanged; ensure the typedef name is referenced in the constructor JSDoc and that the class name IndexRebuildService and constructor remain unchanged.scripts/ts-policy-check.js (1)
50-86:check()has no I/O error handling — unreadable files produce an unhandled rejection.
readdir/readFilefailures will crash the process with a raw Node.js stack trace instead of a clear policy-check error message. Wrapping in a top-level try/catch (or adding a.catchat the call site) gives meaningful output in CI.♻️ Proposed fix
-check(); +check().catch((err) => { + console.error(`TS policy check failed unexpectedly: ${err.message}`); + process.exit(2); +});🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/ts-policy-check.js` around lines 50 - 86, The check() function currently has no I/O error handling so readFile/walkJs errors cause unhandled rejections; wrap the body of check() (or at least the for-await file loop and the readFile call) in a try/catch to catch filesystem errors from readFile/walkJs, then convert them into a clear policy-check error (e.g., push a formatted violation or log a descriptive message including err.message and the affected path using the same violations array) and ensure the process exits non-zero or the caller receives the failure; reference function check(), the readFile call and the for-await loop over walkJs so the catch is placed around those operations.src/domain/errors/ForkError.js (1)
24-29:@property {Object} contextnot updated to match the constructor and parent class.Line 24 still documents
contextas{Object}, while the constructor param (line 29) now usesRecord<string, unknown>and the parentWarpErroris annotated the same way.♻️ Proposed fix
-* `@property` {Object} context - Serializable context object with error details +* `@property` {Record<string, unknown>} context - Serializable context object with error details🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/errors/ForkError.js` around lines 24 - 29, Update the JSDoc for the ForkError class so the `@property` type for "context" matches the constructor and parent WarpError (use Record<string, unknown> instead of Object); locate the ForkError class and its JSDoc block and change the context annotation to "@property {Record<string, unknown>} context" so the documentation and the constructor signature (the constructor param in ForkError) are consistent with WarpError.bin/cli/commands/path.js (1)
82-87: Optional: preserveErrorin the intersection cast.After
error instanceof Error, the subsequent cast/**@type{{code?: string}} */ (error)narrows away theErrortype. Using an intersection type retains both:♻️ Proposed refinement
- if (error instanceof Error && /** `@type` {{code?: string}} */ (error).code === 'NODE_NOT_FOUND') { + if (error instanceof Error && /** `@type` {Error & {code?: string}} */ (error).code === 'NODE_NOT_FOUND') {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/path.js` around lines 82 - 87, The catch block narrows away the Error type by casting error to {{code?: string}}; preserve the original Error by using an intersection type when checking the code property. Replace the cast /** `@type` {{code?: string}} */ (error) with an intersection cast that keeps Error (for example /** `@type` {Error & {code?: string}} */ (error) or the TS equivalent) so the instance check (error instanceof Error) and the use of notFoundError(error.message) remain type-safe.bin/cli/commands/check.js (1)
140-140:buildCheckPayload(line 98) retains multiple*wildcard annotations.The return type on
handleCheckwas tightened to{payload: unknown, …}, but thebuildCheckPayloadJSDoc parameter still useshealth: *, checkpoint: *, gcMetrics: *, hook: *|null, status: *|null. This leaves several unresolved wildcards in the same call chain, contrary to the PR's stated goal.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/check.js` at line 140, The JSDoc for buildCheckPayload still uses wildcard "*" types which conflicts with the tightened handleCheck return type; update buildCheckPayload's parameter and return annotations to concrete types (e.g., use unknown for health, checkpoint, gcMetrics and use unknown|null for hook and status) and ensure its `@returns` matches {Promise<{payload: unknown, exitCode: number}>} (or the synchronous equivalent) so the call chain between buildCheckPayload and handleCheck has consistent, non-wildcard typing; adjust annotations on the buildCheckPayload function signature and any related param tags to reference buildCheckPayload, handleCheck, and their payload/exitCode shapes.src/domain/services/BoundaryTransitionRecord.js (1)
395-414:deserializeBTRfield validation doesn't check individual field types before casting toBTR.The required-fields loop (lines 400–404) only verifies field presence (
field in obj), not types. For example,versioncould be a string,h_incould be an array, and the cast on line 406 would still succeed silently. This is pre-existing behavior, but now that you've typed the decoded object asRecord<string, unknown>, the gap is more visible.Consider reusing
validateBTRStructure(which already checksbtr.version !== BTR_VERSION) or adding lightweight type guards before the cast to prevent malformed BTRs from propagating.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/BoundaryTransitionRecord.js` around lines 395 - 414, deserializeBTR only checks that REQUIRED_FIELDS exist but doesn't validate their types before casting to BTR, so malformed values can slip through; update deserializeBTR to perform lightweight type checks (or call the existing validateBTRStructure) after decoding and before returning: use the same validation logic as validateBTRStructure (including checking btr.version === BTR_VERSION and validating primitives/arrays for h_in, h_out, U_0, P, t, kappa) or implement small type guards for each field, and ensure you still use codec/defaultCodec as currently done.src/domain/services/CheckpointSerializerV5.js (2)
220-237:valproperty access inelsebranch lacks a type guard.In the
elsebranch (line 232–233),valisunknown(coming from the now-Record<string, unknown>obj), but.lamport,.writerId, etc. are accessed without narrowing. This is pre-existing and works at runtime because CBOR decoded data has the expected shape, but it's inconsistent with the stricter typing introduced elsewhere in this file.A minimal guard like
typeof val === 'object' && val !== nullwould align this path with the PR's goals.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/CheckpointSerializerV5.js` around lines 220 - 237, In deserializeEdgeBirthEvent, the else branch accesses properties on val (val.lamport, val.writerId, val.patchSha, val.opIndex) without narrowing its unknown type; add a runtime type guard (e.g. check typeof val === 'object' && val !== null and that expected keys exist/are of correct primitives) before reading those properties in deserializeEdgeBirthEvent, then shallow-copy from the validated object (or skip/handle the entry if validation fails) so you don't access properties on non-objects and remain consistent with the stricter typing.
109-113: Cast tostringis technically unsound for the error diagnostic.
obj.versionpassed the!== undefined && !== 'full-v5'check, so it could be a number, boolean, or object — not necessarily a string. The/**@type{string} */cast is incorrect in that case. The template literal will still calltoString()so runtime behavior is fine, but the cast is misleading.💡 Suggested alternative
- const ver = /** `@type` {string} */ (obj.version); + const ver = String(obj.version);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/CheckpointSerializerV5.js` around lines 109 - 113, Remove the incorrect JSDoc string cast on obj.version in the error path of CheckpointSerializerV5 and instead coerce the value to a safe string for the message (e.g., via String(...) or JSON.stringify(...)) so any type (number, boolean, object) is represented reliably; update the throw in the same block that currently checks if (obj.version !== undefined && obj.version !== 'full-v5') to interpolate the coerced value when constructing the Error to replace the /** `@type` {string} */ (obj.version) cast.test/unit/domain/trust/TrustEvaluator.test.js (1)
121-123: Two@param {*}wildcards remain on lines 121–122.These
.mapcallbacks still use/**@param{*} e */, which is inconsistent with the PR's goal of eliminating wildcard types. They could be updated toRecord<string, unknown>(matching lines 171–172) or to the actual explanation shape.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/unit/domain/trust/TrustEvaluator.test.js` around lines 121 - 123, Replace the JSDoc wildcard param in the map callbacks used on a1.trust.explanations and a2.trust.explanations (the callbacks that extract .writerId) so they no longer use /** `@param` {*} e */; change the type to a non-wildcard such as /** `@param` {Record<string, unknown>} e */ or the actual explanation shape used elsewhere in the test (matching the type used on lines 171–172) so both .map callbacks have a concrete param type instead of {*} and remain consistent with the rest of the file.bin/cli/commands/verify-audit.js (1)
55-57: Double cast throughunknowntoCorePersistencemay mask type drift.The
/**@type{unknown} */→/**@type{CorePersistence} */pattern bypasses all structural checks. IfcreatePersistencereturns a type that diverges fromCorePersistencein the future, this cast will silently hide the mismatch.Consider adding an adapter or a narrowing assertion function (e.g.,
assertCorePersistence(persistence)) that validates the required methods exist at runtime, consistent with the duck-typing patterns elsewhere in the PR.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@bin/cli/commands/verify-audit.js` around lines 55 - 57, Replace the unsafe double cast of persistence to CorePersistence by adding a runtime narrowing/assertion or adapter and using that in the AuditVerifierService constructor: implement a function like assertCorePersistence(persistence) (or build an adapter wrapCorePersistence(persistence)) that checks the required methods/properties (duck-typed) and either throws or returns a properly-typed object, then pass the asserted/adapter value to new AuditVerifierService instead of using /** `@type` {unknown} */ casts; reference the persistence variable and the AuditVerifierService constructor call when applying this change.src/domain/trust/TrustStateBuilder.js (1)
50-55: The/**@type{string} */cast makes the??fallback invisible to the type checker.Casting
record.recordIdtostringtells TypeScript it's always a string, so?? '(unknown)'appears unreachable from TS's perspective. At runtime it works correctly, but if the intent is to handle a missingrecordId, a more accurate approach would be:💡 Suggestion
- recordId: /** `@type` {string} */ (record.recordId) ?? '(unknown)', + recordId: typeof record.recordId === 'string' ? record.recordId : '(unknown)',🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/trust/TrustStateBuilder.js` around lines 50 - 55, The inline JSDoc cast /** `@type` {string} */ on record.recordId hides the possibility of null/undefined from the type checker; change to compute a local recordId first (e.g., const recordId = record.recordId ?? '(unknown)') and then use that variable in the errors.push call so the fallback is visible to TypeScript and runtime behavior is preserved (update the code around parsed.success handling in TrustStateBuilder.js where errors.push references record.recordId).src/domain/trust/TrustRecordService.js (1)
262-262: String casts are safe here but rely on upstream schema validation — consider a defensive guard.Both
record.recordTypeandrecord.recordIdare cast tostringvia JSDoc. This is safe because_persistRecordis only called after schema validation (line 55) and recordId integrity checks (line 64) inappendRecord. However, since_persistRecordis aprivatemethod that could theoretically be called from a new code path in the future, you might consider a lightweight assertion or fallback (e.g.,String(record.recordId)).This is a minor observation — the current code is correct for all existing call sites.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/trust/TrustRecordService.js` at line 262, The template literal building the log `message` in TrustRecordService::_persistRecord currently relies on JSDoc casts for record.recordType and record.recordId; add a defensive guard so unexpected types won't break future callers — coerce both values with String(...) or a fallback like (record.recordType ?? 'unknown') and (record.recordId ?? '') before slicing, or assert types at the top of _persistRecord; adjust the `message` construction to use those guarded/coerced values so appendRecord's upstream validation remains sufficient but _persistRecord is resilient to new call paths.src/domain/services/AuditVerifierService.js (1)
319-319: Consider typingprevReceiptasAuditReceipt|nullinstead ofRecord<string, unknown>|null.
prevReceiptis assigned fromreceipt(line 446), which is cast toAuditReceipt. UsingAuditReceipt|nullwould provide stronger type information for_validateChainLinkcallers and is more precise.Suggested change
- /** `@type` {Record<string, unknown>|null} */ let prevReceipt = null; + /** `@type` {AuditReceipt|null} */ let prevReceipt = null;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/domain/services/AuditVerifierService.js` at line 319, The declaration for prevReceipt uses a loose type; change its JSDoc from Record<string, unknown>|null to AuditReceipt|null so callers of _validateChainLink get stronger typing—update the JSDoc where prevReceipt is declared and ensure any assignment (e.g., assigning receipt cast as AuditReceipt) remains compatible with the new AuditReceipt|null annotation.
| // Rule 3: Embedded wildcards in generic JSDoc types | ||
| // Catches: Record<string, *>, Array<*>, Map<string, *>, {[k:string]: *} | ||
| // Does NOT match inside import() paths or comments unrelated to @type. | ||
| const EMBEDDED_WILDCARD_RE = /(?:@type|@param|@returns|@typedef)\s.*(?:<[^>]*\*[^>]*>|\{\[[\w:]+\]:\s*\*\})/; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check whether `@return` (without 's') appears in the codebase with parameterized JSDoc types
rg -n --type js '@return\s+\{[^}]*<' src/ bin/ scripts/Repository: git-stunts/git-warp
Length of output: 45
🏁 Script executed:
#!/bin/bash
# Also check for `@return` usage more broadly (with any types)
rg -n --type js '@return\s+\{' src/ bin/ scripts/Repository: git-stunts/git-warp
Length of output: 45
🏁 Script executed:
#!/bin/bash
# Check the actual file to understand the regex and context better
head -50 scripts/ts-policy-check.jsRepository: git-stunts/git-warp
Length of output: 1779
🏁 Script executed:
#!/bin/bash
# Check if `@return` (singular) is used anywhere in the codebase
rg -n --type js '@return\b' src/ bin/ scripts/Repository: git-stunts/git-warp
Length of output: 45
🏁 Script executed:
#!/bin/bash
# Check for `@property` usage with parameterized types
rg -n --type js '@property\s+\{[^}]*<' src/ bin/ scripts/Repository: git-stunts/git-warp
Length of output: 7275
🏁 Script executed:
#!/bin/bash
# Verify the comment claim: does the regex actually handle import() paths correctly?
# Test the regex pattern manually with a few examples
node <<'EOF'
const EMBEDDED_WILDCARD_RE = /(?:`@type`|@param|@returns|@typedef)\s.*(?:<[^>]*\*[^>]*>|\{\[[\w:]+\]:\s*\*\})/;
const testCases = [
'@typedef {import("./foo").Type<string, *>} MyType',
'@type {Record<string, *>}',
'@param {Array<*>} arr',
'@returns {Map<string, *>}',
'@return {Array<*>} value',
];
console.log('Testing EMBEDDED_WILDCARD_RE pattern:\n');
testCases.forEach(test => {
const matches = EMBEDDED_WILDCARD_RE.test(test);
console.log(`"${test}"`);
console.log(` → Matched: ${matches}\n`);
});
EOFRepository: git-stunts/git-warp
Length of output: 356
Missing @return (singular) tag and inaccurate comment in EMBEDDED_WILDCARD_RE.
@return {Array<*>} is a false negative — the regex only checks @returns (with 's'), but JSDoc and TypeScript accept both @return and @returns. Additionally, the comment claiming the regex "Does NOT match inside import() paths" is inaccurate; @typedef {import('./foo').Type<string, *>} would still match because the wildcard detection <[^>]*\*[^>]*> applies regardless of context.
The codebase does not currently use @return or wildcard types in @property tags, but this gap in coverage should be closed for forward compatibility.
🔧 Proposed fix
// Rule 3: Embedded wildcards in generic JSDoc types
// Catches: Record<string, *>, Array<*>, Map<string, *>, {[k:string]: *}
-// Does NOT match inside import() paths or comments unrelated to `@type`.
-const EMBEDDED_WILDCARD_RE = /(?:`@type`|@param|@returns|@typedef)\s.*(?:<[^>]*\*[^>]*>|\{\[[\w:]+\]:\s*\*\})/;
+const EMBEDDED_WILDCARD_RE = /(?:`@type`|@param|@returns?|@typedef|@property)\s.*(?:<[^>]*\*[^>]*>|\{\[[\w:]+\]:\s*\*\})/;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@scripts/ts-policy-check.js` around lines 42 - 45, The EMBEDDED_WILDCARD_RE
currently only matches "@returns" and omits "@return" and "@property" JSDoc tags
and its comment incorrectly claims it won't match inside import() paths; update
the regex in EMBEDDED_WILDCARD_RE to accept both "@return" and "@returns" and to
include "@property" (e.g. allow the tag token to be
`@type`|@param|@return(?:s)?|@returns|@typedef|@property) so wildcard generics
like "@return {Array<*>}" are detected, and revise the surrounding comment to
remove the incorrect "Does NOT match inside import() paths" claim (note that
import() generics can match the wildcard pattern).
| * | ||
| * @param {ORSet} set - The ORSet | ||
| * @returns {Array<*>} Array of present elements | ||
| * @returns {string[]} Array of present elements |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's read the ORSet.js file to see the actual annotations and structure
cat -n src/domain/crdt/ORSet.js | head -200Repository: git-stunts/git-warp
Length of output: 7755
🏁 Script executed:
# Continue reading the file to see the serialize/deserialize functions around lines 310-355
sed -n '300,360p' src/domain/crdt/ORSet.js | cat -nRepository: git-stunts/git-warp
Length of output: 2139
🏁 Script executed:
# Now let's search for all call sites of orsetAdd to see what types of elements are actually passed
rg -n --type=js 'orsetAdd\(' -A 2 -B 2Repository: git-stunts/git-warp
Length of output: 50375
🏁 Script executed:
# Let me check if there are any non-string elements being added. Let me search for what gets passed to orsetAdd more carefully
rg 'orsetAdd\(state\.(nodeAlive|edgeAlive),' -A 1 | head -50Repository: git-stunts/git-warp
Length of output: 4068
🏁 Script executed:
# Let me check what encodeEdgeKey returns
rg 'export.*encodeEdgeKey|function encodeEdgeKey' -A 3Repository: git-stunts/git-warp
Length of output: 700
🏁 Script executed:
# Let me read the exact lines mentioned to verify the inconsistency between line 318 and 314
sed -n '309,325p' src/domain/crdt/ORSet.js | cat -nRepository: git-stunts/git-warp
Length of output: 834
string[] return type on line 171 is too narrow for orsetElements.
The ORSet typedef declares entries: Map<*, Set<string>> (line 94), allowing elements of any type. The orsetElements function returns elements from set.entries.keys(), which are typed as *, so the return type should be unknown[] to match the actual element type:
-* `@returns` {string[]} Array of present elements
+* `@returns` {unknown[]} Array of present elementsAdditionally, there is a type consistency issue in orsetSerialize: line 314's public return type uses Array<[unknown, string[]]>, but line 318's internal variable annotation uses Array<[any, string[]]>. The internal annotation should use unknown instead of any for consistency.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| * @returns {string[]} Array of present elements | |
| * `@returns` {unknown[]} Array of present elements |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/crdt/ORSet.js` at line 171, Update the JSDoc/type annotations to
use unknown for ORSet element types: change the return type of orsetElements
from string[] to unknown[] to match the ORSet typedef entries: Map<*,
Set<string>> and the values returned by set.entries.keys(); and in
orsetSerialize replace the internal variable annotation Array<[any, string[]]>
with Array<[unknown, string[]]> so its internal type matches the declared public
return type Array<[unknown, string[]]> and remains consistent with ORSet's
element typing.
| * @property {*} expected - The expected value for the field | ||
| * @property {*} actual - The actual value found in the shard |
There was a problem hiding this comment.
Remaining {*} wildcards contradict the PR goal of eliminating all wildcards.
expected and actual are still annotated as {*} in both the @property declarations (lines 15–16) and the constructor @param inline type (line 35), while the rest of the file has been updated.
🔧 Proposed fix
-* `@property` {*} expected - The expected value for the field
-* `@property` {*} actual - The actual value found in the shard
+* `@property` {unknown} expected - The expected value for the field
+* `@property` {unknown} actual - The actual value found in the shard-* `@param` {{ shardPath?: string, expected?: *, actual?: *, field?: string, context?: Record<string, unknown> }} [options={}] - Error options
+* `@param` {{ shardPath?: string, expected?: unknown, actual?: unknown, field?: string, context?: Record<string, unknown> }} [options={}] - Error optionsAlso applies to: 35-35
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/errors/ShardValidationError.js` around lines 15 - 16, Update the
JSDoc types for the ShardValidationError properties and constructor parameters
to remove the {*} wildcards: replace the `@property` annotations for expected and
actual and the constructor inline `@param` type (in the ShardValidationError
constructor) with a concrete non-wildcard type such as {unknown} (or a more
specific union if you know the possible shapes), and ensure the names expected
and actual in the JSDoc match the constructor parameter names used in the
ShardValidationError class.
| /** | ||
| * @typedef {{ graph: string, writer: string, dataCommit: string, opsDigest: string, schema: number, prevAuditCommit: string, writerId: string, graphName: string, tick: number }} AuditReceipt | ||
| */ |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# Get the AuditReceipt typedef and surrounding context
sed -n '15,30p' src/domain/services/AuditVerifierService.jsRepository: git-stunts/git-warp
Length of output: 781
🏁 Script executed:
# Find validateReceiptSchema definition and its implementation
rg -n "validateReceiptSchema" src/domain/services/AuditVerifierService.js -A 5 -B 2 | head -50Repository: git-stunts/git-warp
Length of output: 641
🏁 Script executed:
# Find all receipt field accesses to see what fields are actually used
rg -n "receipt\.\w+" src/domain/services/AuditVerifierService.js | head -30Repository: git-stunts/git-warp
Length of output: 2284
🏁 Script executed:
# Find the _readReceipt return type and decodedTrailers structure
sed -n '460,475p' src/domain/services/AuditVerifierService.jsRepository: git-stunts/git-warp
Length of output: 726
🏁 Script executed:
# Find validateReceiptSchema function definition (likely in another file)
fd "." --type f -name "*.js" | xargs rg -l "validateReceiptSchema" | grep -v AuditVerifierServiceRepository: git-stunts/git-warp
Length of output: 338
🏁 Script executed:
# Get the complete validateReceiptSchema function to see all required fields
sed -n '77,125p' src/domain/services/AuditVerifierService.jsRepository: git-stunts/git-warp
Length of output: 1872
🏁 Script executed:
# Find where AuditReceipt is used with type cast to confirm the issue
rg -n "@type.*AuditReceipt" src/domain/services/AuditVerifierService.js -B 2 -A 2Repository: git-stunts/git-warp
Length of output: 479
🏁 Script executed:
# Check the AUDIT_RECEIPT spec mentioned in the comment
head -100 docs/specs/AUDIT_RECEIPT.mdRepository: git-stunts/git-warp
Length of output: 4351
AuditReceipt typedef fields don't match the actual receipt schema.
The typedef declares { graph, writer, dataCommit, opsDigest, schema, prevAuditCommit, writerId, graphName, tick }, but validateReceiptSchema (lines 85-87) requires exactly these 9 fields: dataCommit, graphName, opsDigest, prevAuditCommit, tickEnd, tickStart, timestamp, version, writerId. The typedef is missing tickEnd, tickStart, timestamp, and version, and incorrectly includes graph, writer, schema, and tick. This mismatch is confirmed by the AUDIT_RECEIPT.md spec (Section 3) and by actual field accesses in validation code (e.g., lines 94-122 check receipt.version, receipt.tickStart, receipt.tickEnd, receipt.timestamp).
The type cast at line 509 applies this incorrect typedef to the decoded CBOR receipt, defeating the type-safety goal of this PR.
🛠️ Suggested fix
-/**
- * `@typedef` {{ graph: string, writer: string, dataCommit: string, opsDigest: string, schema: number, prevAuditCommit: string, writerId: string, graphName: string, tick: number }} AuditReceipt
- */
+/**
+ * `@typedef` {Object} AuditReceipt
+ * `@property` {string} dataCommit
+ * `@property` {string} graphName
+ * `@property` {string} opsDigest
+ * `@property` {string} prevAuditCommit
+ * `@property` {number} tickEnd
+ * `@property` {number} tickStart
+ * `@property` {number} timestamp
+ * `@property` {number} version
+ * `@property` {string} writerId
+ */📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| /** | |
| * @typedef {{ graph: string, writer: string, dataCommit: string, opsDigest: string, schema: number, prevAuditCommit: string, writerId: string, graphName: string, tick: number }} AuditReceipt | |
| */ | |
| /** | |
| * `@typedef` {Object} AuditReceipt | |
| * `@property` {string} dataCommit | |
| * `@property` {string} graphName | |
| * `@property` {string} opsDigest | |
| * `@property` {string} prevAuditCommit | |
| * `@property` {number} tickEnd | |
| * `@property` {number} tickStart | |
| * `@property` {number} timestamp | |
| * `@property` {number} version | |
| * `@property` {string} writerId | |
| */ |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/services/AuditVerifierService.js` around lines 19 - 21, The
AuditReceipt JSDoc typedef is out-of-sync with the actual receipt schema used by
validateReceiptSchema and other code; update the AuditReceipt typedef to exactly
list dataCommit, graphName, opsDigest, prevAuditCommit, tickStart, tickEnd,
timestamp, version, and writerId (remove graph, writer, schema, tick) so the
type matches validateReceiptSchema and the CBOR decode cast at the bottom of the
file; ensure the typedef name is unchanged and referenced locations
(validateReceiptSchema and the cast of decoded CBOR receipt) now align with the
corrected field names.
| httpPort: z.custom( | ||
| (v) => v !== null && v !== undefined && typeof v === 'object', | ||
| 'httpPort is required', | ||
| ), | ||
| graph: z.custom( | ||
| (v) => v !== null && v !== undefined && typeof v === 'object', | ||
| 'graph is required', | ||
| ), |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Zod v3 z.custom generic type parameter TypeScript
💡 Result:
In Zod v3, z.custom is generic: you pass the TypeScript type you want Zod to infer as the schema’s output type:
import { z } from "zod";
type Px = `${number}px`;
const pxSchema = z.custom<Px>((val) => {
return typeof val === "string" && /^\d+px$/.test(val);
});
type Px2 = z.infer<typeof pxSchema>; // `${number}px`That <Px> generic type parameter is what drives z.infer<typeof pxSchema> (and the output type of .parse). (odocs-zod.vercel.app)
If you want stronger typing inside the validator, you can also write it as a type guard:
const pxSchema = z.custom<Px>((val): val is Px =>
typeof val === "string" && /^\d+px$/.test(val)
);Also note: if you call z.custom<T>() without a check function, it “validates” anything (so you only get typing, not runtime safety). (odocs-zod.vercel.app)
Citations:
- 1: https://odocs-zod.vercel.app/?utm_source=openai
- 2: https://odocs-zod.vercel.app/?utm_source=openai
🏁 Script executed:
# First, let's check the actual code being reviewed
head -n 60 src/domain/services/HttpSyncServer.js | tail -n 30Repository: git-stunts/git-warp
Length of output: 1083
🏁 Script executed:
# Also check what version of Zod is being used
cat package.json | grep -A2 -B2 '"zod"'Repository: git-stunts/git-warp
Length of output: 168
🏁 Script executed:
# Check the full context around the schema definition
sed -n '1,80p' src/domain/services/HttpSyncServer.jsRepository: git-stunts/git-warp
Length of output: 2690
🏁 Script executed:
# Check the allowedWriters validation logic (lines 46-53)
sed -n '40,55p' src/domain/services/HttpSyncServer.jsRepository: git-stunts/git-warp
Length of output: 619
🏁 Script executed:
# Check for JSDoc duplication around initAuth (lines 224-235)
sed -n '220,245p' src/domain/services/HttpSyncServer.jsRepository: git-stunts/git-warp
Length of output: 782
🏁 Script executed:
# Check lines 108-109 and 122-125 for residual `@param` {*}
sed -n '105,130p' src/domain/services/HttpSyncServer.jsRepository: git-stunts/git-warp
Length of output: 675
🏁 Script executed:
# Let's try to find what the default type is for z.custom() without generic
# Check if there are any imports or type definitions that might clarify this
rg "z\.custom" src/domain/services/HttpSyncServer.js -A 2 -B 2Repository: git-stunts/git-warp
Length of output: 702
🏁 Script executed:
# Look for how httpPort and graph are typed after parsing
# Check the constructor to see how parsed values are used
sed -n '250,280p' src/domain/services/HttpSyncServer.jsRepository: git-stunts/git-warp
Length of output: 1370
🏁 Script executed:
# Check how _httpPort and _graph are actually used downstream (lines 344, 364)
sed -n '340,370p' src/domain/services/HttpSyncServer.jsRepository: git-stunts/git-warp
Length of output: 1164
🏁 Script executed:
# Let's also verify the actual TypeScript behavior by checking if there are type checking errors
# Look for any type annotations on the instance properties
rg "this\._httpPort|this\._graph" src/domain/services/HttpSyncServer.js -B 2 -A 2Repository: git-stunts/git-warp
Length of output: 650
🌐 Web query:
Zod v3 z.custom default type inference when generic omitted unknown
💡 Result:
In Zod v3, z.custom() is intentionally generic, and if you omit the generic type argument, TypeScript has nothing to infer T from (unless you supply a contextual type), so the output type falls back to unknown (TypeScript’s fallback when inference fails and there’s no better default/constraint). [1], [2]
What to do
- Specify the generic explicitly (the normal/expected usage):
const Px = z.custom<`${number}px`>((val) =>
typeof val === "string" && /^\d+px$/.test(val)
);
type Px = z.infer<typeof Px>; // `${number}px`[1]
- If you want a different “default type” when you omit the generic, Zod/TS can’t change that automatically; instead wrap it:
const customAny = (check?: Parameters<typeof z.custom>[0]) => z.custom<any>(check);(Or pick your preferred default type instead of any.) [2]
Why it can’t infer: your callback usually returns boolean, and TypeScript can’t reverse-infer a type parameter from that; when inference fails, it falls back to unknown. [2]
Sources: Zod docs on z.custom() usage [1]; TypeScript generic inference fallback explanation [2].
z.custom() without a type parameter yields unknown, erasing type safety for httpPort and graph
z.custom() without a generic parameter defaults to ZodType<unknown>, so z.infer<typeof optionsSchema> produces httpPort: unknown and graph: unknown. This means this._httpPort and this._graph are both typed unknown after the constructor assignment, which strips type safety from every downstream call site — notably this._graph.processSyncRequest(parsed) (line 344) and this._httpPort.createServer(...) (line 364).
Fix: supply explicit generic type parameters to preserve the expected shapes.
Additionally, both error messages ('httpPort is required', 'graph is required') are misleading for callers who supply a value of the wrong type (e.g. httpPort: 42) — consider splitting presence and type checks or using a more descriptive message like 'httpPort must be a non-null object'.
Edge case: allowedWriters validation at line 46 — empty array [] is truthy in JavaScript, so the check if (data.allowedWriters && !data.auth) will incorrectly trigger the error for allowedWriters: [] without auth. Consider if (data.allowedWriters?.length > 0 && !data.auth) instead.
Duplicate JSDoc blocks before initAuth (lines 224–235) — the first block (lines 224–231) has stale crypto?: * and logger?: * wildcards and is overridden by the second block just below it. Remove the first block or merge the parameter descriptions.
Residual @param {*} wildcards — jsonResponse (line 108) and isValidSyncRequest (line 122) still use @param {*}, which contradicts the PR goal of eliminating all wildcards. Add explicit types.
♻️ Proposed fix for httpPort/graph
- httpPort: z.custom(
- (v) => v !== null && v !== undefined && typeof v === 'object',
- 'httpPort is required',
- ),
- graph: z.custom(
- (v) => v !== null && v !== undefined && typeof v === 'object',
- 'graph is required',
- ),
+ httpPort: z.custom<import('../../ports/HttpServerPort.js').default>(
+ (v) => v !== null && v !== undefined && typeof v === 'object',
+ 'httpPort must be a non-null object',
+ ),
+ graph: z.custom<{ processSyncRequest: Function }>(
+ (v) => v !== null && v !== undefined && typeof v === 'object',
+ 'graph must be a non-null object',
+ ),📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| httpPort: z.custom( | |
| (v) => v !== null && v !== undefined && typeof v === 'object', | |
| 'httpPort is required', | |
| ), | |
| graph: z.custom( | |
| (v) => v !== null && v !== undefined && typeof v === 'object', | |
| 'graph is required', | |
| ), | |
| httpPort: z.custom<import('../../ports/HttpServerPort.js').default>( | |
| (v) => v !== null && v !== undefined && typeof v === 'object', | |
| 'httpPort must be a non-null object', | |
| ), | |
| graph: z.custom<{ processSyncRequest: Function }>( | |
| (v) => v !== null && v !== undefined && typeof v === 'object', | |
| 'graph must be a non-null object', | |
| ), |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/services/HttpSyncServer.js` around lines 33 - 40, optionsSchema
currently uses z.custom(...) without generic type parameters so z.infer gives
httpPort: unknown and graph: unknown; update those two validators to use
explicit generics (e.g. z.custom<ExpectedHttpPortType>(...) and
z.custom<ExpectedGraphType>(...)) so this._httpPort and this._graph retain
correct types, and change the validation messages to something like "httpPort
must be a non-null object" / "graph must be a non-null object" to reflect type
failures. Also change the allowedWriters guard from if (data.allowedWriters &&
!data.auth) to check length (e.g. data.allowedWriters?.length > 0 && !data.auth)
so an empty array doesn't trigger the error. Remove the duplicate JSDoc block
before initAuth (keep the correct one) and replace all `@param` {*} wildcards in
jsonResponse and isValidSyncRequest JSDoc with their explicit types to eliminate
wildcards.
| }).strict().superRefine((data, ctx) => { | ||
| if (data.allowedWriters && !data.auth) { | ||
| ctx.addIssue({ | ||
| code: z.ZodIssueCode.custom, | ||
| message: 'allowedWriters requires auth.keys to be configured', | ||
| path: ['allowedWriters'], | ||
| }); | ||
| } |
There was a problem hiding this comment.
allowedWriters: [] (empty array) is truthy and incorrectly triggers the superRefine error
data.allowedWriters && !data.auth evaluates true for an empty array because [] is truthy in JavaScript. A caller passing allowedWriters: [] without auth gets a misleading validation error even though an empty allowedWriters list has no practical effect.
🐛 Proposed fix
- if (data.allowedWriters && !data.auth) {
+ if (data.allowedWriters && data.allowedWriters.length > 0 && !data.auth) {📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| }).strict().superRefine((data, ctx) => { | |
| if (data.allowedWriters && !data.auth) { | |
| ctx.addIssue({ | |
| code: z.ZodIssueCode.custom, | |
| message: 'allowedWriters requires auth.keys to be configured', | |
| path: ['allowedWriters'], | |
| }); | |
| } | |
| }).strict().superRefine((data, ctx) => { | |
| if (data.allowedWriters && data.allowedWriters.length > 0 && !data.auth) { | |
| ctx.addIssue({ | |
| code: z.ZodIssueCode.custom, | |
| message: 'allowedWriters requires auth.keys to be configured', | |
| path: ['allowedWriters'], | |
| }); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/services/HttpSyncServer.js` around lines 46 - 53, In the
superRefine validator for the schema, the condition `data.allowedWriters &&
!data.auth` treats an empty array as truthy and wrongly triggers an error;
change the check to explicitly verify a non-empty array (e.g.,
`Array.isArray(data.allowedWriters) && data.allowedWriters.length > 0 &&
!data.auth`) before calling `ctx.addIssue` so only a real list of allowedWriters
without auth causes the `z.ZodIssueCode.custom` error; update the code in the
same `superRefine` block that references `allowedWriters`, `auth`, and
`z.ZodIssueCode.custom`.
| /** | ||
| * Initializes auth service from config if present. | ||
| * | ||
| * @param {{ keys: Record<string, string>, mode?: 'enforce'|'log-only', crypto?: *, logger?: *, wallClockMs?: () => number }|undefined} auth | ||
| * @param {{ keys: Record<string, string>, mode: 'enforce'|'log-only', crypto?: *, logger?: *, wallClockMs?: () => number }|undefined} auth | ||
| * @param {string[]} [allowedWriters] | ||
| * @returns {{ auth: SyncAuthService|null, authMode: string|null }} | ||
| * @private | ||
| */ | ||
| /** | ||
| * @param {z.infer<typeof authSchema>} [auth] | ||
| * @param {string[]} [allowedWriters] | ||
| */ | ||
| function initAuth(auth, allowedWriters) { | ||
| if (auth && auth.keys) { | ||
| const VALID_MODES = new Set(['enforce', 'log-only']); | ||
| const mode = auth.mode || 'enforce'; | ||
| if (!VALID_MODES.has(mode)) { | ||
| throw new Error(`Invalid auth.mode: '${mode}'. Must be 'enforce' or 'log-only'.`); | ||
| } | ||
| return { auth: new SyncAuthService({ ...auth, allowedWriters }), authMode: mode }; | ||
| if (auth) { | ||
| return { auth: new SyncAuthService({ ...auth, allowedWriters }), authMode: auth.mode }; | ||
| } | ||
| return { auth: null, authMode: null }; | ||
| } |
There was a problem hiding this comment.
Duplicate JSDoc blocks for initAuth — first block retains wildcard * types
There are two consecutive JSDoc blocks immediately before function initAuth. TypeScript associates only the last (closest) block with the function, so the first block (lines 224–231) is dead documentation. More critically, that first block still uses crypto?: * and logger?: * wildcards — exactly the types this PR aims to eliminate. It should be removed and its useful content (description, @returns, @private) merged into the second block.
♻️ Proposed fix
-/**
- * Initializes auth service from config if present.
- *
- * `@param` {{ keys: Record<string, string>, mode: 'enforce'|'log-only', crypto?: *, logger?: *, wallClockMs?: () => number }|undefined} auth
- * `@param` {string[]} [allowedWriters]
- * `@returns` {{ auth: SyncAuthService|null, authMode: string|null }}
- * `@private`
- */
-/**
- * `@param` {z.infer<typeof authSchema>} [auth]
- * `@param` {string[]} [allowedWriters]
- */
+/**
+ * Initializes auth service from config if present.
+ *
+ * `@param` {z.infer<typeof authSchema>} [auth]
+ * `@param` {string[]} [allowedWriters]
+ * `@returns` {{ auth: SyncAuthService|null, authMode: string|null }}
+ * `@private`
+ */
function initAuth(auth, allowedWriters) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/services/HttpSyncServer.js` around lines 224 - 241, Remove the
duplicate JSDoc block immediately preceding function initAuth (the one using
wildcard types like crypto?: * and logger?: *), and merge its useful parts (the
function description, `@returns`, and `@private`) into the remaining JSDoc that uses
z.infer<typeof authSchema>; ensure the final JSDoc above function initAuth
documents the same description, includes the `@returns` and `@private` tags, and
does not contain any wildcard `*` types so TypeScript picks up the correct,
cleaned-up comment for initAuth.
| fromSha: /** @type {string} */ (typedJson.fromSha), | ||
| toSha: /** @type {string} */ (typedJson.toSha), | ||
| writerId: /** @type {string} */ (typedJson.writerId), | ||
| patchCount: /** @type {number} */ (typedJson.patchCount), |
There was a problem hiding this comment.
String fields are cast without a typeof runtime guard, unlike patchCount.
patchCount is protected by a typeof … !== 'number' check at lines 335–340 before its cast. fromSha, toSha, and writerId skip that step — they are only absence-checked (=== undefined). Passing a non-string value (e.g., { fromSha: 42 }) satisfies the existence check, and the JSDoc cast has no runtime effect, so the returned WormholeEdge silently carries a number where a string is expected, producing hard-to-diagnose failures downstream.
🛡️ Proposed fix — add `typeof` guards for the three string fields
for (const field of requiredFields) {
if (typedJson[field] === undefined) {
throw new WormholeError(`Invalid wormhole JSON: missing required field '${field}'`, {
code: 'E_INVALID_WORMHOLE_JSON',
context: { missingField: field },
});
}
}
+ for (const field of ['fromSha', 'toSha', 'writerId']) {
+ if (typeof typedJson[field] !== 'string') {
+ throw new WormholeError(`Invalid wormhole JSON: '${field}' must be a string`, {
+ code: 'E_INVALID_WORMHOLE_JSON',
+ context: { [field]: typedJson[field] },
+ });
+ }
+ }
if (typeof typedJson.patchCount !== 'number' || typedJson.patchCount < 0) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/domain/services/WormholeService.js` around lines 343 - 346, The code
constructs a WormholeEdge from typedJson but only runtime-checks patchCount’s
type; add the same typeof guards for fromSha, toSha, and writerId in the
function that builds the WormholeEdge (look for the block around the existing
patchCount typeof check in WormholeService.js) so each of those fields is
validated with if (typeof typedJson.fromSha !== 'string') { /* handle/throw like
patchCount */ } (and likewise for toSha and writerId) before performing the
JSDoc casts, matching the error-handling/throwing behavior used for patchCount.
Summary
05fcd4e): Eliminated ~107 wildcard casts acrosssrc/domain/andsrc/infrastructure/— role-specific persistence types, error narrowing helpers, typed aliases631ef96): Eliminated ~20 wildcard casts in CLI commands, setup-hooks, roaring, WarpGraph, text presenter0a4d13e): Eliminated all remaining wildcard casts in CLI types, presenters, domain services, utils, and test filesFinal state
TODO(ts-cleanup)tags in src/bin/test@type {*}/@type {any}in src/binNotable type improvements
MinHeapis now generic (@template T)QueryBuilderLikeduck-type defined inbin/cli/types.jsWarpGraphInstancefields narrowed from wildcards to structural typesPersistence.plumbingnarrowed from*tounknownJsonValuerecursive type usesunknown(JSDoc limitation)Test plan
npm run test:local— 193 files, 3,799 tests, 0 failuresnpx tsc --noEmit— 0 errorsnpx eslint .— 0 errorsnode scripts/ts-policy-check.js— passesSummary by CodeRabbit
New Features
Improvements
Bug Fixes
Documentation