Impact: Solves the N+1 problem for expensive API/DB calls inside arrays, and eliminates race conditions caused by global tool state.
Overview
Currently, tools declared at the root level of a bridge (with std.httpCall) share a single global state. If a user wires inputs to this tool from inside an array loop ([] as item { ... }), the loop elements overwrite each other's inputs, causing chaotic race conditions.
This epic solves this by introducing Element-Scoped Tool Declarations (allowing tools to be instantiated locally per-element) combined with Request-Scoped Memoization (preventing duplicate network calls when multiple elements process the same data).
The Target Developer Experience
bridge Query.processCatalog {
with context as ctx
with output as o
o <- ctx.catalog[] as cat {
# 1. LOCAL INSTANCE: Thread-safe, isolated inputs per element
# 2. MEMOIZE: Duplicate inputs hit a shared request-level cache
with std.httpCall as fetchItem memoize
fetchItem.method = "GET"
fetchItem.url <- "https://api.com/items/{cat.id}"
.item <- fetchItem.response.data
}
}
Element-Scoped Tool Declarations
Allow users to declare fresh, isolated tool instances inside array mapping blocks.
- Parser Update: Update the Chevrotain
elementWithDecl rule to accept with <tool> as <handle> alongside the existing alias syntax.
- AST/Engine Integration: Inside
processElementLines, intercept these local tool declarations. Assign them a unique instance ID (e.g., 100000 + nextForkSeq++), identically to how Pipe Forks are handled.
- Result: Every element in the runtime shadow tree gets its own isolated
ExecutionTree.state trunk, completely preventing input collisions.
The memoize DSL Keyword & Metadata API
Provide both DSL-level (for generic tools) and TypeScript-level (for native tools) controls for opting into memoization.
- TypeScript Metadata (
ToolMetadata): Expand the .bridge object on tool functions to accept memoize. This allows tool authors to force memoization by default, and crucially, provide a custom keyFn to bypass the slow JSON.stringify default.
Implementation Target:
export async function fetchExchangeRate(opts: { base: string, target: string }) {
// ... expensive DB/API call ...
}
fetchExchangeRate.bridge = {
// Force memoization by default with a custom, ultra-fast cache key resolver
memoize: {
keyFn: (input) => `${input.base}:${input.target}`
}
};
- Lexer/Parser: Add a
MemoizeKw to the Chevrotain lexer. Allow it at the end of bridgeWithDecl and inside toolBlock definitions.
- AST Types: Add
memoize?: boolean to HandleBinding and ToolDef interfaces.
- Resolution Logic: A tool is memoized if any of the three layers request it: TypeScript metadata
|| ToolDef || Handle Binding.
The Request-Scoped Cache Engine
Implement the actual cache mechanism. Crucial constraint: The cache must live on the request instance, never globally on the Node process, to prevent cross-tenant data leaks.
-
Interpreter (ExecutionTree.ts): * Add memoCache = new Map<string, Promise<any>>() to the root execution tree.
-
Ensure shadow() explicitly inherits the exact same map reference from the parent.
-
In callTool, generate the cache key. If meta.memoize.keyFn exists, invoke it; otherwise fallback to JSON.stringify(input).
-
Compiler (codegen.ts): * Inject const __memoCache = new Map(); at the top of the generated async function.
-
Pass the memoization flag/keyFn into the __call wrapper to manage the cache lookup.
-
Stampede Protection: The engine must cache the Promise, not the resolved result. This ensures that 100 concurrent array elements requesting the same URL instantly attach .then() to the exact same Promise, firing only 1 actual HTTP request.
Scope rules
bridge Query.processCatalog {
with context as ctx
with output as o
with std.httpCall as other
o <- ctx.catalog1[] as cat {
with std.httpCall as outer
# Assigning inputs is OK as it in current scope
outer.value <- cat.val
.inner <- ctx.catalog2[] as cat {
with std.httpCall as fetchItem memoize
# Assigning inputs is OK as it in current scope
fetchItem.method = "GET"
fetchItem.url <- "https://api.com/items/{cat.id}"
# Assign THROWS as you assign to non-scoped node
other.value = "What"
outer.value = "Cant do this"
# You can pull from any "above nested" scopes
.more <- other.result
.item <- fetchItem.response.data
.alsoOk <- outer.value
}
}
}
Impact: Solves the N+1 problem for expensive API/DB calls inside arrays, and eliminates race conditions caused by global tool state.
Overview
Currently, tools declared at the root level of a bridge (
with std.httpCall) share a single global state. If a user wires inputs to this tool from inside an array loop ([] as item { ... }), the loop elements overwrite each other's inputs, causing chaotic race conditions.This epic solves this by introducing Element-Scoped Tool Declarations (allowing tools to be instantiated locally per-element) combined with Request-Scoped Memoization (preventing duplicate network calls when multiple elements process the same data).
The Target Developer Experience
Element-Scoped Tool Declarations
Allow users to declare fresh, isolated tool instances inside array mapping blocks.
elementWithDeclrule to acceptwith <tool> as <handle>alongside the existingaliassyntax.processElementLines, intercept these local tool declarations. Assign them a unique instance ID (e.g.,100000 + nextForkSeq++), identically to how Pipe Forks are handled.ExecutionTree.statetrunk, completely preventing input collisions.The
memoizeDSL Keyword & Metadata APIProvide both DSL-level (for generic tools) and TypeScript-level (for native tools) controls for opting into memoization.
ToolMetadata): Expand the.bridgeobject on tool functions to acceptmemoize. This allows tool authors to force memoization by default, and crucially, provide a customkeyFnto bypass the slowJSON.stringifydefault.Implementation Target:
MemoizeKwto the Chevrotain lexer. Allow it at the end ofbridgeWithDecland insidetoolBlockdefinitions.memoize?: booleantoHandleBindingandToolDefinterfaces.||ToolDef||Handle Binding.The Request-Scoped Cache Engine
Implement the actual cache mechanism. Crucial constraint: The cache must live on the request instance, never globally on the Node process, to prevent cross-tenant data leaks.
Interpreter (
ExecutionTree.ts): * AddmemoCache = new Map<string, Promise<any>>()to the root execution tree.Ensure
shadow()explicitly inherits the exact same map reference from the parent.In
callTool, generate the cache key. Ifmeta.memoize.keyFnexists, invoke it; otherwise fallback toJSON.stringify(input).Compiler (
codegen.ts): * Injectconst __memoCache = new Map();at the top of the generated async function.Pass the memoization flag/keyFn into the
__callwrapper to manage the cache lookup.Stampede Protection: The engine must cache the Promise, not the resolved result. This ensures that 100 concurrent array elements requesting the same URL instantly attach
.then()to the exact same Promise, firing only 1 actual HTTP request.Scope rules