No description
  • TypeScript 100%
Find a file
James Peret a71346c7ad
Add agentId field to ContextRequest for stable agent identity scoping
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-04-03 07:20:15 -03:00
src Add agentId field to ContextRequest for stable agent identity scoping 2026-04-03 07:20:15 -03:00
tests Add agentId field to ContextRequest for stable agent identity scoping 2026-04-03 07:20:15 -03:00
.gitignore Initial commit 2026-04-03 01:11:17 -03:00
package-lock.json Initial commit 2026-04-03 01:11:17 -03:00
package.json Initial commit 2026-04-03 01:11:17 -03:00
README.md Add agentId field to ContextRequest for stable agent identity scoping 2026-04-03 07:20:15 -03:00
tsconfig.json Initial commit 2026-04-03 01:11:17 -03:00
vitest.config.ts Initial commit 2026-04-03 01:11:17 -03:00

@fractal-synapse/memory-injection-plugin

Coordinator plugin for the multi-plugin memory stack. Not a memory store itself — it sits in front of all memory plugins and compiles their context fragments for injection into the agent.

What it does

  • Registers memory plugins and calls their getContext() at the right injection points
  • Prioritises and trims fragments to stay within a token budget
  • Optionally synthesises fragments into coherent prose via a lightweight LLM call
  • Exposes an onContextInjected callback for real-time memory viewer integration

Installation

npm install @fractal-synapse/memory-injection-plugin

Usage

import { MemoryInjectionPlugin } from '@fractal-synapse/memory-injection-plugin';

const injection = new MemoryInjectionPlugin({
  totalTokenBudget: 1250,
  modelRegistry,           // enables LLM synthesis
  onContextInjected: (event) => {
    // send to memory viewer panel
  },
  logger,
});

// Register memory plugins before initializing the agent
injection.register(workspacePlugin);
injection.register(knowledgePlugin, 300);  // custom budget override

// Wire into agent
await injection.initializeAgent(agent);

Config

Option Type Default Description
totalTokenBudget number 1250 Token cap per compilation pass. Applied independently to session-start and per-message.
modelRegistry ModelRegistry When provided, enables LLM synthesis by default.
synthesisModel string Override model name for synthesis. Defaults to getExtractionModelName()getDefaultModelName().
enableSynthesis boolean true Explicitly enable/disable synthesis.
onContextInjected (event) => void Callback fired after each compilation. Used for memory viewer.
logger LoggingInterface Logger from @fractal-synapse/agent-core.

IMemoryInjectionPlugin interface

Memory plugins implement this interface:

interface IMemoryInjectionPlugin {
  readonly name: string;
  readonly injectionPoints: ('session-start' | 'per-message')[];
  getContext(request: ContextRequest): Promise<ContextFragment[]>;
}
  • session-start — called once when the agent starts. Compiled blocks go into the system prompt via systemAppend.
  • per-message — called on every user turn. Compiled content returned as userMessageAppend.

ContextRequest

The request object passed to getContext() in every injection-point call:

interface ContextRequest {
  agentId: string;
  injectionPoint: 'session-start' | 'per-message';
}
  • agentId — The stable agent identity passed to every getContext() call. Derived from agent.getDefinitionName() ?? agent.getId(). Use this to scope per-agent storage — do not use agent.getId() directly, as it is a session-level identifier that changes each conversation.
  • injectionPoint — Which injection point triggered the call, so plugins can return different fragments depending on context.

ContextFragment

interface ContextFragment {
  content: string;
  tokens: number;        // self-reported estimate
  priority: number;      // 0100; lower = cut first
  label?: string;        // section label in output
  synthesize?: boolean;  // false = inject raw, never synthesised
}

onContextInjected callback

Receives an InjectedContextEvent after each compilation:

interface InjectedContextEvent {
  agentId: string;
  injectionPoint: 'session-start' | 'per-message';
  fragments: Array<{
    pluginName: string;
    content: string;
    tokens: number;
    priority: number;
    included: boolean;   // false if trimmed by budget
  }>;
  synthesized: boolean;
  finalContent: string;
  timestamp: string;
}

Synthesis

When modelRegistry is provided and synthesis is not disabled, synthesizable fragments are sent in one LLM call to produce a coherent prose summary targeting ~60% of the token budget. Raw fragments (synthesize: false) are always concatenated verbatim and never touch the LLM. If synthesis fails, it falls back to concatenation silently.