Skip to main content

Type Alias: PromptTemplateFunction()

PromptTemplateFunction = (components, modelInfo, selectedContextualElements, config, estimateTokenCountFn) => Promise<FormattedPrompt>

Defined in: packages/agentos/src/core/llm/IPromptEngine.ts:358

Function signature for prompt template implementations. Templates are responsible for taking all processed prompt components and formatting them into the final FormattedPrompt structure required by a specific LLM provider or model type.

Parameters

components

Readonly<PromptComponents>

The core and augmented prompt components.

modelInfo

Readonly<ModelTargetInfo>

Information about the target AI model.

selectedContextualElements

ReadonlyArray<ContextualPromptElement>

Contextual elements chosen for this prompt.

config

Readonly<PromptEngineConfig>

A read-only view of the PromptEngine's current configuration.

estimateTokenCountFn

(content, modelId?) => Promise<number>

A function to estimate token counts, useful within templates.

Returns

Promise<FormattedPrompt>

A promise that resolves to the final, formatted prompt.