The core and augmented prompt components.
Information about the target AI model.
Contextual elements chosen for this prompt.
A read-only view of the PromptEngine's current configuration.
A function to estimate token counts, useful within templates.
Optional modelId: stringA promise that resolves to the final, formatted prompt.
Function signature for prompt template implementations. Templates are responsible for taking all processed prompt components and formatting them into the final
FormattedPromptstructure required by a specific LLM provider or model type.