Interface LLMJudgeConfig

Configuration for LLM Judge

interface LLMJudgeConfig {
    llmProvider: AIModelProviderManager;
    modelId?: string;
    providerId?: string;
    temperature?: number;
    systemPrompt?: string;
}

Properties

LLM provider manager

modelId?: string

Model to use for judging

providerId?: string

Provider ID

temperature?: number

Temperature for judging (lower = more consistent)

systemPrompt?: string

Custom system prompt for the judge