Class FewShotPromptTemplate

Prompt template that contains few-shot examples.

Hierarchy

Implements

Constructors

Properties

PromptValueReturnType: StringPromptValue
examplePrompt: PromptTemplate<any, any>

An PromptTemplate used to format a single example.

exampleSeparator: string = "\n\n"

String separator used to join the prefix, the examples, and suffix.

inputVariables: string[]

A list of variable names the prompt template expects

partialVariables: PartialValues<any>

Partial variables

prefix: string = ""

A prompt template string to put before the examples.

Default Value

""

suffix: string = ""

A prompt template string to put after the examples.

templateFormat: "f-string" = "f-string"

The format of the prompt template. Options are: 'f-string'

validateTemplate: boolean = true

Whether or not to try validating the template on initialization.

exampleSelector?: BaseExampleSelector

An BaseExampleSelector Examples to format into the prompt. Exactly one of this or examples must be provided.

examples?: InputValues[]

Examples to format into the prompt. Exactly one of this or exampleSelector must be provided.

outputParser?: BaseOutputParser<unknown>

How to parse the output of calling an LLM on this formatted prompt

Methods

  • Stream all output from a runnable, as reported to the callback system. This includes all inner runs of LLMs, Retrievers, Tools, etc. Output is streamed as Log objects, which include a list of jsonpatch ops that describe how the state of the run has changed in each step, and the final state of the run. The jsonpatch ops can be applied in order to construct state.

    Parameters

    • input: any
    • Optional options: Partial<BaseCallbackConfig>
    • Optional streamOptions: Omit<LogStreamCallbackHandlerInput, "autoClose">

    Returns AsyncGenerator<RunLogPatch, any, unknown>

Generated using TypeDoc