Extracts token log probabilities from assistant replies within an LLMMessage
object.
Each row represents a token with its log probability and top alternative tokens.
get_logprobs(.llm, .index = NULL)
A tibble containing log probabilities for the specified assistant reply or all replies.
An LLMMessage
object containing the message history.
A positive integer specifying which assistant reply's log probabilities to extract.
If NULL
(default), log probabilities for all replies are returned.
An empty tibble is output if no logprobs were requested. Currently only works with openai_chat()
Columns include:
reply_index
: The index of the assistant reply in the message history.
token
: The generated token.
logprob
: The log probability of the generated token.
bytes
: The byte-level encoding of the token.
top_logprobs
: A list column containing the top alternative tokens with their log probabilities.
get_metadata()