Learn R Programming

tidyllm (version 0.2.0)

generate_callback_function: Generate API-Specific Callback Function for Streaming Responses

Description

This function generates a callback function that processes streaming responses from different language model APIs. The callback function is specific to the API provided (claude, ollama, "mistral", or openai) and processes incoming data streams, printing the content to the console and updating a global environment for further use.

Usage

generate_callback_function(.api)

Value

A function that serves as a callback to handle streaming responses from the specified API. The callback function processes the raw data, updates the .tidyllm_stream_env$stream object, and prints the streamed content to the console. The function returns TRUE if streaming should continue, and FALSE when streaming is finished.

Arguments

.api

A character string indicating the API type. Supported values are "claude", "ollama", "mistral", "groq" and "openai".

Details

  • For Claude API: The function processes event and data lines, and handles the message_start and message_stop events to control streaming flow.

  • For Ollama API: The function directly parses the stream content as JSON and extracts the message$content field.

  • For OpenAI, Mistral and Groq: The function handles JSON data streams and processes content deltas. It stops processing when the [DONE] message is encountered.