Generate Embeddings Using OpenAI API on Azure
azure_openai_embedding(
.input,
.deployment = "text-embedding-3-small",
.endpoint_url = Sys.getenv("AZURE_ENDPOINT_URL"),
.api_version = "2023-05-15",
.truncate = TRUE,
.timeout = 120,
.dry_run = FALSE,
.max_tries = 3
)
A tibble with two columns: input
and embeddings
.
The input
column contains the texts sent to embed, and the embeddings
column
is a list column where each row contains an embedding vector of the sent input.
A character vector of texts to embed or an LLMMesssage
object
The embedding model identifier (default: "text-embedding-3-small").
Base URL for the API (default: Sys.getenv("AZURE_ENDPOINT_URL")).
What API-Version othe Azure OpenAI API should be used (default: "2023-05-15")
Whether to truncate inputs to fit the model's context length (default: TRUE).
Timeout for the API request in seconds (default: 120).
If TRUE, perform a dry run and return the request object.
Maximum retry attempts for requests (default: 3).