A character string containing the prompt for the Gemini model.
tokens
A list containing the API URL and key from token.vertex() function.
temperature
The temperature to use. Default is 1 value should be between 0 and 2
see https://ai.google.dev/gemini-api/docs/models/generative-models#model-parameters
maxOutputTokens
The maximum number of tokens to generate.
Default is 8192 and 100 tokens correspond to roughly 60-80 words.
topK
The top-k value to use. Default is 40 value should be between 0 and 100
see https://ai.google.dev/gemini-api/docs/models/generative-models#model-parameters
topP
The top-p value to use. Default is 0.95 value should be between 0 and 1
see https://ai.google.dev/gemini-api/docs/models/generative-models#model-parameters
seed
The seed to use. Default is 1234 value should be integer
see https://ai.google.dev/gemini-api/docs/models/generative-models#model-parameters
if (FALSE) {
# token should be created before this. using the token.vertex() functionprompt <- "What is sachins Jersey number?"gemini.vertex(prompt, tokens)
}