Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# ellmer (development version)

* `chat_openai()` (and all other providers that use a ChatGPT model) now default to `gpt-5` (#800).
* `chat_anthropic()` now defaults to `claude-sonnet-4-5-20250929` (#800).
* `chat_google_gemini()` and `chat_openai_responses()` support image generation (#368).
* `batch_*()` no longer hashes properties of the provider besides the `name`, `model`, and `base_url`. This should provide some protection from accidentally reusing the same `.json` file with different providers, while still allowing you to use the same batch file across ellmer versions.
* `batch_*()` have a new `ignore_hash` argument that allows you to opt out of the check if you're confident the difference only arises because ellmer itself has changed.
Expand Down
2 changes: 1 addition & 1 deletion R/batch-chat.R
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
#' For any of the aboves, will return `NULL` if `wait = FALSE` and the job
#' is not complete.
#' @examplesIf has_credentials("openai")
#' chat <- chat_openai(model = "gpt-4.1-nano")
#' chat <- chat_openai(model = "gpt-5-nano")
#'
#' # Chat ----------------------------------------------------------------------
#'
Expand Down
6 changes: 3 additions & 3 deletions R/provider-anthropic.R
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ NULL
#'
#' @inheritParams chat_openai
#' @inherit chat_openai return
#' @param model `r param_model("claude-sonnet-4-20250514", "anthropic")`
#' @param model `r param_model("claude-sonnet-4-5-20250929", "anthropic")`
#' @param api_key `r api_key_param("ANTHROPIC_API_KEY")`
#' @param max_tokens Maximum number of tokens to generate before stopping.
#' @param beta_headers Optionally, a character vector of beta headers to opt-in
Expand Down Expand Up @@ -43,7 +43,7 @@ chat_anthropic <- function(
) {
echo <- check_echo(echo)

model <- set_default(model, "claude-sonnet-4-20250514")
model <- set_default(model, "claude-sonnet-4-5-20250929")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really wish they had something like claude-sonnet-4-5-latest. I tried model = "claude-sonnet-4-5" and it worked, but it's not documented anywhere and might not be reliable.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I had the same reaction. Also the name has to match something in our pricing data (which is getting updated in #790).


params <- params %||% params()
if (lifecycle::is_present(max_tokens)) {
Expand Down Expand Up @@ -75,7 +75,7 @@ chat_claude <- chat_anthropic

chat_anthropic_test <- function(
...,
model = "claude-3-5-sonnet-latest",
model = "claude-haiku-4-5-20251001",
params = NULL,
echo = "none"
) {
Expand Down
4 changes: 2 additions & 2 deletions R/provider-azure.R
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ NULL
#' @export
#' @examples
#' \dontrun{
#' chat <- chat_azure_openai(model = "gpt-4o-mini")
#' chat <- chat_azure_openai(model = "gpt-5-nano")
#' chat$chat("Tell me three jokes about statisticians")
#' }
chat_azure_openai <- function(
Expand Down Expand Up @@ -128,7 +128,7 @@ chat_azure_openai_test <- function(
system_prompt = system_prompt,
api_key = api_key,
endpoint = "https://ai-hwickhamai260967855527.openai.azure.com",
model = "gpt-4o-mini",
model = "gpt-5-nano",
params = params,
echo = echo
)
Expand Down
4 changes: 2 additions & 2 deletions R/provider-github.R
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
#'
#' @family chatbots
#' @param api_key `r api_key_param("GITHUB_PAT")`
#' @param model `r param_model("gpt-4o")`
#' @param model `r param_model("gpt-5")`
#' @param params Common model parameters, usually created by [params()].
#' @export
#' @inheritParams chat_openai
Expand All @@ -37,7 +37,7 @@ chat_github <- function(
) {
check_installed("gitcreds")

model <- set_default(model, "gpt-4.1")
model <- set_default(model, "gpt-5")
echo <- check_echo(echo)

# https://docs.github.com/en/rest/models/inference?apiVersion=2022-11-28
Expand Down
5 changes: 2 additions & 3 deletions R/provider-openai-responses.R
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ chat_openai_responses <- function(
api_headers = character(),
echo = c("none", "output", "all")
) {
model <- set_default(model, "gpt-4.1")
model <- set_default(model, "gpt-5")
echo <- check_echo(echo)

provider <- ProviderOpenAIResponses(
Expand All @@ -56,12 +56,11 @@ chat_openai_responses <- function(
chat_openai_responses_test <- function(
system_prompt = "Be terse.",
...,
model = "gpt-4.1-nano",
model = "gpt-5-nano",
params = NULL,
echo = "none"
) {
params <- params %||% params()
params$temperature <- params$temperature %||% 0

chat_openai_responses(
system_prompt = system_prompt,
Expand Down
7 changes: 3 additions & 4 deletions R/provider-openai.R
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ NULL
#' @param system_prompt A system prompt to set the behavior of the assistant.
#' @param base_url The base URL to the endpoint; the default uses OpenAI.
#' @param api_key `r api_key_param("OPENAI_API_KEY")`
#' @param model `r param_model("gpt-4.1", "openai")`
#' @param model `r param_model("gpt-5", "openai")`
#' @param params Common model parameters, usually created by [params()].
#' @param seed Optional integer seed that ChatGPT uses to try and make output
#' more reproducible.
Expand Down Expand Up @@ -56,7 +56,7 @@ chat_openai <- function(
api_headers = character(),
echo = c("none", "output", "all")
) {
model <- set_default(model, "gpt-4.1")
model <- set_default(model, "gpt-5")
echo <- check_echo(echo)

params <- params %||% params()
Expand All @@ -83,13 +83,12 @@ chat_openai <- function(
chat_openai_test <- function(
system_prompt = "Be terse.",
...,
model = "gpt-4.1-nano",
model = "gpt-5-nano",
params = NULL,
echo = "none"
) {
params <- params %||% params()
params$seed <- params$seed %||% 1014
params$temperature <- params$temperature %||% 0

chat_openai(
system_prompt = system_prompt,
Expand Down
4 changes: 2 additions & 2 deletions R/provider-openrouter.R
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ NULL
#' @export
#' @family chatbots
#' @param api_key `r api_key_param("OPENROUTER_API_KEY")`
#' @param model `r param_model("gpt-4o")`
#' @param model `r param_model("gpt-5")`
#' @param params Common model parameters, usually created by [params()].
#' @inheritParams chat_openai
#' @inherit chat_openai return
Expand All @@ -31,7 +31,7 @@ chat_openrouter <- function(
echo = c("none", "output", "all"),
api_headers = character()
) {
model <- set_default(model, "gpt-4o")
model <- set_default(model, "gpt-5")
echo <- check_echo(echo)

params <- params %||% params()
Expand Down
6 changes: 3 additions & 3 deletions R/provider-portkey.R
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
#'
#' @family chatbots
#' @param api_key `r api_key_param("PORTKEY_API_KEY")`
#' @param model `r param_model("gpt-4o", "openai")`
#' @param model `r param_model("gpt-5", "openai")`
#' @param virtual_key A virtual identifier storing LLM provider's API key. See
#' [documentation](https://portkey.ai/docs/product/ai-gateway/virtual-keys).
#' Can be read from the `PORTKEY_VIRTUAL_KEY` environment variable.
Expand All @@ -34,7 +34,7 @@ chat_portkey <- function(
echo = NULL,
api_headers = character()
) {
model <- set_default(model, "gpt-4o")
model <- set_default(model, "gpt-5")
echo <- check_echo(echo)

params <- params %||% params()
Expand All @@ -53,7 +53,7 @@ chat_portkey <- function(

chat_portkey_test <- function(
...,
model = "gpt-4o-mini",
model = "gpt-5-nano",
params = NULL,
echo = "none"
) {
Expand Down
6 changes: 3 additions & 3 deletions README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ If you're using ellmer inside an organisation, you may have internal policies th

If you're using ellmer for your own exploration, you'll have a lot more freedom, so we have a few recommendations to help you get started:

- `chat_openai()` or `chat_anthropic()` are good places to start. `chat_openai()` defaults to **GPT-4.1**, but you can use `model = "gpt-4-1-nano"` for a cheaper, faster model, or `model = "o3"` for more complex reasoning. `chat_anthropic()` is also good; it defaults to **Claude 4.0 Sonnet**, which we have found to be particularly good at writing R code.
- `chat_openai()` or `chat_anthropic()` are good places to start. `chat_openai()` defaults to **GPT-5**, but you can use `model = "gpt-5-mini"` or `model = "gpt-5-nano"` for a cheaper, faster models. `chat_anthropic()` is also good; it defaults to **Claude 4.5 Sonnet**, which we have found to be particularly good at writing R code.

- `chat_google_gemini()` is a strong model with generous free tier (with the downside that [your data is used](https://ai.google.dev/gemini-api/terms#unpaid-services) to improve the model), making it a great place to start if you don't want to spend any money.

Expand All @@ -88,7 +88,7 @@ You can work with ellmer in several different ways, depending on whether you are
```{r}
library(ellmer)

chat <- chat_openai("Be terse", model = "gpt-4o-mini")
chat <- chat_openai("Be terse", model = "gpt-5-mini")
```

Chat objects are stateful [R6 objects](https://r6.r-lib.org): they retain the context of the conversation, so each new query builds on the previous ones. You call their methods with `$`.
Expand Down Expand Up @@ -141,7 +141,7 @@ In most circumstances, ellmer will stream the output to the console. You can tak

```{r}
my_function <- function() {
chat <- chat_openai("Be terse", model = "gpt-4o-mini", echo = "none")
chat <- chat_openai("Be terse", model = "gpt-5-mini", echo = "none")
chat$chat("What is 6 times 7?")
}
str(my_function())
Expand Down
84 changes: 55 additions & 29 deletions inst/_vcr/Chat.yml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

64 changes: 34 additions & 30 deletions inst/_vcr/chat_anthropic.yml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading