From 745a9b426a6e7217ca69214052e01d6f3f5a3c22 Mon Sep 17 00:00:00 2001 From: Alexander Chumakov Date: Tue, 12 Aug 2025 10:28:48 +0300 Subject: [PATCH] Fix link to model_providers doc in README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 0c01654d66..40f6a7b520 100644 --- a/README.md +++ b/README.md @@ -263,7 +263,7 @@ they'll be committed to your working directory. Codex also allows you to use other providers that support the OpenAI Chat Completions (or Responses) API. -To do so, you must first define custom [providers](./config.md#model_providers) in `~/.codex/config.toml`. For example, the provider for a standard Ollama setup would be defined as follows: +To do so, you must first define custom [providers](./codex-rs/config.md#model_providers) in `~/.codex/config.toml`. For example, the provider for a standard Ollama setup would be defined as follows: ```toml [model_providers.ollama]