docs: update instructions for ollama config command

These tools can be automatically configured using the new ollama config command
This commit is contained in:
Bruce MacDonald
2026-01-21 17:03:41 -08:00
parent b5d0f72f16
commit cc3ac5fee3
4 changed files with 132 additions and 31 deletions

View File

@@ -2,22 +2,31 @@
title: Codex
---
Codex is OpenAI's agentic coding tool for the command line.
## Install
Install the [Codex CLI](https://developers.openai.com/codex/cli/):
```
```shell
npm install -g @openai/codex
```
## Usage with Ollama
<Note>Codex requires a larger context window. It is recommended to use a context window of at least 32K tokens.</Note>
Configure Codex to use Ollama:
```shell
ollama config codex
```
This will prompt you to select a model and automatically configure Codex to use Ollama.
<Accordion title="Manual Configuration">
To use `codex` with Ollama, use the `--oss` flag:
```
```shell
codex --oss
```
@@ -25,20 +34,22 @@ codex --oss
By default, codex will use the local `gpt-oss:20b` model. However, you can specify a different model with the `-m` flag:
```
```shell
codex --oss -m gpt-oss:120b
```
### Cloud Models
```
```shell
codex --oss -m gpt-oss:120b-cloud
```
</Accordion>
<Note>Codex requires a larger context window. It is recommended to use a context window of at least 32K tokens.</Note>
## Connecting to ollama.com
Create an [API key](https://ollama.com/settings/keys) from ollama.com and export it as `OLLAMA_API_KEY`.
To use ollama.com directly, edit your `~/.codex/config.toml` file to point to ollama.com.