diff --git a/README.md b/README.md index 521c4bc7d..0fc1e8ca9 100644 --- a/README.md +++ b/README.md @@ -55,7 +55,7 @@ The official [Ollama Docker image](https://hub.docker.com/r/ollama/ollama) `olla ollama ``` -You'll be prompted to run a model or connect Ollama to your existing agents or applications such as `claude`, `codex`, `copilot`, `openclaw` and more. +You'll be prompted to run a model or connect Ollama to your existing agents or applications such as `Claude Code`, `OpenClaw`, `OpenCode` , `Codex`, `Copilot`, and more. ### Coding diff --git a/docs/docs.json b/docs/docs.json index 3b2e651ff..17884d992 100644 --- a/docs/docs.json +++ b/docs/docs.json @@ -120,6 +120,7 @@ "pages": [ "/integrations/claude-code", "/integrations/codex", + "/integrations/copilot-cli", "/integrations/opencode", "/integrations/droid", "/integrations/goose", diff --git a/docs/integrations/copilot-cli.mdx b/docs/integrations/copilot-cli.mdx index 0c476bce8..5262234ed 100644 --- a/docs/integrations/copilot-cli.mdx +++ b/docs/integrations/copilot-cli.mdx @@ -4,7 +4,7 @@ title: Copilot CLI GitHub Copilot CLI is GitHub's AI coding agent for the terminal. It can understand your codebase, make edits, run commands, and help you build software faster. -Open models can be used with Copilot CLI through Ollama's OpenAI-compatible API, enabling you to use models such as `qwen3.5`, `glm-5:cloud`, `kimi-k2.5:cloud`. +Open models can be used with Copilot CLI through Ollama, enabling you to use models such as `qwen3.5`, `glm-5.1:cloud`, `kimi-k2.5:cloud`. ## Install @@ -90,4 +90,4 @@ Or run with environment variables inline: COPILOT_PROVIDER_BASE_URL=http://localhost:11434/v1 COPILOT_PROVIDER_API_KEY= COPILOT_PROVIDER_WIRE_API=responses COPILOT_MODEL=glm-5:cloud copilot ``` -**Note:** For best results, we recommend models with at least 64k context tokens. See the [context length documentation](/context-length) for how to adjust context length in Ollama. +**Note:** Copilot requires a large context window. We recommend at least 64k tokens. See the [context length documentation](/context-length) for how to adjust context length in Ollama.