---
title: VS Code
---
VS Code includes built-in AI chat through GitHub Copilot Chat. Ollama models can be used directly in the Copilot Chat model picker.

## Prerequisites
- Ollama v0.18.3+
- [VS Code 1.113+](https://code.visualstudio.com/download)
- [GitHub Copilot Chat extension 0.41.0+](https://marketplace.visualstudio.com/items?itemName=GitHub.copilot-chat)
VS Code requires you to be logged in to use its model selector, even for custom models. This doesn't require a paid GitHub Copilot account; GitHub Copilot Free will enable model selection for custom models.
## Quick setup
```shell
ollama launch vscode
```
Recommended models will be shown after running the command. See the latest models at [ollama.com](https://ollama.com/search?c=tools).
Make sure **Local** is selected at the bottom of the Copilot Chat panel to use your Ollama models.
## Run directly with a model
```shell
ollama launch vscode --model qwen3.5:cloud
```
Cloud models are also available at [ollama.com](https://ollama.com/search?c=cloud).
## Manual setup
To configure Ollama manually without `ollama launch`:
1. Open the **Copilot Chat** side bar from the top right corner
2. Click the **settings gear icon** () to bring up the Language Models window
3. Click **Add Models** and select **Ollama** to load all your Ollama models into VS Code
4. Click the **Unhide** button in the model picker to show your Ollama models