mirror of
https://github.com/ollama/ollama.git
synced 2026-04-22 16:55:44 +02:00
doc: update vscode doc (#15064)
--------- Co-authored-by: ParthSareen <parth.sareen@ollama.com>
This commit is contained in:
@@ -2,33 +2,84 @@
|
||||
title: VS Code
|
||||
---
|
||||
|
||||
## Install
|
||||
VS Code includes built-in AI chat through GitHub Copilot Chat. Ollama models can be used directly in the Copilot Chat model picker.
|
||||
|
||||
Install [VS Code](https://code.visualstudio.com/download).
|
||||
|
||||
## Usage with Ollama
|
||||

|
||||
|
||||
1. Open Copilot side bar found in top right window
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Ollama v0.18.3+
|
||||
- [VS Code 1.113+](https://code.visualstudio.com/download)
|
||||
- [GitHub Copilot Chat extension 0.41.0+](https://marketplace.visualstudio.com/items?itemName=GitHub.copilot-chat)
|
||||
|
||||
<Note> VS Code requires you to be logged in to use its model selector, even for custom models. This doesn't require a paid GitHub Copilot account; GitHub Copilot Free will enable model selection for custom models.</Note>
|
||||
|
||||
## Quick setup
|
||||
|
||||
```shell
|
||||
ollama launch vscode
|
||||
```
|
||||
|
||||
Recommended models will be shown after running the command. See the latest models at [ollama.com](https://ollama.com/search?c=tools).
|
||||
|
||||
Make sure **Local** is selected at the bottom of the Copilot Chat panel to use your Ollama models.
|
||||
<div style={{ display: "flex", justifyContent: "center" }}>
|
||||
<img
|
||||
src="/images/local.png"
|
||||
alt="Ollama Local Models"
|
||||
width="60%"
|
||||
style={{ borderRadius: "4px", marginTop: "10px", marginBottom: "10px" }}
|
||||
/>
|
||||
</div>
|
||||
|
||||
|
||||
## Run directly with a model
|
||||
|
||||
```shell
|
||||
ollama launch vscode --model qwen3.5:cloud
|
||||
```
|
||||
Cloud models are also available at [ollama.com](https://ollama.com/search?c=cloud).
|
||||
|
||||
## Manual setup
|
||||
|
||||
To configure Ollama manually without `ollama launch`:
|
||||
|
||||
1. Open the **Copilot Chat** side bar from the top right corner
|
||||
<div style={{ display: "flex", justifyContent: "center" }}>
|
||||
<img
|
||||
src="/images/vscode-sidebar.png"
|
||||
alt="VS Code chat Sidebar"
|
||||
width="75%"
|
||||
style={{ borderRadius: "4px" }}
|
||||
/>
|
||||
</div>
|
||||
2. Select the model dropdown > **Manage models**
|
||||
2. Click the **settings gear icon** (<Icon icon="gear" />) to bring up the Language Models window
|
||||
<div style={{ display: "flex", justifyContent: "center" }}>
|
||||
<img
|
||||
src="/images/vscode-models.png"
|
||||
src="/images/vscode-other-models.png"
|
||||
alt="VS Code model picker"
|
||||
width="75%"
|
||||
style={{ borderRadius: "4px" }}
|
||||
/>
|
||||
</div>
|
||||
3. Enter **Ollama** under **Provider Dropdown** and select desired models (e.g `qwen3, qwen3-coder:480b-cloud`)
|
||||
3. Click **Add Models** and select **Ollama** to load all your Ollama models into VS Code
|
||||
<div style={{ display: "flex", justifyContent: "center" }}>
|
||||
<img
|
||||
src="/images/vscode-model-options.png"
|
||||
alt="VS Code model options dropdown"
|
||||
src="/images/vscode-add-ollama.png"
|
||||
alt="VS Code model options dropdown to add ollama models"
|
||||
width="75%"
|
||||
style={{ borderRadius: "4px" }}
|
||||
/>
|
||||
</div>
|
||||
|
||||
4. Click the **Unhide** button in the model picker to show your Ollama models
|
||||
<div style={{ display: "flex", justifyContent: "center" }}>
|
||||
<img
|
||||
src="/images/vscode-unhide.png"
|
||||
alt="VS Code unhide models button"
|
||||
width="75%"
|
||||
style={{ borderRadius: "4px" }}
|
||||
/>
|
||||
</div>
|
||||
|
||||
Reference in New Issue
Block a user