From 2dbb000908f046396af38ab38462fdd6551eab35 Mon Sep 17 00:00:00 2001 From: Maternion <98753158+maternion@users.noreply.github.com> Date: Tue, 10 Feb 2026 10:13:04 +0530 Subject: [PATCH] update context length format. --- docs/context-length.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/context-length.mdx b/docs/context-length.mdx index 9b1cd8ae8..06ae21a39 100644 --- a/docs/context-length.mdx +++ b/docs/context-length.mdx @@ -6,9 +6,9 @@ Context length is the maximum number of tokens that the model has access to in m Ollama defaults to the following context lengths based on VRAM: - - < 24 GiB VRAM: 4,096 context - - 24-48 GiB VRAM: 32,768 context - - >= 48 GiB VRAM: 262,144 context + - < 24 GiB VRAM: 4k context + - 24-48 GiB VRAM: 32k context + - >= 48 GiB VRAM: 256k context Tasks which require large context like web search, agents, and coding tools should be set to at least 64000 tokens.