Initialize public data and docs repository

This commit is contained in:
AltStack Bot
2026-02-25 22:36:27 +05:30
commit 2a0ac1b107
357 changed files with 50685 additions and 0 deletions

View File

@@ -0,0 +1,118 @@
---
title: "Deploy Meta Llama 3.1 Self-Hosted (Docker)"
description: "Step-by-step guide to self-hosting Meta Llama 3.1 with Docker Compose. "
---
# Deploy Meta Llama 3.1
Meta's flagship open-weight model with 128K context. Supports 8B, 70B, and 405B parameters.
<div className="deploy-hero">
<span className="deploy-hero-item">⭐ 65.0k stars</span>
<span className="deploy-hero-item">📜 Llama 3.1 Community License</span>
<span className="deploy-hero-item">🔴 Advanced</span>
<span className="deploy-hero-item">⏱ ~20 minutes</span>
</div>
<div className="mt-8 mb-4">
<a
href="https://m.do.co/c/2ed27757a361"
target="_blank"
rel="noopener noreferrer"
className="flex items-center justify-center w-full px-6 py-4 text-lg font-bold text-white transition-all bg-blue-600 rounded-xl hover:bg-blue-700 hover:scale-[1.02] shadow-lg shadow-blue-500/30"
>
🚀 Deploy on DigitalOcean ($200 Free Credit)
</a>
</div>
## What You'll Get
A fully working Meta Llama 3.1 instance running on your server. Your data stays on your hardware — no third-party access, no usage limits, no surprise invoices.
## Prerequisites
- A server with Docker and Docker Compose installed ([setup guide](/quick-start/choosing-a-server))
- A domain name pointed to your server (optional but recommended)
- Basic terminal access (SSH)
## The Config
Create a directory for Meta Llama 3.1 and add this `docker-compose.yml`:
```yaml
# -------------------------------------------------------------------------
# 🚀 Created and distributed by The AltStack
# 🌍 https://thealtstack.com
# -------------------------------------------------------------------------
version: '3.8'
services:
ollama-llama:
image: ollama/ollama:latest
container_name: ollama-llama
restart: unless-stopped
command: serve
ports:
- "11434:11434"
volumes:
- ollama:/root/.ollama
volumes:
ollama:
```
## Let's Ship It
```bash
# Create a directory
mkdir -p /opt/llama && cd /opt/llama
# Create the docker-compose.yml (paste the config above)
nano docker-compose.yml
# Pull images and start
docker compose up -d
# Watch the logs
docker compose logs -f
```
## Post-Deployment Checklist
- [ ] Service is accessible on the configured port
- [ ] Admin account created (if applicable)
- [ ] Reverse proxy configured ([Caddy guide](/concepts/reverse-proxies))
- [ ] SSL/HTTPS working
- [ ] Backup script set up ([backup guide](/concepts/backups))
- [ ] Uptime monitor added ([Uptime Kuma](/deploy/uptime-kuma))
## The "I Broke It" Section
**Container won't start?**
```bash
docker compose logs llama | tail -50
```
**Port already in use?**
```bash
# Find what's using the port
lsof -i :PORT_NUMBER
```
**Need to start fresh?**
```bash
docker compose down -v # ⚠️ This deletes volumes/data!
docker compose up -d
```
## Going Further
- [Meta Llama 3.1 on AltStack Directory](https://thealtstack.com/alternative-to/llama)
- [Meta Llama 3.1 Self-Hosted Guide](https://thealtstack.com/self-hosted/llama)
- [Official Documentation](https://llama.meta.com)
- [GitHub Repository](https://github.com/meta-llama/llama3)