--- title: "Deploy Meta Llama 3.1 Self-Hosted (Docker)" description: "Step-by-step guide to self-hosting Meta Llama 3.1 with Docker Compose. " --- # Deploy Meta Llama 3.1 Meta's flagship open-weight model with 128K context. Supports 8B, 70B, and 405B parameters.
⭐ 65.0k stars 📜 Llama 3.1 Community License 🔴 Advanced ⏱ ~20 minutes
🚀 Deploy on DigitalOcean ($200 Free Credit)
## What You'll Get A fully working Meta Llama 3.1 instance running on your server. Your data stays on your hardware — no third-party access, no usage limits, no surprise invoices. ## Prerequisites - A server with Docker and Docker Compose installed ([setup guide](/quick-start/choosing-a-server)) - A domain name pointed to your server (optional but recommended) - Basic terminal access (SSH) ## The Config Create a directory for Meta Llama 3.1 and add this `docker-compose.yml`: ```yaml # ------------------------------------------------------------------------- # 🚀 Created and distributed by The AltStack # 🌍 https://thealtstack.com # ------------------------------------------------------------------------- version: '3.8' services: ollama-llama: image: ollama/ollama:latest container_name: ollama-llama restart: unless-stopped command: serve ports: - "11434:11434" volumes: - ollama:/root/.ollama volumes: ollama: ``` ## Let's Ship It ```bash # Create a directory mkdir -p /opt/llama && cd /opt/llama # Create the docker-compose.yml (paste the config above) nano docker-compose.yml # Pull images and start docker compose up -d # Watch the logs docker compose logs -f ``` ## Post-Deployment Checklist - [ ] Service is accessible on the configured port - [ ] Admin account created (if applicable) - [ ] Reverse proxy configured ([Caddy guide](/concepts/reverse-proxies)) - [ ] SSL/HTTPS working - [ ] Backup script set up ([backup guide](/concepts/backups)) - [ ] Uptime monitor added ([Uptime Kuma](/deploy/uptime-kuma)) ## The "I Broke It" Section **Container won't start?** ```bash docker compose logs llama | tail -50 ``` **Port already in use?** ```bash # Find what's using the port lsof -i :PORT_NUMBER ``` **Need to start fresh?** ```bash docker compose down -v # ⚠️ This deletes volumes/data! docker compose up -d ``` ## Going Further - [Meta Llama 3.1 on AltStack Directory](https://thealtstack.com/alternative-to/llama) - [Meta Llama 3.1 Self-Hosted Guide](https://thealtstack.com/self-hosted/llama) - [Official Documentation](https://llama.meta.com) - [GitHub Repository](https://github.com/meta-llama/llama3)