docs: create home/homelab/guide-OpenWebUI-Ollama-VM
This commit is contained in:
parent
14c266464e
commit
da9b9d0d1c
135
home/homelab/guide-OpenWebUI-Ollama-VM.md
Normal file
135
home/homelab/guide-OpenWebUI-Ollama-VM.md
Normal file
@ -0,0 +1,135 @@
|
|||||||
|
---
|
||||||
|
title: Guide to Install OpenWebUI and Ollama in Proxmox VM
|
||||||
|
description:
|
||||||
|
published: true
|
||||||
|
date: 2025-10-02T13:14:25.148Z
|
||||||
|
tags: ubuntu server, linux, proxmox, virtual machine
|
||||||
|
editor: markdown
|
||||||
|
dateCreated: 2025-10-02T13:14:25.148Z
|
||||||
|
---
|
||||||
|
|
||||||
|
# Guide to Creating a VM for OpenWebUI and Ollama on Proxmox VE
|
||||||
|
|
||||||
|
This guide provides a step-by-step process to create a Virtual Machine (VM) on Proxmox VE 8.4.14 (running on AMD Ryzen 7 5800H with 16 threads, AMD Radeon Vega Graphics, and 61GB RAM) to host OpenWebUI and Ollama for local AI inference. The VM will use Ubuntu Server 24.04 LTS for stability and Docker for containerized deployment.
|
||||||
|
|
||||||
|
**System Considerations:**
|
||||||
|
- Conservative VM resources (8 CPU cores, 16GB RAM, 50GB disk) to balance host performance. Adjust upward for larger models.
|
||||||
|
- CPU-based inference for simplicity; GPU passthrough (for ROCm) is advanced and not covered here.
|
||||||
|
- Assumes Proxmox is updated (`apt update && apt full-upgrade` on host).
|
||||||
|
|
||||||
|
## Step 1: Prepare the ISO Image
|
||||||
|
1. Download the Ubuntu Server 24.04 LTS ISO (AMD64) from [ubuntu.com/download/server](https://ubuntu.com/download/server).
|
||||||
|
2. Upload to Proxmox:
|
||||||
|
- Log into Proxmox web UI (https://your-host-ip:8006).
|
||||||
|
- Go to Datacenter > your-node > Storage > `local` > ISO Images > Upload.
|
||||||
|
- Select the ISO and upload.
|
||||||
|
|
||||||
|
## Step 2: Create the VM in Proxmox
|
||||||
|
1. In Proxmox web UI, click **Create VM**.
|
||||||
|
2. **General**:
|
||||||
|
- Node: Default host.
|
||||||
|
- VM ID: Auto-generated or set (e.g., 100).
|
||||||
|
- Name: `OpenWebUI-Ollama-VM`.
|
||||||
|
- Click *Next*.
|
||||||
|
3. **OS**:
|
||||||
|
- Guest OS: Linux, 6.x - 2.6 Kernel.
|
||||||
|
- ISO Image: Select uploaded Ubuntu ISO.
|
||||||
|
- Click *Next*.
|
||||||
|
4. **System**:
|
||||||
|
- BIOS: OVMF (UEFI).
|
||||||
|
- Machine: q35.
|
||||||
|
- SCSI Controller: VirtIO SCSI single.
|
||||||
|
- Click *Next*.
|
||||||
|
5. **Hard Disk**:
|
||||||
|
- Bus/Device: VirtIO Block.
|
||||||
|
- Storage: `local-lvm` (or your default).
|
||||||
|
- Disk Size: 50 GiB (increase for large models).
|
||||||
|
- Click *Next*.
|
||||||
|
6. **CPU**:
|
||||||
|
- Sockets: 1.
|
||||||
|
- Cores: 8 (adjust 4-12 based on workload).
|
||||||
|
- Type: host.
|
||||||
|
- Click *Next*.
|
||||||
|
7. **Memory**:
|
||||||
|
- Memory: 16384 MiB (16GB; adjust to 32GB for large models).
|
||||||
|
- Minimum Memory: 8192 MiB (optional).
|
||||||
|
- Click *Next*.
|
||||||
|
8. **Network**:
|
||||||
|
- Model: VirtIO.
|
||||||
|
- Bridge: vmbr0.
|
||||||
|
- Firewall: Optional.
|
||||||
|
- Click *Next*.
|
||||||
|
9. **Confirm**:
|
||||||
|
- Review and check *Start after created* (optional).
|
||||||
|
- Click *Finish*.
|
||||||
|
|
||||||
|
## Step 3: Install Ubuntu Server
|
||||||
|
1. Select VM in Proxmox, open **Console** (noVNC).
|
||||||
|
2. Boot from ISO and follow Ubuntu installer:
|
||||||
|
- Language: English.
|
||||||
|
- Keyboard: Default.
|
||||||
|
- Network: DHCP.
|
||||||
|
- Proxy: None.
|
||||||
|
- Mirror: Default.
|
||||||
|
- Storage: Use entire disk.
|
||||||
|
- Profile: Set username (e.g., `user`), password, hostname (e.g., `ai-vm`).
|
||||||
|
- SSH: Enable OpenSSH server.
|
||||||
|
- Snaps: Skip.
|
||||||
|
- Complete and reboot (remove ISO if prompted).
|
||||||
|
3. Log in via console.
|
||||||
|
4. Update system: `sudo apt update && sudo apt upgrade -y`.
|
||||||
|
5. Install curl: `sudo apt install curl -y`.
|
||||||
|
|
||||||
|
## Step 4: Install Ollama
|
||||||
|
1. Install Ollama: `curl -fsSL https://ollama.com/install.sh | sh`.
|
||||||
|
2. Verify: `ollama --version`.
|
||||||
|
3. Pull a model: `ollama pull llama3` (or another model).
|
||||||
|
4. Test: `ollama run llama3` (exit with `/bye`).
|
||||||
|
|
||||||
|
## Step 5: Install OpenWebUI
|
||||||
|
1. Install Docker: `sudo apt install docker.io -y && sudo systemctl enable --now docker`.
|
||||||
|
2. Install Docker Compose: `sudo apt install docker-compose -y`.
|
||||||
|
3. Create directory: `mkdir openwebui && cd openwebui`.
|
||||||
|
4. Create `docker-compose.yml`: Run `nano docker-compose.yml` and add:
|
||||||
|
```yaml
|
||||||
|
version: '3'
|
||||||
|
services:
|
||||||
|
openwebui:
|
||||||
|
image: ghcr.io/open-webui/open-webui:main
|
||||||
|
container_name: openwebui
|
||||||
|
volumes:
|
||||||
|
- ./data:/app/backend/data
|
||||||
|
depends_on:
|
||||||
|
- ollama
|
||||||
|
ports:
|
||||||
|
- 8080:8080
|
||||||
|
environment:
|
||||||
|
- OLLAMA_API_BASE=http://ollama:11434
|
||||||
|
- WEBUI_SECRET_KEY=your_secret_key_here
|
||||||
|
restart: unless-stopped
|
||||||
|
ollama:
|
||||||
|
image: ollama/ollama
|
||||||
|
container_name: ollama
|
||||||
|
volumes:
|
||||||
|
- ./ollama:/root/.ollama
|
||||||
|
ports:
|
||||||
|
- 11434:11434
|
||||||
|
restart: unless-stopped
|
||||||
|
```
|
||||||
|
5. Start containers: `docker-compose up -d`.
|
||||||
|
6. Access OpenWebUI: Visit `http://vm-ip-address:8080` in browser, sign up, and select model.
|
||||||
|
|
||||||
|
## Step 6: Optimize and Secure
|
||||||
|
1. **Monitor**: Check VM usage in Proxmox (VM > Summary). Adjust CPU/RAM if needed (shutdown VM first).
|
||||||
|
2. **Networking**: Set static IP in Ubuntu (`sudo nano /etc/netplan/01-netcfg.yaml`) or configure port forwarding.
|
||||||
|
3. **Security**: Enable firewall: `sudo ufw allow from your-ip to any port 8080 && sudo ufw enable`. Use HTTPS (e.g., Nginx) for public access.
|
||||||
|
4. **Backups**: Enable in Proxmox (VM > Backup).
|
||||||
|
5. **Optional GPU**: Passthrough for ROCm is complex; research separately.
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
- VM fails to boot: Check console for errors; verify ISO.
|
||||||
|
- Ollama slow: Use smaller models or increase RAM.
|
||||||
|
- Docker issues: Check logs (`docker logs openwebui`).
|
||||||
|
- Port conflicts: Edit ports in `docker-compose.yml`.
|
||||||
|
|
||||||
|
This setup provides a robust local AI environment. Pull additional models with `ollama pull`. Refer to Proxmox or Ollama documentation for further assistance.
|
||||||
Loading…
x
Reference in New Issue
Block a user