Docs / AI & Machine Learning / Automate Model Updates and Management with Ollama

Automate Model Updates and Management with Ollama

By Admin · Mar 15, 2026 · Updated Apr 24, 2026 · 136 views · 2 min read

Keeping your locally-hosted AI models current ensures you benefit from the latest improvements in quality, speed, and capabilities. This guide covers automating Ollama model updates, managing multiple model versions, creating custom Modelfiles, and building a robust model management workflow on your VPS.

Understanding Ollama Model Management

Ollama simplifies LLM management by handling downloads, storage, and serving. Models are stored in layers (similar to Docker images), making updates efficient since only changed layers need downloading.

Model Storage and Organization

# Default model storage location
ls ~/.ollama/models/

# Check disk usage by models
du -sh ~/.ollama/models/

# List all installed models with sizes
ollama list

# Show model details
ollama show llama3.1:8b --modelfile
ollama show llama3.1:8b --parameters
ollama show llama3.1:8b --system

Automated Model Update Script

#!/bin/bash
# /opt/scripts/ollama-update.sh
# Automatically updates all installed Ollama models

LOG_FILE="/var/log/ollama-updates.log"
MODELS_TO_TRACK=(
    "llama3.1:8b"
    "deepseek-coder-v2:16b"
    "nomic-embed-text"
    "mistral:7b"
    "codellama:13b"
)

log() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" | tee -a "$LOG_FILE"
}

update_model() {
    local model=$1
    log "Checking for updates: $model"

    # Pull the latest version
    output=$(ollama pull "$model" 2>&1)

    if echo "$output" | grep -q "up to date"; then
        log "  $model: already up to date"
    else
        log "  $model: UPDATED"
        log "  $output"
    fi
}

# Check Ollama service is running
if ! systemctl is-active --quiet ollama; then
    log "ERROR: Ollama service is not running"
    exit 1
fi

log "=== Starting model update check ==="

for model in "${MODELS_TO_TRACK[@]}"; do
    update_model "$model"
done

# Clean up old/unused model layers
log "Cleaning up unused layers..."
ollama prune 2>/dev/null || true

# Report disk usage
DISK_USAGE=$(du -sh ~/.ollama/models/ 2>/dev/null | awk '{print $1}')
log "Total model storage: $DISK_USAGE"
log "=== Update check complete ==="

Schedule with Cron

# Run model updates weekly at 3 AM Sunday
sudo crontab -e
0 3 * * 0 /opt/scripts/ollama-update.sh

# Or use a systemd timer for better logging
cat > /etc/systemd/system/ollama-update.timer  /etc/systemd/system/ollama-update.service  "$snapshot_file"

    echo "Snapshot saved: $snapshot_file"
}

# Compare two snapshots to see what changed
compare_snapshots() {
    diff         

Was this article helpful?