Docs / Troubleshooting / How to Fix Docker No Space Left on Device Error

How to Fix Docker No Space Left on Device Error

By Admin · Mar 1, 2026 · Updated Apr 23, 2026 · 32 views · 2 min read

What Causes This Error?

Docker can consume significant disk space through images, containers, volumes, and build cache. The "no space left on device" error means Docker has filled the partition where its data directory resides (usually /var/lib/docker).

Step 1: Check Disk Usage

# Overall disk usage
df -h /var/lib/docker

# Docker-specific disk usage summary
docker system df

# Detailed breakdown
docker system df -v

Step 2: Clean Up Unused Resources

Remove stopped containers, dangling images, unused networks, and build cache:

# Safe cleanup of unused resources
docker system prune -f

# Also remove unused volumes (WARNING: deletes data in unused volumes)
docker system prune -a --volumes -f

Step 3: Remove Specific Resources

# Remove stopped containers
docker container prune -f

# Remove dangling images (untagged)
docker image prune -f

# Remove ALL unused images (not just dangling)
docker image prune -a -f

# Remove unused volumes
docker volume prune -f

# Remove old build cache
docker builder prune -f --keep-storage=2GB

Step 4: Clean Up Container Logs

Container logs can grow very large over time:

# Find large log files
find /var/lib/docker/containers -name "*.log" -size +100M

# Truncate a specific container log
truncate -s 0 /var/lib/docker/containers/CONTAINER_ID/CONTAINER_ID-json.log

Prevent log bloat by configuring log rotation in /etc/docker/daemon.json:

{
  "log-driver": "json-file",
  "log-opts": {
    "max-size": "10m",
    "max-file": "3"
  }
}
sudo systemctl restart docker

Prevention

  • Set up a cron job to run docker system prune -f weekly
  • Use multi-stage builds to reduce image sizes
  • Always configure log rotation for production containers
  • Monitor disk usage with alerts at 80% capacity

Was this article helpful?