Docs / AI & Machine Learning / How to Deploy Flowise AI Workflow Builder

How to Deploy Flowise AI Workflow Builder

By Admin · Mar 1, 2026 · Updated Apr 23, 2026 · 28 views · 1 min read

How to Deploy Flowise AI Workflow Builder

Flowise is a drag-and-drop tool for building LLM workflows and chatbots. Deploy it on your Breeze for a visual AI application builder with no vendor lock-in.

Requirements

  • A Breeze with at least 2 GB RAM
  • Node.js 18 or newer

Install Node.js

curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
sudo apt install nodejs -y

Install Flowise

npm install -g flowise
npx flowise start --port 3000

Docker Alternative

docker run -d -p 3000:3000 \
  -v flowise-data:/root/.flowise \
  --name flowise \
  --restart always \
  flowiseai/flowise

Access and Configure

Open http://your-breeze-ip:3000 in your browser. The visual canvas lets you chain together LLM calls, document loaders, vector stores, and tools.

Connect to Local Models

Flowise works with Ollama and LocalAI. Add a ChatOllama or ChatLocalAI node and point it to your local inference server. This keeps all data on your Breeze.

Secure Your Instance

Set environment variables for basic auth:

FLOWISE_USERNAME=admin
FLOWISE_PASSWORD=your_secure_password
npx flowise start

For production, place behind Nginx with SSL and restrict access by IP.

Was this article helpful?