Docs / Web Servers / Set Up OpenResty for Dynamic Content with Nginx and Lua

Set Up OpenResty for Dynamic Content with Nginx and Lua

By Admin · Mar 15, 2026 · Updated Apr 24, 2026 · 203 views · 3 min read

OpenResty extends Nginx with LuaJIT scripting, enabling you to build high-performance dynamic web applications, API gateways, and custom request processing pipelines directly in Nginx. Instead of proxying to a separate application server, you handle logic in Lua at the Nginx worker level. This guide covers deploying OpenResty on your VPS.

Install OpenResty

# Add OpenResty repository (Ubuntu/Debian)
sudo apt-get -y install --no-install-recommends wget gnupg ca-certificates lsb-release
wget -O - https://openresty.org/package/pubkey.gpg | sudo gpg --dearmor -o /usr/share/keyrings/openresty.gpg
echo "deb [signed-by=/usr/share/keyrings/openresty.gpg] http://openresty.org/package/ubuntu $(lsb_release -sc) main" | sudo tee /etc/apt/sources.list.d/openresty.list
sudo apt-get update
sudo apt-get -y install openresty

# Verify
openresty -v
# Shows: nginx version: openresty/X.X.X

Lua Scripting in Nginx

# /usr/local/openresty/nginx/conf/nginx.conf
http {
    lua_package_path "/opt/lua/?.lua;;";
    lua_shared_dict cache 100m;    # Shared memory cache
    lua_shared_dict rate_limit 10m;

    init_by_lua_block {
        -- Runs once when Nginx starts
        cjson = require "cjson"
        resty_http = require "resty.http"
    }

    server {
        listen 80;

        # Simple JSON API endpoint
        location /api/hello {
            content_by_lua_block {
                ngx.header.content_type = "application/json"
                local data = {
                    message = "Hello from OpenResty!",
                    timestamp = ngx.now(),
                    server = ngx.var.hostname
                }
                ngx.say(cjson.encode(data))
            }
        }

        # Rate limiting with Lua
        location /api/ {
            access_by_lua_block {
                local limit = ngx.shared.rate_limit
                local key = ngx.var.remote_addr
                local count = limit:incr(key, 1, 0, 60)  -- 60 second window
                if count > 100 then  -- 100 requests per minute
                    ngx.status = 429
                    ngx.header.content_type = "application/json"
                    ngx.say(cjson.encode({error = "Rate limit exceeded"}))
                    return ngx.exit(429)
                end
            }
            proxy_pass http://backend;
        }

        # Shared memory caching
        location /api/data {
            content_by_lua_block {
                local cache = ngx.shared.cache
                local key = ngx.var.uri .. "?" .. (ngx.var.args or "")
                local cached = cache:get(key)

                if cached then
                    ngx.header.content_type = "application/json"
                    ngx.header["X-Cache"] = "HIT"
                    ngx.say(cached)
                    return
                end

                -- Fetch from backend
                local httpc = resty_http.new()
                local res, err = httpc:request_uri("http://127.0.0.1:8080" .. ngx.var.uri, {
                    query = ngx.var.args,
                    headers = { Host = ngx.var.host }
                })

                if res and res.status == 200 then
                    cache:set(key, res.body, 300)  -- Cache for 5 minutes
                    ngx.header.content_type = "application/json"
                    ngx.header["X-Cache"] = "MISS"
                    ngx.say(res.body)
                else
                    ngx.status = 502
                    ngx.say(cjson.encode({error = "Backend unavailable"}))
                end
            }
        }

        # JWT authentication
        location /protected/ {
            access_by_lua_block {
                local jwt = require "resty.jwt"
                local auth = ngx.var.http_authorization
                if not auth then
                    ngx.status = 401
                    ngx.say(cjson.encode({error = "No token provided"}))
                    return ngx.exit(401)
                end

                local token = auth:match("Bearer%s+(.+)")
                local verified = jwt:verify("your-secret-key", token)

                if not verified.verified then
                    ngx.status = 401
                    ngx.say(cjson.encode({error = "Invalid token"}))
                    return ngx.exit(401)
                end

                ngx.req.set_header("X-User-ID", verified.payload.sub)
            }
            proxy_pass http://backend;
        }
    }
}

Use Cases

  • API Gateway: Rate limiting, authentication, request transformation
  • Edge caching: Cache backend responses in shared memory
  • Request routing: Dynamic upstream selection based on headers, cookies, or body
  • WAF: Custom web application firewall rules
  • A/B testing: Route traffic to different backends based on user attributes

Best Practices

  • Use cosocket API (resty.http) for non-blocking I/O — never use blocking calls
  • Use lua_shared_dict for sharing data between workers (caching, rate limiting)
  • Keep Lua code modular: Put reusable code in lua_package_path modules
  • Benchmark: OpenResty can handle 100K+ requests/sec with proper Lua code
  • Use init_by_lua_block for one-time initialization (require modules, load configs)

Was this article helpful?