Docs / Performance Optimization / PageSpeed Insights API Automation

PageSpeed Insights API Automation

By Admin · Mar 15, 2026 · Updated Apr 24, 2026 · 164 views · 5 min read

Google PageSpeed Insights provides Lighthouse performance scoring and real-world Chrome User Experience Report (CrUX) data. By automating the API, you can continuously monitor Core Web Vitals, detect performance regressions, and maintain a historical baseline. This guide covers setting up the API, building automated monitoring scripts, and integrating results into CI/CD pipelines.

Getting an API Key

# 1. Go to https://console.developers.google.com
# 2. Create a new project (or use existing)
# 3. Enable "PageSpeed Insights API"
# 4. Create credentials > API Key
# 5. Restrict key to PageSpeed Insights API only

# Rate limits: 25,000 queries/day free, 400 queries/100 seconds

Basic API Usage

# Desktop analysis
curl "https://www.googleapis.com/pagespeedonline/v5/runPagespeed?\
url=https://example.com&\
key=YOUR_API_KEY&\
strategy=desktop&\
category=performance"

# Mobile analysis
curl "https://www.googleapis.com/pagespeedonline/v5/runPagespeed?\
url=https://example.com&\
key=YOUR_API_KEY&\
strategy=mobile&\
category=performance&\
category=accessibility&\
category=seo"

Bash Monitoring Script

#!/bin/bash
# /usr/local/bin/pagespeed-monitor.sh

API_KEY="YOUR_API_KEY"
LOG_DIR="/var/log/pagespeed"
mkdir -p "$LOG_DIR"
DATE=$(date +%Y-%m-%d_%H%M)

URLS=(
    "https://example.com"
    "https://example.com/products"
    "https://example.com/blog"
)

for url in "${URLS[@]}"; do
    slug=$(echo "$url" | sed 's|https://||;s|/|_|g')

    for strategy in mobile desktop; do
        result=$(curl -s "https://www.googleapis.com/pagespeedonline/v5/runPagespeed?\
url=${url}&key=${API_KEY}&strategy=${strategy}&category=performance")

        # Extract key metrics
        score=$(echo "$result" | jq '.lighthouseResult.categories.performance.score * 100')
        fcp=$(echo "$result" | jq '.lighthouseResult.audits["first-contentful-paint"].numericValue')
        lcp=$(echo "$result" | jq '.lighthouseResult.audits["largest-contentful-paint"].numericValue')
        cls=$(echo "$result" | jq '.lighthouseResult.audits["cumulative-layout-shift"].numericValue')
        tbt=$(echo "$result" | jq '.lighthouseResult.audits["total-blocking-time"].numericValue')

        # Log CSV format
        echo "${DATE},${url},${strategy},${score},${fcp},${lcp},${cls},${tbt}" \
            >> "$LOG_DIR/${slug}_${strategy}.csv"

        # Alert on poor scores
        if (( $(echo "$score < 50" | bc -l) )); then
            echo "ALERT: ${url} (${strategy}) score dropped to ${score}"
            # Send webhook/email notification here
        fi

        echo "  ${strategy}: Score=${score} LCP=${lcp}ms CLS=${cls}"
        sleep 2  # Respect rate limits
    done
done

Python Monitoring with History

#!/usr/bin/env python3
"""PageSpeed Insights monitor with SQLite history."""

import requests
import sqlite3
import json
from datetime import datetime

API_KEY = "YOUR_API_KEY"
DB_PATH = "/var/lib/pagespeed/history.db"

def init_db():
    conn = sqlite3.connect(DB_PATH)
    conn.execute("""
        CREATE TABLE IF NOT EXISTS results (
            id INTEGER PRIMARY KEY,
            url TEXT,
            strategy TEXT,
            score REAL,
            fcp REAL,
            lcp REAL,
            cls REAL,
            tbt REAL,
            si REAL,
            raw_json TEXT,
            tested_at DATETIME DEFAULT CURRENT_TIMESTAMP
        )
    """)
    conn.commit()
    return conn

def test_url(url, strategy="mobile"):
    params = {
        "url": url,
        "key": API_KEY,
        "strategy": strategy,
        "category": "performance"
    }
    resp = requests.get(
        "https://www.googleapis.com/pagespeedonline/v5/runPagespeed",
        params=params, timeout=120
    )
    return resp.json()

def extract_metrics(data):
    lr = data.get("lighthouseResult", {})
    audits = lr.get("audits", {})
    return {
        "score": lr.get("categories", {}).get("performance", {}).get("score", 0) * 100,
        "fcp": audits.get("first-contentful-paint", {}).get("numericValue", 0),
        "lcp": audits.get("largest-contentful-paint", {}).get("numericValue", 0),
        "cls": audits.get("cumulative-layout-shift", {}).get("numericValue", 0),
        "tbt": audits.get("total-blocking-time", {}).get("numericValue", 0),
        "si": audits.get("speed-index", {}).get("numericValue", 0),
    }

def check_regression(conn, url, strategy, current_score):
    row = conn.execute(
        "SELECT AVG(score) FROM results WHERE url=? AND strategy=? "
        "ORDER BY tested_at DESC LIMIT 10",
        (url, strategy)
    ).fetchone()
    if row and row[0] and current_score < row[0] - 10:
        return True, row[0]
    return False, row[0] if row else 0

URLS = [
    "https://example.com",
    "https://example.com/products",
]

conn = init_db()
for url in URLS:
    for strategy in ["mobile", "desktop"]:
        data = test_url(url, strategy)
        metrics = extract_metrics(data)

        conn.execute(
            "INSERT INTO results (url, strategy, score, fcp, lcp, cls, tbt, si, raw_json) "
            "VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)",
            (url, strategy, metrics["score"], metrics["fcp"], metrics["lcp"],
             metrics["cls"], metrics["tbt"], metrics["si"], json.dumps(data))
        )
        conn.commit()

        regressed, avg = check_regression(conn, url, strategy, metrics["score"])
        status = "REGRESSION" if regressed else "OK"
        print(f"[{status}] {url} ({strategy}): {metrics['score']:.0f} (avg: {avg:.0f})")

conn.close()

CI/CD Integration

# GitHub Actions workflow
# .github/workflows/lighthouse.yml
name: Performance Check
on:
  pull_request:
    branches: [main]

jobs:
  lighthouse:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Deploy preview
        run: |
          # Deploy to preview URL
          echo "PREVIEW_URL=https://preview-${{ github.sha }}.example.com" >> $GITHUB_ENV

      - name: Run PageSpeed Test
        run: |
          RESULT=$(curl -s "https://www.googleapis.com/pagespeedonline/v5/runPagespeed?\
          url=${{ env.PREVIEW_URL }}&key=${{ secrets.PSI_API_KEY }}&strategy=mobile&category=performance")

          SCORE=$(echo "$RESULT" | jq '.lighthouseResult.categories.performance.score * 100')
          LCP=$(echo "$RESULT" | jq '.lighthouseResult.audits["largest-contentful-paint"].numericValue')

          echo "Performance Score: $SCORE"
          echo "LCP: ${LCP}ms"

          # Fail if score below threshold
          if (( $(echo "$SCORE < 70" | bc -l) )); then
            echo "::error::Performance score $SCORE is below threshold of 70"
            exit 1
          fi

Core Web Vitals Thresholds

MetricGoodNeeds ImprovementPoor
LCP< 2.5s2.5s - 4.0s> 4.0s
FID/INP< 200ms200ms - 500ms> 500ms
CLS< 0.10.1 - 0.25> 0.25

Summary

Automating PageSpeed Insights transforms performance from a periodic manual check into continuous monitoring. The free API allows 25,000 tests per day — enough to test dozens of pages hourly. Store historical results to track trends, set up regression alerts for score drops, and integrate into CI/CD pipelines to catch performance issues before they reach production. The combination of lab data (Lighthouse) and field data (CrUX) provides a comprehensive picture of real-world performance.

Was this article helpful?