FEP-001 recommended Page Rendering

Automated Lighthouse reports

Lighthouse CI integrated in pipeline to catch performance regressions automatically

Question to ask

"What's your Lighthouse score right now?"

Verification guide

Severity: Recommended

Lighthouse should run automatically in CI/CD to catch performance regressions before deployment. Manual PageSpeed Insights checks are not repeatable or enforceable.

Check automatically:

  1. Look for Lighthouse CI config:
# Check for Lighthouse CI configuration files
ls -la lighthouserc.* .lighthouserc.* lhci.* 2>/dev/null

# Check for lighthouse in package.json scripts or devDeps
grep -E "lighthouse|lhci" package.json 2>/dev/null
  1. Check GitHub Actions for Lighthouse:
# Look for lighthouse steps in CI
grep -rE "lighthouse|lhci|treosh/lighthouse-ci-action" .github/workflows/ 2>/dev/null
  1. Check for performance budgets:
# Budgets in Lighthouse CI config
grep -rE "budgets|performance.*budget" lighthouserc.* .lighthouserc.* 2>/dev/null

# Or in package.json
grep -E "budgets" package.json 2>/dev/null
  1. Check for alternatives (Calibre, SpeedCurve, WebPageTest):
# Look for other performance monitoring tools
grep -rE "calibre|speedcurve|webpagetest" . --include="*.json" --include="*.yml" --include="*.yaml" 2>/dev/null | head -5

Cross-reference with:

  • FEP-002 (automated reports should track Core Web Vitals over time)
  • DEPLOY-002 (performance check as part of deployment pipeline)

Pass criteria:

  • Lighthouse CI or equivalent integrated in CI/CD pipeline
  • Reports generated on PRs or scheduled runs
  • Performance budgets defined (bonus, not required for pass)

Fail criteria:

  • No automated performance testing
  • Only manual PageSpeed Insights checks (not repeatable)

Evidence to capture:

  • Lighthouse CI config location (if found)
  • Where reports are stored/published
  • Performance budget thresholds (if defined)
  • Alternative tools in use

Section

22. Front-End Performance

Infrastructure Features