ARCH-002 recommended Code Quality

Code Quality Measurement Tools

Tooling to measure code health - duplication, complexity, and overall quality with visible results

Question to ask

"How much duplication is hiding in your codebase?"

What to check

  • Quality tools in dependencies (SonarQube, CodeClimate, jscpd, Plato)
  • Duplication detection configured
  • Complexity measurement (ESLint rules or dedicated tools)
  • CI integration with visibility

Pass criteria

  • At least one code quality tool configured
  • Duplication detection available
  • Complexity metrics accessible
  • Results visible to team
  • Team actively reviews and acts on findings

Verification guide

Severity: Recommended

Projects should have tooling to measure code health: duplication, complexity, and overall quality. Results should be visible and acted upon.

Check automatically:

  1. Check for code quality tools in dependencies:
# JavaScript/TypeScript quality tools
grep -E "sonarqube|sonar-scanner|@sonarqube|codeclimate|eslint-plugin-sonarjs|jscpd|plato|es6-plato|typhonjs-escomplex" package.json 2>/dev/null

# Python quality tools
grep -E "radon|xenon|flake8|pylint|bandit|vulture|mccabe|prospector" requirements*.txt pyproject.toml setup.py 2>/dev/null

# Go quality tools
grep -E "golangci-lint|gocyclo|dupl" go.mod Makefile 2>/dev/null

# General config files
ls -la .codeclimate.yml sonar-project.properties .sonarcloud.properties .codacy.yml 2>/dev/null
  1. Check for duplication detection:
# jscpd (copy-paste detector)
grep -E "jscpd" package.json 2>/dev/null
ls -la .jscpd.json .jscpd.yaml 2>/dev/null

# Check CI for duplication checks
grep -rE "jscpd|cpd|duplication|duplicate" .github/workflows/ 2>/dev/null

# SonarQube includes duplication detection
grep -rE "sonar" .github/workflows/ 2>/dev/null
  1. Check for complexity measurement:
# ESLint complexity rules
grep -rE "complexity|max-depth|max-lines|max-statements|max-nested-callbacks" .eslintrc* eslint.config.* 2>/dev/null

# Dedicated complexity tools
grep -E "plato|es6-plato|typhonjs-escomplex|radon" package.json requirements*.txt 2>/dev/null

# Check for complexity thresholds in CI
grep -rE "complexity|cyclomatic" .github/workflows/ 2>/dev/null
  1. Check for CI integration:
# Quality gates in CI
grep -rE "sonar|codeclimate|quality|lint" .github/workflows/ 2>/dev/null

# Check if quality checks are required (not just informational)
grep -rE "fail-on-|--max-|threshold|gate" .github/workflows/ 2>/dev/null
  1. Check ESLint configuration for quality rules:
# Look for quality-focused ESLint plugins
grep -rE "eslint-plugin-sonarjs|eslint-plugin-unicorn|eslint-plugin-complexity" package.json 2>/dev/null

# Check actual complexity settings
cat .eslintrc* eslint.config.* 2>/dev/null | grep -A5 -E "complexity|max-"

Ask user:

  • "What tools do you use to measure code quality?"
  • "Where can developers see code quality metrics?"
  • "Are there thresholds that fail the build?"
  • "How often are quality reports reviewed?"

Common tools (for reference):

  • SonarQube/SonarCloud - Comprehensive (duplication, complexity, bugs, smells)
  • CodeClimate - Quality metrics with GitHub integration
  • jscpd - Copy-paste detection
  • ESLint complexity rules - Built into linting
  • Plato/ES6-Plato - JavaScript complexity reports
  • Radon - Python complexity metrics

Cross-reference with:

  • ARCH-001 (SOLID audits - related but different angle)
  • TEST-005 (CRAP score - combines complexity with coverage)
  • Section 40 (Technical Debt tracking - complexity feeds into debt)
  • DEP-001 (Vulnerability scanning - often same tools)

Pass criteria:

  • At least one code quality tool configured
  • Duplication detection available (jscpd, SonarQube, or similar)
  • Complexity metrics accessible (tool output, ESLint rules, or reports)
  • Results visible to team (dashboard, CI output, or reports)
  • Team actively reviews and acts on findings

Partial pass:

  • Tools configured but results ignored
  • Only ESLint rules, no dedicated quality tool
  • Quality checks run but don't fail builds (visibility without enforcement)

Fail criteria:

  • No code quality tooling configured
  • "We just eyeball it in code review"
  • No visibility into code health metrics
  • Complexity warnings consistently ignored

Evidence to capture:

  • Tools configured (list)
  • Where results are visible (CI output, dashboard URL, reports)
  • Whether thresholds/gates are enforced or advisory
  • Recent example of acting on quality findings

Section

28. Code Architecture

Code Quality & Architecture