🚀 Oferta especial: 60% OFF no CrazyStack - Últimas vagas!Garantir vaga →

CI/CD e Automação

Aula 2 - Módulo 6: Pipelines automatizados para qualidade contínua

🎯 Por que CI/CD é Essencial para tRPC?

🔄 Qualidade Contínua

Com TypeScript e tRPC, cada mudança precisa passar por validação rigorosa. CI/CD garante que type safety, testes e build estejam sempre funcionais antes do deploy.

🚀 Deploy Seguro

Automatização elimina erros humanos no processo de deploy, garantindo que apenas código testado e validado chegue à produção.

📊 Feedback Imediato

Desenvolvedores recebem feedback instantâneo sobre quebras de build, falhas de teste ou problemas de type safety em cada commit.

🎭 Environment Parity

Pipelines garantem que ambiente de desenvolvimento, staging e produção sejam idênticos, eliminando surpresas no deploy.

⚠️ Conceitos Importantes para Entender

Quality Gates:

Checkpoints automáticos que impedem deploy de código com problemas

Matrix Testing:

Testes em múltiplas versões de Node.js e sistemas operacionais

Artifact Management:

Armazenamento e distribuição de builds compilados e dependencies

Progressive Deployment:

Deploy gradual com rollback automático em caso de problemas

🐙 GitHub Actions para tRPC

🏗️ Pipeline Principal CI/CD

.github/workflows/ci-cd.yml
# 📁 .github/workflows/ci-cd.yml
name: tRPC CI/CD Pipeline

on:
  push:
    branches: [ main, develop ]
  pull_request:
    branches: [ main ]

env:
  NODE_VERSION: '18'
  PNPM_VERSION: '8'

jobs:
  # 🧪 Quality Assurance
  quality:
    name: Quality Checks
    runs-on: ubuntu-latest
    
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        
      - name: 📦 Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: ${{ env.NODE_VERSION }}
          
      - name: 📦 Setup pnpm
        uses: pnpm/action-setup@v2
        with:
          version: ${{ env.PNPM_VERSION }}
          
      - name: 📁 Get pnpm store directory
        shell: bash
        run: echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
        
      - name: 🗂️ Setup pnpm cache
        uses: actions/cache@v3
        with:
          path: ${{ env.STORE_PATH }}
          key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
          restore-keys: |
            ${{ runner.os }}-pnpm-store-
            
      - name: 📦 Install Dependencies
        run: pnpm install --frozen-lockfile
        
      - name: 🔍 Type Check
        run: pnpm run type-check
        
      - name: 🎨 Lint Check
        run: pnpm run lint
        
      - name: 💄 Format Check
        run: pnpm run format:check
        
      - name: 🏗️ Build Check
        run: pnpm run build
        
      - name: 📊 Upload Build Artifacts
        uses: actions/upload-artifact@v3
        with:
          name: build-artifacts
          path: |
            dist/
            .next/
          retention-days: 1

  # 🧪 Tests Matrix
  test:
    name: Tests
    runs-on: ${{ matrix.os }}
    needs: quality
    
    strategy:
      matrix:
        os: [ubuntu-latest, windows-latest, macos-latest]
        node-version: ['16', '18', '20']
        
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        
      - name: 📦 Setup Node.js ${{ matrix.node-version }}
        uses: actions/setup-node@v4
        with:
          node-version: ${{ matrix.node-version }}
          
      - name: 📦 Setup pnpm
        uses: pnpm/action-setup@v2
        with:
          version: ${{ env.PNPM_VERSION }}
          
      - name: 📦 Install Dependencies
        run: pnpm install --frozen-lockfile
        
      - name: 🗄️ Setup Test Database
        run: |
          cp .env.example .env.test
          pnpm run db:test:setup
          
      - name: 🧪 Run Unit Tests
        run: pnpm run test:unit
        env:
          NODE_ENV: test
          
      - name: 🔗 Run Integration Tests
        run: pnpm run test:integration
        env:
          NODE_ENV: test
          
      - name: 📊 Upload Coverage
        uses: codecov/codecov-action@v3
        with:
          file: ./coverage/lcov.info
          flags: unittests
          name: codecov-umbrella
          fail_ci_if_error: false

  # 🎭 E2E Tests
  e2e:
    name: E2E Tests
    runs-on: ubuntu-latest
    needs: quality
    
    services:
      postgres:
        image: postgres:15
        env:
          POSTGRES_PASSWORD: postgres
          POSTGRES_DB: trpc_test
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5
        ports:
          - 5432:5432
          
      redis:
        image: redis:7-alpine
        options: >-
          --health-cmd "redis-cli ping"
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5
        ports:
          - 6379:6379
          
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        
      - name: 📦 Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: ${{ env.NODE_VERSION }}
          
      - name: 📦 Setup pnpm
        uses: pnpm/action-setup@v2
        with:
          version: ${{ env.PNPM_VERSION }}
          
      - name: 📦 Install Dependencies
        run: pnpm install --frozen-lockfile
        
      - name: 📥 Download Build Artifacts
        uses: actions/download-artifact@v3
        with:
          name: build-artifacts
          
      - name: 🗄️ Setup Database
        run: |
          cp .env.example .env.test
          pnpm run db:migrate:deploy
          pnpm run db:seed
        env:
          DATABASE_URL: postgresql://postgres:postgres@localhost:5432/trpc_test
          
      - name: 🚀 Start Application
        run: |
          pnpm run start:test &
          sleep 10
        env:
          NODE_ENV: test
          DATABASE_URL: postgresql://postgres:postgres@localhost:5432/trpc_test
          REDIS_URL: redis://localhost:6379
          
      - name: 📦 Install Playwright Browsers
        run: pnpm exec playwright install --with-deps
        
      - name: 🎭 Run E2E Tests
        run: pnpm run test:e2e
        env:
          BASE_URL: http://localhost:3000
          
      - name: 📊 Upload E2E Reports
        uses: actions/upload-artifact@v3
        if: failure()
        with:
          name: e2e-reports
          path: |
            test-results/
            playwright-report/
          retention-days: 7

  # 🚀 Deploy to Staging
  deploy-staging:
    name: Deploy to Staging
    runs-on: ubuntu-latest
    needs: [test, e2e]
    if: github.ref == 'refs/heads/develop'
    
    environment:
      name: staging
      url: https://staging.myapp.com
      
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        
      - name: 📥 Download Build Artifacts
        uses: actions/download-artifact@v3
        with:
          name: build-artifacts
          
      - name: 🔑 Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1
          
      - name: 🐳 Build and Push Docker Image
        run: |
          aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin ${{ secrets.ECR_REGISTRY }}
          docker build -t ${{ secrets.ECR_REGISTRY }}/trpc-app:staging .
          docker push ${{ secrets.ECR_REGISTRY }}/trpc-app:staging
          
      - name: 🚀 Deploy to ECS
        run: |
          aws ecs update-service --cluster staging-cluster --service trpc-service --force-new-deployment

  # 🚀 Deploy to Production
  deploy-production:
    name: Deploy to Production
    runs-on: ubuntu-latest
    needs: [test, e2e]
    if: github.ref == 'refs/heads/main'
    
    environment:
      name: production
      url: https://myapp.com
      
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        
      - name: 📥 Download Build Artifacts
        uses: actions/download-artifact@v3
        with:
          name: build-artifacts
          
      - name: 🔑 Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1
          
      - name: 🐳 Build and Push Docker Image
        run: |
          aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin ${{ secrets.ECR_REGISTRY }}
          docker build -t ${{ secrets.ECR_REGISTRY }}/trpc-app:latest .
          docker push ${{ secrets.ECR_REGISTRY }}/trpc-app:latest
          
      - name: 🚀 Deploy with Blue-Green Strategy
        run: |
          # Deploy to green environment
          aws ecs update-service --cluster production-cluster --service trpc-service-green --force-new-deployment
          
          # Wait for deployment
          aws ecs wait services-stable --cluster production-cluster --services trpc-service-green
          
          # Health check
          curl -f https://green.myapp.com/health || exit 1
          
          # Switch traffic
          aws elbv2 modify-rule --rule-arn ${{ secrets.ALB_RULE_ARN }} --actions Type=forward,TargetGroupArn=${{ secrets.GREEN_TARGET_GROUP }}
          
      - name: 📊 Post-Deploy Health Check
        run: |
          sleep 30
          curl -f https://myapp.com/health || exit 1
          curl -f https://myapp.com/api/trpc/health.check || exit 1

🔒 Security Scanning Pipeline

.github/workflows/security.yml
# 📁 .github/workflows/security.yml
name: Security Scanning

on:
  push:
    branches: [ main, develop ]
  pull_request:
    branches: [ main ]
  schedule:
    # 🕐 Run daily at 2 AM UTC
    - cron: '0 2 * * *'

jobs:
  # 🔍 Dependency Vulnerability Scan
  dependency-scan:
    name: Dependency Security Scan
    runs-on: ubuntu-latest
    
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        
      - name: 📦 Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          
      - name: 📦 Setup pnpm
        uses: pnpm/action-setup@v2
        with:
          version: '8'
          
      - name: 📦 Install Dependencies
        run: pnpm install --frozen-lockfile
        
      - name: 🔍 Run npm audit
        run: pnpm audit --audit-level moderate
        
      - name: 🔒 Run Snyk Security Test
        uses: snyk/actions/node@master
        env:
          SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
        with:
          args: --severity-threshold=medium
          
      - name: 📊 Upload Snyk Results to GitHub
        uses: github/codeql-action/upload-sarif@v2
        if: always()
        with:
          sarif_file: snyk.sarif

  # 🔒 Code Security Analysis
  code-analysis:
    name: Static Code Analysis
    runs-on: ubuntu-latest
    
    permissions:
      actions: read
      contents: read
      security-events: write
      
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        
      - name: 🔍 Initialize CodeQL
        uses: github/codeql-action/init@v2
        with:
          languages: javascript
          queries: security-extended,security-and-quality
          
      - name: 🏗️ Autobuild
        uses: github/codeql-action/autobuild@v2
        
      - name: 🔍 Perform CodeQL Analysis
        uses: github/codeql-action/analyze@v2
        
      - name: 🔒 Run Semgrep Security Scan
        uses: returntocorp/semgrep-action@v1
        with:
          config: >-
            p/security-audit
            p/secrets
            p/typescript
        env:
          SEMGREP_APP_TOKEN: ${{ secrets.SEMGREP_APP_TOKEN }}

  # 🐳 Container Security Scan
  container-scan:
    name: Container Security Scan
    runs-on: ubuntu-latest
    
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        
      - name: 🐳 Build Docker Image
        run: |
          docker build -t trpc-app:security-scan .
          
      - name: 🔍 Run Trivy Container Scan
        uses: aquasecurity/trivy-action@master
        with:
          image-ref: 'trpc-app:security-scan'
          format: 'sarif'
          output: 'trivy-results.sarif'
          severity: 'CRITICAL,HIGH'
          
      - name: 📊 Upload Trivy Results
        uses: github/codeql-action/upload-sarif@v2
        with:
          sarif_file: 'trivy-results.sarif'
          
      - name: 🔒 Run Docker Bench Security
        run: |
          docker run --rm --net host --pid host --userns host --cap-add audit_control \
            -e DOCKER_CONTENT_TRUST=$DOCKER_CONTENT_TRUST \
            -v /etc:/etc:ro \
            -v /usr/bin/containerd:/usr/bin/containerd:ro \
            -v /usr/bin/runc:/usr/bin/runc:ro \
            -v /usr/lib/systemd:/usr/lib/systemd:ro \
            -v /var/lib:/var/lib:ro \
            -v /var/run/docker.sock:/var/run/docker.sock:ro \
            docker/docker-bench-security || true

  # 🕵️ Secrets Scanning
  secrets-scan:
    name: Secrets Detection
    runs-on: ubuntu-latest
    
    steps:
      - name: 📥 Checkout Code
        uses: actions/checkout@v4
        with:
          fetch-depth: 0
          
      - name: 🔍 Run TruffleHog Secrets Scan
        uses: trufflesecurity/trufflehog@main
        with:
          path: ./
          base: main
          head: HEAD
          extra_args: --debug --only-verified
          
      - name: 🔒 Run GitLeaks
        uses: zricethezav/gitleaks-action@master
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          GITLEAKS_LICENSE: ${{ secrets.GITLEAKS_LICENSE }}

🦊 GitLab CI para tRPC

🏗️ Pipeline GitLab CI/CD

.gitlab-ci.yml
# 📁 .gitlab-ci.yml
stages:
  - quality
  - test
  - security
  - build
  - deploy

variables:
  NODE_VERSION: '18'
  PNPM_VERSION: '8'
  POSTGRES_DB: trpc_test
  POSTGRES_USER: postgres
  POSTGRES_PASSWORD: postgres

# 📦 Cache configuration
.pnpm-cache: &pnpm-cache
  cache:
    key:
      files:
        - pnpm-lock.yaml
    paths:
      - .pnpm-store
    policy: pull

# 🐳 Base image with Node.js and pnpm
.node-template: &node-template
  image: node:${NODE_VERSION}-alpine
  before_script:
    - corepack enable
    - corepack prepare pnpm@${PNPM_VERSION} --activate
    - pnpm config set store-dir .pnpm-store
    - pnpm install --frozen-lockfile

# 🔍 Quality stage
quality:lint:
  <<: *node-template
  <<: *pnpm-cache
  stage: quality
  script:
    - pnpm run lint
    - pnpm run format:check
  artifacts:
    reports:
      junit: reports/lint-results.xml
    when: always

quality:typecheck:
  <<: *node-template
  <<: *pnpm-cache
  stage: quality
  script:
    - pnpm run type-check
  artifacts:
    reports:
      junit: reports/typecheck-results.xml
    when: always

quality:build:
  <<: *node-template
  <<: *pnpm-cache
  stage: quality
  script:
    - pnpm run build
  artifacts:
    paths:
      - dist/
      - .next/
    expire_in: 1 hour
  cache:
    <<: *pnpm-cache
    policy: pull-push

# 🧪 Test stage
.test-template: &test-template
  <<: *node-template
  stage: test
  services:
    - postgres:15-alpine
    - redis:7-alpine
  variables:
    POSTGRES_HOST_AUTH_METHOD: trust
    DATABASE_URL: postgresql://postgres:postgres@postgres:5432/trpc_test
    REDIS_URL: redis://redis:6379
  before_script:
    - !reference [.node-template, before_script]
    - cp .env.example .env.test
    - pnpm run db:migrate:deploy
  coverage: '/Lines\s*:\s*(\d+\.?\d*)%/'

test:unit:
  <<: *test-template
  <<: *pnpm-cache
  script:
    - pnpm run test:unit --coverage
  artifacts:
    reports:
      junit: reports/unit-test-results.xml
      coverage_report:
        coverage_format: cobertura
        path: coverage/cobertura-coverage.xml
    paths:
      - coverage/
    when: always

test:integration:
  <<: *test-template
  <<: *pnpm-cache
  script:
    - pnpm run test:integration --coverage
  artifacts:
    reports:
      junit: reports/integration-test-results.xml
    when: always

test:e2e:
  <<: *node-template
  <<: *pnpm-cache
  stage: test
  image: mcr.microsoft.com/playwright:v1.40.0-focal
  services:
    - postgres:15-alpine
    - redis:7-alpine
  variables:
    POSTGRES_HOST_AUTH_METHOD: trust
    DATABASE_URL: postgresql://postgres:postgres@postgres:5432/trpc_test
    REDIS_URL: redis://redis:6379
    CI: true
  before_script:
    - !reference [.node-template, before_script]
    - cp .env.example .env.test
    - pnpm run db:migrate:deploy
    - pnpm run db:seed
  script:
    - pnpm run start:test &
    - sleep 15
    - pnpm run test:e2e
  artifacts:
    paths:
      - test-results/
      - playwright-report/
    expire_in: 1 week
    when: failure

# 🔒 Security stage
security:dependency-scan:
  <<: *node-template
  <<: *pnpm-cache
  stage: security
  script:
    - pnpm audit --audit-level moderate
    - npx snyk test --severity-threshold=medium
  allow_failure: true
  artifacts:
    reports:
      dependency_scanning: snyk-report.json
    when: always

security:sast:
  stage: security
  image: 
    name: returntocorp/semgrep:latest
    entrypoint: [""]
  script:
    - semgrep --config=auto --json --output=semgrep-report.json .
  artifacts:
    reports:
      sast: semgrep-report.json
  allow_failure: true

security:container-scan:
  stage: security
  image: 
    name: aquasec/trivy:latest
    entrypoint: [""]
  services:
    - docker:dind
  variables:
    DOCKER_HOST: tcp://docker:2376
    DOCKER_TLS_CERTDIR: "/certs"
    DOCKER_TLS_VERIFY: 1
    DOCKER_CERT_PATH: "$DOCKER_TLS_CERTDIR/client"
  before_script:
    - docker build -t trpc-app:$CI_COMMIT_SHA .
  script:
    - trivy image --format template --template "@contrib/gitlab.tpl" --output container-scan-report.json trpc-app:$CI_COMMIT_SHA
  artifacts:
    reports:
      container_scanning: container-scan-report.json
  allow_failure: true

# 🏗️ Build stage
build:docker:
  stage: build
  image: docker:latest
  services:
    - docker:dind
  variables:
    DOCKER_HOST: tcp://docker:2376
    DOCKER_TLS_CERTDIR: "/certs"
    DOCKER_TLS_VERIFY: 1
    DOCKER_CERT_PATH: "$DOCKER_TLS_CERTDIR/client"
  before_script:
    - echo $CI_REGISTRY_PASSWORD | docker login -u $CI_REGISTRY_USER --password-stdin $CI_REGISTRY
  script:
    - docker build -t $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA .
    - docker push $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA
    - |
      if [ "$CI_COMMIT_BRANCH" = "main" ]; then
        docker tag $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA $CI_REGISTRY_IMAGE:latest
        docker push $CI_REGISTRY_IMAGE:latest
      fi
  dependencies:
    - quality:build
  only:
    - main
    - develop

# 🚀 Deploy stages
deploy:staging:
  stage: deploy
  image: alpine/helm:latest
  environment:
    name: staging
    url: https://staging.myapp.com
  before_script:
    - apk add --no-cache curl
    - curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl"
    - chmod +x kubectl
    - mv kubectl /usr/local/bin/
    - echo $KUBE_CONFIG | base64 -d > ~/.kube/config
  script:
    - |
      helm upgrade --install trpc-app-staging ./helm/trpc-app \
        --namespace staging \
        --set image.repository=$CI_REGISTRY_IMAGE \
        --set image.tag=$CI_COMMIT_SHA \
        --set environment=staging \
        --set ingress.hosts[0].host=staging.myapp.com \
        --wait --timeout=5m
    - kubectl rollout status deployment/trpc-app-staging -n staging
    - curl -f https://staging.myapp.com/health || exit 1
  dependencies:
    - build:docker
  only:
    - develop

deploy:production:
  stage: deploy
  image: alpine/helm:latest
  environment:
    name: production
    url: https://myapp.com
  before_script:
    - apk add --no-cache curl
    - curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl"
    - chmod +x kubectl
    - mv kubectl /usr/local/bin/
    - echo $KUBE_CONFIG | base64 -d > ~/.kube/config
  script:
    - |
      # Blue-Green Deployment
      CURRENT_COLOR=$(kubectl get service trpc-app-production -o jsonpath='{.spec.selector.color}' 2>/dev/null || echo "blue")
      NEW_COLOR=$(if [ "$CURRENT_COLOR" = "blue" ]; then echo "green"; else echo "blue"; fi)
      
      echo "Deploying to $NEW_COLOR environment"
      
      helm upgrade --install trpc-app-production-$NEW_COLOR ./helm/trpc-app \
        --namespace production \
        --set image.repository=$CI_REGISTRY_IMAGE \
        --set image.tag=$CI_COMMIT_SHA \
        --set environment=production \
        --set color=$NEW_COLOR \
        --set ingress.hosts[0].host=myapp.com \
        --wait --timeout=10m
        
      # Health check
      kubectl rollout status deployment/trpc-app-production-$NEW_COLOR -n production
      
      # Smoke tests
      TEMP_URL=$(kubectl get ingress trpc-app-production-$NEW_COLOR -o jsonpath='{.status.loadBalancer.ingress[0].hostname}')
      curl -f http://$TEMP_URL/health || exit 1
      curl -f http://$TEMP_URL/api/trpc/health.check || exit 1
      
      # Switch traffic
      kubectl patch service trpc-app-production -p '{"spec":{"selector":{"color":"'$NEW_COLOR'"}}}'
      
      echo "Deployment successful! Traffic switched to $NEW_COLOR"
      
      # Cleanup old deployment after 5 minutes
      sleep 300
      helm uninstall trpc-app-production-$CURRENT_COLOR --namespace production || true
  dependencies:
    - build:docker
  when: manual
  only:
    - main

# 📊 Performance testing
performance:
  stage: deploy
  image: grafana/k6:latest
  script:
    - k6 run --out json=k6-results.json tests/performance/load-test.js
  artifacts:
    reports:
      performance: k6-results.json
    when: always
  dependencies:
    - deploy:staging
  only:
    - develop
  allow_failure: true

📊 Performance Testing com K6

tests/performance/load-test.js
// 📁 tests/performance/load-test.js
import http from 'k6/http';
import { check, sleep } from 'k6';
import { Rate, Trend } from 'k6/metrics';

// 📊 Custom metrics
const errorRate = new Rate('error_rate');
const tRPCResponseTime = new Trend('trpc_response_time');

// ⚙️ Test configuration
export const options = {
  stages: [
    // 🚀 Ramp up
    { duration: '2m', target: 20 },
    { duration: '5m', target: 20 },
    { duration: '2m', target: 50 },
    { duration: '5m', target: 50 },
    { duration: '2m', target: 100 },
    { duration: '5m', target: 100 },
    // 📉 Ramp down
    { duration: '5m', target: 0 },
  ],
  thresholds: {
    http_req_duration: ['p(95)<500'], // 95% das requests < 500ms
    http_req_failed: ['rate<0.1'],     // Error rate < 10%
    error_rate: ['rate<0.05'],         // Custom error rate < 5%
    trpc_response_time: ['p(90)<300'], // 90% tRPC calls < 300ms
  },
};

// 🔑 Authentication token
let authToken = '';

export function setup() {
  // 🔑 Login to get auth token
  const loginResponse = http.post(
    '${__ENV.BASE_URL || "https://staging.myapp.com"}/api/auth/login',
    JSON.stringify({
      email: 'load-test@example.com',
      password: 'LoadTest123!',
    }),
    {
      headers: { 'Content-Type': 'application/json' },
    }
  );

  check(loginResponse, {
    'login successful': (r) => r.status === 200,
  });

  return { token: loginResponse.json('token') };
}

export default function (data) {
  authToken = data.token;
  
  // 🎯 Test scenarios
  const scenarios = [
    testHealthCheck,
    testUserOperations,
    testPostOperations,
    testBatchOperations,
    testWebSocketConnection,
  ];

  // 🎲 Random scenario selection
  const scenario = scenarios[Math.floor(Math.random() * scenarios.length)];
  scenario();

  // ⏰ Random think time
  sleep(Math.random() * 3 + 1);
}

function testHealthCheck() {
  const response = http.get('${__ENV.BASE_URL}/health');
  
  check(response, {
    'health check status is 200': (r) => r.status === 200,
    'health check response time < 100ms': (r) => r.timings.duration < 100,
  });

  errorRate.add(response.status !== 200);
}

function testUserOperations() {
  const baseUrl = '${__ENV.BASE_URL}/api/trpc';
  
  // 🔍 Get user profile
  const startTime = Date.now();
  const profileResponse = http.get(`${baseUrl}/user.getProfile`, {
    headers: {
      'Authorization': `Bearer ${authToken}`,
      'Content-Type': 'application/json',
    },
  });
  const responseTime = Date.now() - startTime;

  check(profileResponse, {
    'get profile status is 200': (r) => r.status === 200,
    'profile has user data': (r) => {
      const data = JSON.parse(r.body);
      return data.result?.data?.id !== undefined;
    },
  });

  tRPCResponseTime.add(responseTime);
  errorRate.add(profileResponse.status !== 200);

  // ✏️ Update profile
  const updateResponse = http.post(
    `${baseUrl}/user.updateProfile`,
    JSON.stringify({
      name: `Load Test User ${Math.random().toString(36).substr(2, 9)}`,
      bio: 'Updated by load test',
    }),
    {
      headers: {
        'Authorization': `Bearer ${authToken}`,
        'Content-Type': 'application/json',
      },
    }
  );

  check(updateResponse, {
    'update profile status is 200': (r) => r.status === 200,
  });

  errorRate.add(updateResponse.status !== 200);
}

function testPostOperations() {
  const baseUrl = '${__ENV.BASE_URL}/api/trpc';

  // 📝 Create post
  const createResponse = http.post(
    `${baseUrl}/post.create`,
    JSON.stringify({
      title: `Load Test Post ${Date.now()}`,
      content: 'This is a post created during load testing',
      published: true,
    }),
    {
      headers: {
        'Authorization': `Bearer ${authToken}`,
        'Content-Type': 'application/json',
      },
    }
  );

  let postId = null;
  check(createResponse, {
    'create post status is 200': (r) => r.status === 200,
    'post has ID': (r) => {
      const data = JSON.parse(r.body);
      postId = data.result?.data?.id;
      return postId !== undefined;
    },
  });

  errorRate.add(createResponse.status !== 200);

  if (postId) {
    // 📖 Get post
    const getResponse = http.get(`${baseUrl}/post.getById?input=${JSON.stringify({ id: postId })}`, {
      headers: {
        'Authorization': `Bearer ${authToken}`,
      },
    });

    check(getResponse, {
      'get post status is 200': (r) => r.status === 200,
    });

    errorRate.add(getResponse.status !== 200);
  }
}

function testBatchOperations() {
  const baseUrl = '${__ENV.BASE_URL}/api/trpc';

  // 📦 Batch request
  const batchRequest = [
    {
      id: 1,
      method: 'query',
      params: { path: 'user.getProfile', input: {} },
    },
    {
      id: 2,
      method: 'query',
      params: { path: 'post.getRecent', input: { limit: 10 } },
    },
    {
      id: 3,
      method: 'query',
      params: { path: 'notification.getUnread', input: {} },
    },
  ];

  const batchResponse = http.post(
    baseUrl,
    JSON.stringify(batchRequest),
    {
      headers: {
        'Authorization': `Bearer ${authToken}`,
        'Content-Type': 'application/json',
      },
    }
  );

  check(batchResponse, {
    'batch request status is 200': (r) => r.status === 200,
    'batch response is array': (r) => {
      const data = JSON.parse(r.body);
      return Array.isArray(data) && data.length === 3;
    },
  });

  errorRate.add(batchResponse.status !== 200);
}

function testWebSocketConnection() {
  // 🔌 Test WebSocket upgrade
  const wsResponse = http.get('${__ENV.BASE_URL}/api/ws', {
    headers: {
      'Upgrade': 'websocket',
      'Connection': 'Upgrade',
      'Sec-WebSocket-Key': 'dGhlIHNhbXBsZSBub25jZQ==',
      'Sec-WebSocket-Version': '13',
    },
  });

  check(wsResponse, {
    'websocket upgrade status is 101': (r) => r.status === 101,
  });

  errorRate.add(wsResponse.status !== 101);
}

export function teardown(data) {
  // 🧹 Cleanup: logout
  http.post(
    '${__ENV.BASE_URL}/api/auth/logout',
    null,
    {
      headers: {
        'Authorization': `Bearer ${data.token}`,
      },
    }
  );
}

🏭 Jenkins Pipeline para tRPC

🏗️ Jenkinsfile Principal

Jenkinsfile
// 📁 Jenkinsfile
pipeline {
    agent {
        kubernetes {
            yaml """
                apiVersion: v1
                kind: Pod
                spec:
                  containers:
                  - name: node
                    image: node:18-alpine
                    command:
                    - cat
                    tty: true
                    volumeMounts:
                    - name: docker-sock
                      mountPath: /var/run/docker.sock
                  - name: docker
                    image: docker:latest
                    command:
                    - cat
                    tty: true
                    volumeMounts:
                    - name: docker-sock
                      mountPath: /var/run/docker.sock
                  - name: helm
                    image: alpine/helm:latest
                    command:
                    - cat
                    tty: true
                  volumes:
                  - name: docker-sock
                    hostPath:
                      path: /var/run/docker.sock
            """
        }
    }

    environment {
        NODE_VERSION = '18'
        PNPM_VERSION = '8'
        DOCKER_REGISTRY = credentials('docker-registry')
        KUBE_CONFIG = credentials('kube-config')
        SNYK_TOKEN = credentials('snyk-token')
        SONAR_TOKEN = credentials('sonar-token')
        SLACK_WEBHOOK = credentials('slack-webhook')
    }

    options {
        buildDiscarder(logRotator(numToKeepStr: '10'))
        timeout(time: 45, unit: 'MINUTES')
        timestamps()
        ansiColor('xterm')
    }

    stages {
        stage('🔧 Setup') {
            steps {
                container('node') {
                    script {
                        // 📦 Install pnpm
                        sh '''
                            corepack enable
                            corepack prepare pnpm@${PNPM_VERSION} --activate
                            pnpm config set store-dir .pnpm-store
                        '''
                        
                        // 📦 Install dependencies
                        sh 'pnpm install --frozen-lockfile'
                        
                        // 📊 Display environment info
                        sh '''
                            echo "Node.js version: $(node --version)"
                            echo "pnpm version: $(pnpm --version)"
                            echo "Build number: ${BUILD_NUMBER}"
                            echo "Git commit: ${GIT_COMMIT}"
                        '''
                    }
                }
            }
            post {
                always {
                    // 📊 Archive pnpm cache for future builds
                    stash includes: '.pnpm-store/**', name: 'pnpm-cache', allowEmpty: true
                }
            }
        }

        stage('🔍 Quality Gates') {
            parallel {
                stage('Lint & Format') {
                    steps {
                        container('node') {
                            script {
                                // 🎨 Lint check
                                sh 'pnpm run lint --format=checkstyle > lint-results.xml || true'
                                
                                // 💄 Format check
                                sh 'pnpm run format:check'
                            }
                        }
                    }
                    post {
                        always {
                            // 📊 Publish lint results
                            publishCheckStyleResults checksStyleResults: 'lint-results.xml'
                        }
                    }
                }

                stage('Type Check') {
                    steps {
                        container('node') {
                            sh 'pnpm run type-check'
                        }
                    }
                }

                stage('Security Scan') {
                    steps {
                        container('node') {
                            script {
                                // 🔍 Dependency audit
                                sh 'pnpm audit --audit-level moderate --json > audit-results.json || true'
                                
                                // 🔒 Snyk security scan
                                sh '''
                                    npx snyk auth ${SNYK_TOKEN}
                                    npx snyk test --json > snyk-results.json || true
                                    npx snyk code test --json > snyk-code-results.json || true
                                '''
                            }
                        }
                    }
                    post {
                        always {
                            // 📊 Archive security scan results
                            archiveArtifacts artifacts: '*-results.json', allowEmptyArchive: true
                        }
                    }
                }
            }
        }

        stage('🏗️ Build') {
            steps {
                container('node') {
                    script {
                        // 🏗️ Build application
                        sh 'pnpm run build'
                        
                        // 📦 Create build info
                        sh '''
                            echo "{\"buildNumber\": \"${BUILD_NUMBER}\", \"gitCommit\": \"${GIT_COMMIT}\", \"timestamp\": \"$(date -u +%Y-%m-%dT%H:%M:%SZ)\"}" > dist/build-info.json
                        '''
                    }
                }
            }
            post {
                success {
                    // 📦 Archive build artifacts
                    archiveArtifacts artifacts: 'dist/**', fingerprint: true
                    stash includes: 'dist/**', name: 'build-artifacts'
                }
            }
        }

        stage('🧪 Tests') {
            parallel {
                stage('Unit Tests') {
                    steps {
                        container('node') {
                            script {
                                // 🗄️ Setup test database
                                sh '''
                                    cp .env.example .env.test
                                    export DATABASE_URL="sqlite::memory:"
                                    pnpm run db:migrate:deploy
                                '''
                                
                                // 🧪 Run unit tests
                                sh 'pnpm run test:unit --coverage --reporter=junit'
                            }
                        }
                    }
                    post {
                        always {
                            // 📊 Publish test results
                            publishTestResults testResultsPattern: 'test-results/unit/junit.xml'
                            
                            // 📊 Publish coverage
                            publishCoverage adapters: [
                                istanbulCoberturaAdapter('coverage/cobertura-coverage.xml')
                            ], sourceFileResolver: sourceFiles('STORE_LAST_BUILD')
                        }
                    }
                }

                stage('Integration Tests') {
                    steps {
                        container('node') {
                            script {
                                // 🗄️ Setup test services
                                sh '''
                                    export DATABASE_URL="sqlite:./test.db"
                                    export REDIS_URL="redis://localhost:6379"
                                    
                                    # Start test services
                                    docker run -d --name redis-test -p 6379:6379 redis:7-alpine
                                    
                                    # Setup database
                                    cp .env.example .env.test
                                    pnpm run db:migrate:deploy
                                    pnpm run db:seed
                                '''
                                
                                // 🔗 Run integration tests
                                sh 'pnpm run test:integration --reporter=junit'
                            }
                        }
                    }
                    post {
                        always {
                            // 🧹 Cleanup test services
                            sh 'docker rm -f redis-test || true'
                            
                            // 📊 Publish results
                            publishTestResults testResultsPattern: 'test-results/integration/junit.xml'
                        }
                    }
                }

                stage('E2E Tests') {
                    when {
                        anyOf {
                            branch 'main'
                            branch 'develop'
                            changeRequest()
                        }
                    }
                    steps {
                        container('node') {
                            script {
                                // 🚀 Start application for E2E
                                sh '''
                                    export NODE_ENV=test
                                    export DATABASE_URL="sqlite:./e2e-test.db"
                                    
                                    # Setup
                                    cp .env.example .env.test
                                    pnpm run db:migrate:deploy
                                    pnpm run db:seed
                                    
                                    # Start app in background
                                    pnpm run start:test &
                                    APP_PID=$!
                                    echo $APP_PID > app.pid
                                    
                                    # Wait for app to be ready
                                    timeout 60 bash -c 'until curl -f http://localhost:3000/health; do sleep 1; done'
                                '''
                                
                                // 🎭 Install Playwright and run tests
                                sh '''
                                    pnpm exec playwright install --with-deps
                                    pnpm run test:e2e
                                '''
                            }
                        }
                    }
                    post {
                        always {
                            // 🧹 Stop application
                            sh '''
                                if [ -f app.pid ]; then
                                    kill $(cat app.pid) || true
                                    rm app.pid
                                fi
                            '''
                            
                            // 📊 Archive E2E results
                            archiveArtifacts artifacts: 'test-results/**, playwright-report/**', allowEmptyArchive: true
                            publishTestResults testResultsPattern: 'test-results/e2e/junit.xml'
                        }
                    }
                }
            }
        }

        stage('📊 Code Quality') {
            when {
                anyOf {
                    branch 'main'
                    branch 'develop'
                }
            }
            steps {
                container('node') {
                    script {
                        // 📊 SonarQube analysis
                        sh '''
                            npx sonar-scanner \
                                -Dsonar.projectKey=trpc-app \
                                -Dsonar.sources=src \
                                -Dsonar.tests=tests \
                                -Dsonar.typescript.lcov.reportPaths=coverage/lcov.info \
                                -Dsonar.testExecutionReportPaths=test-results/unit/sonar.xml \
                                -Dsonar.host.url=${SONAR_HOST_URL} \
                                -Dsonar.login=${SONAR_TOKEN}
                        '''
                    }
                }
            }
        }

        stage('🐳 Docker Build') {
            when {
                anyOf {
                    branch 'main'
                    branch 'develop'
                }
            }
            steps {
                container('docker') {
                    script {
                        // 📥 Unstash build artifacts
                        unstash 'build-artifacts'
                        
                        // 🐳 Build Docker image
                        def imageName = "trpc-app"
                        def imageTag = env.BRANCH_NAME == 'main' ? 'latest' : env.BRANCH_NAME
                        def fullImageName = "${DOCKER_REGISTRY}/${imageName}:${imageTag}"
                        
                        sh "docker build -t ${fullImageName} ."
                        
                        // 🔒 Security scan
                        sh "docker run --rm -v /var/run/docker.sock:/var/run/docker.sock aquasec/trivy image --severity HIGH,CRITICAL ${fullImageName}"
                        
                        // 📤 Push image
                        sh "docker push ${fullImageName}"
                        
                        // 🏷️ Tag with build number
                        sh "docker tag ${fullImageName} ${DOCKER_REGISTRY}/${imageName}:${BUILD_NUMBER}"
                        sh "docker push ${DOCKER_REGISTRY}/${imageName}:${BUILD_NUMBER}"
                        
                        // 📝 Save image info for deployment
                        writeFile file: 'image-info.json', text: """
                        {
                            "registry": "${DOCKER_REGISTRY}",
                            "image": "${imageName}",
                            "tag": "${imageTag}",
                            "fullName": "${fullImageName}",
                            "buildNumber": "${BUILD_NUMBER}"
                        }
                        """
                        
                        stash includes: 'image-info.json', name: 'image-info'
                    }
                }
            }
        }

        stage('🚀 Deploy') {
            parallel {
                stage('Deploy to Staging') {
                    when {
                        branch 'develop'
                    }
                    steps {
                        container('helm') {
                            script {
                                deployToEnvironment('staging')
                            }
                        }
                    }
                }

                stage('Deploy to Production') {
                    when {
                        branch 'main'
                    }
                    steps {
                        // 🛡️ Manual approval for production
                        input message: 'Deploy to Production?', ok: 'Deploy',
                              submitterParameter: 'DEPLOYER'
                        
                        container('helm') {
                            script {
                                deployToEnvironment('production')
                            }
                        }
                    }
                }
            }
        }

        stage('🧪 Post-Deploy Tests') {
            when {
                anyOf {
                    branch 'main'
                    branch 'develop'
                }
            }
            steps {
                container('node') {
                    script {
                        def environment = env.BRANCH_NAME == 'main' ? 'production' : 'staging'
                        def baseUrl = environment == 'production' ? 'https://myapp.com' : 'https://staging.myapp.com'
                        
                        // 🏥 Health checks
                        sh """
                            timeout 60 bash -c 'until curl -f ${baseUrl}/health; do sleep 5; done'
                            curl -f ${baseUrl}/api/trpc/health.check
                        """
                        
                        // 💨 Smoke tests
                        sh """
                            export BASE_URL=${baseUrl}
                            pnpm run test:smoke
                        """
                    }
                }
            }
        }
    }

    post {
        always {
            // 🧹 Cleanup workspace
            cleanWs()
        }
        
        success {
            script {
                // 📢 Success notification
                slackSend(
                    channel: '#deployments',
                    color: 'good',
                    message: "✅ Build #${BUILD_NUMBER} succeeded for ${env.BRANCH_NAME} - ${env.GIT_COMMIT[0..7]}"
                )
            }
        }
        
        failure {
            script {
                // 📢 Failure notification
                slackSend(
                    channel: '#deployments',
                    color: 'danger',
                    message: "❌ Build #${BUILD_NUMBER} failed for ${env.BRANCH_NAME} - ${env.GIT_COMMIT[0..7]}"
                )
            }
        }
        
        unstable {
            script {
                // 📢 Unstable notification
                slackSend(
                    channel: '#deployments',
                    color: 'warning',
                    message: "⚠️ Build #${BUILD_NUMBER} unstable for ${env.BRANCH_NAME} - ${env.GIT_COMMIT[0..7]}"
                )
            }
        }
    }
}

// 🚀 Deployment function
def deployToEnvironment(String environment) {
    // 📥 Unstash image info
    unstash 'image-info'
    
    def imageInfo = readJSON file: 'image-info.json'
    
    // 🔧 Setup kubectl
    sh '''
        echo "${KUBE_CONFIG}" | base64 -d > ~/.kube/config
        chmod 600 ~/.kube/config
    '''
    
    // 🚀 Deploy with Helm
    sh """
        helm upgrade --install trpc-app-${environment} ./helm/trpc-app \
            --namespace ${environment} \
            --create-namespace \
            --set image.repository=${imageInfo.registry}/${imageInfo.image} \
            --set image.tag=${imageInfo.tag} \
            --set environment=${environment} \
            --set ingress.hosts[0].host=${environment == 'production' ? 'myapp.com' : 'staging.myapp.com'} \
            --wait --timeout=10m
    """
    
    // ✅ Verify deployment
    sh "kubectl rollout status deployment/trpc-app-${environment} -n ${environment}"
}

⚙️ Jenkins Shared Library

vars/deployTRPCApp.groovy
// 📁 vars/deployTRPCApp.groovy
def call(Map config) {
    def environment = config.environment
    def imageTag = config.imageTag ?: 'latest'
    def namespace = config.namespace ?: environment
    def timeout = config.timeout ?: '10m'
    
    echo "🚀 Deploying tRPC app to ${environment}"
    
    // 🔧 Validate required parameters
    if (!environment) {
        error "Environment is required"
    }
    
    try {
        // 🎯 Deploy with Helm
        sh """
            helm upgrade --install trpc-app-${environment} ./helm/trpc-app \
                --namespace ${namespace} \
                --create-namespace \
                --set image.tag=${imageTag} \
                --set environment=${environment} \
                --set replicaCount=${config.replicas ?: (environment == 'production' ? 3 : 1)} \
                --set resources.requests.memory=${config.memory ?: '256Mi'} \
                --set resources.requests.cpu=${config.cpu ?: '250m'} \
                --wait --timeout=${timeout} \
                --values helm/trpc-app/values-${environment}.yaml
        """
        
        // ✅ Verify deployment
        sh "kubectl rollout status deployment/trpc-app-${environment} -n ${namespace}"
        
        // 🏥 Health check
        def healthUrl = getHealthUrl(environment)
        sh "timeout 60 bash -c 'until curl -f ${healthUrl}/health; do sleep 5; done'"
        
        echo "✅ Deployment to ${environment} successful"
        
        // 📊 Update deployment metrics
        updateDeploymentMetrics(environment, 'success')
        
    } catch (Exception e) {
        echo "❌ Deployment to ${environment} failed: ${e.message}"
        
        // 📊 Update deployment metrics
        updateDeploymentMetrics(environment, 'failure')
        
        // 🔄 Optional rollback
        if (config.autoRollback) {
            echo "🔄 Auto-rollback enabled, rolling back..."
            rollbackDeployment(environment, namespace)
        }
        
        throw e
    }
}

def getHealthUrl(String environment) {
    switch(environment) {
        case 'production':
            return 'https://myapp.com'
        case 'staging':
            return 'https://staging.myapp.com'
        default:
            return "https://${environment}.myapp.com"
    }
}

def rollbackDeployment(String environment, String namespace) {
    echo "🔄 Rolling back deployment in ${environment}"
    
    sh """
        helm rollback trpc-app-${environment} --namespace ${namespace}
        kubectl rollout status deployment/trpc-app-${environment} -n ${namespace}
    """
    
    echo "✅ Rollback completed"
}

def updateDeploymentMetrics(String environment, String status) {
    // 📊 Send metrics to monitoring system
    sh """
        curl -X POST ${METRICS_ENDPOINT}/deployment \
            -H "Content-Type: application/json" \
            -d '{
                "environment": "${environment}",
                "status": "${status}",
                "timestamp": "$(date -u +%Y-%m-%dT%H:%M:%SZ)",
                "build_number": "${BUILD_NUMBER}",
                "git_commit": "${GIT_COMMIT}"
            }' || true
    """
}

🚦 Quality Gates e Policies

📏 SonarQube Quality Gate

sonar-project.properties
# 📁 sonar-project.properties
sonar.projectKey=trpc-app
sonar.projectName=tRPC Application
sonar.projectVersion=1.0

# 📂 Source and test directories
sonar.sources=src
sonar.tests=tests
sonar.exclusions=**/*.test.ts,**/*.spec.ts,**/node_modules/**,**/dist/**

# 📊 Coverage settings
sonar.typescript.lcov.reportPaths=coverage/lcov.info
sonar.testExecutionReportPaths=test-results/sonar.xml

# 🔧 TypeScript configuration
sonar.typescript.tsconfigPath=tsconfig.json

# 🎯 Quality Gate conditions
sonar.qualitygate.wait=true

# 📊 New Code metrics thresholds
sonar.coverage.new_code.minimum=80
sonar.duplicated_lines_density.new_code.maximum=3
sonar.maintainability_rating.new_code.minimum=A
sonar.reliability_rating.new_code.minimum=A
sonar.security_rating.new_code.minimum=A

# 🔍 Code smell thresholds
sonar.sqale_rating.new_code.minimum=A
sonar.complexity.threshold=10
sonar.cognitive_complexity.threshold=15

🔒 Branch Protection Rules

.github/branch-protection.json
# 📁 .github/branch-protection.json
{
  "required_status_checks": {
    "strict": true,
    "contexts": [
      "quality/lint",
      "quality/typecheck", 
      "quality/build",
      "tests/unit",
      "tests/integration",
      "tests/e2e",
      "security/dependency-scan",
      "security/code-analysis",
      "sonarcloud",
      "codecov/patch",
      "codecov/project"
    ]
  },
  "enforce_admins": false,
  "required_pull_request_reviews": {
    "required_approving_review_count": 2,
    "dismiss_stale_reviews": true,
    "require_code_owner_reviews": true,
    "require_last_push_approval": true,
    "bypass_pull_request_allowances": {
      "users": [],
      "teams": ["platform-team"],
      "apps": ["dependabot"]
    }
  },
  "restrictions": {
    "users": [],
    "teams": ["senior-developers", "platform-team"],
    "apps": []
  },
  "required_conversation_resolution": true,
  "allow_deletions": false,
  "allow_force_pushes": false,
  "allow_fork_syncing": true,
  "block_creations": false,
  "required_linear_history": true
}

🎯 Custom Quality Checks

scripts/quality-gate.sh
#!/bin/bash
# 📁 scripts/quality-gate.sh

set -e

echo "🚦 Running Quality Gate Checks..."

# 📊 Variables
COVERAGE_THRESHOLD=80
COMPLEXITY_THRESHOLD=10
DUPLICATION_THRESHOLD=3
SECURITY_THRESHOLD="medium"

# 🧪 Check test coverage
echo "📊 Checking test coverage..."
COVERAGE=$(cat coverage/coverage-summary.json | jq '.total.lines.pct')
if (( $(echo "$COVERAGE < $COVERAGE_THRESHOLD" | bc -l) )); then
    echo "❌ Coverage $COVERAGE% is below threshold $COVERAGE_THRESHOLD%"
    exit 1
fi
echo "✅ Coverage: $COVERAGE%"

# 🔍 Check code complexity
echo "🔧 Checking code complexity..."
COMPLEXITY=$(npx ts-node scripts/check-complexity.ts)
if (( $(echo "$COMPLEXITY > $COMPLEXITY_THRESHOLD" | bc -l) )); then
    echo "❌ Average complexity $COMPLEXITY is above threshold $COMPLEXITY_THRESHOLD"
    exit 1
fi
echo "✅ Complexity: $COMPLEXITY"

# 📝 Check code duplication
echo "📝 Checking code duplication..."
DUPLICATION=$(npx jscpd src --format json | jq '.statistics.total.percentage')
if (( $(echo "$DUPLICATION > $DUPLICATION_THRESHOLD" | bc -l) )); then
    echo "❌ Code duplication $DUPLICATION% is above threshold $DUPLICATION_THRESHOLD%"
    exit 1
fi
echo "✅ Duplication: $DUPLICATION%"

# 🔒 Check security vulnerabilities
echo "🔒 Checking security vulnerabilities..."
VULNS=$(pnpm audit --audit-level $SECURITY_THRESHOLD --json | jq '.metadata.vulnerabilities.total')
if [ "$VULNS" -gt 0 ]; then
    echo "❌ Found $VULNS $SECURITY_THRESHOLD+ security vulnerabilities"
    pnpm audit --audit-level $SECURITY_THRESHOLD
    exit 1
fi
echo "✅ No $SECURITY_THRESHOLD+ security vulnerabilities found"

# 📏 Check bundle size
echo "📦 Checking bundle size..."
MAX_BUNDLE_SIZE=500000 # 500KB
BUNDLE_SIZE=$(du -b dist/main.js | cut -f1)
if [ "$BUNDLE_SIZE" -gt "$MAX_BUNDLE_SIZE" ]; then
    echo "❌ Bundle size $(($BUNDLE_SIZE / 1024))KB exceeds limit $(($MAX_BUNDLE_SIZE / 1024))KB"
    exit 1
fi
echo "✅ Bundle size: $(($BUNDLE_SIZE / 1024))KB"

# 🎨 Check TypeScript strict mode
echo "🎨 Checking TypeScript configuration..."
STRICT_MODE=$(cat tsconfig.json | jq '.compilerOptions.strict')
if [ "$STRICT_MODE" != "true" ]; then
    echo "❌ TypeScript strict mode is not enabled"
    exit 1
fi
echo "✅ TypeScript strict mode enabled"

# 📊 Performance budget check
echo "⚡ Checking performance budget..."
if [ -f "lighthouse-results.json" ]; then
    PERFORMANCE_SCORE=$(cat lighthouse-results.json | jq '.categories.performance.score * 100')
    MIN_PERFORMANCE=90
    if (( $(echo "$PERFORMANCE_SCORE < $MIN_PERFORMANCE" | bc -l) )); then
        echo "❌ Performance score $PERFORMANCE_SCORE is below $MIN_PERFORMANCE"
        exit 1
    fi
    echo "✅ Performance score: $PERFORMANCE_SCORE"
fi

# 📝 Check API documentation coverage
echo "📚 Checking API documentation..."
if [ -f "docs/api-coverage.json" ]; then
    DOC_COVERAGE=$(cat docs/api-coverage.json | jq '.coverage')
    MIN_DOC_COVERAGE=95
    if (( $(echo "$DOC_COVERAGE < $MIN_DOC_COVERAGE" | bc -l) )); then
        echo "❌ API documentation coverage $DOC_COVERAGE% is below $MIN_DOC_COVERAGE%"
        exit 1
    fi
    echo "✅ API documentation coverage: $DOC_COVERAGE%"
fi

echo "🎉 All quality gates passed!"
exit 0

📊 Quality Metrics Collector

scripts/collect-metrics.ts
// 📁 scripts/collect-metrics.ts
import fs from 'fs';
import path from 'path';
import { execSync } from 'child_process';

interface QualityMetrics {
  timestamp: string;
  buildNumber: string;
  gitCommit: string;
  coverage: {
    lines: number;
    statements: number;
    functions: number;
    branches: number;
  };
  complexity: {
    average: number;
    max: number;
    files: Array<{
      file: string;
      complexity: number;
    }>;
  };
  duplication: {
    percentage: number;
    lines: number;
    files: number;
  };
  security: {
    vulnerabilities: {
      critical: number;
      high: number;
      medium: number;
      low: number;
    };
    lastScan: string;
  };
  performance: {
    bundleSize: number;
    buildTime: number;
    testTime: number;
  };
  codeQuality: {
    lintErrors: number;
    lintWarnings: number;
    typeErrors: number;
  };
}

class QualityMetricsCollector {
  private metrics: QualityMetrics;

  constructor() {
    this.metrics = {
      timestamp: new Date().toISOString(),
      buildNumber: process.env.BUILD_NUMBER || 'local',
      gitCommit: this.getGitCommit(),
      coverage: { lines: 0, statements: 0, functions: 0, branches: 0 },
      complexity: { average: 0, max: 0, files: [] },
      duplication: { percentage: 0, lines: 0, files: 0 },
      security: {
        vulnerabilities: { critical: 0, high: 0, medium: 0, low: 0 },
        lastScan: new Date().toISOString(),
      },
      performance: { bundleSize: 0, buildTime: 0, testTime: 0 },
      codeQuality: { lintErrors: 0, lintWarnings: 0, typeErrors: 0 },
    };
  }

  // 🔧 Collect coverage metrics
  async collectCoverage(): Promise<void> {
    try {
      const coveragePath = 'coverage/coverage-summary.json';
      if (fs.existsSync(coveragePath)) {
        const coverage = JSON.parse(fs.readFileSync(coveragePath, 'utf8'));
        this.metrics.coverage = {
          lines: coverage.total.lines.pct,
          statements: coverage.total.statements.pct,
          functions: coverage.total.functions.pct,
          branches: coverage.total.branches.pct,
        };
      }
    } catch (error) {
      console.warn('⚠️ Could not collect coverage metrics:', error);
    }
  }

  // 🔧 Collect complexity metrics
  async collectComplexity(): Promise<void> {
    try {
      const complexityResults = execSync(
        'npx ts-node scripts/analyze-complexity.ts',
        { encoding: 'utf8' }
      );
      const complexity = JSON.parse(complexityResults);
      
      this.metrics.complexity = {
        average: complexity.average,
        max: complexity.max,
        files: complexity.files.slice(0, 10), // Top 10 most complex files
      };
    } catch (error) {
      console.warn('⚠️ Could not collect complexity metrics:', error);
    }
  }

  // 📝 Collect duplication metrics
  async collectDuplication(): Promise<void> {
    try {
      const duplicationResults = execSync(
        'npx jscpd src --format json',
        { encoding: 'utf8' }
      );
      const duplication = JSON.parse(duplicationResults);
      
      this.metrics.duplication = {
        percentage: duplication.statistics.total.percentage,
        lines: duplication.statistics.total.duplicatedLines,
        files: duplication.statistics.total.duplicatedFiles,
      };
    } catch (error) {
      console.warn('⚠️ Could not collect duplication metrics:', error);
    }
  }

  // 🔒 Collect security metrics
  async collectSecurity(): Promise<void> {
    try {
      const auditResults = execSync('pnpm audit --json', { encoding: 'utf8' });
      const audit = JSON.parse(auditResults);
      
      this.metrics.security.vulnerabilities = {
        critical: audit.metadata.vulnerabilities.critical || 0,
        high: audit.metadata.vulnerabilities.high || 0,
        medium: audit.metadata.vulnerabilities.moderate || 0,
        low: audit.metadata.vulnerabilities.low || 0,
      };
    } catch (error) {
      console.warn('⚠️ Could not collect security metrics:', error);
    }
  }

  // ⚡ Collect performance metrics
  async collectPerformance(): Promise<void> {
    try {
      // Bundle size
      const bundlePath = 'dist/main.js';
      if (fs.existsSync(bundlePath)) {
        this.metrics.performance.bundleSize = fs.statSync(bundlePath).size;
      }

      // Build time (from CI environment or build logs)
      if (process.env.BUILD_START_TIME) {
        const buildStart = parseInt(process.env.BUILD_START_TIME);
        this.metrics.performance.buildTime = Date.now() - buildStart;
      }

      // Test time (from test results)
      const testResultsPath = 'test-results/performance.json';
      if (fs.existsSync(testResultsPath)) {
        const testResults = JSON.parse(fs.readFileSync(testResultsPath, 'utf8'));
        this.metrics.performance.testTime = testResults.totalTime;
      }
    } catch (error) {
      console.warn('⚠️ Could not collect performance metrics:', error);
    }
  }

  // 🎨 Collect code quality metrics
  async collectCodeQuality(): Promise<void> {
    try {
      // Lint results
      const lintResultsPath = 'lint-results.json';
      if (fs.existsSync(lintResultsPath)) {
        const lintResults = JSON.parse(fs.readFileSync(lintResultsPath, 'utf8'));
        this.metrics.codeQuality.lintErrors = lintResults.errorCount || 0;
        this.metrics.codeQuality.lintWarnings = lintResults.warningCount || 0;
      }

      // TypeScript errors
      try {
        execSync('pnpm run type-check', { stdio: 'pipe' });
        this.metrics.codeQuality.typeErrors = 0;
      } catch (error) {
        // Count type errors from output
        const output = error.stdout || error.stderr || '';
        const errorMatches = output.match(/error TS\d+/g);
        this.metrics.codeQuality.typeErrors = errorMatches ? errorMatches.length : 1;
      }
    } catch (error) {
      console.warn('⚠️ Could not collect code quality metrics:', error);
    }
  }

  // 🔧 Helper methods
  private getGitCommit(): string {
    try {
      return execSync('git rev-parse HEAD', { encoding: 'utf8' }).trim();
    } catch {
      return process.env.GIT_COMMIT || 'unknown';
    }
  }

  // 📊 Collect all metrics
  async collectAll(): Promise<QualityMetrics> {
    console.log('📊 Collecting quality metrics...');
    
    await Promise.all([
      this.collectCoverage(),
      this.collectComplexity(),
      this.collectDuplication(),
      this.collectSecurity(),
      this.collectPerformance(),
      this.collectCodeQuality(),
    ]);

    return this.metrics;
  }

  // 💾 Save metrics
  async save(outputPath: string = 'quality-metrics.json'): Promise<void> {
    fs.writeFileSync(outputPath, JSON.stringify(this.metrics, null, 2));
    console.log(`📄 Quality metrics saved to ${outputPath}`);
  }

  // 📤 Send to monitoring system
  async sendToMonitoring(): Promise<void> {
    const monitoringEndpoint = process.env.METRICS_ENDPOINT;
    if (!monitoringEndpoint) {
      console.warn('⚠️ METRICS_ENDPOINT not configured, skipping...');
      return;
    }

    try {
      const response = await fetch(monitoringEndpoint, {
        method: 'POST',
        headers: { 'Content-Type': 'application/json' },
        body: JSON.stringify(this.metrics),
      });

      if (response.ok) {
        console.log('✅ Metrics sent to monitoring system');
      } else {
        console.error('❌ Failed to send metrics:', response.statusText);
      }
    } catch (error) {
      console.error('❌ Error sending metrics:', error);
    }
  }
}

// 🚀 Main execution
async function main() {
  const collector = new QualityMetricsCollector();
  
  try {
    const metrics = await collector.collectAll();
    await collector.save();
    await collector.sendToMonitoring();
    
    console.log('🎉 Quality metrics collection completed');
    console.log(`📊 Coverage: ${metrics.coverage.lines}%`);
    console.log(`🔧 Complexity: ${metrics.complexity.average}`);
    console.log(`📝 Duplication: ${metrics.duplication.percentage}%`);
    console.log(`🔒 Vulnerabilities: ${Object.values(metrics.security.vulnerabilities).reduce((a, b) => a + b, 0)}`);
    
  } catch (error) {
    console.error('❌ Error collecting metrics:', error);
    process.exit(1);
  }
}

if (require.main === module) {
  main();
}

export { QualityMetricsCollector, type QualityMetrics };

✅ O que você conquistou nesta aula

GitHub Actions pipeline completo
GitLab CI com matrix testing
Jenkins pipeline enterprise
Security Scanning automatizado
Quality Gates rigorosos
Performance Testing com K6
Blue-Green Deployment seguro
Metrics Collection automático

🎯 Próximos Passos

Na próxima aula, vamos explorar Performance e Otimização, implementando técnicas avançadas de cache, lazy loading e otimização de queries para aplicações tRPC de alta performance.

Aula Anterior
Módulo 6
Aula 2 de 5
CI/CD e Automação
Próxima Aula