CI/CD Pipeline Integrations

Integrate Donobu with popular CI/CD platforms like GitHub Actions, Jenkins, and GitLab CI for automated testing and deployment workflows.

CI/CD Integration Overview

Integrate Donobu into your continuous integration and deployment pipelines to ensure automated testing at every stage of development.

GitHub Actions Integration

Complete Workflow Example

# .github/workflows/e2e-tests.yml
name: E2E Testing with Donobu
on:
  push:
    branches: [main, develop]
  pull_request:
    branches: [main]

jobs:
  e2e-tests:
    runs-on: ubuntu-latest
    services:
      donobu:
        image: donobu/studio:latest
        ports:
          - 31000:31000
        env:
          OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}

    steps:
      - uses: actions/checkout@v3
      
      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '18'
          cache: 'npm'
      
      - name: Install dependencies
        run: npm install
      
      - name: Wait for Donobu to be ready
        run: |
          until curl -f http://localhost:31000/api/ping; do
            echo "Waiting for Donobu..."
            sleep 2
          done
      
      - name: Setup test environment
        run: |
          # Configure GPT
          curl -X POST http://localhost:31000/api/gpt-configs/ci-config \
            -H "Content-Type: application/json" \
            -d '{
              "type": "OPENAI",
              "apiKey": "${{ secrets.OPENAI_API_KEY }}",
              "modelName": "gpt-4"
            }'
          
          # Set environment variables
          curl -X POST http://localhost:31000/api/env/BASE_URL \
            -H "Content-Type: application/json" \
            -d '{"value": "https://staging.example.com"}'
      
      - name: Run E2E tests
        run: npm run test:e2e
        env:
          CI: true
      
      - name: Upload test artifacts
        if: failure()
        uses: actions/upload-artifact@v3
        with:
          name: test-results
          path: |
            screenshots/
            videos/
            test-reports/

Advanced GitHub Actions Setup

# .github/workflows/matrix-testing.yml
name: Cross-Browser Testing
on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest
    strategy:
      matrix:
        browser: [chrome, firefox, safari]
        environment: [staging, production]
    
    steps:
      - uses: actions/checkout@v3
      
      - name: Run browser-specific tests
        env:
          BROWSER: ${{ matrix.browser }}
          ENVIRONMENT: ${{ matrix.environment }}
        run: |
          npm run test:donobu -- --browser=$BROWSER --env=$ENVIRONMENT

Jenkins Pipeline Integration

Declarative Pipeline

// Jenkinsfile
pipeline {
    agent any
    
    environment {
        DONOBU_API_URL = 'http://localhost:31000/api'
        OPENAI_API_KEY = credentials('openai-api-key')
    }
    
    stages {
        stage('Setup') {
            steps {
                script {
                    // Start Donobu container
                    sh '''
                        docker run -d --name donobu-test \
                          -p 31000:31000 \
                          -e OPENAI_API_KEY=${OPENAI_API_KEY} \
                          donobu/studio:latest
                    '''
                    
                    // Wait for service to be ready
                    sh '''
                        until curl -f ${DONOBU_API_URL}/ping; do
                            sleep 2
                        done
                    '''
                }
            }
        }
        
        stage('Configure') {
            steps {
                sh '''
                    # Setup GPT configuration
                    curl -X POST ${DONOBU_API_URL}/gpt-configs/jenkins-config \
                      -H "Content-Type: application/json" \
                      -d "{
                        \\"type\\": \\"OPENAI\\",
                        \\"apiKey\\": \\"${OPENAI_API_KEY}\\",
                        \\"modelName\\": \\"gpt-4\\"
                      }"
                '''
            }
        }
        
        stage('E2E Tests') {
            steps {
                script {
                    def flowConfigs = [
                        [
                            name: 'Critical Path Test',
                            objective: 'Test main user journey',
                            website: 'https://staging.example.com'
                        ],
                        [
                            name: 'Admin Flow Test', 
                            objective: 'Test admin functionality',
                            website: 'https://staging.example.com/admin'
                        ]
                    ]
                    
                    def results = [:]
                    
                    flowConfigs.each { config ->
                        def flowId = sh(
                            script: """
                                curl -X POST ${DONOBU_API_URL}/flows \
                                  -H "Content-Type: application/json" \
                                  -d '{
                                    "name": "${config.name}",
                                    "targetWebsite": "${config.website}",
                                    "overallObjective": "${config.objective}",
                                    "maxToolCalls": 25
                                  }' | jq -r '.id'
                            """,
                            returnStdout: true
                        ).trim()
                        
                        results[config.name] = flowId
                    }
                    
                    // Wait for all flows to complete
                    results.each { name, flowId ->
                        sh """
                            while true; do
                                state=\$(curl -s ${DONOBU_API_URL}/flows/${flowId} | jq -r '.state')
                                if [ "\$state" = "SUCCESS" ]; then
                                    echo "${name} completed successfully"
                                    break
                                elif [ "\$state" = "FAILED" ]; then
                                    echo "${name} failed"
                                    curl -s ${DONOBU_API_URL}/flows/${flowId} | jq '.result'
                                    exit 1
                                fi
                                sleep 5
                            done
                        """
                    }
                }
            }
        }
    }
    
    post {
        always {
            sh 'docker stop donobu-test && docker rm donobu-test'
        }
        failure {
            script {
                // Download artifacts for failed tests
                sh '''
                    mkdir -p test-artifacts
                    # Download screenshots and videos
                    # Implementation depends on specific needs
                '''
            }
            archiveArtifacts artifacts: 'test-artifacts/**/*', fingerprint: true
        }
    }
}

Jenkins Shared Library

// vars/donobuTest.groovy
def call(Map config) {
    def flowId = sh(
        script: """
            curl -X POST ${env.DONOBU_API_URL}/flows \
              -H "Content-Type: application/json" \
              -d '${groovy.json.JsonOutput.toJson(config)}' \
              | jq -r '.id'
        """,
        returnStdout: true
    ).trim()
    
    // Wait for completion
    sh """
        while true; do
            state=\$(curl -s ${env.DONOBU_API_URL}/flows/${flowId} | jq -r '.state')
            if [ "\$state" = "SUCCESS" ]; then
                echo "Flow completed successfully"
                break
            elif [ "\$state" = "FAILED" ]; then
                echo "Flow failed"
                exit 1
            fi
            sleep 5
        done
    """
    
    return flowId
}

GitLab CI Integration

Complete Pipeline Configuration

# .gitlab-ci.yml
stages:
  - test
  - deploy

variables:
  DONOBU_IMAGE: donobu/studio:latest
  DOCKER_DRIVER: overlay2

services:
  - docker:dind

e2e-tests:
  stage: test
  image: node:18
  services:
    - name: $DONOBU_IMAGE
      alias: donobu
      variables:
        OPENAI_API_KEY: $OPENAI_API_KEY
  
  before_script:
    - apt-get update && apt-get install -y curl jq
    - npm install
    
    # Wait for Donobu to be ready
    - |
      until curl -f http://donobu:31000/api/ping; do
        echo "Waiting for Donobu..."
        sleep 2
      done
    
    # Configure GPT
    - |
      curl -X POST http://donobu:31000/api/gpt-configs/gitlab-config \
        -H "Content-Type: application/json" \
        -d "{
          \"type\": \"OPENAI\",
          \"apiKey\": \"$OPENAI_API_KEY\",
          \"modelName\": \"gpt-4\"
        }"
  
  script:
    - npm run test:e2e
  
  artifacts:
    when: on_failure
    paths:
      - screenshots/
      - videos/
      - test-reports/
    expire_in: 1 week
  
  only:
    - main
    - develop
    - merge_requests

parallel-tests:
  stage: test
  parallel: 3
  script:
    - |
      case $CI_NODE_INDEX in
        1) npm run test:smoke ;;
        2) npm run test:regression ;;
        3) npm run test:integration ;;
      esac

GitLab Multi-Environment Testing

# .gitlab-ci.yml - Multi environment
.donobu_test_template: &donobu_test
  script:
    - |
      curl -X POST http://donobu:31000/api/flows \
        -H "Content-Type: application/json" \
        -d "{
          \"name\": \"$TEST_NAME\",
          \"targetWebsite\": \"$TARGET_URL\",
          \"overallObjective\": \"$TEST_OBJECTIVE\"
        }"

test:staging:
  <<: *donobu_test
  variables:
    TARGET_URL: "https://staging.example.com"
    TEST_NAME: "Staging E2E Tests"
  environment:
    name: staging
  only:
    - develop

test:production:
  <<: *donobu_test
  variables:
    TARGET_URL: "https://example.com"
    TEST_NAME: "Production Smoke Tests"
  environment:
    name: production
  only:
    - main
  when: manual

Azure DevOps Integration

Azure Pipelines YAML

# azure-pipelines.yml
trigger:
- main
- develop

pool:
  vmImage: 'ubuntu-latest'

variables:
  DONOBU_API_URL: 'http://localhost:31000/api'

services:
  donobu:
    image: donobu/studio:latest
    ports:
      - 31000:31000
    env:
      OPENAI_API_KEY: $(OPENAI_API_KEY)

steps:
- task: NodeTool@0
  inputs:
    versionSpec: '18.x'
  displayName: 'Install Node.js'

- script: |
    npm install
  displayName: 'Install dependencies'

- script: |
    until curl -f $(DONOBU_API_URL)/ping; do
      echo "Waiting for Donobu..."
      sleep 2
    done
  displayName: 'Wait for Donobu'

- script: |
    curl -X POST $(DONOBU_API_URL)/gpt-configs/azure-config \
      -H "Content-Type: application/json" \
      -d "{
        \"type\": \"OPENAI\",
        \"apiKey\": \"$(OPENAI_API_KEY)\",
        \"modelName\": \"gpt-4\"
      }"
  displayName: 'Configure Donobu'

- script: |
    npm run test:e2e
  displayName: 'Run E2E Tests'

- task: PublishTestResults@2
  inputs:
    testResultsFormat: 'JUnit'
    testResultsFiles: '**/test-results.xml'
  condition: always()

- task: PublishBuildArtifacts@1
  inputs:
    pathToPublish: 'test-artifacts'
    artifactName: 'test-results'
  condition: failed()

CircleCI Integration

CircleCI Configuration

# .circleci/config.yml
version: 2.1

executors:
  node-executor:
    docker:
      - image: cimg/node:18.16
      - image: donobu/studio:latest
        environment:
          OPENAI_API_KEY: ${OPENAI_API_KEY}

jobs:
  e2e-tests:
    executor: node-executor
    steps:
      - checkout
      - run:
          name: Install dependencies
          command: npm install
      - run:
          name: Wait for Donobu
          command: |
            until curl -f http://localhost:31000/api/ping; do
              echo "Waiting for Donobu..."
              sleep 2
            done
      - run:
          name: Configure Donobu
          command: |
            curl -X POST http://localhost:31000/api/gpt-configs/circleci-config \
              -H "Content-Type: application/json" \
              -d "{
                \"type\": \"OPENAI\",
                \"apiKey\": \"$OPENAI_API_KEY\",
                \"modelName\": \"gpt-4\"
              }"
      - run:
          name: Run tests
          command: npm run test:e2e
      - store_artifacts:
          path: test-artifacts
      - store_test_results:
          path: test-results

workflows:
  version: 2
  test-and-deploy:
    jobs:
      - e2e-tests:
          context: donobu-testing

Best Practices for CI/CD Integration

1. Environment Management

  • Use separate configurations for different environments
  • Manage secrets securely through CI/CD platform secret management
  • Implement environment-specific test suites

2. Resource Management

  • Use appropriate timeouts for different test types
  • Implement proper cleanup procedures
  • Monitor resource usage and costs

3. Parallel Execution

  • Run independent tests in parallel
  • Use matrix builds for cross-browser testing
  • Implement proper test isolation

4. Failure Handling

  • Capture comprehensive debugging information
  • Implement retry mechanisms for flaky tests
  • Send notifications for test failures

5. Reporting and Metrics

  • Integrate with existing monitoring and alerting systems
  • Track test execution metrics and trends
  • Generate comprehensive test reports

6. Security Considerations

  • Never expose API keys in logs
  • Use secure methods for credential storage
  • Implement proper access controls

By integrating Donobu into your CI/CD pipelines, you can ensure that your applications are thoroughly tested at every stage of development, from feature branches to production deployments.