Skip to content

Performance Benchmark CLI - Configuration Guide

Overview

The Performance Benchmark CLI is configured through appsettings.json located in the PerformanceBenchmarkCli directory. This guide documents all available configuration options.

Configuration File Structure

The configuration file has the following main sections:

{
  "Benchmark": {
    "Output": { },
    "Execution": { },
    "Reports": { },
    "Analysis": { }
  },
  "ConnectionStrings": { },
  "Logging": { }
}

Benchmark Configuration

Output Settings

Controls where and how benchmark results are saved.

{
  "Benchmark": {
    "Output": {
      "ResultsDirectory": "./Results",
      "KeepHistoryDays": 90,
      "BaselineFileName": "baseline.json"
    }
  }
}
Property Type Default Description
ResultsDirectory string "./Results" Directory where benchmark results are saved
KeepHistoryDays int 90 Number of days to retain old result files
BaselineFileName string "baseline.json" Name of the baseline file

Examples:

// Save results to a network share
"ResultsDirectory": "\\\\server\\benchmarks\\essert-mf"

// Keep results for 1 year
"KeepHistoryDays": 365

// Custom baseline name
"BaselineFileName": "production-baseline.json"

Execution Settings

Controls how benchmarks are executed.

{
  "Benchmark": {
    "Execution": {
      "DefaultIterations": 10,
      "DefaultWarmup": 3,
      "IncludeMemoryDiagnostics": true,
      "Suites": {
        "Connectivity": {
          "Enabled": true,
          "ThresholdMs": 100,
          "Description": "Database connection tests"
        },
        "Repository": {
          "Enabled": true,
          "ThresholdMs": 500,
          "Description": "CRUD operations and queries"
        },
        "Service": {
          "Enabled": true,
          "ThresholdMs": 100,
          "Description": "CRC calculation and caching"
        },
        "Scenario": {
          "Enabled": true,
          "ThresholdMs": 2000,
          "Description": "Complex workflows and scenarios"
        }
      }
    }
  }
}
Property Type Default Description
DefaultIterations int 10 Number of times each benchmark runs
DefaultWarmup int 3 Number of warmup iterations before measurements
IncludeMemoryDiagnostics bool true Collect memory allocation metrics

Iteration Guidelines:

  • Quick testing: DefaultIterations: 3 (1-2 minutes)
  • Development: DefaultIterations: 10 (5-10 minutes)
  • CI/CD: DefaultIterations: 5 (3-5 minutes)
  • Production baseline: DefaultIterations: 20 (15-20 minutes)
  • Detailed analysis: DefaultIterations: 50 (30-60 minutes)

Suite Configuration:

Each benchmark suite can be individually configured:

Property Type Description
Enabled bool Whether this suite runs by default
ThresholdMs int Performance threshold in milliseconds
Description string Human-readable description

Performance Status Classification:

Based on the threshold, benchmarks are classified as:

  • Excellent: < 50% of threshold
  • Good: < 100% of threshold
  • Warning: < 200% of threshold
  • Critical: ≥ 200% of threshold

Examples:

// Fast testing during development
"DefaultIterations": 5,
"DefaultWarmup": 2,
"IncludeMemoryDiagnostics": false

// Disable specific suites
"Suites": {
  "Connectivity": { "Enabled": false },
  "Repository": { "Enabled": true, "ThresholdMs": 250 }
}

// Stricter thresholds for production
"Suites": {
  "Repository": { "ThresholdMs": 300 },
  "Service": { "ThresholdMs": 50 }
}

Reports Settings

Controls report generation and formatting.

{
  "Benchmark": {
    "Reports": {
      "DefaultFormats": [ "html", "pdf", "json" ],
      "Html": {
        "IncludeCharts": true,
        "Theme": "dark",
        "Title": "Essert.MF Performance Benchmark Report"
      },
      "Pdf": {
        "IncludeDetailedTimings": true,
        "IncludeCharts": true,
        "PageSize": "A4",
        "Orientation": "Portrait"
      },
      "Markdown": {
        "IncludeSummaryTable": true,
        "IncludeDetailedResults": false,
        "MaxTableRows": 20
      }
    }
  }
}

Global Report Settings

Property Type Default Description
DefaultFormats array ["html", "pdf", "json"] Default report formats to generate

HTML Report Settings

Property Type Default Description
IncludeCharts bool true Include performance charts
Theme string "dark" Color theme ("dark" or "light")
Title string "Essert.MF Performance..." Report title

PDF Report Settings

Property Type Default Description
IncludeDetailedTimings bool true Include detailed timing tables
IncludeCharts bool true Include charts (increases file size)
PageSize string "A4" Page size ("A4", "Letter", "Legal")
Orientation string "Portrait" Page orientation ("Portrait" or "Landscape")

Markdown Report Settings

Property Type Default Description
IncludeSummaryTable bool true Include summary table at top
IncludeDetailedResults bool false Include all benchmark details
MaxTableRows int 20 Maximum rows in tables

Examples:

// Console-only output (no file reports)
"DefaultFormats": [ "console" ]

// Light theme for HTML
"Html": {
  "Theme": "light",
  "IncludeCharts": true
}

// Landscape PDF for wide tables
"Pdf": {
  "PageSize": "A4",
  "Orientation": "Landscape"
}

// Detailed markdown for documentation
"Markdown": {
  "IncludeSummaryTable": true,
  "IncludeDetailedResults": true,
  "MaxTableRows": 100
}

Analysis Settings

Controls performance analysis and regression detection.

{
  "Benchmark": {
    "Analysis": {
      "RegressionThresholdPercent": 10,
      "TrendAnalysisMinRuns": 5
    }
  }
}
Property Type Default Description
RegressionThresholdPercent double 10.0 Percentage increase to flag as regression
TrendAnalysisMinRuns int 5 Minimum runs needed for trend analysis

Threshold Guidelines:

  • Strict (production): 5% - Catches small performance degradations
  • Balanced (default): 10% - Good for most use cases
  • Lenient (development): 20% - Reduces noise during active development

Examples:

// Strict regression detection for production
"RegressionThresholdPercent": 5.0

// Lenient for noisy test environments
"RegressionThresholdPercent": 25.0

// Require more data for trend analysis
"TrendAnalysisMinRuns": 10

Connection Strings

Database connection strings for all Essert.MF databases.

{
  "ConnectionStrings": {
    "ProcessDb": "Server=192.168.101.128;Database=db_process;User=Service;Password=***;",
    "StatisticsDb": "Server=192.168.101.128;Database=db_statistics;User=Service;Password=***;",
    "ChangelogsDb": "Server=192.168.101.128;Database=db_changelogs;User=Service;Password=***;",
    "EssertDb": "Server=192.168.101.128;Database=db_essert;User=Service;Password=***;",
    "ProductParameterDb": "Server=192.168.101.128;Database=db_productparameter;User=Service;Password=***;",
    "RobotsDb": "Server=192.168.101.128;Database=db_robots;User=Service;Password=***;",
    "WpcDb": "Server=192.168.101.128;Database=db_wpc;User=Service;Password=***;"
  }
}

Connection String Format

MySQL/MariaDB connection strings use the following format:

Server=<hostname_or_ip>;Database=<database_name>;User=<username>;Password=<password>;[Options]

Common Options:

Server=192.168.101.128;
Database=db_process;
User=Service;
Password=MyPassword;
Port=3306;
ConnectionTimeout=30;
SslMode=Required;

Best Practices:

  1. Use Environment Variables (Production)
{
  "ConnectionStrings": {
    "ProcessDb": "${ESSERT_PROCESS_DB_CONNECTION}"
  }
}
  1. Use Separate Configuration File (Development)

Create appsettings.Development.json:

{
  "ConnectionStrings": {
    "ProcessDb": "Server=localhost;Database=db_process;User=dev;Password=dev123;"
  }
}
  1. Use Secrets Manager (CI/CD)

Reference secrets in your pipeline and inject at runtime.

Logging Configuration

Controls application logging output.

{
  "Logging": {
    "LogLevel": {
      "Default": "Information",
      "Microsoft": "Warning",
      "Microsoft.EntityFrameworkCore": "Warning"
    }
  }
}
Log Level Description Use Case
Trace Most verbose Deep debugging
Debug Detailed information Development troubleshooting
Information General information Default for development
Warning Warning messages Default for production
Error Error messages only Minimal logging
Critical Critical errors only Minimal logging

Examples:

// Verbose logging for troubleshooting
"Logging": {
  "LogLevel": {
    "Default": "Debug",
    "Microsoft.EntityFrameworkCore": "Information"
  }
}

// Minimal logging for production
"Logging": {
  "LogLevel": {
    "Default": "Warning",
    "Microsoft": "Error"
  }
}

// Entity Framework query logging
"Logging": {
  "LogLevel": {
    "Microsoft.EntityFrameworkCore.Database.Command": "Information"
  }
}

Environment-Specific Configuration

The CLI supports environment-specific configuration files:

File Priority

  1. appsettings.json (base configuration)
  2. appsettings.{Environment}.json (environment overrides)
  3. Environment variables (highest priority)

Usage

# Use Development configuration
export DOTNET_ENVIRONMENT=Development
dotnet run -- run

# Use Production configuration
export DOTNET_ENVIRONMENT=Production
dotnet run -- run

# Windows
set DOTNET_ENVIRONMENT=Production
dotnet run -- run

Example: appsettings.Production.json

{
  "Benchmark": {
    "Execution": {
      "DefaultIterations": 20,
      "DefaultWarmup": 5
    },
    "Analysis": {
      "RegressionThresholdPercent": 5.0
    }
  },
  "Logging": {
    "LogLevel": {
      "Default": "Warning"
    }
  }
}

Command-Line Overrides

Some configuration values can be overridden via command-line arguments:

# Override output directory
dotnet run -- run --output /path/to/results

# Override regression threshold
dotnet run -- compare --threshold 5

# Override iterations (future feature)
dotnet run -- run --iterations 20 --warmup 5

# Override formats
dotnet run -- report --format html,pdf

Configuration Validation

The CLI validates configuration on startup:

  • Connection strings are tested
  • Output directories are created if missing
  • Invalid threshold values are rejected
  • Missing required settings trigger errors

Validation Examples:

Error: Invalid RegressionThresholdPercent: -5. Must be > 0.
Error: ProcessDb connection string is missing.
Warning: ResultsDirectory does not exist. Creating: ./Results

Configuration Best Practices

Development

{
  "Benchmark": {
    "Execution": {
      "DefaultIterations": 5,
      "DefaultWarmup": 2,
      "IncludeMemoryDiagnostics": true
    },
    "Reports": {
      "DefaultFormats": [ "console", "json" ]
    }
  }
}

CI/CD

{
  "Benchmark": {
    "Execution": {
      "DefaultIterations": 10,
      "DefaultWarmup": 3,
      "IncludeMemoryDiagnostics": false
    },
    "Reports": {
      "DefaultFormats": [ "markdown", "json" ]
    },
    "Analysis": {
      "RegressionThresholdPercent": 10.0
    }
  }
}

Production Baseline

{
  "Benchmark": {
    "Execution": {
      "DefaultIterations": 50,
      "DefaultWarmup": 10,
      "IncludeMemoryDiagnostics": true
    },
    "Reports": {
      "DefaultFormats": [ "html", "pdf", "json", "markdown" ]
    },
    "Analysis": {
      "RegressionThresholdPercent": 5.0
    }
  }
}

Troubleshooting Configuration Issues

Connection Errors

Error: Unable to connect to database 'db_process'

Solution: Verify connection strings, network access, and credentials.

Invalid Configuration

Error: DefaultIterations must be greater than 0

Solution: Check configuration values are within valid ranges.

Missing Configuration

Warning: Using default value for ResultsDirectory: ./Results

Solution: Add explicit configuration values or use defaults.

File Permission Errors

Error: Unable to write to ResultsDirectory: ./Results

Solution: Check directory permissions and disk space.