Dynamic Workloads

Use dynamic workloads when the traffic should change over time instead of staying flat. This is how you model ramps, bursts, uneven demand, or runtime-driven stops.

What this page helps you do

What this page helps you do

Use dynamic workloads when the traffic should change over time instead of staying flat. This is how you model ramps, bursts, uneven demand, or runtime-driven stops.

Who this is for

Engineers writing or reviewing scenario code in one of the supported SDKs.

Prerequisites

  • A scenario or runtime surface you want to wire correctly in code

By the end

The exact SDK surface you need for this part of the runtime.

Use this page when

Use this reference when you already know the workflow and need the exact Dynamic Workloads API surface in code.

Visual guide

Sequence diagram showing how a LoadStrike workflow moves from setup to report output.
This page fits into the same setup, run, correlate, and report flow as the rest of the public LoadStrike runtime.

Guide

Adaptive Traffic Shape

Use InjectRandom and the Ramping* simulations when the workload should rise, fall, or vary over time instead of holding one steady shape from start to finish.

Scenario Weighting

Use WithWeight when clustered runs should give some scenarios a larger share of assignment or throughput than others.

Runtime Control

Use context.StopScenario or context.StopCurrentTest when the scenario itself needs to stop because of a runtime signal, not just because the original load profile ended.

SDK reference samples

Use these SDK samples to compare how Dynamic Workloads is exposed across the supported languages before you wire it into a full scenario.

If you run these examples locally, add a valid runner key before execution starts. Set it with WithRunnerKey("...") or the config key LoadStrike:RunnerKey.

Dynamic Workload

using LoadStrike;

var httpClient = new HttpClient
{
    BaseAddress = new Uri("https://api.example.com")
};

var scenario = LoadStrikeScenario.Create("submit-orders", async context =>
{
    if (context.InvocationNumber >= 500)
    {
        context.StopScenario("submit-orders", "enough-orders");
    }

    var step = await LoadStrikeStep.Run<string>("POST /orders", context, async () =>
    {
        using var response = await httpClient.PostAsJsonAsync("/orders", new
        {
            orderId = $"ord-{context.InvocationNumber}",
            amount = 49.95m
        });

        return response.IsSuccessStatusCode
            ? LoadStrikeResponse.Ok<string>(statusCode: ((int)response.StatusCode).ToString())
            : LoadStrikeResponse.Fail<string>(statusCode: ((int)response.StatusCode).ToString());
    });

    return step.AsReply();
})
.WithLoadSimulations(
    LoadStrikeSimulation.InjectRandom(5, 15, TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(20)),
    LoadStrikeSimulation.RampingConstant(6, TimeSpan.FromSeconds(20))
);

LoadStrikeRunner.RegisterScenarios(scenario)
    .WithRunnerKey("rkl_your_local_runner_key")
    .Run();

Dynamic workload tools

InjectRandom(minRate, maxRate, interval, during)

Use this when the workload should fluctuate inside a bounded range instead of staying flat.

RampingInject(rate, interval, during)

Use this when request rate should climb gradually toward a target value.

RampingConstant(copies, during)

Use this when concurrent copies should rise gradually instead of starting at the full steady-state count immediately.

Multiple simulations in order

Pass multiple simulation definitions to WithLoadSimulations when the scenario should move through warmup, ramp, steady load, and cooldown phases.

Shared HTTP transports under spikes

When the workload issues HTTP requests, reuse one shared client, session, or dispatcher so connection pooling survives burst traffic without socket churn.

context.InvocationNumber

Useful for dynamic in-code branching, such as stopping after a certain number of executions or changing payload behavior later in the run.

context.StopScenario(...) / context.StopCurrentTest(...)

Lets the scenario react to runtime conditions instead of waiting only for the original load-shape duration to finish.