PowerShell for a Quick and Dirty Load Test

I needed a quick way to run a simple load test on an air-gapped remote server, where it wasn’t possible to install regular load-testing tools. In a Windows Server environment, PowerShell comes built-in so I thought I’d see how much mileage I can get out of it.

Measure a HTTP request #

$url = 'https://target-server/'
Measure-Command { Invoke-RestMethod $url }

Gives me some basic timings of the request (remembering that this includes stuff like client processing, network latency and remote server time). Still handy for a rough “finger in the wind” of how fast a server gives a response:

Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 0
Milliseconds      : 25
Ticks             : 251252
TotalDays         : 2.90800925925926E-07
TotalHours        : 6.97922222222222E-06
TotalMinutes      : 0.000418753333333333
TotalSeconds      : 0.0251252
TotalMilliseconds : 25.1252

But a single request isn’t much of a load test. I can run this a bunch of times by button-mashing my keyboard but we can do better.

Measure a bunch of HTTP requests #

Let’s wrap it in a loop, fire off a bunch of requests and aggregate the results:

 1$url = 'https://target-server/'
 2$iterations = 1000
 3$result = @()
 4
 5$metaResult = Measure-Command {
 6for ($i=1; $i -le $iterations; $i++)
 7  {
 8    $result += (Measure-Command { Invoke-RestMethod $url }).TotalMilliseconds
 9  }
10}
11
12Write-Output ('Request count: ' + $iterations)
13Write-Output ('Average time (ms): ' + [math]::Round(($result | Measure-Object -Average).Average,2))
14Write-Output ('Average RPS: ' + [math]::Round($iterations/$metaResult.TotalSeconds,2))

Gives me something like this:

Request count: 1000
Average time (ms): 129.32
Average RPS: 7.73

That’s a bit better but 7 requests per second? That’s piddly.

We’re still firing these requests off in series one after the other. Can we simulate a bit of concurrency with a few clients sending requests?

Measure a bunch of HTTP requests in parallel #

We can! As of Powershell 7, there’s a handy for-each -parallel feature to save messing around with herding PSJobs.

 1$url = 'https://target-server/'
 2$iterations = 5000
 3
 4$metaResult = Measure-Command {
 5
 6  $results = 1..$iterations | ForEach-Object -ThrottleLimit 5 -Parallel { 
 7    $fail = $false
 8    $time = (Measure-Command { 
 9        try {
10          Invoke-RestMethod $using:url | Out-Null
11        }
12        catch { 
13          Write-Warning -Message ("StatusCode:" + $_.Exception.Response.StatusCode.value__ )
14          $fail = $true
15        }
16      }).TotalMilliseconds 
17        
18    if (! $fail ) { return $time }
19  } | Measure-Object -AllStats
20}
21
22Write-Output ('Successful request count: ' + $results.Count + '/' + $iterations + ' (' + ($results.Count / $iterations * 100) + '%)')
23Write-Output ('Average time (ms): ' + [math]::Round($results.Average, 2))
24Write-Output ('Average RPS: ' + [math]::Round($iterations / $metaResult.TotalSeconds, 2))

Great! We’ve got a bunch more throughput:

Request count: 1000
Average time (ms): 724.39
Average RPS: 121.08

Notes:

  • Tweak the 1..100 range operator to launch more requests
  • Change the -ThrottleLimit 10 if you want more simultaneous requests