TechEmpower benchmark entry
Hi!
https://github.com/TechEmpower/FrameworkBenchmarks
Would you be interested in adding this framework to techempower benchmarks? I can do it if you don't mind. Reasoning is to have more C# entries for a broader comparison between benchmarks.
Hi,
I agree, having more C# projects in the TechEmpower benchmarks would be great. I'll look into this and send a sample if it doesn't take to much time.
Ok great, thank you.
I can open a PR to techempower with plaintext and json tests if you need, done it with some frameworks already.
You can submit a PR to TechEmpower. Let me know once you've done, and I will add a link to the documentation.
Thanks for your time !
Ok, I started working on it. To follow the rules I just need to adjust 2 things, maybe you can give me a quick tip on how to:
- Disable gzip encoding
- Add a date header (how to just add a new header entry to the response or maybe something like a middleware if framework supports)
the fork is here https://github.com/MDA2AV/FrameworkBenchmarks/tree/add_simplew/frameworks/CSharp/simplew in case you want to add some adjustments.
I ran some local benchmarks: i9 14900k 128GB Ram @ 6000Mhz
wrk -t16 -c512 -d5s http://localhost:8080/api/json
Result: 2.3 million RPS
wrk -t16 -c512 -d5s http://localhost:8080/api/json
Running 5s test @ http://localhost:8080/api/json
16 threads and 512 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 702.80us 4.41ms 111.71ms 98.39%
Req/Sec 147.81k 43.27k 331.26k 73.35%
11833299 requests in 5.09s, 1.25GB read
Requests/sec: 2322826.21
Transfer/sec: 250.32MB
wrk -t16 -c512 -d5s http://localhost:8080/api/plaintext -s pipeline.lua
where pipeline.lua is a lua script for HTTP/1.1 pipelined
Result: didn't work, does SimpleW support Http/1.1 Pipelined?
Some points :
- there are quick tips to disabled the Gzip
- you can add headers with
Response.SetHeader(), but due to an underlying architectural choice of chronoxor (1, 2), you need to write the entire Response. It works but that's not the way I want users do it in SimpleW - pipelining : not handled for now, can you send me your
pipeline.luaso I can use it as test.
That's why I've been working on it for a couple of hours. I've rewritten some parts and it's a cleaner now. I still need a few more hours to test for any regressions or performance issues.
After that, I'll post modifications here + new release, and you'll finally be able to complete the PR.
my pipeline.lua
-- HTTP/1.1 Pipelining Script for wrk
-- Configure the pipeline depth here
local pipeline_depth = 16
local pipelined_request = ""
-- Initialize function runs once per thread before requests start
function init(args)
local req = {}
-- Headers table with keep-alive
local headers = {}
headers["Connection"] = "keep-alive"
-- Build all pipelined requests
for i = 1, pipeline_depth do
req[i] = wrk.format("GET", wrk.path, headers, nil)
end
-- Concatenate all requests into a single string
pipelined_request = table.concat(req)
--print("Pipeline depth: " .. pipeline_depth)
--print("Request size: " .. string.len(pipelined_request) .. " bytes")
end
-- This function is called for each request
function request()
return pipelined_request
end
-- Optional: Custom response handler
function response(status, headers, body)
-- You can add custom response handling here if needed
end
and the shell script i usually use
# Usage: ./pipeline.sh
URL=${1:-http://localhost:8080/api/plaintext}
DEPTH=${2:-16}
THREADS=${THREADS:-16}
CONNECTIONS=${CONNECTIONS:-512}
DURATION=${DURATION:-15s}
echo "Running pipelined plaintext benchmark:"
echo " URL: $URL"
echo " Depth: $DEPTH"
echo " Threads: $THREADS"
echo " Connections: $CONNECTIONS"
echo " Duration: $DURATION"
echo
wrk -t"$THREADS" -c"$CONNECTIONS" -d"$DURATION" --timeout 8 --latency \
-H 'Accept: text/plain,text/html;q=0.9,application/xhtml+xml;q=0.9,application/xml;q=0.8,*/*;q=0.7' \
-H 'Connection: keep-alive' \
"$URL" -s pipeline.lua -- "$DEPTH"
OK sounds great, I'll wait for the new release
I've just release SimpleW v16.1.0.
I've rewritten the documentation to explain how a response can be fully customized — there are several code snippets showing how to add headers, disable compression, and more.
So, to get the best results in your benchmark, use MakeResponse(), a versatile method for returning data.
public class BenchmarksController : Controller {
[Route("GET", "/json")]
public object Json() {
return Response.MakeResponse(
new { message = "Hello, World !" }, // object will be serialized
addHeaders: new Dictionary<string, string>() { { "Date", DateTime.Now.ToString("o") } }
// compress parameter is default to null, so no compression
);
}
[Route("GET", "/plaintext")]
public object Plaintext() {
return Response.MakeResponse(
"Hello, World !",
"text/plain",
addHeaders: new Dictionary<string, string>() { { "Date", DateTime.Now.ToString("o") } }
// compress parameter is default to null, so no compression
);
}
}
The pipelining will take more time. Maybe can you submit the PR without pipelining ?
Let me know.
Ok looks fine now, thanks.
About the pipeline one we can still submit but the test will either fallback for non pipelined or give a lot of errors but that's fine, we can just improve it later on, there are runs every 10 days basically and we can just update it when pipeline is fixed.
Edit: PR is here https://github.com/TechEmpower/FrameworkBenchmarks/pull/10237 already passed all the tests so should be approved in the next days, then after approval it will start running on each test run which has a typical cycle of ~10 days, you can see ongoing runs here: https://tfb-status.techempower.com/
Thanks, I will let know you once the pipelining is ready.
Well, I'm rewriting the server/session part from scratch since 3 weeks from now. It supports the pipelining but I still have some code to backport from the main line.
I close the issue.