python-client-benchmarks icon indicating copy to clipboard operation
python-client-benchmarks copied to clipboard

Create ansible script to set up and run benchmark in AWS (nice to have)

Open svanoort opened this issue 9 years ago • 2 comments

Goal: create an Ansible script to automate the AWS benchmarking that is currently being set up & done manually. Goal is greater Ansible familiarity + improved repeatability of tests.

Inputs:

  • AWS keys
  • region to use
  • instance type (optional, default to c4.large)
  • optional boolean to use private IPs instead of public
  • benchmarking git branchname (default master)
  • OPTIONAL CLI args for benchmark

Steps performed:

Playbook 1: provisioning

  1. If instances DON'T exist: Create two instances of $type in $region, with $selected_keypair, 5 GB of GP2 EBS volume (enough) for benchmark host, 2 GB for client, security group open on ports 22/8080/4443 (SSH/HTTP/HTTPS), name them pycl-benchmark, pycl-benchmark-client
  • Store the private and public IPs
    1. Run updates
    2. Install git, docker on both 4). Clone repo to /tmp on both, and checkout $benchmark_branch
    3. On pycl-benchmark-client, install python, python-pip, python-pycurl, and pip install client libraries from requirements.txt

Playbook 2: benchmark run

  1. See if instances exist, if not fail, if so then start them if not running in EC2
  2. On pycl-benchmark, start docker service and start container for benchmark (container needs to be pushed to Dockerhub to be pullable)
  3. On pycl-benchmark-client start the benchmark using (public or private IP) of pycl-benchmark as target. Option to supply CLI options to benchmark. Log the overall run to a logfile, and output results to a CSV
  4. When benchmark is done, retrieve the log to local folder, and same for the results.
  5. Stop instances (don't terminate though).

Note: running it WILL COST MONEY, call it around $2 for setup + first run, and $0.20-$0.40 per run. Approximately $1.60 with 8 GB EBS volumes (2 x 8 GB @ $0.10 per provisioned GB per month) + instance cost (say $0.20-$0.30). Perhaps we can under-provision storage to 2 GB-4GB per host to keep costs under $1? -- see how much of volumes are used currently.

svanoort avatar Feb 26 '16 15:02 svanoort

If a public ip is used, a real certificate can be get from letsencrypt to test the whole chain, not sure if it worse it ? (an setup among others : https://blog.heckel.xyz/2015/12/04/lets-encrypt-5-min-guide-to-set-up-cronjob-based-certificate-renewal/ ) In this case :

  • start the docker instance
  • get the certificate (nginx configuration has to modified for the ACME challenge)
  • reload nginx (sending the HUP signal)

Another note, about requests : the pyopenssl package can be installed which modified the behavior of requests.

dalf avatar Feb 26 '16 16:02 dalf

@dalf It's worth considering down the road, definitely (might be overkill for this case though). Initially my goal is getting a solid core benchmark for HTTP & HTTPS across different request sizes (setting up for this has eaten more than a reasonable amount of time already, unfortunately).

I don't want to go too far down the path of comparing different SSL options though, because... well libcurl has more than a few choices too. I've already had headaches around this in my PyRestTest library, which uses pyCurl internally.

svanoort avatar Feb 26 '16 21:02 svanoort