Cook icon indicating copy to clipboard operation
Cook copied to clipboard

Benchmark time to schedule a workload

Open dgrnbrg opened this issue 10 years ago • 1 comments

This will give us an idea of how long it should take to start some number of jobs, of various sizes.

The motivation is to understand how long it should take to launch a Spark cluster, so that we can figure out how multitenancy affects this, and if something special is needed.

dgrnbrg avatar Sep 11 '15 23:09 dgrnbrg

There are 3 ways to do this:

  • On a real big cluster (this is most accurate, but very hard)
  • Within the simulator (this would require making the APIs match the offer pipeline and simulating it in realtime)
  • Writing a model and testing with that (this is least accurate, but probably the easiest approach)

dgrnbrg avatar Sep 23 '15 17:09 dgrnbrg