FluidFramework icon indicating copy to clipboard operation
FluidFramework copied to clipboard

Error handling in custom benchmark

Open jikim-msft opened this issue 1 year ago • 1 comments

Description

#AB12925

Found a bug in benchmarkCustom() function not throwing an error.

This PR makes following changes:

  • Adds try-catch logic to the benchmarkCustom() function.
  • Adds error event from the benchmarkCustom() function which gets emitted to the MochaReporter.
  • Enables --verbose on the test:customBenchmarks script for better logging.

Result

With the changes above, the following code throws an error on console:

			benchmarkCustom({
				only: true,
				type: BenchmarkType.Measurement,
				title: `SharedTree`,
				run: async (reporter) => {
					throw new Error("This error is INTENTIONAL.");
				},
			});

Screenshot 2024-08-22 at 16 55 35

jikim-msft avatar Aug 16 '24 23:08 jikim-msft

@fluid-example/bundle-size-tests: +245 Bytes
Metric NameBaseline SizeCompare SizeSize Diff
aqueduct.js 461.15 KB 461.18 KB +35 Bytes
azureClient.js 559.19 KB 559.24 KB +49 Bytes
connectionState.js 680 Bytes 680 Bytes No change
containerRuntime.js 261.99 KB 262 KB +14 Bytes
fluidFramework.js 399.76 KB 399.77 KB +14 Bytes
loader.js 134.26 KB 134.28 KB +14 Bytes
map.js 42.39 KB 42.39 KB +7 Bytes
matrix.js 146.56 KB 146.56 KB +7 Bytes
odspClient.js 526.47 KB 526.52 KB +49 Bytes
odspDriver.js 97.72 KB 97.74 KB +21 Bytes
odspPrefetchSnapshot.js 42.78 KB 42.79 KB +14 Bytes
sharedString.js 163.26 KB 163.26 KB +7 Bytes
sharedTree.js 390.27 KB 390.28 KB +7 Bytes
Total Size 3.3 MB 3.3 MB +245 Bytes

Baseline commit: b7e4e1e71add4e695c715122aac6afde13403c29

Generated by :no_entry_sign: dangerJS against 3f2a5c19d5d5779e6c7b5a3f4bb78383d61e01bc

msfluid-bot avatar Aug 23 '24 01:08 msfluid-bot

/*!
 * Copyright (c) Microsoft Corporation and contributors. All rights reserved.
 * Licensed under the MIT License.
 */

"use strict";

const getFluidTestMochaConfig = require("@fluid-internal/mocha-test-setup/mocharc-common");

const packageDir = __dirname;
const config = getFluidTestMochaConfig(packageDir);

// TODO: Revert this
const spec =
	config["spec"] !== undefined
		? Array.isArray(config["spec"])
			? config["spec"]
			: [config["spec"]] // If string, wrap as array to use spread operator
		: []; // If undefined, use an empty array

spec.push("--perfMode");

module.exports = {
	...config,
	spec: spec,
};
  1. Add --perfMode to .mocharc.cjs's spec and run pnpm run test:mocha
Screenshot 2024-08-29 at 14 29 37
  1. Add --perfMode flag to the script and run pnpm run test:mocha -- --perfMode
Screenshot 2024-08-29 at 14 31 39

I added logging to ensure that the script & .mocharc BOTH work in --perfMode

if (isInPerformanceTestingMode) {
    console.log(`PERFMODE Benchmark error: ${benchmarkError.error}`);
    return;
}
  1. pnpm run test:mocha:verbose -- --perfMode
Screenshot 2024-08-29 at 15 12 57
  1. "test:mocha": "cross-env FLUID_TEST_VERBOSE=1 mocha \"dist/test/**/*.js\"" Screenshot 2024-08-29 at 15 13 41

jikim-msft avatar Aug 29 '24 21:08 jikim-msft

With the change in the Reporter.ts, errors are correctly logged using the table.

Screenshot 2024-09-03 at 16 59 33

jikim-msft avatar Sep 04 '24 00:09 jikim-msft

With the change in the Reporter.ts, errors are correctly logged using the table.

Awesome! Can you confirm that things look right if there's a mix of successful and failing tests in a given suite? The successful test should still get the custom data printed, and I'd like to see that the failed tests that won't report those columns still look correct and don't affect the rendering of the successful tests.

alexvy86 avatar Sep 04 '24 16:09 alexvy86

Screenshot 2024-09-04 at 10 44 41

benchmarkCustom() is now able to report all the test results without crashing.

https://github.com/microsoft/FluidFramework/pull/22245#issuecomment-2327650099

Also from the comment above, I wanted to make sure that the error messages from --perfMode and non-perfMode should be different as the events get emitted at different time.

--perfMode: return ; non perfMode: throw error

Both emit benchmark end before running isInPerformanceTestingMode.

jikim-msft avatar Sep 04 '24 17:09 jikim-msft

pnpm run test

image

pnpm run test:customBenchmarks

image

pnpm run test:customBenchmarks -- --perfMode

image

jikim-msft avatar Sep 04 '24 20:09 jikim-msft

🔗 No broken links found! ✅

Your attention to detail is admirable.

linkcheck output


> [email protected] ci:linkcheck /home/runner/work/FluidFramework/FluidFramework/docs
> start-server-and-test ci:start 1313 linkcheck:full

1: starting server using command "npm run ci:start"
and when url "[ 'http://127.0.0.1:1313' ]" is responding with HTTP status code 200
running tests using command "npm run linkcheck:full"


> [email protected] ci:start
> http-server ./public --port 1313 --silent


> [email protected] linkcheck:full
> npm run linkcheck:fast -- --external


> [email protected] linkcheck:fast
> linkcheck http://localhost:1313 --skip-file skipped-urls.txt --external

Crawling...

Stats:
  378023 links
    2980 destination URLs
       2 URLs ignored
       0 warnings
       0 errors


github-actions[bot] avatar Sep 04 '24 23:09 github-actions[bot]

image

image

Verified that the errors are logging in both non-perfMode and perfMode.

jikim-msft avatar Sep 04 '24 23:09 jikim-msft