Demo of frontend JS unit testing
Fixes #7152
What changes did you make?
- Selected Jest as the test framework
- Added "hfla_test" container that installs Jest and test dependencies, then runs all tests
- Added some minor refactor to two frontend JS files to extract pure-logic functions
- Added a Jest transformer that renders Jekyll front-matter and Liquid into pure JS
- Added two unit tests for pure-logic functions
- Added two integration tests for DOM/Jekyll-based
- Added GHA workflow to run the tests on PRs targeting gh-pages
Why did you make the changes (we will use this info to test)?
Test Framework Selection: Jest
I landed on targeting Jest for this infrastructure. Compared to the other options I was considering (Mocha, Jasmine, Vitest), Jest still meets the "widely used" criteria the best -- see here: https://npmtrends.com/jasmine-vs-jest-vs-mocha-vs-vitest
Over the course of this development, I came to the conclusion that we could probably use any of these frameworks successfully -- but Jest has some nice features already built in like code coverage, function mocking, and assertions that some other frameworks don't, and would require more dependencies to include.
Vitest was probably the next best choice. I read that Expunge Assist is transitioning from Jest to Vitest due to some React dependency issues, and Vitest is growing in popularity. However, we don't have that dependency issue, Jest is more ubiquitous, and we aren't using Vite, so I think Jest is still a preferable choice for us.
Testing Environment
I opted to create an additional container in our docker_compose.yml so that every dev will have access to the same testing environment. When this container is launched, docker pulls down node:20-alpine, installs npm, jest, and some other libs I used in test, and then runs all Jest tests.
The other notable thing in the Jest "environment" setup is the transformer, jekyll-js-transformer.js. Our frontend JS scripts use frontmatter and LiquidJS statements that are rendered away in Jekyll, but that means we can't import them as-is into JS tests without hitting syntax issues. I debated just using the Jekyll build output, but it's slow and doesn't play nicely with CI, so I opted to leverage Jest's "transformer" functionality. This transformer runs on any files that match the pattern described in the jest.config.js file, then essentially performs a render of those files using the real Jekyll data in the repo. Fortunately, the CJS-syntax-transformer works on both CJS and ESM targets!
Not being able to dynamically adjust the data in the render was a pain point for me, but ultimately all the solutions I came up with to allow for injecting custom test data (rather than the actual data) were super complicated, fragile, and slow, so I abandoned that effort. I think time is better spent refactoring Liquid/etc. out of logic that needs to be tested to that degree.
Code Updates
There are two things that I think need to be done to the actual frontend JS scripts in order to make them more testable -- and I did some of that on my "examples"
- Files that use ESM syntax should have the extension
.mjs. Since we use both CJS and ESM in our frontmatter scripts, the test framework (and test writers!) need a way to know what syntax is being used in the target. - Functions that needs to be unit tested needs to be exported by the source so they can be imported by the test. The way I went about this was by creating seperate
-utilfiles that contained the exported files, that the original source then imported, SO that I didn't change the public API of the original sources. However, we could also do something like a conditional export of those functions if we are in a "test mode" or something... so I'm very open to opinions on that.
Tests
The scope of "unit testing" varies a lot depending on who you ask. I decided to break the potential scope of useful and relevant unit tests into two levels:
- "Unit" Tests: the unit-under-test in each test is an individual function, and test files should contain all the tests for the relevant functions in a given source file. Functions in this category should ideally not interact with the DOM or have any state or global side-effects. I'd also probably prefer if they didn't have Liquid or Frontmatter since that adds some dependency outside the function itself.
- "Integration" Tests: the unit-under-test is a complete script as it would be loaded in an HTML file. Test files should focus on an individual source file, and (if needed) provide a simulated DOM and expect that the source will be translated with real Jekyll data.
I provided two examples of Unit Tests:
-
current-projects-utils.unit.test.jsto show an example of unit testing a set of fairly complicated functions from a CJS-syntax-script -
vrms-events-utils.unit.test.mjsto show basic testing of an ESM syntax unit script.
I also created two examples of Integration Tests:
-
hamburger-nav.intg.test.jsto show testing a script that interfaces with the DOM heavily -
vrms-events.intg.test.mjsto show testing a script that depends heavily on Jekyll-injected content
I'm not ready to claim that these scripts are completely sufficient, but they did produce ~100% branch and statement coverage on the target files, so I think they are representative.
Running Tests as a Developer
The tests run automatically when you launch the docker container. This can be done with the following command:
docker compose up hfla_test
Expand for a screenshot of a test run. (Darn that last branch I can't test!)
Running Tests as CI
In addition to the developer environment, I've added the Jest call to the workflow that triggers on PR opening and updates targeting gh-pages, so the tests will run in those cases as well. The workflow steps are basically identical to the steps in setting up the new docker container -- install npm, install dependencies, run the tests.
The tests are run on the source branch rather than the target branch, so I hope they run on this PR!
Currently Known Gotchas
- With the current configuration, we need to pull down node:20-alpine and install the dependencies for each time you want to re-run tests. There is some caching that is done, but it still feels like a lot of unnecessary work. It would probably be easier to just have a step where we build a docker image from node:20-alpine where we install our dependencies, then the container just calls that image... but that felt out of scope. Still, I think there's room for optimization here
- Mapping coverage "Uncovered Line #s" to the actual file lines is a little wonky. Jest is reporting the line numbers from the transformed file, which in this setup is not ever actually visible to the user. You have to do some reasoning to match line-numbers with source file line numbers. This could probably be helped by storing the transformed files somewhere accessible to users, OR by creating the text modification map and handing that to Jest, but for now that's the situation
Summary
I'm VERY interested in differing opinions or thoughts on how this can be better. I'm not married to really anything I've done here, and I think there are several ways it can be done differently -- and I'm super happy to change or re-do things. I'd also understand if this PR can't be merged as-is because it has several not-necessarily-trivial changes to source code.
All that said, I think this framework is pretty functional and I'm stoked to share it. I think that having some tests in place will make things like refactoring a lot easier, since we'll have the behavior already codified and an instant way to see success/failure. (I'm already eyeing that createFilter function in current-project-utils.unit.test.js).
Thanks for taking a look.
CodeQL Alerts
After the PR has been submitted and the resulting GitHub actions/checks have been completed, developers should check the PR for CodeQL alert annotations.
Check the PR's comments. If present on your PR, the CodeQL alert looks similar as shown
Please let us know that you have checked for CodeQL alerts. Please do not dismiss alerts.
- [ ] I have checked this PR for CodeQL alerts and none were found.
- [x] I found CodeQL alert(s), and (select one):
- [x] I have resolved the CodeQL alert(s) as noted
- [x] I believe the CodeQL alert(s) is a false positive (Merge Team will evaluate)
- [ ] I have followed the Instructions below, but I am still stuck (Merge Team will evaluate)
Instructions for resolving CodeQL alerts
If CodeQL alert/annotations appear, refer to How to Resolve CodeQL alerts.
In general, CodeQL alerts should be resolved prior to PR reviews and merging
Screenshots of Proposed Changes To The Website (if any, please do not include screenshots of code changes)
- No visual changes to the website
Want to review this pull request? Take a look at this documentation for a step by step guide!
From your project repository, check out a new branch and test the changes.
git checkout -b ryanfkeller-frontend-unit-tests gh-pages
git pull https://github.com/ryanfkeller/website.git frontend-unit-tests
One additional comment -- setting up the framework to enable the demos required in the issue was pretty complicated. I think that this issue was quite a bit bigger than 2pts. For reference, the research, experimentation, and implementation of this PR took me somewhere around 30-40h. I'd say at least 20h of that was strictly necessary. The sister task #7151 is likely to be similarly challenging and should maybe be scaled up.
@ryanfkeller I updated the original issue size and complexity to reflect the actual time it took, thanks for bringing that up. Looks like Will is assigned to #7151 so hopefully he'll have a head start from looking at your work here.