JavaScript test performance: getting the best out of Jest

No Comments

In recent years Jest has established itself as the go-to testing framework for JavaScript and TypeScript development. It provides a complete toolkit (test runner, assertion library, mocking library, code coverage and more) out of the box, and requires zero or minimal configuration.

Jest tests are executed in a Node.js runtime, and jsdom (also included) is used to simulate the browser API where required. This usually brings about significant performance advantages in comparison to browser-based testing libraries, such as Karma, the default library for Angular.

Yet, having said that, we often come across Jest performance problems in larger projects, in particular those which are part of monorepos or have large dependencies. Such performance issues tend to be further exacerbated on Windows machines. In one project where we were recently called in to help out, the developers were having to wait so much time to execute their Jest tests (even individual tests) that they tried to avoid writing new ones wherever possible. This is a certain cause of declining software quality and should, of course, be investigated. A fast feedback loop is one of the key benefits of automated testing and is a necessary precondition for test-driven development.

We spent some time exploring the reasons for such performance problems and made some interesting findings. In this article I will share those findings and explain how we went about tackling the problems.

Test profiling

ndb is a Node.js debugging tool developed by the Google Chrome team. As Jest tests execute in a Node runtime, we can use this tool to analyse test performance and identify bottlenecks. I’ve prepared a simple demo project to illustrate how this works.

The demo project is based on a basic React app bootstrapped using the well-known Create React App. After installing the dependencies, we can start an ndb profiling session in a Chromium browser by running npm run profile. Then, in the scripts side bar under the sources tab, we can record profiling information for the various npm scripts (which contain sample test cases) by clicking the ⏺ button next to those scripts.

Test case 1 is a benchmark which runs the sample test against the unchanged sample application generated by Create React App.

As a general comment, please note that all test executions are significantly slower when profiled using ndb. On my Mac M1, for example, the benchmark test took 0.7 seconds when run normally, whereas it took 1.7 seconds to run with ndb profiling.

However, we can immediately see some interesting results from this benchmark test. Although the total test time was 1.7 seconds, only 109 ms was required for the actual test file (App.test.tsx) itself. The rest of the time was required for initialising the testing environment: Jest’s own initialisation code, instantiating the jsdom environment, and running Create React App’s default setup file (setupTests.ts):

The problem with module imports

When we analysed our customer’s tests, we soon noticed a recurring problem: that the most significant impact on performance was the number of modules which needed to be imported by the tests in question. Jest creates a new module registry for each test file and has to crawl through the whole dependency chain of modules which are imported by the test suite, even if they are completely unused. It is easy for developers to overlook this fact given that build tools such as webpack can eliminate dead code via tree-shaking, but this is only the case for production builds and does not help with test execution or development servers.

By way of example, we can reproduce this problem in test case 2 by importing and rendering a button component from the Material UI React component library. In this test case, we import the component via the library’s barrel file, i.e. the dependency’s root index.js file which re-exports all of the library’s public modules.

In this case we see an immediate increase in test execution time from 1.7 to 2.7 seconds. The actual test file (AppWithMuiRootImport.test.tsx) took 1.1 seconds to execute, as opposed to 109 ms in the benchmark test. Nearly all of this additional time was associated with importing the entire Material UI component library, even though we are only using a single button from this library:

Barrel files are very common in JavaScript development (both for internal and public packages) and other users have reported similar problems when importing their dependencies from large packages in this way. Their effect on test performance is not mentioned in the Jest documentation and many developers are not aware of the potential consequences.

Tips for improving test performance

Avoiding barrel files

In test case 3 the same button component is rendered, but we import the file directly from the package’s button folder instead of from the root index.js. In this way we are able to reduce the test time from 2.7 seconds to 2 seconds, saving 0.7 seconds.

In this simple example we are only importing from one large dependency. But in more complicated examples in our customer’s project, where multiple large dependency chains were imported in a large number of modules under test, the performance effects of barrel file imports were exponential, and had very significant consequences for test execution and development time.

It may therefore be beneficial to use deep imports, or to split barrel files for internal packages up into smaller chunks, in order to make the performance of dependency imports more manageable.

Mocking

As a quick win, we looked at ways of improving test performance by mocking dependencies:

Automatic mocking

Users of Jest will be aware of the ability to automock a module by calling jest.mock(moduleName). We can verify the effect of such automocking on test performance in test case 4 by mocking out Material UI.

This has no effect on test performance, because Jest’s automocking still has to import the module in order to determine its exports and generate mocks for each of those exports.

Factory mocking

In test case 5 we repeat the mocking of Material UI, but this time by providing a factory function for mocking the imported module:

In this case we manage to completely eliminate the module crawling time for Material UI, and reduce the test execution time to that of the benchmark test, because the factory function avoids the need for Jest to carry out its own automocking.

This approach can help to speed up suitable test cases or improve test performance during the refactoring of legacy code.

Good code design

In many cases the presence of a large number of module imports is a symptom of bad code design. We encountered, for example, a number of tests that were ostensibly unit tests intended to test application logic, but where the code structure was so bad that they resulted in all sorts of imports and side-effects such as the instantiation of state management stores, network requests being attempted, errors being thrown, and other unintended consequences.

Detailed advice on improving code design is beyond the scope of this article, but some areas that might need to be looked at include:

  • Focus on separating the concerns of your application by using architectural patterns such as hexagonal architecture and flux/redux concepts.
  • Where possible abstract external dependencies and use dependency injection principles, so that your code can be independently unit-tested.
  • Avoid large global helper objects which are directly imported and used throughout the application.
  • Avoid importing modules which carry out low-level configuration and initiate side effects. Such configuration should only be carried out at the application root.
  • Avoid circular dependencies and use tools such as dependency-cruiser to validate and visualise the dependency graph for your application.

Node vs jsdom test environments

By default Create React App uses the jsdom test environment for running tests. As jsdom incurs some setup costs, it is often possible to speed up non-UI test suites, which do not require a DOM object, by setting the test environment to node.

Jest is slow on Windows

Unfortunately Jest tests run significantly slower on Windows machines than on Mac OS and Linux due to slower crawling of the Windows file system. In one extreme example involving a huge number of imports we found a test which took 5 seconds to run on a Linux system and 33 seconds on a similarly-powerful Windows machine. Other users have reported many similar performance problems on Windows.

Colleagues with a Windows machine were able to bypass these performance problems by installing and running their tests on Windows Subsystem for Linux (WSL2).

Otherwise developers may wish to consider other test runners such as Mocha, where better performance has been reported.

Summary

Jest is a very powerful testing tool, but due its module-importing process it can suffer from serious performance problems in some scenarios. This article has hopefully shed some light on how such scenarios can come about, how they can be minimised, and how we can analyse and improve performance.

Edward Byne is a software developer and consultant at codecentric AG. He feels comfortable working with both frontend and backend technologies. He places great value on writing clean, understandable and test-driven code and implementing practical and efficient solutions for his customers.

Post by Edward Byne

Agilität

Quality means teamwork

Comment

Your email address will not be published. Required fields are marked *