Using pytest asyncio to Reduce Test Execution Time
Milos Kajkut
Posted On: November 30, 2023
66212 Views
7 Min Read
Test execution time refers to the duration it takes for a suite of tests to run within the software testing process. This aspect profoundly impacts various stages of the development life cycle.
When the execution times are shorter, it allows for faster feedback loops, enabling developers to identify and address any issues during the development phase quickly. This agility is crucial in maintaining an efficient continuous integration process, accelerating the release cycle, and enhancing overall software quality.
Additionally, shorter test times optimize resource utilization, enabling support for parallel test execution strategies and positively impacting the productivity and satisfaction of development teams. On the other hand, longer test execution times can hinder development cycles, slow down continuous integration pipelines, and impede overall responsiveness in the software development process. Therefore, optimizing test execution time is critical in achieving streamlined, agile, and high-quality software development processes.
When discussing decreasing execution time for automated test cases, the first thought is parallel execution of test cases. But is a parallel execution the only solution for this problem or is the test execution bottleneck of the test automation framework? We should ask ourselves this question whenever we want to reduce test execution time.
This blog will look into different ways of managing test execution time and how pytest asyncio helps reduce test execution time.
TABLE OF CONTENTS
Methods to Reduce Test Execution Time
When it comes to managing and optimizing test execution time in pytest, two main approaches are prominent: the classical setup and teardown mechanism and harnessing the potential of asynchronous programming using asyncio.
- Using Setup and Teardown mechanism
- Using asyncio mechanism
First, let’s talk about automated test cases. Each automated test case contains three main parts: Setup, Actual Test Case, and Teardown. In the Setup process, we perform all necessary actions to prepare the environment for test execution. Those actions include reading or writing data from a database, performing API requests, reading or writing data from specific files, etc. After the process, we can start with the Test Case execution process, where test steps and assertions will be executed. As a final stage, the process of Teardown will be executed to revert all changes made during the setup and test execution process.
Setup and Teardown in pytest
Setup and Teardown process can be performed by Fixtures. Fixtures in pytest are a very powerful feature, and by definition, they present a Python function that pytest executes before and after the actual test function. So, let’s look at how to make a Fixture:
1 2 3 4 |
@pytest.fixture() def first_fixture(): """" This fixture will return result from 2 + 2 """ return 2 + 2 |
Fixture implementation is fully flexible (which can be concluded from the example above), but the right question is: “Where is the before and after part in this fixture?”. For the setup and teardown process, we need to implement the Fixture as a generator function using the yield keyword. So, everything before the yield keyword will be considered as setup, and everything after the yield keyword will be considered a teardown. This is how it looks in our example:
1 2 3 4 5 6 |
@pytest.fixture() def first_fixture(): """"This fixture will calculate 2 + 2 as setup process and print message as teardown process""" result = 2 + 2 # before actual test function yield result print("Start Teardown") # after actual test function |
After implementing the Fixture, we are ready to write the first test case:
1 2 3 4 5 6 7 |
@pytest.mark.usefixtures("first_fixture") def test_first(first_fixture): """First test""" result_from_fixture = first_fixture assert result_from_fixture == 4 |
So far, we have implemented one Fixture with setup and teardown process and one test case that uses the implemented Fixture. From this point, we can set up a problem that can be solved with pytest asyncio. Imagine a situation where you implemented 200 automated test cases and ran them in parallel, but still, test execution takes too long.
The first step is to detect bottlenecks in the framework or to give an answer to the question, “Where do we have time leakage? ”. For this purpose, we will set up the problem situation like this:
We will add one more test case and modify the test case with time.sleep() to simulate test case execution time:
1 2 3 4 5 6 7 8 9 10 11 12 13 |
@pytest.mark.usefixtures("first_fixture") def test_first(self, first_fixture): """First test""" result_from_fixture = first_fixture time.sleep(2) assert result_from_fixture == 4 @pytest.mark.usefixtures("first_fixture") def test_second(self, first_fixture): """Second test""" result_from_fixture = first_fixture time.sleep(2) assert result_from_fixture == 4 |
Also, to simulate precondition functions, we will add to the function that time execution will also be simulated with time.sleep():
1 2 3 4 5 6 7 8 9 10 |
def first_precondition_function(): logging.info("First Precondition function called") time.sleep(3) logging.info("First Precondition function finished") def second_precondition_function(): logging.info("Second Precondition function called") time.sleep(3) logging.info("Second Precondition function finished") |
At the end, let’s modify the Fixture to execute precondition functions:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
@pytest.fixture() def first_fixture(): """"This fixture will calculate 2 + 2 as setup process and print message as teardown process""" result = 2 + 2 # before actual test function # precondition functions logging.info("Precondition started") first_precondition_function() second_precondition_function() logging.info("Precondition finished") yield result logging.info("\nStart Teardown")# after actual test function |
From the execution report, we can see the following results:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
============================= 2 passed in 16.19s ============================== 2023-11-05 14:40:19,102 [INFO] Precondition started 2023-11-05 14:40:19,103 [INFO] First Precondition function called 2023-11-05 14:40:22,103 [INFO] First Precondition function finished 2023-11-05 14:40:22,103 [INFO] Second Precondition function called 2023-11-05 14:40:25,104 [INFO] Second Precondition function finished 2023-11-05 14:40:25,104 [INFO] Precondition finished PASSED [ 50%]2023-11-05 14:40:27,107 [INFO] Start Teardown 2023-11-05 14:40:27,109 [INFO] Precondition started 2023-11-05 14:40:27,109 [INFO] First Precondition function called 2023-11-05 14:40:30,110 [INFO] First Precondition function finished 2023-11-05 14:40:30,110 [INFO] Second Precondition function called 2023-11-05 14:40:33,111 [INFO] Second Precondition function finished 2023-11-05 14:40:33,111 [INFO] Precondition finished PASSED [100%]2023-11-05 14:40:35,114 [INFO] Start Teardown |
As a starting point, we should detect and count bottlenecks in the framework:
- 1st Bottleneck: Assume 2 seconds for test execution is the best time we can achieve with the current test implementation. For the execution of two test cases, we need 4 seconds.
- 2nd Bottleneck: Assume that 3 seconds for precondition execution is the best time we can achieve with the current implementation. For execution precondition functions, we need 6 seconds per test case.
We have set up the problem situation and can discuss possible solutions for the bottlenecks. The solution for the first bottleneck is obvious: we need to introduce parallel execution, and we will reduce test time execution from 4 seconds to 2 seconds. But, with parallel execution, we will not solve the second bottleneck, and there are an additional 6 seconds in total time execution.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
============================== 2 passed in 9.54s ============================== test_blog.py::TestBlog::test_second 2023-11-05 14:57:38,953 [INFO] Precondition started 2023-11-05 14:57:38,953 [INFO] First Precondition function called 2023-11-05 14:57:41,954 [INFO] First Precondition function finished 2023-11-05 14:57:41,954 [INFO] Second Precondition function called 2023-11-05 14:57:44,956 [INFO] Second Precondition function finished 2023-11-05 14:57:44,956 [INFO] Precondition finished [gw0] [ 50%] PASSED test_blog.py::TestBlog::test_first 2023-11-05 14:57:46,959 [INFO] Start Teardown 2023-11-05 14:57:38,964 [INFO] Precondition started 2023-11-05 14:57:38,965 [INFO] First Precondition function called 2023-11-05 14:57:41,966 [INFO] First Precondition function finished 2023-11-05 14:57:41,966 [INFO] Second Precondition function called 2023-11-05 14:57:44,967 [INFO] Second Precondition function finished 2023-11-05 14:57:44,967 [INFO] Precondition finished [gw1] [100%] PASSED test_blog.py::TestBlog::test_second 2023-11-05 14:57:46,970 [INFO] Start Teardown |
Asyncio Module in Python
Asyncio is a powerful Python module that enables developers to write asynchronous code by leveraging the intuitive async/await syntax. This module is especially valuable when dealing with I/O-bound operations, including network requests, file operations, and other tasks requiring external resources. Introduced in Python 3.4, the asyncio module has emerged as an integral component of Python’s robust asynchronous programming capabilities. It provides a structured and consistent approach to asynchronous programming using coroutines, the event loop, and other related concepts.
How to use asyncio with pytest?
Our goal with pytest asyncio is to run two precondition functions concurrently. To achieve such a goal, it is necessary to create coroutines from precondition functions:
1 2 3 4 5 6 7 8 9 10 |
async def first_precondition_function(): logging.info("First Precondition function called") await asyncio.sleep(3) logging.info("First Precondition function finished") async def second_precondition_function(): logging.info("Second Precondition function called") await asyncio.sleep(3) logging.info("Second Precondition function finished") |
To run coroutines concurrently, we need to use await asyncio.gather:
1 2 3 4 5 6 7 8 9 10 11 12 |
@pytest.fixture() async def first_fixture(): """"This fixture will calculate 2 + 2 as setup process and print message as teardown process""" result = 2 + 2 # before actual test function # precondition functions logging.info("Precondition started") await asyncio.gather(first_precondition_function(), second_precondition_function()) logging.info("Precondition finished") yield result logging.info("Start Teardown") # after actual test function |
At the end, we need to make test functions also coroutines. To do that, pytest provides two possible solutions:
- The first solution is to mark test functions with @pytest.mark.asyncio and use the keyword async:
- The second solution is to set up the parameter asyncio_mode=auto in the pytest.ini file. In that case, we do not need to decorate the test function with marker @pytest.mark.asyncio:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
@pytest.mark.asyncio @pytest.mark.usefixtures("first_fixture") async def test_first(self, first_fixture): """First test""" result_from_fixture = first_fixture time.sleep(2) assert result_from_fixture == 4 @pytest.mark.asyncio @pytest.mark.usefixtures("first_fixture") async def test_second(self, first_fixture): """Second test""" result_from_fixture = first_fixture time.sleep(2) assert result_from_fixture == 4 |
1 2 |
[pytest] asyncio_mode=auto |
Now, see the results:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
============================== 2 passed in 6.50s ============================== test_blog.py::TestBlog::test_first 2023-11-07 22:04:22,972 [INFO] Precondition started 2023-11-07 22:04:22,972 [INFO] First Precondition function called 2023-11-07 22:04:22,973 [INFO] Second Precondition function called 2023-11-07 22:04:25,985 [INFO] First Precondition function finished 2023-11-07 22:04:25,985 [INFO] Second Precondition function finished 2023-11-07 22:04:25,986 [INFO] Precondition finished [gw0] [ 50%] PASSED test_blog.py::TestBlog::test_first [gw1] [100%] PASSED test_blog.py::TestBlog::test_second 2023-11-07 22:04:27,989 [INFO] Start Teardown 2023-11-07 22:04:22,970 [INFO] Precondition started 2023-11-07 22:04:22,971 [INFO] First Precondition function called 2023-11-07 22:04:22,971 [INFO] Second Precondition function called 2023-11-07 22:04:25,969 [INFO] First Precondition function finished 2023-11-07 22:04:25,969 [INFO] Second Precondition function finished 2023-11-07 22:04:25,970 [INFO] Precondition finished 2023-11-07 22:04:27,973 [INFO] Start Teardown |
From the results, we see that by the concurrent approach, we reduced execution time by 3 seconds and, in total, 10 seconds.
Conclusion
In conclusion, we delved into two powerful strategies, each tailored to optimize the efficiency of test execution in pytest. The approach of setup and teardown provides a structured means of preparing the testing environment before execution and cleaning up afterward, ensuring a consistent and controlled context for each test. Adopting asynchronous programming through pytest asyncio introduces a dynamic paradigm, particularly advantageous for handling I/O-bound tasks and achieving concurrent execution.
By exploring both these approaches, you can balance meticulous test preparation and swift execution, ultimately reducing test execution time and enhancing the overall effectiveness of the testing process. Whether leveraging traditional setup practices or embracing the asynchronous capabilities of asyncio, pytest offers a robust framework for optimizing test workflows in diverse testing scenarios.
Got Questions? Drop them on LambdaTest Community. Visit now