Add configuration support
Related to #772 and #179
Supports automatic discovery of VHDL configurations of testbench entities. These are added as VUnit configurations to the testbench. These VUnit configurations have a newly introduced vhdl_configuration_name attribute that, if set, makes VUnit call the simulator with the VHDL configuration name rather than the testbench entity + architecture names.
I provide a minimal reproducible example, about using a VHDL configuration and test cases. Both together does not work. Either the VHDL configuration is used but no testcase are found OR the test cases are found but the VHDL configuration is ignored.
I hope that helps. As soon there is a solution for testing. I am ready.
best regards, Sebastian Joerss
@sjoerss I'm returning to this and had a look at your use case. It's not the style of a classic OSVVM testbench but nevertheless something it is reasonable for us to support.
This PR will also take a new direction. It was built on the assumption that top-level generics can't be combined with configurations. That doesn't seem to be the case anymore when testing with a number of simulators.
The previous approach also introduced reading the runner configuration from file and running parallel threads in different root directories. These feature still have value as they support other use cases but I will release that later in separate PRs.
@sjoerss I think my original approach was too OSVVM use case focused. Your use case is one of a family of use cases where the expressiveness of VHDL is too limited. In VHDL, the configuration is bound to an entity, i.e. a VUnit testbench. However, we also want to be able to bind the configuration to a subset of test cases in the testbench. In your case the subset is the test cases in the entity selected by the configuration but there are also other examples. For example, we can have a standard VUnit testbench with the test cases in the same file and then a set of VHDL configurations selecting what implementation of the DUT to use. We may not want to run all tests on every DUT.
With standard VUnit configurations we express the creation of a configuration and the binding in a single line, for example my_testbench.add_config(...) or my_test_case.add_config(...) but with VHDL configurations we should think of it as two steps. The VHDL code creates the configuration but then we need to bind it to a testbench or a set of test cases. For example, my_test_case.bind_config(my_vhdl_configuration).
This would also allow us to scan for test cases first, before we do the binding. It would also allow us to set other properties of the configurations before binding such as what to generics to use.
@sjoerss Thinking about this a bit more made me realize that your use case is more than just allowing us to bind a VHDL configuration to a test case. Today a testbench file is scanned for test cases or, if the test suite is located elsewhere, VUnit can be instructed to scan another file by using the testbench method scan_tests_from_file. In either case there is only one test suite in each testbench/simulation. In your case there are two test suites in different files and if we simply extend scan_tests_from_file to allow scanning of several files, the testbench would "see" several test suites. You would use configurations to only allow one test suite to run in each simulation but internally there would be two test suites for the testbench and unless we restrict that and keep the clean/standard structure of unit tests, I fear that we're opening Pandoras box.
Rather than going down that road I would prefer that this use case is solved in other ways. What is your reason for partitioning the test cases in two test suites? I know of at least two reasons why splitting test suites can be a good idea:
- Putting all test cases for a DUT in the same file can make the file very large and unmanageable. However, you may still want to reuse the "test fixture" of the testbench (DUT and surrounding verification components such as BFMs and clock generators). In that case I create two testbenches, one for each test suite, and put the test fixture in a separate entity which I instantiate in both testbenches.
- All test cases may fit in a single testbench but some of them may be slow, especially if the DUT is at the system level. In those cases it can be nice to have a basic and fast subset of the test suite (that you run frequently) and then the full test suite is executed once every night. You can achieve that by using VUnit attributes. For example, if you tag the basic test cases with
.basic(the attribute namespace starting without the dot is reserved for VUnit):
if run("Test something basic") then
-- vunit: .basic
...
and then you call:
python run.py --with-attributes .basic
What is your use case and would any of these solutions work?
This PR is replaced by #951 when it comes to adding support for top-level configurations. This PR is kept alive since it also provides functionality for running concurrent simulator threads in different directories. This capability is needed to solve #877 and people are using it as a temporary solution. Once that split of simulator directories has been merged (via another PR) this PR will be closed.
@LarsAsplund Thanks a lot for your good thoughts about how to handle configuration and testcases.
I have attached a simple diagram of our System Level Testbench. The VHDL configuration is used to instantiate the Testsuite and different DUT (design under test) combinations. There are different Testsuites for each DUT combination. Each Testsuite should contain one or several testcases. At the moment there are no testcases possible. Just the testsuite can be executed as one testcase run.
The testbench method scan_tests_from_file is an option for me. You can define the name of one! testsuite file in the run.py file.
Then the testcases in the testsuite are found and they must be bound to the VHDL configuration.
The changes you made in this branch, could it work with my testbench setup? Shall I test it?
It is not possible to have test cases in test suites the way you suggest. My recommendation is to have a testbench for each test suite and then put all your BFMs in a single entity that you instantiate in each testbench to reduce copy and pasting