Massively Reducing the Effort to Test Xtext Languages
Most people agree that to develop a stable and mature software, it is helpful to have automated tests. This also applies for developing Domain Specific Languages (DSL) and their tool support.
However, during the recent years of doing consulting around Xtext, we came to notice that the test coverage tends to be too low to rely on or even non-existent. People perceived tests as too difficult to write and preferred to focus there efforts on the to-be-delivered part of the software.
We will present an approach that makes testing domain specific languages significantly easier by embedding test expectations into comments of the DSL documents that you want to test. The approach is non-invasive for your software and integrates well with existing tooling since it is based on JUnit.
The ease to write test comes from being able to reuse test implementations from a library for standard scenarios such as continent assist and validation. Furthermore, since the test data is actually specified inside your DLS documents, you can use your self-developed DSL editor to edit your test data.
We've been using this approach for several large customers and want to pass on the gained experience, since the approach provides more benefit than simplifying writing of tests: Due to a clear separation of test data, configuration and expectations from the test implementation, the tests become an artifact that is understandable by users of the DSL with no Xtext-knowledge whatsoever. Thereby, the tests become a valuable asset to discuss the language and its properties with your customers.