Before I start, I'd like to tell a bit about me, since it gives a context on why I created Xpect. As you may know, I've been a committer at the Eclipse Xtext project since January 2009. Besides working on Xtext (and Xtend), I have been applying this technology in several customer projects: Project sizes varied from man days to man years and my role varied between designing a language and providing professional support for technical challenges.
All projects have in common that we wanted to develop Xtext languages. Sometimes we were free to design syntax and semantics from scratch and sometimes we were bound to a language specification. In most projects we needed to...
- have automated tests for our language tooling.
- discuss design and implementations decisions.
- explain the language and tooling to fellow developers and prospective users.
- allow our implementation to be reviewed regarding completeness and correctness. I.e. pass an acceptance test.
- set up test stubs for fellow developers to implement tests.
To comply with these use cases I found myself writing plenty of JUnit tests and creating many example documents of the language, often with redundant content. I could not use the JUnit test to explain and discuss the language with others, since the Java code was distracting them from what I wanted to show. Furthermore, I was annoyed that I could not use the language's Xtext editor to craft test data, since the test data was embedded in string literals in the test's Java code. Also, when I wrote example documents, I found myself inserting comments to explain the language: "This expression will evaluate to xxx", "This statement causes error yyy", "This cross reference links to zzz", etc.
The idea arose to converge test cases and example documents: What if I could embed test expectations within comments inside my example documents? What if I could find a format for these comments so they're both human understandable and automatically verifiable by a test suite?
Xpect realizes this idea. The version introduced by this blog post or older versions are actively being used by in-house projects, several customer projects, and open source projects such as Xcore and Spray.
One File, Two Languages
What you can see in the screenshot above is a JUnit that has been executed and which passed. The test class is org.domainmodel.tests.validation.DMValidationTest and it has executed a file named test1.dmodel.xt. The file defines two test cases, one called warnings and one called errors. On the right hand side you can see the file's contents in an editor. The editor generically combines support for two languages: First, the language that is being tested. In the screenshot this is the Domainmodel language (*.dmodel) which ships with Xtext as an example language. Secondly, the editor supports the Xpect language (*.xt). The two languages do not interfere with each other since the Domainmodel language ignores text insides comments and the Xpect language ignores text that doesn't start with an "XPECT" keyword. The editor applies a greenish background color to Xpect syntax.
Let's take a closer look at the Xpect syntax. At the beginning of test1.dmodel.xt, there is a region called XPECT_SETUP. It holds a reference to the JUnit test that can run this file. Further down we find test cases such as:
// capitalized property names are discouraged
// XPECT warnings --> "Name should start with a lowercase" at "Property1"
Property1 : String
The first line is an optional title for the test. warnings references a JUnit test method (implemented in Java) and the part following the --> is the test expectation, which is passed as a parameter into the JUnit test method. The test method can then compare the test expectation with what it calculated to be the actual test value and pass or fail accordingly. For this example, the test expectation is composed of the error/warning message (Name should start with a lowercase) and the text that would be underlined by the red curly line in the editor (Property1). For this validation test, the XPECT statement collects all errors or warning occurring in the next line. A feature you might find very useful is, that the XPECT statement consumes error or warning markers: An expected error or warning will not be shown as an error marker in the Eclipse editor.
Synchronize Expectations and Implementation
The fact that Xpect uses textual expectations and embeds them into DSL documents opens the door for another awesome (IMHO) feature: Using the Eclipse comparison editor to inspect failed tests and to fix out-dated test expectations:
When one or more tests fail and you want to fix them, it is crucial to quickly get an overview over all failed tests. With Xpect you can not only see all failed test cases from one file in a single comparison editor, but there are also no assert statements which sometimes prevent execution of follow-up assert statements and thereby hide valuable hints on why the test failed. The comparison editor, as the name suggests, also lets you edit the test file.
Reusable Test Library
In Xtext projects, there are several scenarios where it is reasonable to have test coverage. The validation test I explained earlier in this article is just one of these scenarios. Xpect ships the Java-part for such tests as a library. There is also an example project that demonstrates their usage.
There are test for:
- The parser and Abstract Syntax Tree (AST) structure (demo only, no library).
- Code generators implemented via Xtext's IGenerator interface.
- Validation: Test for absence, presence, message and location of errors and warnings.
- Linking: Verify a cross reference resolved to the intended model element
- Scoping: Verify the expected names are included or excluded from a cross references's scope.
- ResourceDescriptions: Verify a document exports the intended model elements with proper names.
- JvmModelInferrer: For languages using Xbase, test the inferred JVM model
There will be more tests in future versions of Xpect.
Support for Standalone and Workspace Tests
UI-independent parts of Xtext, such as the parser, can operate standalone (i.e. without OSGi and Eclipse Workspace). The same is true for Xpect. For capabilities where Xtext does not require OSGi or an Eclipse Workspace, Xpect does not do so either. Consequently, Xpect tests can be executed as plain JUnit test or Plug-In JUnit tests.
Since both scenarios require different kinds of setups, both can be configured separately in the XPECT_SETUP section. When executed as plain JUnit test, the ResourceSet-configuration is used and for Plug-In JUnit tests, the Workspace-Configuration is used.
The screenshot also illustrates how an additional file can be loaded so that it is included in the current ResourceSet or Workspace during test execution.
Suites: Combine Tests in the Same File
When explaining a specific concept of a language, it is helpful to look at the language concept from all sides: This is the valid syntax, these scenarios are disallowed, this is how cross references resolve, this is how it is executed, etc. So far, testers were required to create a new test file for every technical aspect. With Xpect test suites, however, it is possible to combine the Java-parts of various tests into a single test suite.
The test suite XtextTests combines several tests so that in CombiningMultipleTests.dmodel.xt test methods from all these tests can be used. This allows to group tests by language concept.
Works with CI Builds
Xpect tests have proven to run fine with Maven, Tycho, Surefire, Buckminster and Jenkins. No special integration is necessary since Xpect tests run as JUnit tests.
Source & Download
You can find a link to the update site at www.xpect-tests.org. The source code and an issue tracker are on the github project page.