When dealing with integration tests, specifically around APIs, the goal is be able to test the entire process from the HTTP Rest calls through to the database. Sure you can set up a MySQL database for integration testing and populate it with data, but how do you handle updates to the database? You cannot rollback in an integration environment. The only way to guarantee consistent results would be to start with a fresh database every time the test runs. One nice way to do this would be with an in-memory DB. My first attempt at this failed miserably.

Liquibase DDL CHANGELOG Generation

My team uses Liquibase to maintain its DB schema changes and updates to static data. I tried to leverage that to create the schema for H2. What I’ve been finding out though is that exporting DDL from one DB to another through the Liquibase updates doesn’t work very well. Liquibase likes to create tables in little steps, with table definitions, then alters on the tables to add indexes and foreign key constraints. Unfortunately, or fortunately, I was unable to get Liquibase to export from MySQL to H2.

Hibernate DDL Generation

We also use Hibernate as its ORM and it provides the ability to generate a DB schema based on your Java Entity classes. It may not be the best approach but would work for what we are trying to achieve. One benefit of this over Liquibase is I don’t have to generate the entire DB schema, just the tables the project uses.

With Spring Boot, it is very simple to get an H2 memory DB setup. I won’t go into all the details of setting up Spring Boot, but make sure you have spring-data-jpa and H2 as dependencies in your pom.



Annotate your test class to load a specific integration properties file with the following:


We can have an application-integration.properties file specifically for integration tests with the following properties:





spring.jpa.hibernate.ddl-auto = create


After the DB schema is generated, it needs to be populated. To populate the DB I chose to use a tool called DBUnit, which gives you the flexibility to run queries against an existing DB to gather the data. We already have a JUnit DB that I can run queries against to gather a subset of data used for integration testing, as it may not be possible to load that entire DB into memory. It is very straightforward to use. We can create a simple script that can generate our data for us. This only needs to be run when we need additional data generated. In this scenario, we only want a subset of tables and only a subset of data in those tables, around companyId=1. The output is an XML file that can be imported into a number of different DBs.

{	public class DBDataExport {
	    DataSource datasource;
	    public void exportData() throws DatabaseUnitException, SQLException, FileNotFoundException, IOException
		IDatabaseConnection connection = new DatabaseConnection(datasource.getConnection());
		connection.getConfig().setProperty(DatabaseConfig.PROPERTY_ESCAPE_PATTERN,  "`?`");
		QueryDataSet partialDataSet = new QueryDataSet(connection);
		partialDataSet.addTable("Assignments", "SELECT a.* FROM Assignments a where companyId=1");
		partialDataSet.addTable("Nags", "SELECT n.* FROM Nags n JOIN Assignments a on a.id=n.parentId and n.parentType=1 and a.companyId=1");
		partialDataSet.addTable("AssignmentUserModules", "SELECT aum.* FROM AssignmentUserModules aum JOIN Assignments a on a.id=aum.assignmentId and a.companyId=1");
		FlatXmlDataSet.write(partialDataSet, new FileOutputStream("full.xml"));
		// write DTD file
		FlatDtdDataSet.write(connection.createDataSet(), new FileOutputStream("test.dtd"));

Before your Integration TestSuite starts, you just need to populate your DB with the data. The only thing to note is that the data will be inserted as it is in the file, and foreign key constraints need to be met, so export the data in correct order.

private void setupData() throws Exception
IDatabaseConnection connection = new DatabaseConnection(dataSource.getConnection());
// initialize your dataset here

FlatXmlDataSetBuilder builder = new FlatXmlDataSetBuilder();
builder.setMetaDataSetFromDtd(new FileInputStream("test.dtd"));
IDataSet dataSet = builder.build(new File("full.xml"));
final Map<String, List<String>> tablePrimaryKeyMap = new HashMap<String, List<String>>();
tablePrimaryKeyMap.put("ROLEPERMISSIONS", Arrays.asList(new String[] {"ROLEID", "PERMISSIONID"}));
DatabaseConfig config = connection.getConfig();
config.setProperty(DatabaseConfig.PROPERTY_PRIMARY_KEY_FILTER, new IColumnFilter() {

    public boolean accept(String tableName, Column column) {
        String upper = tableName.toUpperCase();
        if (tablePrimaryKeyMap.containsKey(upper)) {
             return tablePrimaryKeyMap.get(upper).contains(column.getColumnName().toUpperCase());
          } else {
             // you can add default handling if for example all
             // other table ids are named as 'id'
             return column.getColumnName().equalsIgnoreCase("id");
try {

    DatabaseOperation.CLEAN_INSERT.execute(connection, dataSet);

} finally {




This will have your application up and running and using an H2 in-memory DB instead of pointing at a local MySQL DB where you can manipulate the Data as you wish. Each time the tests run it will recreate the DB from your DB export.

For APIs, we could then use something such as Rest-Assured to test the complete functionality easily from Rest calls to the DB.

    public void testGetAssignment() throws Exception{
	.header("Authorization", TOKEN)
	.pathParam("id", 1l)
	.body("data.id", equalTo("1"))
	.body("data.relationships.userModules.links.self", equalTo("/api/assignments/1/relationships/userModules"));