Combining Spring Boot and Liquibase

We’re now at the point in our application development that includes a deployed production server, and a large list of improvements to the service that need to be implemented. These enhancements will inevitably require changes to the database, and rather than trying to stage these by hand, it would be better to use a tool designed for the task, such as Flyway or Liquibase.

There are plenty of articles on using Liquibase with Spring or Spring Boot, but like most of these tutorials, I’ve found that they usually just provide some simple examples of how to hook it up, then expect you to do the hard work of figuring out how to actually get it to work with a real application. I hope this article will at least provide some better information about the latter.

I spent some time looking at the differences between the two migration tools, and eventually settled on Liquibase. The choice ultimately came down to the ability to support multiple database types as we have been using a variety (mainly postgres and SQL Server) during the development process. Since Liquibase uses an abstraction to describe the database changes, rather than SQL, it seemed like the best approach for us.

Having decided on a tool, the next step was to get it set up with our backend server. Since our application is using Spring Boot, the natural choice was to take a look at what Baeldung has done. Although this looked pretty straight forward, it quickly ran into some issues that are not covered by Baeldung. The first was that there are actually two different times that Liquibase is run. The first is as part of the Spring Boot application, and this is when it checks for any database updates and applies them. The second is when you need to generate database logs; an initial log containing the full database definition, then any change logs when you make a model change. It turns out that both these methods require different configuration setups that I’ll describe now.

I’m not a fan of YAML and Baeldung mentions that the default change log for Spring Boot is db/changelog/db.changelog-master.yaml and this can be modified using the liquibase.change-log property (presumably in the application.properties file.) I found that this didn’t work at all and after much searching, discovered that this option has now been updated to spring.liquibase.change-log for later versions of Liquibase. Another issue I encountered is that this option is only used when running the Spring Boot application. When running Liquibase stand-alone, such as to generate log files, a property called changeLogFile in the liquibase.properties file is used. This properties file is specified in the <propertyFile></propertyFile> entry for the liquibase-maven-plugin in the pom.xml.

Once I’d figured out how to set the location of the change logs, then next thing I wanted to do was keep them in separate files, one file per change log. Eventually I came across this article which shows how to use a master change log file to include the separate changes. I didn’t like the idea of having to update this file for every new change, so I’m using the includeAll option rather than an individual include for each change. If you use this option, then you need to make sure that take care naming the change log files so they will be executed in the correct order.

So now the change log files were set up, the next step was to populate the liquibase.properties file. Going back to Baeldung, there was a nice example. But hang on a minute, what are these username and password entries in the file? We’ve removed all authentication information from the application.properties by using resource filtering. We also have separate properties files for development, test, production, etc. How do you set this up with Liquibase? It turns out there is some documentation for Liquibase that describes how to do it, although it is not very clear. First, you have to make sure that liquibase.properties is covered by the resources plugin. In our setup, we have to specifically add the files that we want filtered, so liquibase.properties needed to be added to this as follows:

<resources>
  <resource>
    <directory>src/main/resources</directory>
    <filtering>false</filtering>
  </resource>
  <resource>
    <directory>src/main/resources</directory>
    <includes>
      <include>liquibase.properties</include>
      <include>application.properties</include>
      <include>application-dev.properties</include>
      <include>application-test.properties</include>
      <include>application-prod.properties</include>
    </includes>
    <filtering>true</filtering>
  </resource>
</resources>

Next, the liquibase-maven-plugin needs to refer to the target version of the liquibase.properties file rather than the source, as this is the one that has been filtered. So the plugin declaration becomes:

<plugin>
  <groupId>org.liquibase</groupId>
  <artifactId>liquibase-maven-plugin</artifactId>
  <version>${liquibase.version}</version>
  <configuration>
    <propertyFile>target/classes/liquibase.properties</propertyFile>
  </configuration>
  ...
</plugin>

Here’s the liquibase.profile I eventually used. I specified an outputChangeLogFile because I wanted to generate an initial changeLog.

changeLogFile=src/main/resources/db/liquibase-changelog.xml
url=jdbc:postgresql://localhost:5432/
driver=org.postgresql.Driver
username=@database.username@
password=@database.password@
outputChangeLogFile=src/main/resources/db/changelog/initial_db.xml

The other thing I found out was that the Maven command to generate a change log does not execute the resource filtering by default. It now needs to include the extra goal resources:resources as well as specifying which profile to enable. So here’s the command I used for my first attempt generate the initial change log:

mvn resources:resources liquibase:generateChangeLog -Pdev

Wow, it worked! It created the initial_db.xml file with a respectable looking set of changes.

Now, I already had a database change planned, so next I wanted to see if I could generate a change log that reflected that change. According to Baelung, it should be possible to use the Liquibase Hibernate plugin to do this without requiring a second database. I added the dependencies indicated, except that I used liquibase-hibernate5 and appropriate versions of the spring-beans and spring-data-jpa dependencies. I also added the following lines to the liquibase.properties file:

referenceUrl=hibernate:spring:server.model?dialect=org.hibernate.dialect.PostgreSQL92Dialect&hibernate.implicit_naming_strategy=org.springframework.boot.orm.jpa.hibernate.SpringImplicitNamingStrategy&hibernate.physical_naming_strategy=org.springframework.boot.orm.jpa.hibernate.SpringPhysicalNamingStrategy
diffChangeLogFile=src/main/resources/db/changelog/db_1_0_1.xml

The key here is the referenceUrl which tells Liquibase which database to use for the diff. By using hibernate:spring:... it’s possible to use your annotated entities directly. I added naming_strategy to the referenceUrl because our database uses underscores in table and column names rather than camel case. The diff file is going to be called db_1_0_1.xml because it’s the first change.

Unfortunately, when I ran the command:

mvn resources:resources liquibase:diff -Pdev

I got the following error (at the end of the usual mass of output):

Cannot find database driver: Driver class was not specified and could not be determined from the url

Well, it looks to me like the url is correct, so what is going on? After much searching, it turns out that I was assuming that the Spring Boot dependencies would be loaded automatically when the mvn command is run. Apparently, however, Liquibase marks a bunch of dependencies as “provided” so it is necessary to include them explicitly. Because of this, Liquibase is unable to locate the class associated with the driver and hence the error message. Again, after much searching, I found this stackoverflow answer which gives some suggestions as to what dependencies are required. I added these dependencies (making sure the versions matched what my version of Spring Boot is using). This seemed to fix the url problem, but now I was getting a different error. This time it was that javax.validate.ValidateFactory could not be found in the classpath. On a guess, I added javax.validation validation-api as a dependency and it finally worked!

Now I had one problem left: how to get my tests working again. I had been using the hibernate.import_files property to set up a test database. However now that I was using Liquibase, it made sense to use it to do the database initialization. Rather than convert my SQL test data to xml, I simply added some Liquibase metadata to the beginning of the file:

--liquibase formatted sql
--changeset user:1 context:test

The first line just indicates that it is a Liquibase file and the second line provides the context that must be set for the file to be loaded. I then added the following line to application-test.properties:

spring.liquibase.contexts=test

Now the Liquibase context will be set to “test” when the tests are being run. I also added context="test" to the <databaseChangeLog> element in the initial change log file, since this should only be run when initializing the test database (or a new database). Finally, the spring.liquibase.contexts property also needs to be set in the application-dev.properties and application-prod.properties files (or whatever ones you have). It doesn’t actually matter what the value is, as long as it’s not empty and not “test”. Annoyingly, an empty value runs all the change logs regardless of their context settings.

I hope this article will help the next person who is trying to get Liquibase and Spring Boot running together in a real application environment.

References:
[1] https://www.baeldung.com/liquibase-refactor-schema-of-java-app
[2] https://medium.com/@harittweets/evolving-your-database-using-spring-boot-and-liquibase-844fcd7931da
[3] http://www.liquibase.org/documentation/maven/
[4] https://stackoverflow.com/a/46414892/3841907

IDE Wars – 1

There’s been a lot of discussion lately about the problems with Eclipse, and how much better other IDE’s like IntelliJ and Netbeans are. I was particularly intrigued by a resent post to the cdt-dev list saying (admittedly second hand) that Netbeans remote debugging “just works”. Having worked on remote support for C/C++ development for many years, I know just how difficult it is. If someone else had solved this problem, then it was time to take another look. It also seemed like a good opportunity to compare the out-of-box Netbeans experience to that of Eclipse.

Heading over to netbeans.org I found the latest version to be 8.2 and a pleasingly large download button. Clicking on this took me to a page that listed the contents of the download bundles. Unlike Eclipse, Netbeans only has six bundles so the choice is fairly simple. I decided to get the “All” bundle since it contained C/C++ and would give me a chance to play with some of the other features. I’m using Mac OS X, so this resulted in a DMG file (disk image) being downloaded. Once mounted, the image contained an Installer .pkg file. One point for Netbeans using the standard Mac OS X installation process rather than Eclipse’s unusual gzipped tar format.

Unfortunately this is where things started to go horribly wrong. After the install completed, I launched the Netbeans app and all hell broke loose. Well to put it more accurately, hell froze over. Netbeans had taken over my entire machine and slowed it to a crawl, to the extent that when I moved the mouse, I had to wait 30 seconds for it to update. Only after about 10 minutes of painstakingly inching the mouse to the dock, right clicking, and quitting the application was I able to restore my computer to normal operation. Not a great first time experience to say the least. After spending too much time fruitlessly searching for information on what might be going wrong, I decided to try plan B.

Back to netbeans.org, and this time I downloaded the C++-only bundle which I duly installed. This time, however, the following dialog popped up during the installation:

Huh? I didn’t get this when installing the full bundle so that is weird. Well at least they provided a link. But what should I do? Exit or disable modules and continue? I opted for disabling the modules in the hope that the installation would complete successfully. The link provided steps for editing the netbeans.conf file which I found inside the application bundle, and setting netbeans_jdkhome to the location of the JDK. The original value was set to the the path of a JRE that is shipped with Netbeans, which is strange if a JDK is required for this bundle. In any case, launching this time was more successful; at least my computer remained responsive.

Unfortunately, by this time I had spent way too much time just getting Netbeans installed and so had to put it aside to work on other things.

To be continued…