Synchronizing Data During Automated Testing with DB Ghost
We often develop complex applications that required a large number of automated tests and a good chunk of these tests involve making changes to the test data stored in the application’s database. In order to ensure the integrity and reliability of each test, the database would have to be returned to its original state before each test is run, disregarding any data changes made by a previous test. Our existing database deployment manager performed this task accurately, however, due to the large and increasing amount of automated tests the application required, we found a way of making this process faster.
While investigating the feasibility of using another tool to do this job, we came across software called DB Ghost Change Manager. This program provides the functionality to not only compare two databases, but to synchronize the target database with the source database. As this process would only involve comparing the source and target databases and updating the latter based solely on its differences from the former, this seemed like a much faster way of restoring a database to its original state than having to re-deploy it in its entirety.
Although the intention was for the database comparison and synchronization to happen automatically via a command driven trigger, in order to initially set up the files for this to be possible we had to follow the setup process in the DB Ghost Change Manager UI.
The main Tasks screen provided several functions, including scripting database objects and data to files, building databases using scripts, and copying databases, but for the purposes of resetting our environment we only required the functionality to compare two databases to identify the differences, and to make one database the same as another. The image below shows the Tasks screen with these items selected.
Clicking next takes you to another screen, where you can choose the databases to compare. The server name, authentication details, and the name of the database must be selected in order to progress. Our automated tests only included changes to the data rather than the structure of the database, therefore it was only necessary to select “Compare Database Data” from the further options below the database selection.
The next screen provides the option to select what data should be compared. As we required the entirety of our data to be compared, it was only necessary to ensure that three options were selected: Check for missing rows to insert; Check for different rows to update; Check for extra rows to delete. This would ensure that any data that could be found in the source database would be inserted into the target database, and any updated or deleted rows reverted back to their original ‘source’ state. Clicking next on this page displays a Create Files page, where a log file can be specified in order to record the results of the database compare and synchronize. Clicking next again takes you to a ‘Final Process Review’.
This page allows you to see a review of the tasks which will be performed by the DB Ghost Change Manager. Clicking ‘Create CMD line’ at the bottom of the page provides the option to create a DB Ghost Change Manager settings file and a batch file. It is the settings file that we used to incorporate the compare and synchronize functionality into our application.
The generated settings file is essentially an XML file consisting of the various configuration settings required for a database comparison and synchronization, and can be altered using any text editor. Once this file was generated it was simple enough to re-use the file, changing different configuration settings to apply to different databases and environments, rather than having to follow the UI process again. This was convenient because, due to the ever-changing structure of the application, it was very likely that new tables and columns may be added to the database during future changes to the system. The generated file initially contained details for each of the tables to be included in the synchronization, however, it was clear that if new tables were to be added to the database then these would not be included in the process unless a new settings file was generated using the Change Manager UI. We therefore tweaked the settings file by removing the options in the ‘UserSuppliedTableList’ element, which meant that all tables in the source database would be compared.
To incorporate the Change Manager into our application, we created a new project for our solution called DatabaseCompareAndSynchronize. This included a class called ‘Deployer’, which extracted references to the locations of the Change Manager console application as well as the location of the actual Change Manager settings file from the web.config of the main project. The console application path was set as the FileName of the Process’ StartInfo, and the Change Manager settings file as an argument. This process would then be started, and the output written to a log file. A successful compare and synchronize process would return a response with a 200 HTTP status code, and any exception thrown during the procedure would cause it to return a 500 HTTP status code along with the message for the thrown exception. A DLL file was then created for the project, and referenced in the rest of the solution.
The system.webServer section of our web.config file included a new custom HTTP handler that responds to requests to any resource with the extension ‘.dbg’. This handler was defined as the class ‘DatabaseCompareAndSynchronize.Deployer’ in the assembly ‘DatabaseCompareAndSynchronize’.
Once these settings were in place, all that was required to run this process at the end of an automated test was to create a Web Request pointing to the Root URL followed by any path ending with the extension ‘.dbg’. Checking the HTTP status code returned in the response allowed us to determine whether or not the database comparison and synchronization was successful.
The process of comparing and synchronizing the database was around a minute faster per test than our existing database re-deployment operation. The overall process of building and deploying the application and running all of the automated tests took, on average, 6.8 hours to complete prior to the implementation of DB Ghost. The post-implementation average run time for the build was 5.1 hours, giving us a time saving of 1.7 hours. This might not seem like a large enough saving to justify the purchase of a license for the software, but as the number of automated tests grows it will become increasingly more time-effective than the existing setup.