My initial task at Gliffy was to develop the integration API (which is almost ready!). That quickly morphed into me taking primary ownership of the back-end code (this is the part of Gliffy that you don't see; the part the stores diagrams, manages users, creates JPGs of your drawings, etc.). After some initial planning and exploration of the code base, I suggested a few under-the-hood changes that might make the task a bit smoother, at a slightly longer delivery time. Clint, Chris and I discussed some options and we agreed to replace the SQL-based database layer with an object/relational mapping system (this is the cool thing about working for a small organization; decisions can be made quickly and easily, and there's no paperwork :)

Feeling pretty good about the changes, I was suddenly hit with the Fear of Breaking Something. This is why we had to have COBOL programmers come out of hibernation to fix the Y2K problem in 20 year old code; 15 years prior, The Fear had kept that code alive.

The good news is that this Fear can be alleviated with an army of automated tests. The bad news was that we didn't have such an army. We did, however, have some marching orders: Clint had created a detailed set of test scripts that testers had used for previous releases. While these weren't code, but more like "recipes" for using Gliffy, they were a huge help in creating some automated test cases.

Adapting a Gliffy Architecturetechnique I'd used on a previous project, I figured I could record myself executing Clint's tests and then play them back via a script. Using those recorded tests, I could rid myself of the Fear (or, at the very least, turn it into the Shame of Test Failure, which is much better than the Disgrace of Bringing Down the Website).

It turns out, Gliffy's layered architecture made it really easy to insert some code to record the tests. I just needed a small bit of code to record the requests that the Flash application was making, as well as the information the server was sending back.

Recording Tests The second part was to write another piece of code that would essentially pretend to be a web browser; it would read my recorded tests and send the same information to the server that the Flash application did. If my pseudo-browser got back the same data that Gliffy did when I recorded the test, I knew I hadn't broken anything.

It didn't go quite that smoothly, as I had to make my tests a bit smarter, so they could ignore things like timestamps and database keys, but ultimately, it worked out great. I was able to get rid of almost all usage of SQL and replace it with the calls to the Test Playback Java Persistence Architecture (JPA) API, which should serve us well for the foreseeable future. As a bonus, Chris was able to use these tests to remove Struts almost entirely, which not only simplifies our deployment, but significantly reduces the size of the plugin for Confluence.

Interested in the long technical details? They are all on my personal blog.