Recently, I was asked to head up a 2 man team to deliver our Solr based search infrastructure, to move it from our existing creaking database.
It was a great project that was really enjoyable to work on and was delivered quickly with a large amount of success. So I just wanted to say a few words about why.
Focused Kanban Board
The first job was to create our own Kanban board, with a work in progress(wip) limit of 1. This would allow us to pair on one job at a time helping maintain focus and also making sure knowledge of a particular piece of work was adequately shared.
The second job was to take any existing search jobs from the backlog and analyse them to see if we could split them in any way. We had a concept of Yellow cards for large tasks that had not been “Split” yet, and then blue cards for each sub-task which we would then work on within our limit. Blue cards were strictly only allowed to be created once a yellow card was moved into our ‘In Development’ column, unless there was a very good reason. Again, this was to help maintain focus.
The result of this was that we were able to get work out very quickly indeed as we could clearly focus on one job at a time; if any “blockers” occurred, we would stop until we got the problem resolved, or re-prioritise our other blue cards. Another plus was that it was easy for other members of staff to see exactly was we were working on as our board was not littered with noise.
Finally, we kept our usual pink cards for production bugs, which would always have to jump the queue. Luckily we were able to keep these to a minimum.
Clearly defined roadmap
Another great help to us was the fact that we had a very clearly defined brief; simply put – we needed to replace the current search system for a better one. Acceptance criteria were clearly laid out and easy to understand and build initial tests from, leading to a very successfully acceptance test led project. This meant we were able to deliver manageable slices of the new Search-Api endpoint by endpoint and get quick feedback in the real world.
This allowed us to focus on delivering the work from a technical point of view, without having to worry overly about usability issues. As long as we adhered to the acceptance criteria, we were OK.
Clean and test driven modular code from the off
As this was an entirely new project that existed as a loosely coupled ‘sub’ api called over HTTP from our existing api service, we were able to work in a largely greenfield way. We used OpenRasta to offer an immediate restful interface around the solr backed search which allows you to work very quickly. It also allows you to work in a way that encourages small, loosely coupled classes, which again helps with maintainability and speed of development.
Builds and deployed were handled by our usual TeamCity build system, with a stripped down set of build projects and no release branch. We used the concept of pinning the latest stable build to deploy into our near live environment, and once signed off we could deploy straight into live. This enabled us to release small changes often.
The whole project was properly test driven from the off, with a focus on keeping the tests fast. Our acceptance tests used a dedicated solr instance to inject records at testfixturesetup time to enable us to run a set of tests against a given setup. Solr is lightning quick, and this method of setting up our tests environment on the fly and tearing it down at the end allowed us to write extensive and robust tests whilst maintaining running speed. All of which gave us heightened release confidence.
Once we began to switch the functionality over to the new service we have been able to easily deal with our recent upsurge in demand, with incredibly fast response times (<25ms) and much more relevant search results!
Next stop, our entire track catalogue gets the Solr treatment.