Quantcast
Channel: labs@bsb.com
Viewing all articles
Browse latest Browse all 11

Improving code quality with Sonar and Jenkins

$
0
0

Last year, we started an ambitious internal project and since we started more or less greenfield we thought it was a good opportunity to test and integrate new development methodologies. At the time we were already using Jenkins and Sonar but we felt we could get more from those tools and that they were definitely not the only things to consider to achieve our goal.

What is code quality?

The first step is to define precisely what code quality means for us and how we can measure it. We decided to focus on these areas:

  • Duplications (code copy-paste)
  • Rules violations (static analysis)
  • Code coverage
  • Documentation (API)
  • Package tangle
  • Compiler warnings
  • TODOs, FIXMEs and others reminders in code

From a tool standpoint, Jenkins can track compiler warnings and open tasks (TODO, etc) easily. Sonar can actually manage all the rest and much more.

So where do we go from there? Our team is using some of the agile methodologies, in particular the daily stand-up, iteration-based development, iteration review and retrospective. The end goal is not about measuring, it is about controlling how that measure evolves. Everything starts with an initial measure that gives an overview. Alongside the initial measure we need some guidelines to define what the objectives are and how to achieve them. These guidelines should improve over time and should adapt to new requirements.

At the end of an iteration (a development lifecycle), we measure the quality and we take a set of actions for the next one if not all quality criteria are met. Of course, this only works if those actions are known by the whole team and every single team member embraces them.

How do we measure code quality?

Measuring quality is about applying the concept of Continuous Inspection. Just like continuous integration makes sure your project builds and all tests pass, continuous inspection is about looking at your code and notify you when the quality decreases. Both Jenkins and Sonar have the ability to give a trend of what is going on over time.

The plugins we use in Jenkins allow to give an history of compiler warnings and open tasks:

The problem with those graphs is that you need to keep enough builds to get a proper history. By default (especially for the Maven project style), Jenkins stores a lot of data at each build but you can easily fix that by checking the Disable automatic artifact archiving advanced option of the build section in the project configuration. Once you fixed that, you still have the issue that Jenkins does not play nice if a project has a big job history since it seems to parse it when it starts. To fix that, you have to either discard old builds or keep only the latest X builds. In both cases, you’ll be back to the original problem since the history of your project will be lost when you reach that threshold. Luckily there is a solution. In our case, we want to eventually keep one build per iteration so we use the Keep this build forever action on the latest build of every iteration. That way the graph has a deeper history.

Sonar has a wonderful feature called the differential dashboard that basically allows you to diff two quality snapshots. Similarly to Jenkins, Sonar purges those quality snapshots according to a configurable policy. Sonar also allows you to name one particular quality snapshot either by assigning a version to it or an event. What we do is that at the end of each iteration, we assign the name of the iteration to the latest quality snapshot of that iteration.

To do so, go to the project in sonar and in the left menu, choose the History element under the Configuration section. Then choose the quality snapshot and click on the create button in the version column. Once you’ve done that, you can register that version so that it is available in the Time changes drop down list. To do so, go to Settings > Differential views and fill in the name of your version in one of the period fields.

Once we have configured Sonar that way, we are able to diff the current state of the project with the last or the before last iteration:

How do we improve code quality?

Properly configured tools are not enough to improve the overall quality of a project but a feature like the differential view in Sonar is definitely a very good leverage to achieve that goal. On top of these tools, we need Key Progress Indicator (KPIs), guidelines, rules and conventions. But most important of all, we need to do all this as a team. This is why we have built incrementally a team agreement page where we write down the things that we have openly discussed and agreed upon.

Concretely, we have the following code quality checks during the iteration:

  • Jenkins mail notification (using the email-ext plugin) that provides a summary of what is going on: commits since the last build, tests that are failing and the proper excerpt of the console log
  • Sonar notifications that are sent when new violations are introduced in the project
  • A manual email and/or reminder at the stand-up that any team member can trigger if he feels something needs our attention
  • The iteration retrospective where we look at Jenkins and Sonar to figure out together how the iteration went and determine if concrete actions are needed for the next iteration

If the quality objectives are not met, each team member has a concrete objectives for the next iteration: fix 10 items. An item is an generic concept: it defines one rule violation, one compiler warning, one open task, one package tangle, one copy-pasted block, 0.1% of code coverage or 0.1% of API documentation. Obviously, these may sound unfair and not well balanced (fixing a package tangle is probably much harder than fixing a simple rule violation) but on the other hand simple items never last until the end of the iteration. This gives a concrete objective to the team and nobody (including me as team lead) do care about who does what as long as we meet our team objective (6 team members = 60 items to fix).

There is also things that a tool will have a hard time to track: conventions. And code quality is amongst others about consistency. To make sure we are doing the same thing the same way, we have built a set of rules & conventions pages. These pages are also built incrementally and are based on concrete issues and/or inconsistency in the code. These are invaluable when we have to remember what we decided years ago or when a new team mate joins our team.

This process is also continuously improved: any  team member can write a proposal or request a change in the Sonar configuration. Based on a proposal that is fully linkable on our Wiki, we discuss it together at the stand-up, the iteration retrospective or in a specific meeting depending on the complexity of the proposal. Discussing about such things improves our understanding of what others see as good code and why: it helps us to improve our code quality  in general, even for things that are not tracked by the tools.

Wrapping up

Tools like Sonar and Jenkins really helped us to improve the quality of our code on a daily basis. Sonar is very powerful and easy to configure and the differential dashboard is really a feature that makes a huge difference. But having good tools is not enough: we need to make sure that the way we write code and the design principles we apply are consistent. Making that an open process with a formalized trace on the Wiki helped us as a team to achieve our goal.


Viewing all articles
Browse latest Browse all 11

Trending Articles