Among the admirable hallmarks of software developers is that they always write tools to automate the boring or repetitive stuff. Tools can certainly assist in the code review process -- particularly in ensuring that the code adheres to corporate style, collecting metrics and applying departmental programming policies -- but some caution that you shouldn't depend overmuch on them.
In particular, look at automated tools as a way to help collect and manage information before or during a code review, or learn the structure of the software-not to do the analysis for you. "Software will help the code review process, but initially your investment will be in trimming down the rule set to get the most value out of the tool," says Micheal Lalande, director of technology at QLogitek, a SaaS supply chain solution provider. "Work with the smallest set of rules as possible, so that you can tune out the noise. As you tune the rule set, you will find that software will become better and better within the organization."
The tools exist to automate the things that computers do better, such as checking naming conventions or comments, so developers don't have to waste their time on those details. "Some of these things can be caught by tools like FXCop, for instance, so a senior programmer and a group of peers should not be going over them," says Christopher Buchino, director of software engineering at GotVMail Communications.
J. Schwan, managing partner of Solstice Consulting, points out that while software can help automate the creation of unit test cases and ensure code coverage in unit test identification, the tools can't replace code reviews. "There aren't any tools to my knowledge that can offer perspectives on opportunities for code reusability and efficiency. This is where the human mind prevails over the CPU and should be leveraged accordingly," says Schwan.
Or to put it in different terms: Someone once asked Fats Waller, in the early days of boogie-woogie piano, how he kept the right hand from doing what the left hand was doing. He said, "That's what I'm in the middle for."
But do turn to the tools for where they can help the team. Lalande suggests you run all code through an automated tool as often as possible. "This will handle the low hanging fruit, and will train developers in best practices for development. Focus this automated review on rulesets that you care about, as most code review tools cover more non-functional requirements than you are concerned about in your project." Doing so reduces the size of the report and provides more relevant results. Plus, he says, "Review all automated test results before making changes. You cannot take the results of an automated code review tool as gospel. Since these tools are designed to cover general cases, there may be specific reasons that you are breaking those rules."
Naturally, the people we spoke with whose companies sell these tools see a major role in the process. (That's okay. These guys only put kibble in their kitty's bowls when they create software you're willing to pay for. They have reason to lie awake at night thinking about how to make their software useful.)
Jason Cohen, founder of Smartbear Software explains, "I do not believe a tool 'makes' you find more bugs or builds a team. But I do believe a tool can remove busywork and mundane tasks so code review is more efficient and therefore people like it more, and also maybe then they'll actually do it."
Doug Carrier, the Compuware Devpartner product manager, points out that automated code reviews subject everyone's code to the same set of rules regardless of who wrote the code. (That might be a benefit in eschewing political disharmony; it's not us criticizing your code, it's the software!) "Developers who have on-demand access to the same static analysis technology that is used to review their code during an automated build/test cycle can dramatically improve code compliance and maintainability while minimizing downstream problems," says Carrier. "Such problems can be disruptive to the business and very expensive to fix once an application goes live. Supplementing the peer code review process with static analysis tools can also raise the confidence and proficiency of developers at all levels and can help keep projects on track!"
But it isn't only the vendors who see value in the tools. Ben Sweet, principal engineer at Lear Corporation, says, "Don't make reviewers review anything that can be automated." Code should compile successfully before it is reviewed; static source code analysis tools can review against industry-standard guidelines. "Yes, they may cost money, but a roomful of developers reviewing code is not free!" Sweet says. "Make sure that they are reviewing code that is worthy of their effort."
Tim Rosenblatt, the Agile Development Director at Cloudspace appreciates automated testing tools because they can be run by the coder as part of their day-to-day work, and they provide an objective metric that ensures all code is held to the same standard. "We've used some automated testing tools previously, and I plan on continuing to use them. There are code review tools for most languages that do a good job, but it's important to realize that an automated tool can only go so far. Good code could still be given a bad score for reasons that a human will understand."
James Pitts, VP of development and program management at Embarcadero Technologies, says the most important role for tools in code reviews is to enforce the workflow in your bug tracking system and project management. " Every bug and task should be reviewed," he says. "Optional code reviews eventually means no code reviews, especially when there is a lot of pressure to get a release out and it is most important to you not to introduce problems." Plus, he says, tools can find some problems like language misuse or interface issues easier than people can; "However, I would always use both, because people can catch problems with the intended purpose and apply mentorship to less skilled developers."
Several open source tools are available, Pitts says, which can automatically review code. "Commercial packages can do a better job, but something is better than nothing. Just don't ignore the warnings!" he adds.
Manoranjan (Mano) Paul, the software assurance advisor for (ISC)Â², which specializes in information security education and certification (its newest certification is the Certified Secure Software Lifecycle Professional), cautions that code review software only detects issues matching the patterns pre-designed within the software before the review is conducted, and the tools don't address design issues. "This could lead to a lot of vulnerabilities in code going undetected," Paul says. Despite this, Paul says, the tools can help developers zone in on the portions of code that need most attention, as along as everyone understands they do not replace the human intellect. "Vendor claims can be trusted but must be verified," Paul says. "An effective process is to start with software to assist with code review, capture the issues, followed by a manual code review of design and code not covered by the code review software."
Still, crank up your cynicism. E. William Horne, systems architect at William Warren Consulting advises, "Be very leery of 'magic' tools and those selling them; I've seen good men's careers ruined by dependence on magic that never materialized."
These aren't the only tools you can use, obviously. I encourage you to recommend tools you've found particularly helpful in the comments below.
Now that we have the tools in place, it's time to get people into the conference room, looking at the code and doing a proper evaluation. We cover that in How to Lead a Code Review.