Dead Simple Rails Metrics with metric_fu
Every time I create a new rails project I usually put off writing tasks to analyze the code's quality 'cause it takes time and time is, you know, finite. So I've decided to extract some code into a rails plugin which I call metric_fu.
It's a bunch of rake tasks that produce reports on code coverage (using Rcov), cyclomatic complexity (using Saikuro), flog scores (using Flog), and rails stats (using 'rake stats'). It knows if it's being run inside a CruiseControl.rb build and puts the output in the Custom Build Artifacts folder so when you view a build you see this:
The coverage report is your standard rcov report:
Flog output is thrown into an html file:
At the end metric_fu calculates the average flog score per method:
You might want to check out my previous posts on what to do with a Flog report: The Method Hit List and When You Should Ignore Metrics
Saikuro's output is the same as always:
(I changed the warning and error levels for this pic -- more on how I did that later)
And 'rake stats' is always useful:
So how do you get all these reports?
1. install Flog
sudo gem install flog
2. install rcov
sudo gem install rcov
3. install metric_fu
ruby script/plugin install \
http://metric-fu.rubyforge.org/svn/tags/REL_0_5_1/metric_fu/
(in the base of your rails app)
4. rake metrics:all
Which should work fine if you have standard Rails testing and you like my defaults. But what if you use a combination of RSpec and stock Rails testing? Then you can insert this into your Rakefile:
If you also want Rcov to sort by lines of code (loc) and have more aggressive cyclomatic complexity settings then do this:
That's it -- hope you find it useful, lemme know if you find a bug, and check out the project home page at:
http://metric-fu.rubyforge.org
Oh, and thanks to all my co-workers who helped write the original code, in its various forms, that became this plugin.
Update 9/22/2008 - metric_fu is now a gem, on GitHub, and is useful for any Ruby Project. Check the home page for current information.
It's a bunch of rake tasks that produce reports on code coverage (using Rcov), cyclomatic complexity (using Saikuro), flog scores (using Flog), and rails stats (using 'rake stats'). It knows if it's being run inside a CruiseControl.rb build and puts the output in the Custom Build Artifacts folder so when you view a build you see this:
The coverage report is your standard rcov report:
Flog output is thrown into an html file:
At the end metric_fu calculates the average flog score per method:
You might want to check out my previous posts on what to do with a Flog report: The Method Hit List and When You Should Ignore Metrics
Saikuro's output is the same as always:
(I changed the warning and error levels for this pic -- more on how I did that later)
And 'rake stats' is always useful:
So how do you get all these reports?
1. install Flog
sudo gem install flog
2. install rcov
sudo gem install rcov
3. install metric_fu
ruby script/plugin install \
http://metric-fu.rubyforge.org/svn/tags/REL_0_5_1/metric_fu/
(in the base of your rails app)
4. rake metrics:all
Which should work fine if you have standard Rails testing and you like my defaults. But what if you use a combination of RSpec and stock Rails testing? Then you can insert this into your Rakefile:
The namespace isn't strictly necessary, but I like it for intentional purposes. Multiple paths are useful if, like on my last project, you need to be specific about which tests to run as some tests go after external services (and the people who manage them get cranky if you hammer 'em a lot).
namespace :metrics do
TEST_PATHS_FOR_RCOV = ['spec/**/*_spec.rb', 'test/**/*_test.rb']
end
If you also want Rcov to sort by lines of code (loc) and have more aggressive cyclomatic complexity settings then do this:
namespace :metrics do
TEST_PATHS_FOR_RCOV = ['spec/**/*_spec.rb', 'test/**/*_test.rb']
RCOV_OPTIONS = { "--sort" => "loc" }
SAIKURO_OPTIONS = { "--warn_cyclo" => "3", "--error_cyclo" => "4" }
end
That's it -- hope you find it useful, lemme know if you find a bug, and check out the project home page at:
http://metric-fu.rubyforge.org
Oh, and thanks to all my co-workers who helped write the original code, in its various forms, that became this plugin.
Update 9/22/2008 - metric_fu is now a gem, on GitHub, and is useful for any Ruby Project. Check the home page for current information.
Comments
I attempted a something similar, it was a mix between metric_fu and autotest. Called it autometric: http://benburkert.com/2007/11/9/introducing-autometric
It was a hack, and the performance hit on my system wasn't worth the realtime-iness of the metrics. If metric_fu is better performance wise, it might be worth taking another go at it.
Like Chad, I'm interested to know whether this can be set up on the CI machine without polluting the rails apps themselves?
CC.rb Plugins
task :my_metrics => ["db:migrate", "metrics:all"]
Then you can just call 'my_metrics' in the cruise control config file.
http://silkandspinach.net/2008/09/23/reek-a-code-smells-detector-for-ruby/
Should I add it in github?