[ANN] MetricsReloaded 0.1 released

Announcing the release of MetricsReloaded 0.1. MetricsReloaded is a plugin to IntelliJ IDEA which calculates and displays a large number of source code metrics for any IDEA project. MetricsReloaded is designed for use in project tracking, quality assurance, and developer training. Over 100 metrics are currently available at the project, module, package, class, and method levels. Metrics can be sorted, graphed, compared with previous metrics runs, and exported to file. MetricsReloaded is available in binary form via the Plugin Manager, will be available in source form at www.intellij.org shortly.

MetricsReloaded is currently only supported against the EAP 4.1 version of IDEA. Indeed, MetricsReloaded has only been tested against Pallada 2008, and won't even compile against previous versions. It is possible, but by no means gauranteed, that a version of MetricsReloaded compatible with IDEA 4.0 will be released in the future.


Using MetricsReloaded

In use, MetricsReloaded acts much like the "Inspect Code..." panel built into IDEA, except that instead of finding code weaknesses, it calculates numeric values
about the code. MetricsReloaded can be run from the main menu (under Tools/Calculate Metrics...) or by right-clicking in the Project or Package views. If selected from the main menu, metrics will be calculated for all files in the project. If selected from the Project or Package views, metrics are calculated for the selected file, package, directory, or module.
Next, a panel will appear, asking whether you wish to calculate metrics for all java files, only product files, or only test files. After you've selected here, you are then shown the main metrics configuration panel. Like choosing inspections in the inspection panel, the metrics configuration panel lets you choose which metrics you wish to run. Also like the inspection panel, the user may create multiple named inspection profiles, allowing him
to have standard sets of metrics to run for different uses. Finally, the metrics configuration panel allows the user to configure thresholds for each metric. Metrics values which are outside of the selected threshold values will be highlighted in the metrics display. When calculation is complete, the metrics display panel will be become visible. The metrics display panel consists of a tabbed set of tables, filled with metrics values. Metrics
values can be sorted on any column, saved as a snapshot file, compared with values from a previous snapshot file, or exported to a comma-separated value (.csv) file. By right-clicking on a column, that metric's values may be displayed in a histogram, distribution graph, or pie chart (if appropriate). It should all be pretty obvious.

Metrics Currently Supported

Project Metrics
Comment lines of code
Comment ratio
Javadoc lines of code
Lines of code
Lines of product code
Lines of test code
Number of abstract classes
Number of classes
Number of concrete classes
Number of interfaces
Number of methods
Number of packages
Number of product classes
Number of test classes
Number of top-level classes
Number of top-level interfaces
Test ratio
Module Metrics
Comment lines of code
Comment ratio
Encapsulation ratio
Javadoc lines of code
Lines of code
Lines of product code
Lines of test code
Number of classes
Number of abstract classes
Number of concrete classes
Number of interfaces
Number of methods
Number of product classes
Number of test classes
Number of top-level classes
Number of top-level interfaces
Test ratio
Package Metrics
Abstractness
Afferent coupling
Comment lines of code
Comment ratio
Distance to main sequence
Efferent coupling
Encapsulation ratio
Instability
Javadoc lines of code
Lines of code
Lines of product code
Lines of test code
Number of abstract classes
Number of classes
Number of concrete classes
Number of interfaces
Number of methods
Number of product classes
Number of test classes
Number of top-level classes
Number of top-level interfaces
Test ratio
Interface Metrics
Comment lines of code
Comment ratio
Interface size (attributes)
Interface size (operations + attributes)
Interface size (operations)
Javadoc lines of code
Lines of code
Number of dependencies
Number of dependent classes
Number of implementations
Number of subinterfaces
Class Metrics
Average number of parameters
Average operation complexity
Average operation size
Class size (attributes)
Class size (operations + attributes)
Class size (operations)
Comment lines of code
Comment ratio
Depth of inheritance tree
Javadoc lines of code
Lines of code
Maximum operation complexity
Maximum operation size
Number of attributes added
Number of attributes inherited
Number of constructors
Number of dependencies
Number of dependent classes
Number of inner classes
Number of operations added
Number of operations inherited
Number of operations overridden
Number of statements
Number of subclasses
Weighted method complexity
Method Metrics
Comment lines of code
Comment ratio
Control density
Cyclomatic complexity
Javadoc lines of code
Lines of code
Nesting depth
Number of branch statements
Number of control statements
Number of executable statements
Number of implementations
Number of loop statements
Number of method calls
Number of overriding methods
Number of parameters
Number of return points
Number of statements
Relative lines of code

Future directions
Export to XML and HTML.
Ability to navigate from metrics panel to editing panel.
Kiviat diagrams.
Ability to show breakdowns of individual metric values.
And of course: More Metrics. Suggestions for additional metrics to be calculated are strongly encouraged.

Caveats
MetricsReloaded is available in beta form. This implies that there are certain to be bugs still in the code. Users of MetricsReloaded are encouraged to report bugs on the IDEA Plugins forum. Perhaps more worryingly, it is entirely possible (indeed almost guaranteed) that metrics profiles or snapshots created with a beta version of MetricsReloaded will not be readable by future versions of MetricsReloaded. Finally, MetricsReloaded relies on portions of the IntelliJ OpenAPI which are recently opened, and subject to change without notice. It is entirely possible that future versions of IDEA will break MetricsReloaded. Consider yourself warned.

Terms of use
MetricsReloaded is public domain software, available in either binary or source form for any use, commercial or non-commercial. In particular, inclusion of MetricsReloaded in the core IntelliJ IDEA product is expressly allowed and desired. The graphing functionality of MetricsReloaded is provided by the open source JFreeChart library (www.jfree.org). JFreeChart is available under the GNU Lessser General Public License, which is included with the MetricsReloaded distribution. Any redistribution of MetricsReloaded or JFreeChart is expected to follow the letter and spirit of that license.

Important Admonition
The primary purpose of MetricsReloaded is to allow developers and quality assurance personnel greater insight into their code, so as to help them find weaknesses and places for improvement. As a secondary purpose, MetricsReloaded can be used by individual developers as a project management tool, to track their efforts and help them improve their estimates. The one thing MetricsReloaded shouldn't be used for however, is development management. USING THE NUMBERS CREATED BY METRICSRELOADED FOR PURPOSES OF RANKING DEVELOPER'S OUTPUTS FOR HIRING, REVIEW, PROMOTION, OR RETENTION IS DISCOURAGED IN THE STRONGEST TERMS.

Hope you like it. Let me know what you think.

--Dave Griffith

0
27 comments

Hi!

Thanks for good thing! :)

So, will be nice to have some thing like "Recomended Metrix Set". It define some metrix for development areas, and we just choose it from metrix profile :)

Thanks!

PS. Why "metrics reloaded", and not "metrix reloaded"? :)

0

Look very like the matrix indeed :)
Seriously I'd suggest using IDEA standard progress scheme which is currently in OpenApi.
I think all you need is ApplicationManager.getApplication.runProcessWithProgressSynchronously() and set fraction and
text values to ProgressIndicator that may be aquired via ProgressManager.getInstance().getProgressIndicator()

--
Maxim Shafirov
IntelliJ Labs / JetBrains Inc.
http://www.intellij.com
"Develop with pleasure!"

0

Great stuff, as always.

- The icon for the Metrics Reloaded tool window is so big that all
the tool window buttons are twice as tall as normal

- It seems like there should be a "select all / deselect all" button
on the metrics selection dialog, or maybe a way to select
"all Project Metrics" or all "Module Metrics", etc

- When "Save As..."'ing to a new profile, pressing enter in
the profile name dialog has no effect (requires clicking OK)

- it would be useful to have "autostcroll to source" and "jump
to source" actions or buttons when clicking on names like
method or interface of class names.

- How about having summarizing rows in the results that would
show totals / averages / deviation / etc ? Of course for some
columns they wouldn't make sense but for example it would be
interesting to know the total number of subinterfaces in the
project directly from the "interface metrics" tab. They might
be redundant with project/module/package metrics but they
would be easy to understand.

- I miss the little arrow on the column headers to tell me which
column the table is currently sorted on.

- When hovering over the column name, a small tool-tip shows
up to shortly describe the metric. A right-click action (i.e.
"explain") that would pop up a ctrl-q-like description explaining
what the metric is all about would be useful (basically, reuse the
description available in the metrics-selection dialog)

- when a column is selected (blue), if I right-click on another
column and select histogram or distribution, I get the graph related
to the selected column, not the column I right-clicked on. I think the
selection should change on right-click.

- Are you going to have a MetricsReloadedForInspectionGadgetPlugin
that will allow us to create inspections based on metrics gathered by
MetricsReloaded? :)))

Impressive first release, by the way.

Vince.


0

I tried to use the standard progress indicator (really tried, as in "wasted a day on it"), but was unable to get it to work. Request for examples was met with silence. If you've got any sample code, I'd really would love to use the standard, but otherwise I don't really want to bang my head against that particular wall again.

--Dave Griffith

0

- The icon for the Metrics Reloaded tool window is so big >that all the tool window buttons are twice as tall as normal


Doh! Shows what I get for turning off tool-window bars

>- It seems like there should be a "select all / deselect all" button on the metrics selection dialog, or maybe a way to select "all Project Metrics" or all "Module Metrics", etc

Hurm. Maybe. The thing is that I'm guessing that people will essentially never want to run all the metrics, or indeed more than a double-handful at a time. Unlike inspections, metrics bombards you with data whether your code is good or bad. I figure most of my users will set up a handful of metrics profiles, each with no more than a dozen or so metrics.

>- When "Save As..."'ing to a new profile, pressing enter in the profile name dialog has no effect (requires clicking OK).

Whoops, missed that one. Fixed in next release, promise.

- I miss the little arrow on the column headers to tell me which column the table is currently sorted on.


Yeah, my wife complained about that as well. Before 1.0, promise.

- When hovering over the column name, a small tool-tip shows up to shortly describe the metric. A right-click action (i.e. "explain") that would pop up a ctrl-q-like description explaining what the metric is all about would be useful (basically, reuse the description available in the metrics-selection dialog)


Good one. Should be easy

>- How about having summarizing rows in the results that would show totals / averages / deviation / etc ? Of course for some columns they wouldn't make sense but for example it would be interesting to know the total number of subinterfaces in the project directly from the "interface metrics" tab. They might be redundant with project/module/package metrics but they would be easy to understand.

Good one. I like it, and it should be really easy. Nothing fancy, here. Just an extra row at the bottom in a different color.

>- when a column is selected (blue), if I right-click on another column and select histogram or distribution, I get the graph related to the selected column, not the column I right-clicked on. I think the selection should change on right-click.

One of my great learning experiences so for in this has been finding that I had a deep and abiding hatred of the JTable event model. This is one of the reasons. It's on my "to fix" list, but no gaurantees.

Glad you like it.

--Dave Griffith

0

Hurm. Maybe. The thing is that I'm guessing that people will essentially

never want to run all the metrics, or indeed more than a double-handful at
a time. Unlike inspections, metrics bombards you with data whether your
code is good or bad. I figure most of my users will set up a handful of
metrics profiles, each with no more than a dozen or so metrics.

That's what I figured too, but I was thinking of the metrics discovery
process.

The first thing I wanted to do with this plugin was to select all metrics,
run all of them and then investigate the results to better understand what
they mean (it's always easier to see with real numbers) and see how metrics
correlate with each other. That would help my process of constructing a few
useful profiles. Hence the need for an "explain" action (I want to
understand the metrics once I have the numbers rather than reading before
choosing metrics).

Glad you like it.


Oh yeah. And the source would be nice, just to take a crack at porting it to
4.0 :)

Vince.


0

Whoops, missed one

>- it would be useful to have "autostcroll to source" and "jump to source" actions or buttons when clicking on names like method or interface of class names.

Eventually there will be "jump to source". It's tougher to justify spending effort on "Autoscroll to source" simply because it makes me quesy everytime I try to use IDEA with that autoscroll on in other tool windows, but I could be convinced.

0


Okay, with your clue about runProcessWithProgressSynchronously, I was able to get it to work. 0.2 will definitely use the standard progress setup. The only wierd thing is that I don't seem to be completly in control of the progress window. It looks as though any call I make to the (wicked cool) PsiSearchHelper class hijacks the progress monitor for it's own nefarious purposes. While in progress, it displays the title and progress of it's search, and they reverts to the metrics calculation progress when done. While this seems to be the behaviour of the inspection progress panel as well, I find it very odd.

On the plus side, it'll help a lot with my performance tuning. According to the panel, I'm spending a lot of time searching for references to private inner classes. Looks like tightening up my uses of GlobalSearchScope will be a big win. That must be what the AccessScope thing is for.

--Dave "API documentation is for wussies" Griffith

0

Dave Griffith wrote:

Okay, with your clue about runProcessWithProgressSynchronously, I was able to get it to work. 0.2 will definitely use the standard progress setup. The only wierd thing is that I don't seem to be completly in control of the progress window. It looks as though any call I make to the (wicked cool) PsiSearchHelper class hijacks the progress monitor for it's own nefarious purposes. While in progress, it displays the title and progress of it's search, and they reverts to the metrics calculation progress when done. While this seems to be the behaviour of the inspection progress panel as well, I find it very odd.

On the plus side, it'll help a lot with my performance tuning. According to the panel, I'm spending a lot of time searching for references to private inner classes. Looks like tightening up my uses of GlobalSearchScope will be a big win. That must be what the AccessScope thing is for.

--Dave "API documentation is for wussies" Griffith

You may prevent other activities from using common progress indicator by running them in
ProgressManager.getInstance().runProcess() passing null as ProgessIndicator. I'd recommend to pass some meaningful info
into original ProgressIndicator though. This is how inspection works. It runs any search activities with null
ProgressIndicator and sets "Searching for xxx" message into original one at the same time.

--
Maxim Shafirov
IntelliJ Labs / JetBrains Inc.
http://www.intellij.com
"Develop with pleasure!"

0

It's not working for me. I run the metrics, and it's "analyzing" and that looks very promissing, but I never see any output, instead I see the exception below in the log.

I am using IDEA 4.0.3 (build 1176)

Thanks,
Ilya

java.lang.NoClassDefFoundError: com/intellij/openapi/util/Pair
at com.siyeh.metrics.ui.table.MetricTableModel.sort(MetricTableModel.java:162)
at com.siyeh.metrics.ui.table.MetricTableModel.]]>(MetricTableModel.java:35)
at com.siyeh.metrics.ui.MetricsDisplay.loadTable(MetricsDisplay.java:93)
at com.siyeh.metrics.ui.MetricsDisplay.setMetricsResults(MetricsDisplay.java:51)
at com.siyeh.metrics.MetricsPluginImpl.showToolWindow(MetricsPluginImpl.java:107)
at com.siyeh.metrics.ui.MetricsConfigurationPanel$8.run(MetricsConfigurationPanel.java:341)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:171)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:454)
at com.intellij.ide.q.b(q.java:36)
at com.intellij.ide.q.a(q.java:136)
at com.intellij.ide.q.dispatchEvent(q.java:48)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(EventDispatchThread.java:201)
...

0


Due to recent OpenAPI changes, MetricsReloaded currently works only with Pallada EAP 2008. Back-porting to 4.0 should be possible, but is not in scope at this time.

--Dave Griffith

0

Hi Dave,

Once again, another great plug-in for IDEA.

I'm not big on metrics myself, but those to whom I report find them useful.

Once question I have is about the Test classes. How does the tool
differentiate between a production class and a test class? I know I've got
lots of test classes, but Metrics Reloaded is reporting 0 test classes.

Thanks,
Ted


0

Now I don't often say this, but this is rather cool. Sure it's rough around the edges, but is exactly the sort of plugin that makes selling the whole idea of plugins a lot more realistic ;)

0

Test classes are classes in test source roots. If you put all of your test classes in the same source root as your product code, you'll see 0 test classes. Otherwise, it's a bug. I don't check for subclases of TestCase, for instance. It's all in how your project paths are set up it.

--Dave

0

High praise indeed. I never have heard what you think of InspectionGadgets or IntentionPowerPack, but I'll take what I can get.

--Dave Griffith

0

Hi,

Thanks for the plugin.

I found a small bug in the CSV export: If a number is renderer with comma (for example in german locale) the value should be wrap in
a text qualifier for instance quotation marks. It would be preferable to have the value separator and the text qualifier configurable.

Regards,
Stefan

0

Ahh. Is there any way you could look for classe names that end in Test in
addition to looking in the test root? We organize our project so that the
test classes are in the same package as the classes being tested.

Thanks,
Ted

"Dave Griffith" <dave.griffith@cnn.com> wrote in message
news:14126637.1081824694302.JavaMail.itn@is.intellij.net...

Test classes are classes in test source roots. If you put all of your

test classes in the same source root as your product code, you'll see 0 test
classes. Otherwise, it's a bug. I don't check for subclases of TestCase,
for instance. It's all in how your project paths are set up it.
>

--Dave



0


I'll put it on the TODO list, but I confess it will probably have pretty low priority. The test source root seems to be the "blessed" way of determining test classes, and I'm loathe to go beyond that. I put my tests in the same package as well, but in a different directory tree.

--Dave Griffith

0

In article <24732843.1081961629548.JavaMail.javamailuser@localhost>,
Dave Griffith <dave.griffith@cnn.com> wrote:

I'll put it on the TODO list, but I confess it will probably have pretty low
priority. The test source root seems to be the "blessed" way of determining
test classes, and I'm loathe to go beyond that. I put my tests in the same
package as well, but in a different directory tree.


Same here - I have foo/source and foo/test/source in most of my projects.

Scott

0

Hello Dave,

Great plugin like usual.

You asked for more metrics to support. Here they are:

Halsted metrics - Effort, Difficulty, Bug Prediction -> http://www.sei.cmu.edu/activities/str/descriptions/halstead.html
Lack of cohesion of methods (LCOM) -> http://www.comp.glam.ac.uk/pages/staff/bfjones/oot/lco.htm
Coupling between objects (CBO) -> http://www.comp.glam.ac.uk/pages/staff/bfjones/oot/cbo.htm
Response for class (RFC) -> http://www.comp.glam.ac.uk/pages/staff/bfjones/oot/rfc.htm

Others that I have not used yet but sound interesting are
Essential Complexity -> http://www.sei.cmu.edu/activities/str/descriptions/cyclomatic.html
Extended Cyclomatic Complexity
Maintainability Index -> http://www.sei.cmu.edu/activities/str/descriptions/mitmpm_body.html

There are several tools that compute these and other interesting metrics. You might want to take a look at these
http://www.virtualmachinery.com/jhawkmetrics.htm
http://www.powersoftware.com/em/

He. You should have known when you asked... ;)

If you want we have built a html reporting site using xslt from the xml output of essential metrics. The Emetrics XML is straight forward.
Email me directly if you are interested.

The other thing you might want to consider is trend graphs. This would be rather important in order to help people improve (incidently it is something we are missing in our internal tool ;)

Have fun

Jacques

But the real question is: does chicken really taste like chicken?

0

Cool. I already had many of these on the TODO list, but the links will be very helpful. I'd love to see your internal tool's XML format and XSLTs as well.

--Dave Griffith

0

I'm still hoping to see the sources show up on the wiki
to mess around to get it to run in 4.0..

Maybe we need a "DG's weekly status report" :)

Vince.


0


Sure, why not.

Changes so far on 0.2
-


Total and average rows added to results tables
Switched to using stock IDEA progress panel.
Sort arrows added to column headers
"Explain" dialog (popping up the metric description from the right-click menu).
Fixes to cyclomatic complexity calculation.
Selected profile bound persistently to project
Much UI polishing (table selection, default buttons, toggling button enablement, etc).

New Metrics Added
-


Number of interfaces implemented (class)
% Classes javadoced (package, module, project)
% Fields javadoced (class, interface package, module, project)
% Methods javadoced (class,interface, package, module, project)
Total cyclomatic complexity (package, module, project)
Average cyclomatic complexity (package, module, project)
True comment ratio (all levels)
Non-comment lines of code (all levels)
Num expressions (method)
Num commands (class, interface)
Num queries (class, interface)
Num typecast expressions (method)
Extended cyclomatic complexity (method)
Essential cyclomatic complexity (method)

Additional metrics planned for 0.2
-


RFC
MPC
LCOM
CBO
Fan-in
Fan-out
Exceptions thrown
Exceptions caught

Currently in mid-refactoring, after changing snapshot and profile persistence from Java serialization to JDOM. This will make it easier to store column sorting and sizing information persistently with the profile, so that metrics displays can be tuned "just like you want them" and that will persist between sessions. On the downside, snapshots and profiles from 0.1 will no longer work (you were warned).

Features deferred till at least 0.3
-


Halstead Metrics
MOOD Metrics
SEI and related indices
Export to XML
Export to HTML
Kiviat diagrams
Jump to source
Drilldown

I plan to ship both Aurora and Pallada versions of 0.2, probably about two weeks from today. Source will be available for the 0.2 version onward.

--Dave Griffith

0

I don't know how easy this would be, or whether anyone else would find it useful, but it would be great if the exported CSV file didn't use acronyms for the column headings (or there was a key of some kind at the start/end of the file).

Also, I know someone else mentioned a "Select All" tick box - that would be useful I think, but it might be more useful if you could at least turn on/off a section of metrics.

Other than that, it's a fantastic add-on :D Keep up the good work!

0

I don't know how easy this would be, or whether anyone else would find it useful, but it would be great if the exported CSV file didn't use acronyms for the column headings (or there was a key of some kind at the start/end of the file).


Already done.


>Also, I know someone else mentioned a "Select All" tick box - that would be useful I think, but it might be more useful if you could at least turn on/off a section of metrics.

Don't know how that would work, really. The Inspection panel has something like that to allow you to disable entire categories of inspection, but I've never found it all that useful.

Glad you like my toy.

--Dave Griffith

0

Don't know how that would work, really. The Inspection panel has

something like that to allow you to disable entire categories of inspection,
but I've never found it all that useful.

That's already two people asking for it.. :)

Thanks for the update!

Vince.



0

Ah, I actually find that category selection quite useful in the Inspection Gadget! That's exactly what I was trying to describe in my previous post.

Thanks for the update.

Simon

0

Please sign in to leave a comment.