Graphical integration of running tests in plugin
Hello,
I am able to run tests in the language for which I'm making a plugin. Currently I extend the CommandLineState class and implement the startProcess function which will run the process that will actually perform the tests on the command line.
However, I would also like to integrate with the visual ui part of the test framework, I would like to show a green or red status bar in case a test reports failure, I would like to show the names of the failed tests (if any) at the end, and so on. I have been trying to look around in some examples, notable the junit plugin, but I don't seem to find my way around in it enough to find where the ui integration enters the room. Can anybody give me a pointer where I'd best start looking?
Kasper
Please sign in to leave a comment.
Thanks for the quick response, Anna! I just tried that and it seems to work fine. Not exactly what I was going for, but I think it solves the problem. Thanks again!
Anna, now that I've gotten unit testing going, is there a good reference for implementing a custom code coverage provider/engine? I'm starting with the coverageEngine EP and trying to build out the graph of required classes, but obviously I'm approaching it from a brute force standpoint. In particular my code coverage engine uses an API to fetch coverage data. As far as I can tell I probably still need to be able to store that information in local files to represent coverage suites. Anyway, any information about how this hangs together would definitely help direct my implementation. Thanks!
Hi Scott,
I am afraid that the only public coverage available is one for java. I'll try to explain how coverage in general works in IDEA:
1. Start configuration with coverage
When you start configuration with coverage, it is started with com.intellij.coverage.CoverageExecutor. Then each coverage-implementer needs to provide runner which would accept this executor, in case of java it's com.intellij.coverage.DefaultJavaCoverageRunner. Runner should check the executor (it should not run 'debug sessions') and the configuration which can be executed (python coverage runner should not run JUnit tests). Java coverage attaches javaagent to the process which would do the job, it's done in com.intellij.execution.coverage.CoverageJavaRunConfigurationExtension#updateJavaParameters, more general way to RunProfileState of your configuration is to implement com.intellij.execution.configuration.RunConfigurationExtensionBase.
2. Load coverage data in IDE
After tests are done, coverage information is stored somewhere. RunConfigurationExtensionBase#attachToProcess allows to get notification that process is terminated and coverage is ready (in your implementation com.intellij.coverage.CoverageDataManager#attachToProcess). Here you'll see that you need com.intellij.execution.configurations.coverage.CoverageEnabledConfiguration to store coverage specific data (e.g. where do you have generated coverage). You would create your own in com.intellij.coverage.CoverageEngine#createCoverageEnabledConfiguration. Other methods of coverage engine allows you to map sources to output (CoverageEngine#getCorrespondingOutputFiles and different getQualifiedName methods), check that coverage information is up-to-date (checks if there is compiler output in java - CoverageEngine#recompileProjectAndRerunAction), provide coverage information about files, missed from generated coverage (not loaded classes in java)
3. Present coverage information for the user
CoverageEngine force you to create
1. annotator, which when requested (JavaCoverageAnnotator#createRenewRequest) fills internal maps to be able to quickly amswer questions - how many lines were covered in this file, etc.
2. coverage view extension which would build table with coverage data in separate toolwindow (com.intellij.coverage.CoverageEngine#createCoverageViewExtension); directory based implementation: com.intellij.coverage.view.DirectoryCoverageViewExtension; java one - com.intellij.coverage.view.JavaCoverageViewExtension
3. com.intellij.coverage.CoverageEngine#generateReport to generate report
Looks like that's all what you need to do to integrate coverage for your tests. Some of the api was not available at IDEA 12, something was changed in 14.1 but the main workflow was unchanged. I am sure that I've forgot something, please ask when I've skipped something important.
P.S. CoverageSuite is just a place where to store coverage settings.
Anna
Wow, Anna! I'll need to digest everything you've provided here, but this is exceptionally helpful! Thanks so much!
Okay, I'm taking a look at this and it's still not quite clicking with me. Let me provide some specific context, then I'll tell you where I am right now in my implementation, and finally I'll ask some specific questions.
Context
For the custom language I'm supporting (Salesforce Apex), code coverage metrics are automatically computed every single time unit tests are run for the involved product classes. There's not really a need for a separate run configuration for "Run with coverage" vs. just "Run". Once the unit test run completes, I can call an API to query the computed coverage metrics for not only the classes tested by the unit test run, but for all classes for which coverage has been computed. As a result, a local file-based version of the metrics into which I'd merge additional info isn't as useful as it might be with other code coverage tools. I can build the entire picture with a series of API calls at unit test run completion.
Current implementation status
Given that there really isn't a distinction between "Run with coverage" and "Run", I've just updated my existing unit test implementation of BasicProgramRunner.canRun() to accept both DefaultRunExecutor.EXECUTOR_ID and CoverageExecutor.EXECUTOR_ID (and of course only when it's my plugin's run configuration). This obviously causes the "Run with coverage" button to enable as well as the "Run" button. Right now they both do the same thing.
I've added dependencies in my plugin.xml on coverage. I'll still need to sort out how this will work with IDEA 12/13 Community Edition, but let's get this going with IDEA 14 CE first and then I can deal with that.
Also in plugin.xml, I've registered a coverageEngine EP with a partial implementation of CoverageEngine. Right now isApplicableTo() and canHavePerTestCoverage() return true only for my plugin's run configuration.
I've stubbed out an implementation of CoverageEnabledConfiguration that I return from createCoverageEnabledConfiguration(). Right now that just sets the coverage runner to my CoverageRunner implementation which is registered with the coverageRunner EP in plugin.xml. My coverage runner currently returns true from acceptsCoverageEngine() for my coverage engine and has no implementation of loadCoverageData() or getDataFileExtension().
I've also stubbed out an implementation of CoverageSuite that I return from the various signatures of create*CoverageSuite(), but it sounds like that may not be necessary?
I've stubbed out an implementation of CoverageAnnotator with no real implementation of createRenewRequest(), getDirCoverageInformationString(), and getFileCoverageInformationString(). I generally feel fine about how to implement the get*CoverageInformationString() methods once I have all of the computed coverage metrics, but I'm not sure what I need to do for createRenewRequest().
Questions
I guess the first question is whether I need to create and register a runConfigurationExtension given that there's no distinction between "Run" and "Run with coverage" for me. And a follow-on question is that if I do have to do this, what does it mean that the EP is constrained by RunConfigurationExtension (not RunConfigurationExtensionBase) which has very Java-specific behavior, e.g., updateJavaParameters(). I'm not trying to implement coverage for Java.
You mentioned "After tests are done, coverage information is stored somewhere". In my case, they're stored at the other end of a series of API calls and not in the file system. Everything seems very file-oriented. Do I need to download the metrics into local files to fit into this workflow?
Along those lines, is my implementation of CoverageRunner, in particular loadCoverageData() and getDataFileExtension() going to require some form of local file-based storage of computed coverage metrics?
Do I need to do anything with CoverageSuites? In particular, how do I register a new coverage suite with Analyze>Show Coverage Data for display against the local source code?
What role does CoverageAnnotator.createRenewRequest() play?
Sorry if you've already answered some of these questions in your previous email. This is just the way I've organized my thoughts on the code coverage piece.
Thanks again in advance!
Scott,
IDEA 12, 13 CE didn't have coverage, it was Ultimate feature so you surely need to register optional part in your plugin.xml (if you need details, please ask)
It seems to me that you need register RunConfigurationExtensionBase only to override attachToProcess where you call com.intellij.coverage.CoverageHelper#attachToProcess, it would create coverage suite (and register it everywhere, you don't need to do anything else to get your suites in the dialog) and it would call coverageGathered which would notify editor, project view etc that coverage information is available. Please ensure that you throw com.intellij.openapi.util.WriteExternalException in RunConfigurationExtensionBase#writeExternal, so you won't modify settings (I'll change the default but you would work with IDEA 14)
If you don't need files, you may create your CoverageSuite (returned from CoverageEngine#createCoverageSuite) with your own com.intellij.coverage.CoverageFileProvider with empty implementation and then load coverage information with api calls. the main idea here is to provide ProjectData with class-lines map inside.
CoverageRunner.getDataFileExtension is used to check if selected file could contain coverage data. Looks like it's not your case, return null there.
CoverageAnnotator.createRenewRequest() normally accumulates coverage data, so the get-calls are called in EDT, so you may do nothing in advance but be ready to UI freezes if 'get' calls are not 'just' getters.
Hopefully this helps,
Anna
Great! Thanks again! I'll take this and see what I can do with it this evening. If you have a pointer on how to make coverage in my plugin work with IDEA UE 12/13 and IDEA CE 14+, that would definitely save me some searching! I'm pretty sure I do it by using a dependency-based include, but explicit documentation or examples would be wonderful.
For IDEA 12,13 you need to register <depends optional="true" config-file="your-optional-coverage-plugin.xml">Coverage</depends> in your main plugin.xml. I am afraid that this won't work for IDEA 14/15 where you won't have the plugin and would need to register extensions right in your plugin.xml.
Anna
Okay. That helps. In order to support IDEA 12/13/14 and soon 15, I already have preprocessing and conditional compilation as part of my build. I'll just have the same preprocessing operate on the plugin.xml to create the appropriate contents for each version of the IDE. Thanks YET again!
Anna, I've made quite good progress and have code coverage annotations appearing in the gutter now, though it's not quite what I'd want/expect. I have a few questions that I think should allow me to get this thing wrapped up, though. The API that I call to query code coverage metrics will tell me the following:
1) The test class and method from which code coverage driven.
2) The class in which code was executed.
3) The covered and uncovered line numbers in that class.
Now I'm trying to figure out how to translate this information into the ProjectData that gets returned from my suite's getCoverageData(). Basically I'm doing the following (the CodeCoverage records are the data queried through the API):
From what I see in the UI, though, the lines I've added there are being annotated as uncovered. How do I include covered and uncovered information in the returned ProjectData? Also, is there some way I can include the information about which test methods helped drive which coverage given that I have that information?
I'm almost there! Just need a little more help with these last pieces hopefully!
Thanks!
Scott, you need to specify hits for line data, corresponding piece of code from ProjectDataLoader as sample
{code}
LineData lineInfo = ...;
{code}
Lines treated as covered when lineInfo.getHits() > 0. Looks logical to me ;)
Anna
Anna, I really apologize if I'm missing something obvious here, but some of this terminology isn't making sense to me. The code coverage metrics that are available to me are in terms of lines. I get information about the exact lines that are covered and those that are uncovered. Obviously there are other lines in files, e.g., white space, that wll be neither. I'm not sure how to map this information to LineData and its hits, jumps, and switches. Unfortunately I'm only also to look at the coverage plugin source through the decompiler which make be losing some of the internal documentation and meaning as well, so I'm not really able to infer the meaning of these additional concepts.
If possible, can you explain how, given a line number, I can say in the project data and class data whether that line is covered or uncovered?
Thanks, and sorry I'm just missing the obvious here!
Scott,
you can add coverage-src.zip as sources for IDEA JDK, it should show you sources for ProjectData.
If I understand you correctly, for covered lines you set lineInfo.setHits(1), for uncovered you leave it 0, for 'not-executable' lines you don't create lineInfo (lineData array of classData would contain null values for that line numbers)
Hope this helps,
Anna
That did it! Definitely more work to do on my side top get aggregates displaying and such, but I'm showing green and red in the gutter properly now. Thanks!!!
Anna, thanks to your extensive guidance, I was able to wrap up a decent implementation of code coverage integration over the weekend. I do have a few follow-up questions, though.
First, the API I'm invoking actually does provide sufficient information for me to be able to support coverage-by-test, but I'm a bit perplexed as to how to use it. I tried to create a distinct LineData for each combination of test method that drove coverage and class/line number that was covered by that test:
Hi Scott,
actually tests per line was implemented for java only and there is no real api for that yet. Trace files are generated during the run (placed in the same directory as other coverage data) and after that they are used like ShowCoveringTestsAction#extractTests. Really I am not sure that it's possible to add this part in your case. But most probably we would do something in this direction in IDEA 15.
ArrayIndexOutOfBounds could appear due to ±1 problem, I think that line numbers should start from 1 here but I am not sure ;) otherwise it could be the case that IDEA thinks that some changes were done after coverage data was produced and calculates a map to remap old lines to new lines. Sorry it's hard to say what's going on without an example.
Last one with blinking coverage view looks like a bug in toolwindow subsystem. What version do you check?
Thanks,
Anna
Thanks for the quick response, Anna. I assumed that per-line tests was Java-only right now but I appreciate the confirmation.
As for the off-by-one error, I thought that might be the case as well. Often in the plugin SDK line and column numbers are specified starting at zero, e.g., in the Problems view, even though they're shown in the editor starting at one. I thought that might be the case here, but line numbers for coverage seem to be specified starting at one and that's how I'm supplying them. And like I said, I see coverage appearing properly aligned in the UI when the suite is active. I'll see if I can supply a concrete example with a specific file, the line data that I'm supplying, and the resulting exceptions that get raised.
And as for the blinking coverage view, it has the same behavior on IDEA 12, 13, and 14, so I'm assuming it's something I'm doing wrong. At first I thought it might be an EDT/non-EDT thing where I was doing something on the wrong thread, but that doesn't seem to be the case. I'll keep playing with it and see if I can corner it a bit better.
Thanks again! I sincerely appreciate your help on all of this!
Okay, Anna. I have a specific example of that ArrayIndexOutOfBoundsException. In SrcFileAnnotator.createRangeHighlighter(), lines has the following contents:
LineNumber 46, Hits 1
LineNumber 48, Hits 1
LineNumber 51, Hits 1
LineNumber 53, Hits 1
LineNumber 56, Hits 1
LineNumber 58, Hits 1
The exception is being raised because it's using line=45 and of course that means it's trying to index lines[46] in an array of only length 6. Looking up two levels in the stack at SrcFileAnnotator.showCoverageInformation(), it appears that lines is being passed in from executableLines which is based on postProcessedLines which ultimately is from my created ClassData.getLines().
It seems like I'm either supposed to be returning one entry per-line (not just executable line) there or there's a bug in how this indexing is taking place.
Presumably this doesn't happen with Java's coverage so I'm assuming there's something I should be able to do in how I create my ProjectData/ClassData/LineData to avoid this issue. Thoughts?
Scott,
you need to return array of lines with null values, where null should correspond to non-executable line, otherwise lineInfo with 0 hits for uncovered lines, > 0 hits for covered lines. I've already said that but looks like I was not clear enough.
Anna
Ah, okay. I interpreted that as only needing to return line data for executable lines with hits=0/>0 for uncovered/covered respectively. I can definitely do that!
Hi, Anna. Hopefully you're still around and getting notifications on this thread! Unit testing and code coverage have been working very well in my plugin thanks to your help a few months back. However, now I have one user seeing an issue that I can't explain so I wanted to check in here to see if you might have a thought on why it's happening.
Basically the user runs tests which creates a coverage suite. When the suite is activated in IntelliJ IDEA, I retrieve the coverage metrics and build line data and class data from them. Line hits are reported properly for covered and uncovered executable lines, and other lines are reported as null. When all is said and done, the user can see the rollup metrics in the Project View and in the Coverage View, but there are no gutter annotations displayed in the editor window for files that have reported coverage.
I've also verified that the values for getQualifiedNames()/getQualifiedName() returned by my coverage engine implementation match the keys I'm using to add class data as well.
Is there some way this might have been disabled by the user? The only configuration setting I could find was under Colors & Fonts where you can set the color for full/partial coverage and uncovered lines, but he verified that to be properly enabled. Is there some other reason the coverage gutter annotations might not be showing up for this one user?
Thanks in advance for any insights!
Hi Scott,
I can imagine that something goes wrong with library flag : IDEA doesn't show coverage for library classes and there are corresponding checks in engine for that. Please look at com.intellij.coverage.SrcFileAnnotator#showCoverageInformation for more details. I can admit that we have reorts when coverage is not shown in the editor (available in separate view/project view) for java.
Anna
Thanks, Anna. I'll update my engine implementation to include an override of that method to see how it's perceiving these files. That should help me know whether that's contributing or not.
Oh, and just so I understand your comment "I can admit that we have..." You're saying that there are certain situations in the Java coverage engine where coverage annotations aren't shown in the editor even though the class data and line data is configured properly? That would indicate that there might be some underlying bug that could be hitting me here as well? Just trying to make sure I understand the implications...
There must be a bug, but we can't reproduce it so no additional inforamtion available, sorry
No, that's actually very helpful. I'll see if my user can create a standalone reproducible test case of the issue. If so, I'll debug it including into the coverage plugin itself if it's not a bug in my own plugin. I'll let you know what I find.