BuildManagerListener isn't being triggered outside of debug mode
Answered
We're using a correctly registered BuildManagerListener that should pick up a configuration setting right before a build starts.
We're observing that this works as expected on Build Debug Process mode, but outside of debug mode, any changes to that setting aren't picked up before build time.
Please sign in to leave a comment.
Hi João,
BuildManagerListener.beforeBuildProcessStarted(), as its name suggests, is called right before the build process is actually launched.
Note that this does not mean that BuildManagerListener.buildStarted() will be called immediately after beforeBuildProcessStarted. Some significant period of time can divide these two evens. For example, IDEA uses an optimization trick, when the build process is starts, initializes is caches and remains waiting. as soon as the IDE needs to perform compilation, it just takes this waiting build process and performs compilation. In this scenario the "beforeBuildProcessStarted" event will be sent before launching the preloaded process and "buildStarted" event is sent only before actual build starts. In debug mode this "preload" optimization is always turned off.
I can also confirm that these events are always sent in expected moments.
Thanks a lot for the explanation, Eugene.
That "beforeBuildProcessStarted" caching could explain why the setting wasn't getting updated. But I just tried adding the same logic to "buildStarted" and the setting is still not being picked up before the build starts.
If it's any help (although I seriously doubt it), my method grabs a file path from a central location and stores it in a facet (so it can be read by our builder). I noticed that if I change the facet (even though the file location isn't living there), the setting is now read. Maybe this update causes a cache miss somehow? That is very undesirable for us, though -- we want this query and write to always run, even when no facet was modified between builds.
For reference, here is a GitHub PR containing the work I'm mentioning. https://github.com/GoogleCloudPlatform/google-cloud-intellij/pull/1096
So the task is to pass some context data from the IDE to the build process so that your custom builder working in this process can read it. The JPS reads all project information from the disk after your listener modifies the project data, it should be saved. This kind of operation is highly undesirable at this moment. Also the preloaded process should be cancelled too, because it has already read the old project state.
There is a better way to accomplish this,
Ideally your builder in the JPS process should read this data directly from the project files. I feel something wrong about the scheme, when the project model must be altered by some code every time before the build starts. Isn't the project model supposed to already store all necessary data? So the first solution would be reconsidering the current approach and update data in the model when it is actually needed basing on some objective criteria. This approach will work for automatically started builds (auto-makes) too.
If you cannot avoid data update right before the build starts, you may consider another option:
- Register your custom "before task" with CompilerManager.addBeforeTask() This task will be called before build start and will take the role of your current BuildManagerListener, which is kind of "low-level api".
- In your CompileTask.execute(CompileContext context) implementation, you can get a CompileScope opbject from the context instance passed to the task:
CompileScope scope = context.getCompileScope();
- you can put any custom key-value pair to the scope:
scope.putUserData(Key, data); This data will be passed to the corresponding CompileContext object in the JPS process. While you can put any objects as key-value, their toString() versions will be passed to JPS.
- In the JPS process your builder can read the data using org.jetbrains.jps.incremental.CompileContext.getBuilderparameter(key) API
The disadvantage of this approach is that for automakes the tasks are not called, so the data won't be passed to your builder. For all builds invoked explicitly by the user this should work fine.
Eugene, let me explain the situation a bit better:
We're storing a setting (i.e., the Cloud SDK path) centrally, in a service, instead of in a facet. On each build, we want to use the Cloud SDK to generate source context files in the output directory, to be packaged together with the resulting artifact (war/jar/etc.).
Consider the case where the user stores an invalid Cloud SDK path (e.g., one that doesn't contain any SDK). They try to build the artifact, it won't work, so they will go to Settings, change the Cloud SDK there and re-run the build.
One option that would go in line with IJ's current build model is to trigger an SDK update everywhere it is used, from the CloudSdkService. That way, the project information (i.e., the facets) would be changed, I believe this would trigger cache misses and the new SDK would be used. However, a big limitation of this is the maintenance overhead on the service, having to maintain a list of places where this path lives and updating it there, if they exist.
We wanted to hook to the build process instead, so that each build always grabs the freshest Cloud SDK path. But, in the previous situation, no project information would change between builds, only the central SDK locally. I believe this is what is causing BuildManagerListener not to get triggered in between builds, since no project information is changed.
I just tried CompilerManager.addBeforeTask(), but it also won't get called in between builds, unless I go to the facet and press OK, which isn't the desired behaviour. CompileContext.getBuilderparameter(key) also doesn't sound like the right choice, since we want source context to be generated, regardless of whether the build is manual or automatic. (By the way, what are some examples of automated builds? Before launch task builds?)
Do you have any suggestions for an alternative design?
> I just tried CompilerManager.addBeforeTask(), but it also won't get called in between builds,
Could you please elaborate on this? The before- and after- compilation tasks are always called for every build launched explicitly by the user.
> We wanted to hook to the build process instead, so that each build always grabs the freshest Cloud SDK path
If you need to read the cloud sdk path for every build, why not to have the builder itself to obtain it whenever the builder needs it? In the facet you can store only the data describing how to read and where to look for the path, but actual path will be read directly from the builder "lazily". This approach will work for all kinds of builds and you don't have to bother with passing data from IDE process to JPS process.
> Could you please elaborate on this? The before- and after- compilation tasks are always called for every build launched explicitly by the user.
You're actually right -- addBeforeTask() is always being called, and writes the correct path to the facet. But if I call Build Artifact right after I've changed the path, the old path is still read. It takes one or two Build Artifact calls for the correct value to go through. Is there a way to prevent this?
> If you need to read the cloud sdk path for every build, why not to have the builder itself to obtain it whenever the builder needs it?
I agree that would be great, and would totally solve our problem. The issue is that it results in a circular dependency -- our main plugin depends on jps-plugin, for building the artifacts, and jps-plugin would also need to depend on the main plugin, to use CloudSdkService to get the freshest SDK path value. Extracting CloudSdkService to its own module and make the main plugin and jps-plugin depend on it failed for other reasons. I spent some time looking into this and couldn't see a way through other than the one I'm trying to implement. Let me know if I'm overlooking anything.
Thanks!
Hi, we're still stuck on getting the Cloud SDK path without any delay from the build process. Thanks!
> Extracting CloudSdkService to its own module and make the main plugin and jps-plugin depend on it
This would be the best solution in my opinion, otherwise there will be always a time overhead for reading data from a remote location right before the build starts. Also, as I mentioned above, modifying facet state in "beforeBuildStart" event is not a solution, as facet data is not supposed to be modified at those moments. As an alternative I see only a hack: you can store this piece of data somewhere on disk (e.g. in the build's system data directory for particular project: see BuildDataPaths.getDataStorageRoot()) and make sure that this information is always up-to-date. The builder should be modified so that it reads this setting from the new location. Ideally this data should be updated only when the setting actually changes and not before every build, but even if you update it before the build starts, the difference is that you are writing the data directly to disk and not to the project model which needs further synchronization with the disk.
Eugene, thanks a lot for your response.
I was able to extract CloudSdkService into a new Gradle module and build everything without dependency issues. However, the only really useful thing that CloudSdkService was doing for us was persisting the Cloud SDK value through PropertiesComponent.getInstance(). Which file this is being persisted to, I don't know, but we're not calling PropertiesComponent.getInstance(project), so it might be in a file outside of the project's .idea folder..
I haven't found a way to make PropertiesComponent work from the jps-plugin yet, because PropertiesComponent.getInstance() calls ServiceManager.getService(), which in turn calls ApplicationManager.getApplication(), which returns null, I suspect because Application is being set in jps-plugin.
It seems that something passed to our ModuleLevelBuilder through build() could give us access to this persisted data, but I'm not sure what?
Earlier you mentioned project model, but what is the "project model"? I can get a JpsModel object through compileContext.getProjectDescriptor().getModel(), but it doesn't seem to contain anything useful? JpsGlobal stores the JDK and libraries, but neither seem to be suitable to store this piece of configuration.
I feel like I'm close to a solution here, I just need to crack reading the project file from jps-plugin, for which there must be an existing utility.