Always OutOfMemoryError with the External Build feature

Hi All,

I'm using 130.1619. My project has only 1 module and more than 15,000 java files. If I use the External Build feature as in Untitled2.png, I always get the OutOfMemoryError. That doesn't happen with the internal build as in Untitled1.png.

Regards.

0
5 comments
Avatar
Permanently deleted user

What is your operating system?
What is the exact version of the project jdk (the one that is configured in your project settings)?
Are you using 32-bit or 64-bit version of the project jdk?
Could you please also locate IDEA logs directory via "Help | Show log..." action. There must be a subdirectory "build-log" containing logs from the build process. Please zip it and sent it to me for investigation.
It is also a good idea to create an issue in youtrack and continue investigation there.

Regards,
   Eugene.

0
Avatar
Permanently deleted user

Now Idea doesn't give OutOfMemoryError anymore but the compilation takes forever. It stucks at the Parsing step as in Untitled1.png. The internal build takes about 17mins. After 38 mins, the external build is still at the Parsing step.

My machine runs Windows 7 Professional 64bit. I start Idea 132.197 with jdk 7 64-bit (1.7.0_25) by setting IDEA_JDK=C:\jdk1.7.0, and the project jdk is 1.6.0_45 64-bit.



Attachment(s):
log.zip
Untitled1.png
0
Avatar
Permanently deleted user

Now I cannot build my project having more than 17000 java files anymore because I just switch to 132.425. Could you tell me how to create an issue in youtrack?

Thanks.

PS: the build.log has almost nothing. After 12mins, Idea is still "Parsing java..." and the build.log has:

2013-09-27 11:38:49,355 [      0]   INFO - jps.cmdline.JpsModelLoaderImpl - Loading model: project path = C:/Users/c0dant2/IdeaProjects/dev2, global options path = C:/Users/c0dant2/.IntelliJIdea13/config/options
2013-09-27 11:38:50,766 [   1411]   INFO - jps.cmdline.JpsModelLoaderImpl - Model loaded in 1412 ms
2013-09-27 11:38:50,767 [   1412]   INFO - jps.cmdline.JpsModelLoaderImpl - Project has 1 modules, 1 libraries
2013-09-27 11:38:51,480 [   2125]   INFO - ellij.util.io.PagedFileStorage - lower=100; upper=200; buffer=10; max=1887961088
2013-09-27 11:38:51,765 [   2410]   INFO - .incremental.IncProjectBuilder - Building project; isRebuild:true; isMake:false parallel compilation:false
2013-09-27 11:38:51,795 [   2440]   INFO - penapi.util.io.win32.IdeaWin32 - Native filesystem for Windows is operational
2013-09-27 11:39:15,846 [  26491]   INFO - s.incremental.java.JavaBuilder - Using javac 1.6.0_45 to compile java sources
2013-09-27 11:39:15,848 [  26493]   INFO - s.incremental.java.JavaBuilder - Compiling 17011 java files; module: dev2


Edit: finally the compilation was completed in 48 min 53 sec



Attachment(s):
Untitled.png
0
Avatar
Permanently deleted user

Hi,

Have a look at this topic: http://devnet.jetbrains.com/message/5504096

Seems like a few people are having same problem as you, well at least it sounds a lot like the problems I've been having.
My project is similar - single module, 10k+ files.

Try the "eclipse" compiler option I mentioned in that topic - might work for you.

Cheers,
Shorn.

0
Avatar
Permanently deleted user

Most likely the reason is the memory. Javac is written so that it should have enough memory to store parsed representation of _all_ input source files. So the more files you add to be compiled in one go, the more memory to allocate to the build process. All snapshots taken in similar situations (over 10k large source files in one module) show that javac consumes almost all heap available to process and either fails with out-of-memory or requires too much time to work because of GC pauses.
As we cannot influence or change javac's memory requirements, possible solutions are:
1. Split your module into several modules each containing less files. There will be a separate javac call for each module, after which all parsed AST structures will be collected and the build process will fit into the specified memory limit.
2. Increase the heap size for the build process, thus giving javac more memory.  We also have a positive experience with concurrent garbage collector for such cases (very big modules). You can enable it by adding "-XX:+UseConcMarkSweepGC" option to "Settings | Compiler | Additional build process VM options" field.
3. use alternative eclipse compiler as Shorn suggested.

0

Please sign in to leave a comment.