Anything suspicious with my syntax highlighting code?

Currently I'm getting an "out of memory" notification from IntelliJ when I try to go to the "Color & Fonts" settings page for my language plugin. However, I verified that the lexer is easily and quickly getting all the tokens used in the demo. Also, the same code was working for the "Color & Fonts" tab previously, so fixing my lexer has apparently given rise to a bug in the syntax highlighting. I believe the only significant custom code relevant here is the following:

 
package com.atslangplugin;

import
com.intellij.lexer.FlexAdapter;
import
com.intellij.lexer.Lexer;
import
com.intellij.openapi.editor.colors.TextAttributesKey;
import
com.intellij.openapi.editor.markup.TextAttributes;
import
com.intellij.openapi.fileTypes.SyntaxHighlighter;
import
com.intellij.openapi.fileTypes.SyntaxHighlighterBase;
import
com.intellij.psi.TokenType;
import
com.intellij.psi.tree.IElementType;
import org.jetbrains.annotations.NotNull;

import
java.awt.*;
import
java.io.Reader;

import static
com.intellij.openapi.editor.DefaultLanguageHighlighterColors.*;

import static
com.intellij.openapi.editor.colors.TextAttributesKey.createTextAttributesKey;

public class
ATSSyntaxHighlighter extends SyntaxHighlighterBase {
    public static final TextAttributesKey ATS_IDENTIFIER =
            createTextAttributesKey("IDENTIFIER", IDENTIFIER);
    public static final
TextAttributesKey ATS_BLOCK_COMMENT =
            createTextAttributesKey("BLOCK_COMMENT", BLOCK_COMMENT);
    public static final
TextAttributesKey ATS_LOCAL_VARIABLE =
            createTextAttributesKey("LOCAL_VARIABLE", LOCAL_VARIABLE);

    private static final
TextAttributesKey[] ATS_IDENTIFIER_KEYS =
            new TextAttributesKey[]{ATS_IDENTIFIER};
    private static final
TextAttributesKey[] ATS_BLOCK_COMMENT_KEYS =
            new TextAttributesKey[]{ATS_BLOCK_COMMENT};
    private static final
TextAttributesKey[] ATS_LOCAL_VARIABLE_KEYS =
            new TextAttributesKey[]{ATS_LOCAL_VARIABLE};

    @NotNull
    @Override
    public
Lexer getHighlightingLexer() {
        return new FlexAdapter(new ATSLexer((Reader) null));
    
}

    @NotNull
    @Override
    public
TextAttributesKey[] getTokenHighlights(IElementType tokenType) {
        if (tokenType.equals(ATSTokenTypes.IDENTIFIER)) {
            return ATS_IDENTIFIER_KEYS;
        
} else if (tokenType.equals(ATSTokenTypes.COMMENT_LINE)) {
            return ATS_BLOCK_COMMENT_KEYS;
        
} else if (tokenType.equals(ATSTokenTypes.COMMENT)) {
            return ATS_BLOCK_COMMENT_KEYS;
        
} else {
            return ATS_LOCAL_VARIABLE_KEYS;
        
}
    }
}


I admit I'm a bit new to Java development. Maybe I can try JProfiler to find the issue, though I'm open to suggestions.

I've also attached an image of the "working" page from a point in time when the lexer was more buggy and not getting all the way through the demo file:

Some_comments_working.PNG

Happy Holidays!

9 comments
Comment actions Permalink

This class is implemented correctly. The problem must be elsewhere.

To diagnose the problem easily, you can add -XX:-HeapDumpOnOutOfMemoryError to the VM options of your debug IDEA instance; it will then write an .hprof file when the OOME occurs, and you'll be able to open it with JProfiler, YourKit or another Java profiling tool.

0
Comment actions Permalink

This has me on the right track. When I use '-XX:+HeapDumpOnOutOfMemoryError', VisualVM verifies that 'Heap dump on OOME' is enabled, and indeed, for a simple java program I can find the hprof file in the project directory. However, for the plugin, I never see the message that an hprof file has been generated in IntelliJ, and a search on my C: drive doesn't show anything. Since IntelliJ has a dialog for handling OOMEs when they occur, I guess this mechanism is somehow preventing the heap dump. Any way to disable it?

Thanks,
Brandon

0
Comment actions Permalink

No, it's not preventing anything. The .hprof file should be generated in the "bin" subdirectory of the IntelliJ IDEA installation directory.

0
Comment actions Permalink

Thanks! It was an issue with the default permissions on the bin directory.

0
Comment actions Permalink

Well, the top of the heap dump of the OOME file (while looking at my custom language Colors & Fonts) certainly looks different than the heap dump for Java's Color's & Fonts. Here is the top of the heap dump (unfortunately nothing is looking terribly obvious --- to me --- but I'll keep looking):

Name Instance count Size (bytes)
com.intellij.openapi.vfs.newvfs.impl.VirtualFileImpl 608817 19482144
java.util.concurrent.LinkedBlockingQueue$Node 514187 12340488
int[ ] 151931 149990488
com.intellij.util.containers.ConcurrentIntObjectHashMap$Node 132515 4240480
com.intellij.openapi.vfs.newvfs.impl.VfsData$DirectoryData 122228 2933472
com.intellij.openapi.util.Pair [2 classes] 92812 2227488
com.intellij.openapi.util.io.FileAttributes 92656 3706240
char[ ] 87749 8953736
java.lang.String 86633 2079192
com.intellij.openapi.vfs.newvfs.impl.VirtualDirectoryImpl 83079 3323160
byte[ ] 75007 10952744
com.intellij.util.containers.IntObjectLinkedMap$MapEntry 60032 1921024
com.intellij.util.text.ByteArrayCharSequence 60000 960000
com.intellij.util.containers.CharTrie$Node 30666 735984
java.util.HashMap$Node 27659 885088
0
Comment actions Permalink

I've seen one thing in your lexer which is definitely very wrong and could cause this behavior. If a lexer sees some input that it cannot process, it must never throw an exception. Instead, the correct behavior is to return TokenType.BAD_CHARACTER and advance to the next character in the fie.

0
Comment actions Permalink

Thanks! I found another problem that was (or was also) causing the problem. The following line in the lexer:

<<EOF>>                     { return ATSTokenTypes.EOF; }


I'm not sure why this caused a problem exactly.
0
Comment actions Permalink

Yes, it's a problem indeed. This will block IntelliJ IDEA's own contract for detecting that the lexer has reached EOF (returning null from Lexer.getTokenType()) and will instead cause an infinite stream of EOF tokens to get generated.

0

Please sign in to leave a comment.