AI Assistant context awereness
Hi,
Before holidays I used the AI Assistant on the trial period and it seemed to be mostly context aware. I could ask about how specific functions were used within the project and the answers were pretty good. So I now went ahead and got the subscription.
Now if I ask a similar question the answers seem to be “Without more specific information about the full content of file X, and without knowing the full list of functions defined in the file, it is impossible to definitively list out…”. So apparently the assistant is not context aware?
When I asked why it can't access the file it says that I need to supply the code into the conversation for it to be able to access it. It apparently cannot access files for security reasons.
But what's the point of using the assistant if it isn't context aware even within the project? Or am I missing something here?
请先登录再写评论。
Hello roblob ,
Unfortunately, I wasn't able to reproduce this issue.
There is a similar report in YouTrack: https://youtrack.jetbrains.com/issue/LLM-1646
Can you please collect additional information and attach it to LLM-1646 for analysis?
Required information:
1) IDE version
2) Screenshot of the issue
3) It would be helpful if it's possible to reproduce the issue with an open source project and you can provide a link to the project and steps to reproduce.
4) Log with debug:
Exactly the same thing happened to me.
Now each time I want to ask something suddenly the AI Assistant is as dumb as they come and I have to painstakingly specify each file or block of code my question is referring to, but then again that file / code block is suddenly isolated and the assistant doesn't see the other files it is clearly related to.
It's maddening.
Arman Shahinyan , please collect the same logs, send it to https://uploads.jetbrains.com/browse and share the logs id with us.
HI i found the same problem and managed to fix it by going to Settings > Tools > AI Assistant. Then I turned of all three checkboxes for “Enable smart chat mode”, “Enable automatic inline code ..” and “Provide AI-generated name suggestions”. Restart, then turn them all on again, and restart, and now context awareness is back to normal.
There's perhaps a faster way to do this, but this works for me, so just want to share. If I can't get this working I would have cancelled my subscription as it makes the AI assistant worthless without context awareness.
Joe Wandy
Which version of IDE / AI Assistant plugin are you using?
Context awareness should be improved in the latest version. For example, for IntelliJ - 2023.3.5: https://www.jetbrains.com/idea/download.
Ivan Pajic I am having a similar problem on AI assistant on IntelliJ, WebStorm, and PyCharm, each on the latest stable release. I have tested a prompt like
And I get a response that talks about a theoretical implementation of MyConstants, not the actual MyConstants definition.
Joe Wandy 's workaround to turn off the AI Assistant features is not working for me on WebStorm.
Kawin Nikomborirak
It looks like you encountered this issue: https://youtrack.jetbrains.com/issue/LLM-2725.
Support for referencing files and symbols in chat is planned for the next major release 2024.2: https://youtrack.jetbrains.com/issue/LLM-2217. It will be possible to refer to all files from the project by using following syntax:
@file:/foo/bar/Foo.kt
. Sending the file as part of the chat prompt will allow the model to provide a more relevant response.Got it, thanks Ivan Pajic
Ivan Pajic I am having the same issue with PyCharm Professional Edition (2024.1.4) - I asked a specific question about my codebase and got this response: `I don't have direct access to your code; it's processed in a privacy-preserving manner to support this interaction.`. So is the AI assistant supposed to actually have full context awareness? If not I don't see much value in paying for this service vs using another LLM where I'd need to copy/paste code in.
I pressed the assistant on this point, and got this response:
`As an AI developed by OpenAI, I do offer context-aware assistance in a conversational manner. However, when it comes to specific codebase structures or proprietary code, I cannot access this information directly due to privacy and security measures in place.`
So am I to understand that Jetbrains AI is merely a wrapper for ChatGPT?
UPDATE: so it does seem that by enabling `allow detailed data collection` in Tools > AI Assistant > Data Sharing, I was able to get slightly better responses that initially seemed to have at least SOME context from the project code. However, when given detailed questions related to the context the response is still:
`I don't have direct access to your codebase, but I can provide general advice based on typical Django structures and practices`
Arosenfeld2003
AI Assistant should be able to search through the codebase and access project files' content. It requires that the Smart Chat mode is enabled in Settings (
⌘ + ,
orCtrl + Alt + S
) > Tools > AI Assistant.AI Assistant is integrated with the IDE by having access to specific functions which provide more information about the project context - e.g. search through the project tree, read files, access project dependencies, access project build system configuration, access metadata information like technologies/frameworks used etc.
Also, it provides other features, apart from AI chat, such as: code generation, documentation generation, commit message generation, test generation, code completion, name suggestions, runtime error explanation, language conversion, etc.
Detailed data collection should not affect the behavior of AI Assistant features (e.g. AI chat). It enables additional logging on IDE side which sends more detailed information to JetBrains servers in order to understand product usage and identify opportunities for improvement.
We are currently analyzing some issues related to project context awareness of AI chat. It would help us in our investigation, if you could file a new issue report on our issue tracker: https://youtrack.jetbrains.com/newIssue?project=LLM, and provide more details like a sample project and steps to reproduce the issue you encountered. Thank you!
Yeah, so far it seems to be atrociously bad. Paid for sub because my trial finished. At this point it's completely useless and has nothing to do with actual context awareness. The most basic things like creating fastapi app and setting uvicorn in your main.py then asking the chat assistant to generate the dockerfile for it (which is extremely basic thing even gpt3 would normally handle) is impossible to do with the chat assistant.
It has 0 knowledge about the project, which is extremely discouraging when trying to use another feature that just jumps on the bandwagon of using LLMs while producing negative results to the users. There's no reason to use this for small projects, let alone bigger ones. For small projects resorting to copypasting all your existing codebase to the prompt is the only solution that can work, and the problem with it is obvious, iterating on code like this is impossible.
NIkita Mikhailov
Could you please share some examples, screenshots, chat dumps or minimal reproduction/steps to reproduce the issues you encountered? So that we can try to recreate the same scenario on our end? Which feature did you use when this issue occurred (AI Chat, code generation etc.)? Which IDE version and AI Assistant version did you have installed when this issue reproduced?
You can zip and upload the data to: https://uploads.jetbrains.com, just provide the uploaded file ID. Thank you.
I've recently returned to using AI assistant after having stepped away for a month or two. I'm stunned by it's current lack of context awareness. At my work, we only have access to Copilot. A month or two in the past when I was frequently switching back and forth between the two (Copilot at work and AI Assistant at home) and I was always favoring the Jetbrains product. I have no idea what has been done to cripple it recently, but it's is borderline useless right now.
I'll be editing a Kotlin file and pop over to chat and ask it a question about my code and it will return a completely generic answer. I just now asked it about the filter I was using on a list and the answer ended in “if you have the code for the specific filter your asking about, I can inspect it to provide a better description of it's behavior”.
This happens 100% of the time now. Unless I actually PASTE my code that ALREADY exists in my editor into the chat window, I cannot get any answers relevant to my actual code. I can't possibly understand how this is useful to anyone.
Just to make sure I wasn't crazy, I disabled AI Assistant and installed Copilot and it worked perfectly. It was always aware of my code, what I was doing, and provided context aware answers.
I want to use the Jetbrains Assistant. I don't actually like Copilot. It frankly gives completely WRONG answers far too often, but at least it answers questions about my working code.
@Matthew
Could you please file a new bug report on our issue tracker: https://youtrack.jetbrains.com/newIssue?project=LLM , and share more details about the issue you encountered - examples, chat dumps, minimal reproduction project/steps to try and reproduce the issues on our end? Which version of IDE / AI Assistant did you have installed when reproducing the mentioned issues? Thank you!