AI Assistant context awereness

已回答

Hi,

Before holidays I used the AI Assistant on the trial period and it seemed to be mostly context aware. I could ask about how specific functions were used within the project and the answers were pretty good. So I now went ahead and got the subscription.

Now if I ask a similar question the answers seem to be “Without more specific information about the full content of file X, and without knowing the full list of functions defined in the file, it is impossible to definitively list out…”. So apparently the assistant is not context aware?

When I asked why it can't access the file it says that I need to supply the code into the conversation for it to be able to access it. It apparently cannot access files for security reasons.

But what's the point of using the assistant if it isn't context aware even within the project? Or am I missing something here?

 

5

Hello roblob ,

Unfortunately, I wasn't able to reproduce this issue.

There is a similar report in YouTrack: https://youtrack.jetbrains.com/issue/LLM-1646

Can you please collect additional information and attach it to LLM-1646 for analysis?

Required information:

1) IDE version

2) Screenshot of the issue

3) It would be helpful if it's possible to reproduce the issue with an open source project and you can provide a link to the project and steps to reproduce.

4) Log with debug: 

  • add the following lines to Help | Diagnostic Tools | Debug Log Settings..:
com.intellij.ml.llm
com.intellij.util.io.HttpRequests
com.intellij.util.proxy.CommonProxy
  • restart IDE
  • reproduce the issue and collect logs via Help | Collect Logs and Diagnostic Data (entire archive)
0

Exactly the same thing happened to me.
Now each time I want to ask something suddenly the AI Assistant is as dumb as they come and I have to painstakingly specify each file or block of code my question is referring to, but then again that file / code block is suddenly isolated and the assistant doesn't see the other files it is clearly related to.
It's maddening.

1

Arman Shahinyan , please collect the same logs, send it to https://uploads.jetbrains.com/browse and share the logs id with us.

0

HI i found the same problem and managed to fix it by going to Settings > Tools > AI Assistant. Then I turned of all three checkboxes for “Enable smart chat mode”, “Enable automatic inline code ..” and “Provide AI-generated name suggestions”. Restart, then turn them all on again, and restart, and now context awareness is back to normal.

There's perhaps a faster way to do this, but this works for me, so just want to share. If I can't get this working I would have cancelled my subscription as it makes the AI assistant worthless without context awareness.

1

Joe Wandy 
Which version of IDE / AI Assistant plugin are you using? 
Context awareness should be improved in the latest version. For example, for IntelliJ - 2023.3.5: https://www.jetbrains.com/idea/download.

0

Ivan Pajic I am having a similar problem on AI assistant on IntelliJ, WebStorm, and PyCharm, each on the latest stable release. I have tested a prompt like 

What other constants in MyConstants do I define other than THIS_VALUE

And I get a response that talks about a theoretical implementation of MyConstants, not the actual MyConstants definition.

Joe Wandy 's workaround to turn off the AI Assistant features is not working for me on WebStorm.

0

Kawin Nikomborirak 
It looks like you encountered this issue: https://youtrack.jetbrains.com/issue/LLM-2725.

Support for referencing files and symbols in chat is planned for the next major release 2024.2: https://youtrack.jetbrains.com/issue/LLM-2217. It will be possible to refer to all files from the project by using following syntax: @file:/foo/bar/Foo.kt. Sending the file as part of the chat prompt will allow the model to provide a more relevant response.

1

Ivan Pajic I am having the same issue with PyCharm Professional Edition (2024.1.4) - I asked a specific question about my codebase and got this response: `I don't have direct access to your code; it's processed in a privacy-preserving manner to support this interaction.`. So is the AI assistant supposed to actually have full context awareness?  If not I don't see much value in paying for this service vs using another LLM where I'd need to copy/paste code in.

I pressed the assistant on this point, and got this response:
`As an AI developed by OpenAI, I do offer context-aware assistance in a conversational manner. However, when it comes to specific codebase structures or proprietary code, I cannot access this information directly due to privacy and security measures in place.`

So am I to understand that Jetbrains AI is merely a wrapper for ChatGPT?

UPDATE: so it does seem that by enabling `allow detailed data collection` in Tools > AI Assistant > Data Sharing, I was able to get slightly better responses that initially seemed to have at least SOME context from the project code.  However, when given detailed questions related to the context the response is still:

`I don't have direct access to your codebase, but I can provide general advice based on typical Django structures and practices`

0

Arosenfeld2003 

So is the AI assistant supposed to actually have full context awareness?

AI Assistant should be able to search through the codebase and access project files' content. It requires that the Smart Chat mode is enabled in Settings (⌘ + , or Ctrl + Alt + S) > Tools > AI Assistant.

So am I to understand that Jetbrains AI is merely a wrapper for ChatGPT?

AI Assistant is integrated with the IDE by having access to specific functions which provide more information about the project context - e.g. search through the project tree, read files, access project dependencies, access project build system configuration, access metadata information like technologies/frameworks used etc.
Also, it provides other features, apart from AI chat, such as: code generation, documentation generation, commit message generation, test generation, code completion, name suggestions, runtime error explanation, language conversion, etc.

so it does seem that by enabling `allow detailed data collection` in Tools > AI Assistant > Data Sharing, I was able to get slightly better responses

Detailed data collection should not affect the behavior of AI Assistant features (e.g. AI chat). It enables additional logging on IDE side which sends more detailed information to JetBrains servers in order to understand product usage and identify opportunities for improvement.

We are currently analyzing some issues related to project context awareness of AI chat. It would help us in our investigation, if you could file a new issue report on our issue tracker: https://youtrack.jetbrains.com/newIssue?project=LLM, and provide more details like a sample project and steps to reproduce the issue you encountered. Thank you!

0

AI Assistant should be able to search through the codebase and access project files' content. It requires that the Smart Chat mode is enabled in Settings (⌘ + , or Ctrl + Alt + S) > Tools > AI Assistant.

Yeah, so far it seems to be atrociously bad. Paid for sub because my trial finished. At this point it's completely useless and has nothing to do with actual context awareness. The most basic things like creating fastapi app and setting uvicorn in your main.py then asking the chat assistant to generate the dockerfile for it (which is extremely basic thing even gpt3 would normally handle) is impossible to do with the chat assistant.

It has 0 knowledge about the project, which is extremely discouraging when trying to use another feature that just jumps on the bandwagon of using LLMs while producing negative results to the users. There's no reason to use this for small projects, let alone bigger ones. For small projects resorting to copypasting all your existing codebase to the prompt is the only solution that can work, and the problem with it is obvious, iterating on code like this is impossible.

 

2

NIkita Mikhailov 

At this point it's completely useless and has nothing to do with actual context awareness. The most basic things like creating fastapi app and setting uvicorn in your main.py then asking the chat assistant to generate the dockerfile for it (which is extremely basic thing even gpt3 would normally handle) is impossible to do with the chat assistant.

Could you please share some examples, screenshots, chat dumps or minimal reproduction/steps to reproduce the issues you encountered? So that we can try to recreate the same scenario on our end? Which feature did you use when this issue occurred (AI Chat, code generation etc.)? Which IDE version and AI Assistant version did you have installed when this issue reproduced?

You can zip and upload the data to: https://uploads.jetbrains.com, just provide the uploaded file ID. Thank you.

0

I've recently returned to using AI assistant after having stepped away for a month or two. I'm stunned by it's current lack of context awareness. At my work, we only have access to Copilot. A month or two in the past when I was frequently switching back and forth between the two (Copilot at work and AI Assistant at home) and I was always favoring the Jetbrains product. I have no idea what has been done to cripple it recently, but it's is borderline useless right now. 

I'll be editing a Kotlin file and pop over to chat and ask it a question about my code and it will return a completely generic answer. I just now asked it about the filter I was using on a list and the answer ended in “if you have the code for the specific filter your asking about, I can inspect it to provide a better description of it's behavior”. 

This happens 100% of the time now. Unless I actually PASTE my code that ALREADY exists in my editor into the chat window, I cannot get any answers relevant to my actual code. I can't possibly understand how this is useful to anyone. 

Just to make sure I wasn't crazy, I disabled AI Assistant and installed Copilot and it worked perfectly. It was always aware of my code, what I was doing, and provided context aware answers. 

I want to use the Jetbrains Assistant. I don't actually like Copilot. It frankly gives completely WRONG answers far too often, but at least it answers questions about my working code. 

1

@Matthew

Could you please file a new bug report on our issue tracker: https://youtrack.jetbrains.com/newIssue?project=LLM , and share more details about the issue you encountered - examples, chat dumps, minimal reproduction project/steps to try and reproduce the issues on our end? Which version of IDE / AI Assistant did you have installed when reproducing the mentioned issues? Thank you!

0

Well, I see that this happens to me as well.

Here is the typical response of the AI assistant after I ask why the code base is not being considered:

My apologies for the confusion. If there's already a function named fetch_inspections_for_plants, we don't need to create a new one. We should instead use the existing one in place of the duplicated logic we were previously using.

0

Toghrul Maharramov , please create a bug report on our issue tracker: https://youtrack.jetbrains.com/newIssue?project=LLM , and share more details about the issue you encountered - reproduction project/steps to try, screenshots that represent the issue, debug logs  (see How to collect IntelliJ IDE logs for AI Assistant troubleshooting), chat dumps.

0

This is a big problem with all the AI assistants. To track the tricky behavior the codebase context is the most important thing laying the base for a question to work. It must be supported from the IDE side so that we can simply add folders in separate user interface. Separate generic interface for adding files to dedicated AI context would also make IDE more robust. It is not so hard as it would add only one new well defined bubble to runtime.
The old concepts of the “context” running inside the IDE are really messing things up.
I have a certain test case which took me several hours to solve and solution was very simple at the end.
So I started testing which AI assistant would support the case. And how.
The first thing is to be able to exactly 100% define the code context. It looks like Phpstorm is messing this up from the start. It could be repaired by building totally separate dedicated interface to be delivered to AI assistants.
As a first try I ended up asking: what classes are declared in my codebase? This would mean all the directories under htdocs.
Currently no assistant can advance in that question because it is obvious the interface for the question about contextbase is not mature and generic. 
My actual tricky trace leading to error is totally dependent for AI to be able to follow current directory and it must have access to all the actual classes in the project tree. It does not make sense for me to provide any files or such as I can see the problem is universally not to be solvable yet as the interface is not there. 
AI could be much better on this even in its current infancy if given chance with correct and consistent codebase delivery interface.
AI is still extremely good at lower level language related questions.

I have tried numerous AI assistants and none can deal with PHPstorm's contextbase concept. 

 

0

请先登录再写评论。