Trouble with view and proc code getting out of sync

I am using Datagrip with SQL Server 2008. I started using Datagrip because of my extremely positive experience with IntelliJ, Pycharm, and CLion.

I am having a heck of a time with stored procedures and view code not refreshing. I can synchronize, I can select help->invalidate caches but I still get stale code when I open the objects.

I can turn on the "notify when object has changed", then click on Revert Local Changes after I open it.. but that takes 30-40 seconds each time before the link to revert actually works. Then, every time I make a change and commit it, it pops up again. Shouldn't the version that was just committed to the server match what is in my edit?

I am not understanding why reverting is slow and why synchronizing is not working. It seems that if I synchronize (or do invalidate  cache) I should get the "current" version?

It has gotten to the point that I don't trust Datagrip despite all its awesome features. I am always uncomfortable that I may be losing some important change done outside of Datagrip.

Is there something I should be doing in the connection settings or in setup that will force newly opened objects to always reflect the current state of the server?

Thanks much,

Chris

 

 

38 comments
Comment actions Permalink
Official comment

Isak, thanks for the clear comment. 

In fact, to get the actual version of the code you need to press Refresh on the toolbar. Before 2017.3.4 it didn't work properly, now it does.

It means, that if you open the source code of the procedure you cannot be sure it's totally actual. If it has been changed from the other tool, DataGrip doesn't know about that. Why just not to refresh the code when you open it? I will try to explain it.

In DataGrip we have a source code storage: when DG syncs the schema, we have all the sources in the local storage which actual for the moment of synchronization. It is kind of your database model: it helps not only to search through the source code base, but provides a navigation → you will be able to know i.e. table A is used in the procedure B, procedure B is executed from the procedure C etc. It is a model which should be consistent to help DataGrip to have all its smart navigation and search features. So, that's why if we refresh only one procedure code, we will break the consistency of the model.

So, you need to refresh the whole schema. Why we don't do it when you open the code? Because performing synchronization of the schema on any click on the procedure is not a good idea. If you do some kind of updates, we do it if Auto-sync is on, but refreshing the whole schema when you just view the source code is not what user expects. If we would imagine that it works like that, just see: I open procedure A, then navigate to view B, then navigate to some other procedure (as you may know, we resolve objects in the code). In that case we need to refresh schema thrice? No way.

But, as we see: usability problem exists.

We consider the following feature: when you open the source code, we perform a small query to understand - if our local version of the code is the same as in the database. If not, you will see the note. Something like 'you need to click Refresh to see the actual code'. If you strongly need to see the new version (i.e. not just to understand what proc does, but to update it), you click Refresh. And it refreshes the whole schema's source codes.

We think it will bring you back the confidence in what you see and will leave fast navigation/search features here.

Comment actions Permalink

Why do you keep asking the same question, "Is auto-sync enabled"? We are all telling you that we *manually* execute "Synchronize" and it DOES NOT SHOW THE CORRECT/RECENT VERSION OF THE PROC from the server. We have told you that auto-sync IS enabled yet you still do not offer a solution.

1
Comment actions Permalink

Hi,

I think the reason is that unless you use the specific technology everyday, you do not understand fully... So, I figured I will try to make a small video, to show what I find so confusing/frustrating and how this issue happens. I think it may appear the autosync is not working either way, but here it is, for everybody to watch: https://www.screencast.com/t/Z94gaFb1Lko

Apologies, it is full five minutes, but I wanted to explore the issue in detail. I think there are couple points:

1) auto sync takes time and isn't immediate

2) the issue only gets noticed when two people are working on the same thing in quite fast succession, which in today's modern time of continuous integration is more common

3) I suspect that this isn't the case just for SQL server, but that is where you notice it the most, probably because of the stored procedures and their little quirks. But SQL server is more common in company environments with multiple developers

4) and before you ask : https://www.screencast.com/t/kFV7qYptEehh

 

1
Comment actions Permalink

I thought I had a synchronization issue as described above, but found that I was not using the tool properly.  I was expecting it to act exactly like SQL Management Studio.  Below is what was doing.

  1. Double click a procedure which then brings up a the CREATE PROCEDURE.
  2. Change the CREATE to ALTER and then make my change.
  3. Select all text and click run.
  4. Expect my procedure to be updated.
  5. It did not alter on the server. Forcing refresh did not help

The problem was that the file opened after double clicking procedure was not capable of running the script. This is what worked for me.

  1. Double click a procedure which then brings up a the CREATE PROCEDURE.
  2. Select all of the text and create a new scratch file...pasting the text to the scratch file.
  3. Change the CREATE to ALTER and then make my change.
  4. Select all text and click run.
  5. The alter statement has now altered the procedure on the database.

I hope this helps someone.

 

 

 

1
Comment actions Permalink

Hi,

Could you explain the way you submit your changes to a database? How big is your database? Could you provide screenshots of your data source settings?

Thank you

0
Comment actions Permalink

Generally I am just opening the proc, changing it to an alter (or chose "modify" when using SQL Management Studio), making my change, and executing the alter. I do understand that if I have a proc open in the window when I bring up datagrip it may not reflect a change I made elsewhere.. but I do expect to see the most recent version when I open a new edit window with the proc.

On a wag I would say the DB has 30-40 tables, the we have 30+ procs or views, maybe quite a lot more. It varies. The server itself will have 20-50 databases depending on how much we have archived. Some tables are quite small, a few have millions of records.

I will see what I can do for screen shots. It is a super busy time for us right now so I may have to revisit this issue  later.

I do seem to be having less trouble with the "SQL Source Modification Detection" settng since I fixed all my lf/cr settings to be the same across my editors but I have not tested that.

 

0
Comment actions Permalink

I'm experiencing this issue since updating to 2017.2.2. I'm using DataGrip to connect to a SQL Server 2008 database. I can Synchronize, which appears to execute without errors, but opening a stored proc which I know was modified on the server is not updated locally. In addition, the "SQL source modification detection" notification doesn't appear to work any more. While editing a proc, previously it would notify me that my local changes are not synced with the server and ask to either keep local changes or revert. Now it doesn't give me the option. This is a show-stopper issue as this will cause my changes to overwrite the most recent version on the server.

To answer Vasily's questions above. I open the proc, change CREATE to ALTER, modify as necessary and execute the SQL. This happens across all database, across all projects, big or small. What specifically do you need to see in the screenshot? I'm not going to screenshot the Data Source Properties with the connection string info.

0
Comment actions Permalink

Hi,

Try to use "Forget Cached Schemas.. "  and don't forget to alter your procs.

Thank you.

0
Comment actions Permalink

I have the same probleme...I modified a stored procedure then I click over "Forget Cached Schemas" -> click over "Synchronize" then I open the stored procedure that I just modified and an old version is displayed (the version before my changes), however, the changes were made because I can see them using another tool (MySql server Management Studio)

0
Comment actions Permalink

Hi,

 

Experiencing similar on Azure SQL server 2016 with latest Datagrip 2017.2. I change procedure using SSMS on the server and it shows me that is changed, but opening with Datagrip I still see the old version. I tried using synchronize, and forget cached Schemas. Is there a fix?

0
Comment actions Permalink

Hi,

Did you enable Auto sync?



Thank you.

0
Comment actions Permalink

Hi,

If you modify routines outside DataGrip you can refresh sources to ensure everything is correct:

Thank you.

0
Comment actions Permalink

Hi, autosync is enabled and I tried the sync on both data on individual procedures aka routines. I  even deleted cache and loaded it again. In my desperation I even restarted the pc.

0
Comment actions Permalink

I'm having the same issue (running 2017.2.3, SQL Server 2016). It shows a stale version of a procedure, and does not update even if I try to do it manually. This issue is so serious that I cannot trust or recommend this tool.

Showing a stale version of a procedure when we navigate to it is absolutely crazy - something that should never happen no matter what. I think most people could understand a list of tables not being 100% up to date all the time, and sometimes needed a manual nudge, but with an issue like this you have crossed a line.

0
Comment actions Permalink

@Isak Sky Hi,

Could you describe your scenario? Could you attach a screenshot of your data source settings? Do you have "Auto-sync" enabled?

Thank you.

0
Comment actions Permalink

@vasily chernov here are my settings. Auto-sync is enabled.

My scenario is just wanting to see the latest version of a procedure on a database. The database can get modified by other people, tooling, using many different programs.

 

0
Comment actions Permalink

@ Katerina Wait  Hi,

We understood the problem you're talking about. 

Thank you.

0
Comment actions Permalink

We have the same issue. If you double click on the name of the function/procedure, it will open a new console with the OLD (local) version of that object.

But if you right click on the name of the object, then go to DDL and Sources->Generate DDL to Console, then it will open the NEW most recent version.

Don't double click to open any objects in DataGrip.

0
Comment actions Permalink

We fixed many problems connecting with ths in the latest release. I kindly ask everybody who wrote there to try the current version: 2017.3.4

If the problem still exist, please, describe the case (or clarify that is is the same as above :) 

Thanks!

0
Comment actions Permalink

@Maxim

 

Thanks for the update. I just gave this a try using 2017.3.4, and the problem still exists in my case.

Steps:

1. Open a procedure on a SQL Server 2017 database via the Control-N menu in DataGrip.

2. Change the procedure X in another program, e.g., SQL Server Management Studio.

3. Close the procedure tab in Datagrip, and then open it again via Control-N. It will show an out of date version.

 

0
Comment actions Permalink

@Maxim, thanks for the response.

That sounds like it would be an improvement, but I still wonder, why do you think it is OK to show any stale code whatsoever? Do your programmers do this when opening java source files? Of course not, it would be completely crazy. Programmers would stop using such a tool immediately. I wonder why you seem to think it is acceptable here. It seems to me like any competent DBA would ban a tool like this for making any code changes.

This is a shame, because DataGrip has a lot of other features where it is way ahead any competition, like the data editor, and the object finder.

I can understand not wanting to do complete expensive schema rebuilding constantly, but that should only be for information that is not absolutely crucial, like autocomplete metadata, type information, etc. Not for showing the code!

This is how it works for normal programming environments - e.g., open a file - always see the latest version, but you may at some times see red error squiggles that are incorrect - they haven't taken the latest changes to other files into account yet. That is completely understandable, programmers know that they need to either wait, or manually trigger a cache invalidation. Showing a stale version of the file, however, is not - that is a recipe for disaster.

I would consider maintaining more than one version of the schema in memory - one that can be used for operations that are not crucial, like autocomplete, type information, and one stuff where getting stale information is unacceptable.

 

0
Comment actions Permalink

@Isak

So, the first and the most important question - what is the problem if DataGrip will have a feature I've described you above? If you will see that this code needs to be updated and click 'Refresh' button. That might make you completely sure, isn't it?

Then. When you describe working with files, you say: "OK to show any stale code whatsoever? Do your programmers do this when opening java source files? Of course not". The answer is "it's possible", because we use VCS. How do you know that this file is changed in the repository? You manually update your project. That is nearly the same: you have some consistent version of the database you work with. Want to keep updated? Will will warn you (no we don't, but we all agree it's not ok).

 

0
Comment actions Permalink

@Maxim 

> So, the first and the most important question - what is the problem if DataGrip will have a feature I've described you above? If you will see that > this code needs to be updated and click 'Refresh' button. That might make you completely sure, isn't it?

You are right, it would, but it is still strange. Think about if Visual Studio also maintained it's own version of C# class definitions separate from the normal source directory file system, and separate from VCS. Could they then, like our case here, alleviate the problem with a warning if their version of the class did not match up with that of the file system? Yes, but the obvious better alternative would be to just never show it for any reason. The same principle applies here.

It is true that VCS also adds a version that you have to think about, but in that case the cost and benefits are well understood - it is something needed to work as a team, and is consistent regardless of what editor you use. In this case, it is hard to see what the benefits are, if one does not consider what is currently convenient for the the internals of DataGrip.

0
Comment actions Permalink

> We consider the following feature: when you open the source code, we perform a small query to understand - if our local version of the code is the same as in the database. If not, you will see the note. Something like 'you need to click Refresh to see the actual code'.

After updating DataGrip to 2017.3.4, I can confirm that the Refresh button works now and refreshes the code to the latest version. But you actually have to remember to do that every time you open any function. Because it DOES NOT show you a note reminding you that you have a stale version and need to refresh as you described. That feature still does not work. 

P.S. I do have "Auto Sync" enabled in data source settings.

0
Comment actions Permalink

@Neket01 Yes, about "showing" we hope to have it in 2018.1, as I wrote above.

0
Comment actions Permalink

I had been away from Datagrip for a while but now that I have a few projects suited to it (and a little time to experiment) I have given it another shot. I was delighted to see that they sync seems to work now and look forward to 2018.1 to get a warning.

The way I develop when working with SQL is different than, say, Scala with IntelliJ which I work with out of a repository. There is a focus on real-time operational work directly to the server. I still would prefer that DataGrip autosyncs each time I re-open a proc to edit but I can be happy just being prompted to push a button.

0
Comment actions Permalink

We didn't manage to include warning into 2018.1

Please, follow the ticket: https://youtrack.jetbrains.com/issue/DBE-6185

0
Comment actions Permalink

I am having a heck of a time with stored procedures and view code not refreshing (sql server 2014). I can synchronize, I can select help->invalidate caches but I still get stale code when I open the objects. I have the refresh action from sproc editor, datasource, parent datasource node, and auto sync is on: 

Pls let me know if I'm missing something ? 

0
Comment actions Permalink

As a temporary workaround one needs to invoke "Database tools -> Forget cached schemas" action to get up-to-date routines sources.

0
Comment actions Permalink

I can confirm this issue is present and there is something else going on... Whenever i connect to database, say after a few hours or a day, on connection when it syncs automatically my older version of objects where were in datagrip's cache over writes the work of other people's they did when i was out. Please acknowledge if someone else have faced this issue.

I have questions from DG devs/support:

  1. Do you actually means synchronize like both ways, download changes from server, and upload my changes to server? Or just downloads and updates the local cache?
  2. Can we have an automated scheduled (configurable) cache refresh job in background say after 1 min, 3 min or 10 mins?
0

Please sign in to leave a comment.