Database tool functionality needed

0xDBE is shaping up nicely but it still needs some database "tool" functionality. Right now 0xDBE seems mainly focused on being a query tool. But it needs to expand into functionality that works on the underlying data and across connections.

Here are some examples:

- Execute SQL script

The ability to run a SQL script without loading it. I can easily have a 200+ MB SQL file that can't be loaded by 0xDBE. The script should be optionally run in batch blocks of N statements.

- Import data from external file

Import data from CSV, Excel files, fixed width files, etc. into tables. Either into an existing table or into a newly created table.

- Copy table(s)

Copy a table (and data) from one table to another, optionally to another database connection.

- Backup tables(s)/database

Pull out schema and associated data for later import.

Comment actions Permalink

Hi Charles,

Huge SQL files can be executed from  Files view (use a context menu action).

Two other features you've mentioned are already in our issue tracker and they will be implemented eventually.

Thank you for your input

Comment actions Permalink

It doesn't give you the option to batch the running of the file.

In general the tool-level features of 0xDBE are hidden in right-click menus and are awkward to get at. Here's an example: I want to dump the contents of a large query and import the data into a new table in another database.

1. Open a console on the source database.
2. Type in and run the query.
3. Search around for a save button, see the Comma...(CSV) item and click on it.
4. Choose SQL Insert Statements. It's selected but nothing happens.
5. Right-click on the data grid. See the option 'Save Data to File...' and the 'Data Extractor: SQL I...ments'. Choose the Save Data option and save the data to a file.

[manipulate the file on the command line, vi, sed, etc. to]

6. File -> Open the enormous file. Get the error that the file is too big to be loaded.
7. Hunt around for some way to run a given file. Stumble on the Files view. It doesn't show any files. Take forever to click on the Scratches item and discover the Files option.
8. Follow the direction to right-click on the view and add the directory the manipulated file is in.
9. Right-click the manipulated file and choose Run, then choose the datasource. Get no option to batch the statements in the file.

Spelunking through data is awesome with 0xDBE but manipulating that data is very sub-par. It's setup in object -> function manner when most database tools are setup in task -> function manner. As a result, things that don't fit well into the object -> function matrix are hard to find and hard to remember where they are.

Comment actions Permalink

@Charles thanks for posting this. This is the last big obvious hurdle to using only DataGrip. I can do things with scripts of course but that's part of what I hope for tooling to do. I still use Navicat in fact for quick backups and restores in two (OK three) clicks. I don't have to tell it where to store it. Importing files is the same.

DataGrip has improved this but it's still a multi-step process to set up the export destination, and I can't dump a whole database without dropping down to the terminal. I also don't get timestamped exports.

@Sergey do you have the links for these features on the issue tracker?


Please sign in to leave a comment.