Extractor - data stream iterated only once assertionion error

Answered

I would like to iterate over ALL_COLUMNS and get the column name with its data type and then process the ROWS data stream.  The reason for this is in my Merge extractor I want to build the static text of the MERGE statement while excluding blob, clob, nclob, bFile, image, and sys.xmltype data types.

I looked at implementing FORMATTER.getTypeName(object, column) but I could only get it to work by supplying the ROW as the object. This has triggered an assertion error.  I implemented ROWS to get the data type then again to get the meat of the merge statement.

Is there a way to use getTypeName without ROWS?

Uploaded my merge work in progress [Upload id: 2020_10_25_J878XCAn8o4fAWRh (file: Merge.sql.groovy)]

3 comments
Comment actions Permalink

Sorry for such slow reply

For anyone who has the same problem:
You may pass null value to getTypeName as object, you'll get column type:

COLUMNS.forEach { column ->
OUT.append(FORMATTER.getTypeName(null, column))
OUT.append("\n")
}


Actually the only case to pass real object to getTypeName is when you work with MongoDB because in MongoDB type is determined by an object and not by a column

0
Comment actions Permalink

In SQLite other, then integer primary key a column can contain multiple data types.

 

create table TEST_TBL (
ID INTEGER,
DESC TEXT,
TEST
);

INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (1, 'integer', 255);
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (2, 'real', 0.333314829616256247390992939472198486328125);
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (3, 'text', 'Lorem ipsum dolor sit amet, consectetur adipiscing elit.');
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (4, 'blob', x'0123456789ABCDEF0123456789ABCDEF');
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (5, 'numeric', 2048.33);
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (6, 'date', datetime('now'));
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (7, 'boolean', true);
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (8, 255, 'integer');
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (9, 0.333314829616256247390992939472198486328125, 'real');
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (10, 'Lorem ipsum dolor sit amet, consectetur adipiscing elit.', 'text');
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (11, x'0123456789ABCDEF0123456789ABCDEF', 'blob');
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (12, 2048.33, 'numeric');
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (13, datetime('now'), 'date');
INSERT INTO TEST_TBL (ID, DESC, TEST) VALUES (14, true, 'boolean');

commit;

select * from TEST_TBL TT order by 1;
0
Comment actions Permalink

Currently DataGrip can infer type from value only in MongoDB and Couchbase.
If you work with SQLite, you can only get type of a column.
I created a ticket for it https://youtrack.jetbrains.com/issue/DBE-12406

Also you can get Java type of a value:

def COLUMN_NAME = "TEST"

def multiTypeColumn = COLUMNS.find { column -> column.name() == COLUMN_NAME }

if (multiTypeColumn == null) {
OUT.append("No column named ")
OUT.append(COLUMN_NAME)
OUT.append(" was found")
return
}

ROWS.forEach { row ->
def v = row.value(multiTypeColumn)
if (v == null) OUT.append("null")
else OUT.append(v.getClass().getSimpleName())
OUT.append("\n")
}
0

Please sign in to leave a comment.