[MCOL-2276] Unable to insert certain row Created: 2019-04-04 Updated: 2020-01-28 Resolved: 2019-11-06 |
|
| Status: | Closed |
| Project: | MariaDB ColumnStore |
| Component/s: | DMLProc |
| Affects Version/s: | 1.2.3 |
| Fix Version/s: | 1.2.6 |
| Type: | Bug | Priority: | Major |
| Reporter: | Piotr | Assignee: | Bharath Bokka (Inactive) |
| Resolution: | Fixed | Votes: | 0 |
| Labels: | None | ||
| Environment: |
Ubuntu 18.04.2 LTS |
||
| Attachments: |
|
||||||||||||||||
| Issue Links: |
|
||||||||||||||||
| Sprint: | 2019-04, 2019-05, 2019-06 | ||||||||||||||||
| Description |
|
Hi! I have trouble to insert (statement in attachment) one row to the table: CREATE TABLE `text` ( The insert process appears in process list with the Update state and lasts here forever. I can't see any progress. I checked logs in /var/logs/mariadb/columnstore but can't see anything helpful. What I found weird is that if I remove the last few characters from 2nd field then the insert works properly. Additionally if I add few more characters it also works. Unfortunately I have no idea what is the cause. Thank you in advance for your time. |
| Comments |
| Comment by Andrew Hutchings (Inactive) [ 2019-04-04 ] | ||
|
Column 2 is 8176 bytes which happens to be the block boundary size for TEXT data (16 bytes for block header). I have a feeling block allocation is looping in Dctnry::insertDctnry trying to find a suitable block and not finding it. | ||
| Comment by Piotr [ 2019-04-04 ] | ||
|
I have sql dumps with lots of inserts. Do you have any idea how to handle or avoid those unfortunate rows? I got a little bit stuck. | ||
| Comment by Andrew Hutchings (Inactive) [ 2019-04-05 ] | ||
|
Hi arcturus, unfortunately I don't yet know of a good workaround beyond making sure the length of data for a TEXT column isn't divisible by 8176 bytes. Alternatively you could store the text in an InnoDB table and join as required (this may be faster anyway). Or I think cpimport may not have the same bug (I have not yet checked). | ||
| Comment by Piotr [ 2019-04-08 ] | ||
|
Thank you. Cpimport (used from command line) successfully managed to insert this row and all others (from csv file). Although I had to change data types from blobs to texts. I have encountered another problem with importing my data to another table. Now my csv file size is 130GB and I'm trying to cpimport it the same way like the previous file (7GB). Despite the setting in Columnstore.xml <NumBlocksPct>50</NumBlocksPct> and even lower (I restarted server too) my memory goes to max (~15GB). Next swapfile also goes to max (2GB) and then server restarts. I got following errors: Turning off swap memory didn't fix the problem. Sorry that I mention my next problem here but I don't feel like it deserves new topic. Please correct me if I'm wrong. | ||
| Comment by Andrew Hutchings (Inactive) [ 2019-04-08 ] | ||
|
arcturus actually if you could create a new Jira ticket for this it would be great. Along with details of your setup (number of UMs and PMs). Looks like we may not be freeing memory in cpimport for TEXT/BLOB correctly which is a different problem. | ||
| Comment by Bharath Bokka (Inactive) [ 2019-11-06 ] | ||
|
Verified on, 'insert4.sql' works fine on the above build. Ex:
----------
---------- |