Details
-
Bug
-
Status: Closed (View Workflow)
-
Major
-
Resolution: Fixed
-
1.4.0
-
None
Description
Build tested: 1.4.0-1
server commit:
67452bc
engine commit:
64ceb86
With S3 localStorage on a single server configuration, I created a dbt3 database and tried to cpimport a 1gb dataset. When it was loading the lineitem table, the following error was shown and cpimport never finished after almost one hour. I had to kill the cpimport process. The same test with S3 cloud (AWS) completed successfully.
Using table OID 3092 as the default JOB ID
Input file(s) will be read from : /root/tests
Job description file : /usr/local/mariadb/columnstore/data/bulk/tmpjob/3092_D20190910_T205837_S825658_Job_3092.xml
Log file for this job: /usr/local/mariadb/columnstore/data/bulk/log/Job_3092.log
2019-09-10 20:58:37 (17949) INFO : successfully loaded job file /usr/local/mariadb/columnstore/data/bulk/tmpjob/3092_D20190910_T205837_S825658_Job_3092.xml
2019-09-10 20:58:37 (17949) INFO : Job file loaded, run time for this step : 0.096211 seconds
2019-09-10 20:58:37 (17949) INFO : PreProcessing check starts
2019-09-10 20:58:37 (17949) INFO : input data file /data/qa/autopilot/data/source/dbt3/1g/lineitem.tbl
2019-09-10 20:58:37 (17949) INFO : PreProcessing check completed
2019-09-10 20:58:37 (17949) INFO : preProcess completed, run time for this step : 0.120688 seconds
2019-09-10 20:58:37 (17949) INFO : No of Read Threads Spawned = 1
2019-09-10 20:58:37 (17949) INFO : No of Parse Threads Spawned = 3
SocketPool: warning! Probably got a bad length field! payload length = 12 endOfData = 42 startOfPayload = 9
Attachments
Issue Links
- relates to
-
MCOL-3566 cpimport fails on PM2 with S3 storage backend
- Closed