Details
-
Bug
-
Status: Closed (View Workflow)
-
Major
-
Resolution: Fixed
-
1.1.4
-
None
-
None
-
CS 1.1.3 on CentOS 7
-
2018-07, 2018-08
Description
1. Start bulk load of a large CSV file with pentaho bulk load adapter. File contains 121 million rows (9 GB in size).
2. Pentaho CSV input module writes data on chunks and passes them to the Bulk Loader.
3. After last chunk has been read the error is displayed on the server where pentaho job was run:
2018/03/26 15:06:05 - MariaDB ColumnStore Bulk Loader.0 - Linenr 121250000
|
2018/03/26 15:06:06 - CSV file input.0 - Finished processing (I=121270191, O=0, R=0, W=121270191, U=0, E=0)
|
#
|
# A fatal error has been detected by the Java Runtime Environment:
|
#
|
# SIGSEGV (0xb) at pc=0x00007fe93dee54dc, pid=32515, tid=0x00007fe9255ea700
|
#
|
# JRE version: OpenJDK Runtime Environment (8.0_161-b14) (build 1.8.0_161-b14)
|
# Java VM: OpenJDK 64-Bit Server VM (25.161-b14 mixed mode linux-amd64 compressed oops)
|
# Problematic frame:
|
# C [libc.so.6+0x804dc] cfree+0x1c
|
#
|
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
|
#
|
# An error report file with more information is saved as:
|
# /home/elena.kotsinova/downloads/data-integration/hs_err_pid32515.log
|
#
|
# If you would like to submit a bug report, please visit:
|
# http://bugreport.java.com/bugreport/crash.jsp
|
# The crash happened outside the Java Virtual Machine in native code.
|
# See problematic frame for where to report the bug.
|
|
See attached hs_err_pid32515.log
Result:
The data are not loaded into database.