[CONJ-421] Spark error: java.sql.SQLException: Out of range value for column 'i' : value i is not in Integer range Created: 2017-02-01 Updated: 2018-08-03 |
|
| Status: | Open |
| Project: | MariaDB Connector/J |
| Component/s: | None |
| Affects Version/s: | 1.5.7 |
| Fix Version/s: | None |
| Type: | Task | Priority: | Minor |
| Reporter: | David Thompson (Inactive) | Assignee: | Diego Dupin |
| Resolution: | Unresolved | Votes: | 1 |
| Labels: | None | ||
| Issue Links: |
|
||||||||
| Description |
|
Install Spark 2.0 using docker for simplicity following this image: copy the sample docker-compose.yml locally and run: After a while the master and worker will both start and logging stops. Now:
the df.show will result in a stack trace with the error:
If instead i use the mysql connector (mysql-connector-java-5.1.40.tar.gz) it works. This docker image use java 8, however i modified this to use Java 7 and rebuilt and the error still happens so not a java 8 specific issue. Table definition is very simple: |
| Comments |
| Comment by David Thompson (Inactive) [ 2017-02-01 ] |
|
This also happens with spark 2.1 which is the latest version of spark. |
| Comment by Diego Dupin [ 2017-02-01 ] |
|
reproduced, so i'll be able to debug the issue |
| Comment by Diego Dupin [ 2017-02-03 ] |
|
back on it : somehow, sparck execute the query 'SELECT "i","ip" FROM tmp', not 'SELECT i,ip FROM tmp' |
| Comment by Diego Dupin [ 2017-02-03 ] |
|
All works well with spark using connection string with "jdbc:mysql:...", but not using "jdbc:mariadb:..." because MySQL dialect is then not used. when not used, defaut quote is ", not ` So, some internal query generated by spark like "SELECT `i`,`ip` FROM tmp" will then be executed as "SELECT "i","ip" FROM tmp" with dataType previously retrieved, causing the exception I'll make a pull request to spark so "jdbc:mariadb:" connection string can be handle |
| Comment by Diego Dupin [ 2017-02-03 ] |
|
(pull request is dependant of |
| Comment by Russell Spitzer [ 2017-05-25 ] |
|
I couldn't find a corresponding Spark Jira for this. Has one been made? If so can you please link it here |
| Comment by Dieter Vekeman [ 2018-08-03 ] |
|
I ran into the problem recently. I created one now Not sure if the patch has been submitted and is hanging somewhere? |