[CONJ-1029] After upgrading to MariaDB JDBC Driver for Java version 3.1.0 I'm now receiving error message SQL state [HY000]; error code [1210] Incorrect arguments to mysqld_stmt_bulk_execute Created: 2022-11-25 Updated: 2023-01-12 Resolved: 2022-12-21 |
|
| Status: | Closed |
| Project: | MariaDB Connector/J |
| Component/s: | Other |
| Affects Version/s: | 3.1.0 |
| Fix Version/s: | 3.1.1 |
| Type: | Bug | Priority: | Major |
| Reporter: | Thierry Giguere | Assignee: | Diego Dupin |
| Resolution: | Fixed | Votes: | 2 |
| Labels: | None | ||
| Environment: |
Java 19.0.1 / JDBC Driver 3.1.0 / MariaDb 10.6.11 |
||
| Attachments: |
|
||||||||||||||||
| Issue Links: |
|
||||||||||||||||
| Description |
|
Using the new JDBC Driver for Java version 3.1.0 I'm now receiving error message never seen before create table mytable INSERT INTO mytable (p1, p2, p3, p4, p5, p6, p7) SQL state [HY000]; error code [1210]; (conn=8568) Incorrect arguments to mysqld_stmt_bulk_execute; nested exception is java.sql.BatchUpdateException: (conn=8568) Incorrect arguments to mysqld_stmt_bulk_execute; In Java the parameter p7 is a long primitive while parameter p1 to p6 are BigDecimal Some SQL insert succeed other failed. While using JDBC Driver version 3.0.8 I didn't have any problem I suspect it's related to CONN-1015 but don't have much more information JDBC Driver configuration useBulkStmts=true Will provide soon more detailed information at trace level |
| Comments |
| Comment by Diego Dupin [ 2022-11-25 ] |
|
Thanks to provide trace level info. Can you indicate if you use some sort of proxy , and if so the version as well ? I suspect that issue occurs at that level. trace info will confirm it, but if the change of |
| Comment by Thierry Giguere [ 2022-11-25 ] |
|
Using a AWS RDS Fully managed version of MariaDB. No multi-az. No RDS proxy With permitPipeline=false added I still have SQLException I just took a look at individual commit : https://github.com/mariadb-corporation/mariadb-connector-j/commit/eee5184a#diff-7e2ab5e25a936263af07174854494c6233782687b7a2e268391b1422e6f43e70 So I also try with disablePipeline=true but I still have SQLException I will now try JDBC Driver version 3.0.9 to narrow down the problem |
| Comment by Thierry Giguere [ 2022-11-25 ] |
|
Version 3.0.9 is fine. So problem seems to be the upgrade from 3.0.9 to 3.1.0 |
| Comment by Thierry Giguere [ 2022-11-25 ] |
|
I will also try to investigate the size of the packet by either playing with option maxAllowedPacket and/or sending at Java level small batches instead of thousands rows. More to come |
| Comment by Thierry Giguere [ 2022-11-28 ] |
|
Additional info : For the very same database it works in a different scenario (a different table/statement) : create table table2 INSERT INTO table2 (c3, c4, c5, c6) So I think we can rule out proxies. Storage engine here is innoDb and not MEMORY. The table do have a primary key. Both table are empty before the insertion. The batch size here is around 600 rows. No decimal(10,6) field in this table |
| Comment by Thierry Giguere [ 2022-11-30 ] |
|
I provided in attachment a simple plain main java to reproduce the problem A basic table : CREATE TABLE mytable A basic Java program : import java.math.BigDecimal; public class TestMariaDbBatchUpdateBulInsertPipeline { ; DriverManager.registerDriver((Driver) Class.forName("org.mariadb.jdbc.Driver").newInstance()); ps.addBatch(); ps.executeBatch(); Some conclusion : |
| Comment by Thierry Giguere [ 2022-12-06 ] |
|
Using double works well : ps.setDouble(j + 2, Double.valueOf(VALUES[j])); instead of ps.setBigDecimal(j + 2, new BigDecimal(VALUES[j])); and I can reach over 100,000 rows inserted. So sounds like the problem is limited to BigDecimal usage |
| Comment by Gaël Jourdan-Weil [ 2022-12-09 ] |
|
We are facing this issue as well when upgrading the client from 3.0.9 to 3.1.0. If needed I can try to provide reproduction cases. |
| Comment by Diego Dupin [ 2022-12-21 ] |
|
Allright, after correcting |
| Comment by Gaël Jourdan-Weil [ 2023-01-12 ] |
|
Is it really fixed in 3.1.1? I don't see any reference to this issue in 3.1.1 changelog not commits history. |
| Comment by Thierry Giguere [ 2023-01-12 ] |
|
I can see the fix for |
| Comment by Thierry Giguere [ 2023-01-12 ] |
|
Using the test provided in this issue, I confirmed that switching to version 3.1.1 now works. Thanks |