[CONJ-929] OOM happening when querying large datasets and calling isValid on connection Created: 2022-02-15 Updated: 2022-03-08 Resolved: 2022-03-08 |
|
| Status: | Closed |
| Project: | MariaDB Connector/J |
| Component/s: | performance |
| Affects Version/s: | 2.7.5 |
| Fix Version/s: | N/A |
| Type: | Bug | Priority: | Blocker |
| Reporter: | Cyril Joui | Assignee: | Diego Dupin |
| Resolution: | Won't Fix | Votes: | 0 |
| Labels: | None | ||
| Description |
|
I am using the following
Hello I'm getting the same issue with Kafka Connect JDBC Source. (https://jira.mariadb.org/browse/CONJ-715) We have a large table to read with connector and we are facing OOM. // load active result if any so buffer are clean for next query Do you have any idea why calling isValid method has this impact ? STACK : |
| Comments |
| Comment by Diego Dupin [ 2022-02-15 ] |
|
Ah, i just answered that on When streaming data, data are read row per row (or to be precise by bunch corresponding to fetch size). See the red paragraph in https://mariadb.com/kb/en/about-mariadb-connector-j/#streaming-result-sets :
The problem is if the same connection is used when streaming, to execute another command ( a Connection.isValid() in your case ), the connector will force the loading of streaming resultset, to read the new command result. The forced loading of streaming resultset might cause this OOM error. |
| Comment by Cyril Joui [ 2022-02-15 ] |
|
Hi ! I will give you feedbacks. |
| Comment by Diego Dupin [ 2022-02-15 ] |
|
Still, since you indicate that this seem to work with mysql driver, i'll check what they do, that's strange |
| Comment by Diego Dupin [ 2022-02-15 ] |
|
There could be 2 others possible implementation :
|
| Comment by Diego Dupin [ 2022-02-15 ] |
|
I just checked with mysql driver. That correspond to the first other options indicate above. I just wonder why it hasn't failed when you were using mysql driver |
| Comment by Cyril Joui [ 2022-02-15 ] |
|
You're right. That's so strange ... ________________________________________________________________________________________________ package test.aurora; import java.sql.*; public class SelectAccount { public static void main(String[] args) throws SQLException { Connection conn = DriverManager.getConnection("jdbc:mysql://localhost:3308/loyalty?useSSL=false&useCursorFetch=true", user, password); PreparedStatement stmt = conn.prepareStatement("select .... FROM TABLE WHERE "); // BIG QUERY (80 millions rows) // This code affect current ResultSet Iteration => OOM // load active result if any so buffer are clean for next query */ }).start(); System.out.println("test before executeQuery"); } |
| Comment by AAOUAD TOUFIQ [ 2022-03-06 ] |
|
Hello, |
| Comment by Diego Dupin [ 2022-03-08 ] |
|
Streaming implementation is executing the whole command, driver just fetching some rows at a time in memory. This is by far the fastest implementation when streaming data. This means connection cannot be used for other command until finishing reading streaming results. Some server/driver use a different streaming implementation, using a cursor (MariaDB provided this functionnality in protocol), but this isn't implemented by our java driver. MySQL driver works the same way, even throwing an exception when another commandis executed when streaming, where our driver load streaming resultset in memory, making previous streaming resultset workable, but can leed to OOM. I think that is real solution there is that kafka make a connection.isValid() call where there is no real need there : |