Details
-
Bug
-
Status: Closed (View Workflow)
-
Critical
-
Resolution: Fixed
-
2.0.0-RC, 1.6.0, 2.0.1, 1.6.1, 2.0.2
-
None
Description
As I can't copy-paste code here as our application uses Connector/J inside a complicated self-written framework, I'll describe the reproduction steps:
- Create a table someTable with a field foo type longtext
- Use a prepared statement with a simple query like INSERT INTO someTable (foo) VALUES ( ? )
- Generate random strings with below code:
private String createString(int length) {
return StringUtils.leftPad("", length, "\"");
}
- Use this to generate strings length 8000000 (8 million) doublequotes and 10000000 (10 million) doublequotes
- Check the results (SELECT LENGTH(foo) FROM someTable)
You'll see (or at least I see) that 8000000 inserts a string length 8000000 (how it should be), but the 10000000 inserts a string half its length: 5000000.
I think I also found the problem and the solution, so I will add a pull request.
In our case (where we insert very large JSON strings with a lot of doublequotes) this makes us lose data and generate invalid JSON strings. This does not happen in 1.5.9, but does happen in 1.6.x and 2.0.x as far as I have tested.
This is a critical bug as this results in invalid and lost data.
Attachments
Activity
Field | Original Value | New Value |
---|---|---|
Description |
As I can't copy-paste code here as our application uses Connector/J inside a complicated self-written framework, I'll describe the reproduction steps:
* Create a table {{someTable}} with a field {{foo}} type {{longtext}} * Use a prepared statement with a simple query like {{INSERT INTO someTable (foo) VALUES (?)}} * Generate random strings with below code: {code:java} private String createString(int length) { return StringUtils.leftPad("", length, "\""); } {code} * Use this to generate strings length 8000000 (8 million) doublequotes and 10000000 (10 million) doublequotes * Check the results ({{SELECT LENGTH(foo) FROM someTable}}) You'll see (or at least I see) that 8000000 inserts a string length 8000000 (how it should be), but the 10000000 inserts a string half its length: 5000000. I think I also found the problem and the solution, so I will add a pull request. In our case (where we insert very large JSON strings with a lot of doublequotes) this makes us lose data and generate invalid JSON strings. This does not happen in 1.5.9, but does happen in 1.6.x and 2.0.x as far as I have tested. This is a critical bug as this results in invalid and lost data. |
As I can't copy-paste code here as our application uses Connector/J inside a complicated self-written framework, I'll describe the reproduction steps:
* Create a table {{someTable}} with a field {{foo}} type {{longtext}} * Use a prepared statement with a simple query like {{INSERT INTO someTable (foo) VALUES ( ? )}} * Generate random strings with below code: {code:java} private String createString(int length) { return StringUtils.leftPad("", length, "\""); } {code} * Use this to generate strings length 8000000 (8 million) doublequotes and 10000000 (10 million) doublequotes * Check the results ({{SELECT LENGTH(foo) FROM someTable}}) You'll see (or at least I see) that 8000000 inserts a string length 8000000 (how it should be), but the 10000000 inserts a string half its length: 5000000. I think I also found the problem and the solution, so I will add a pull request. In our case (where we insert very large JSON strings with a lot of doublequotes) this makes us lose data and generate invalid JSON strings. This does not happen in 1.5.9, but does happen in 1.6.x and 2.0.x as far as I have tested. This is a critical bug as this results in invalid and lost data. |
Fix Version/s | 1.6.2 [ 22560 ] | |
Fix Version/s | 2.0.3 [ 22559 ] |
Component/s | Other [ 12201 ] | |
Resolution | Fixed [ 1 ] | |
Status | Open [ 1 ] | Closed [ 6 ] |
Workflow | MariaDB v3 [ 81335 ] | MariaDB v4 [ 134997 ] |
Pull request:
https://github.com/MariaDB/mariadb-connector-j/pull/107