Details
-
Bug
-
Status: Closed (View Workflow)
-
Major
-
Resolution: Fixed
-
2.7.1
Description
Given the following table definition:
CREATE TABLE mytable(
|
col1 CHAR(100)
|
) DEFAULT CHARSET=utf8mb4;
|
When executing a query and calling resultSet.getMetaData().getPrecision(1) the return value is 400, the length in bytes.
java.sql.ResultSetMetaData#getPrecision(int) says that
Get the designated column's specified column size. For numeric data, this is the maximum precision. For character data, this is the length in characters. For datetime datatypes, this is the length in characters of the String representation (assuming the maximum allowed precision of the fractional seconds component). For binary data, this is the length in bytes. For the ROWID datatype, this is the length in bytes. 0 is returned for data types where the column size is not applicable.
The expected result for resultSet.getMetaData().getPrecision(1) in this case is 100.