[MDEV-6853] Unexpected zero result on 1 % <very small decimal> (wrong default precision of the returned result?) Created: 2014-10-08  Updated: 2015-01-27

Status: Open
Project: MariaDB Server
Component/s: None
Affects Version/s: 5.5.40
Fix Version/s: 5.5

Type: Bug Priority: Minor
Reporter: Elena Stepanova Assignee: Sergei Golubchik
Resolution: Unresolved Votes: 0
Labels: None

Issue Links:
Relates
relates to MDEV-6841 test decimal math Open

 Description   

MariaDB [test]> select 1 % 0.00000000000000000000000000000032349 ;
+-------------------------------------------+
| 1 % 0.00000000000000000000000000000032349 |
+-------------------------------------------+
|          0.000000000000000000000000000000 |
+-------------------------------------------+
1 row in set (0.00 sec)

Technically, the second operand is outside DECIMAL precision (it has 30 zeros after the point), so it wouldn't be strange if the query considered it 0 and produced 0. But the current result is difficult to explain.

MySQL 5.5 and PostgreSQL return a non-zero value:

postgres=# select 1 % 0.00000000000000000000000000000032349 ;
               ?column?                
---------------------------------------
 0.00000000000000000000000000000018574

MySQL [test]> select 1 % 0.00000000000000000000000000000032349 ;
+-------------------------------------------+
| 1 % 0.00000000000000000000000000000032349 |
+-------------------------------------------+
|     0.00000000000000000000000000000018574 |
+-------------------------------------------+
1 row in set (0.00 sec)



 Comments   
Comment by Elena Stepanova [ 2014-10-08 ]

Actually, it's not really 0. If I assign the result of the expression to a variable, I can see the real value:

MariaDB [test]> set @a = 1 % 0.00000000000000000000000000000032349;
Query OK, 0 rows affected (0.00 sec)
 
MariaDB [test]> select @a;
+---------------------------------------+
| @a                                    |
+---------------------------------------+
| 0.00000000000000000000000000000018574 |
+---------------------------------------+
1 row in set (0.00 sec)

But if it's the problem with the default precision of a returned value, it can be seen on much simpler examples:

MariaDB [test]> select 1.1 / 10000000;
+----------------+
| 1.1 / 10000000 |
+----------------+
|        0.00000 |
+----------------+
1 row in set (0.00 sec)
 
MariaDB [test]> set @a = 1.1 / 10000000;
Query OK, 0 rows affected (0.00 sec)
 
MariaDB [test]> select @a;
+-------------+
| @a          |
+-------------+
| 0.000000110 |
+-------------+

It is strange, I can't find in the documentation anything that would explain such low precision of the result.

Generated at Thu Feb 08 07:15:04 UTC 2024 using Jira 8.20.16#820016-sha1:9d11dbea5f4be3d4cc21f03a88dd11d8c8687422.