Details
-
Bug
-
Status: Closed (View Workflow)
-
Critical
-
Resolution: Not a Bug
-
10.4.10
-
None
-
Linux
Description
CREATE TABLE `sip404` (
|
`number` bigint(20) unsigned NOT NULL,
|
`id` int(11) NOT NULL AUTO_INCREMENT,
|
PRIMARY KEY (`id`),
|
UNIQUE KEY `UK_callerid_number` (`number`)
|
) ENGINE=ROCKSDB AUTO_INCREMENT=3898030 DEFAULT CHARSET=latin1
|
 |
select number,count(*) from sip404 group by number having count(*) > 1 limit 10;
|
+------------+----------+
|
| number | count(*) |
|
+------------+----------+
|
| 2012053466 | 4 |
|
| 2012084337 | 4 |
|
| 2012087180 | 4 |
|
| 2012088653 | 5 |
|
| 2012109447 | 4 |
|
| 2012120237 | 4 |
|
| 2012120241 | 4 |
|
| 2012121846 | 4 |
|
| 2012185042 | 4 |
|
| 2012204752 | 4 |
|
+------------+----------+
|
10 rows in set (0.001 sec)
|
This breaks the basic functionality of a database engine.
How did it happen?
I have a file ${FILE} with a single column of numbers, ordered randomly, and the I do this
mysql "$DATAB" --execute="set session rocksdb_bulk_load_allow_unsorted=ON;set session rocksdb_bulk_load=1;LOAD DATA INFILE '${FILE}' IGNORE INTO TABLE ${sqltable} FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' IGNORE 0 LINES (${field}); SHOW WARNINGS;set session rocksdb_bulk_load=0;"
|
every time I do this the same numbers get added again, notwithstanding the fact that duplicate numbers should be ignored. The unique index on (number) is not being respected.
I can give access to Elena and she can verify the issue.