There is an issue for logging query on slow log table.
Since we set log_output=FILE and the data type for 'rows_examined' column on CVS table slow_log is INT, when this value exceeds the maximum allowed value for INT , an error is logged into error log.
([ERROR] Unable to write to mysql.slow_log)
|
Because the INSERT into the slow_log table fails due to the maximum value being exceeded. This is the output when we change the log_output to file:
# Thread_id: 380229 Schema: andre_k QC_hit: No# Query_time: 0.028396 Lock_time: 0.001841 Rows_sent: 2 Rows_examined: 10958383778436017378# Rows_affected: 0 Bytes_sent: 124# Tmp_tables: 2 Tmp_disk_tables: 0 Tmp_table_sizes: 380968# Full_scan: Yes Full_join: No Tmp_table: Yes Tmp_table_on_disk: No# Filesort: Yes Filesort_on_disk: No Merge_passes: 0 Priority_queue: Yes
|
# SET timestamp=1677147120;
|
# SELECT get_id(CAST(aes_decrypt(tab1.NAME_F,'V41iNM0n4') AS char),'KM_ID_PL') as get_string, (CASE WHEN (SELECT ID FROM tab2 where tab2.TAB1_ID = tab1.ID LIMIT 1) IS NULL THEN 0 ELSE 1 END) AS IS_ATTFROM tab1ORDER BY 2 DESCLIMIT 2;
|
use case huge_value_rows_examined.txt is attached.