Details
-
Task
-
Status: Open (View Workflow)
-
Minor
-
Resolution: Unresolved
-
None
Description
Hello.
In optimized environments, especially when there are many small queries running consequentially, statistics is really giving a visible slowdown.
This is my typical situation (performance schema off; slowlog off; userstat = 1):
starting 0.000073
Waiting for query cache lock 0.000020
init 0.000023
checking query cache for query 0.000074
checking permissions 0.000021
checking permissions 0.000017
checking permissions 0.000017
Opening tables 0.000070
After opening tables 0.000023
System lock 0.000019
Table lock 0.000018
After opening tables 0.000018
Waiting for query cache lock 0.000017
After opening tables 0.000040
init 0.000068
optimizing 0.000045
statistics 0.001257
preparing 0.000135
executing 0.000023
Sending data 0.000278
Waiting for query cache lock 0.000023
Sending data 0.000214
end 0.000022
query end 0.000016
closing tables 0.000031
freeing items 0.000021
updating status 0.000016
Waiting for query cache lock 0.000014
updating status 0.000023
Waiting for query cache lock 0.000026
updating status 0.000016
storing result in query cache 0.000021
cleaning up 0.000018
To cut long story short - 58% of time is used for "Statistics" (1.3 milliseconds in this particular query). It's not consistent. Sometimes it's about 30%. But basically varies from 30% to 80%. I understood that it's still fast, but sometimes there are 10,000 queries per page or per second and statistics then can waste 10 seconds per page load (in very bad optimized website, well).
I am thinking that user statistics is something that have no need to be done synchronously (blocking returning "DONE" state to client). I mean, that, could be awesome to do such logic:
execute query
return result "query DONE" so client can continue with other queries
asynchronously write new row to "statistics buffer"
later (asynchronously) add new rows from "statistics buffer" to "statistics table"