Details
-
New Feature
-
Status: Open (View Workflow)
-
Major
-
Resolution: Unresolved
-
None
-
None
Description
My use case is monitoring for unused/inactive tables and "X depends on Y". I don't currently care so much about auditing "every single access" - but rather "that X accessed Y recently". I'd like to see some sort of windowed/buffered table audit option, to be able to say "Log first access every N minutes" or "Skip logging duplicates from last N minutes".
The goal would be - let's say Fred accesses employees+salaries 100 times, and then employees+benefits 100 times. I'd prefer to just see TWO entries in the audit log if the above were done in short order. Note that this wouldn't have to be handled over a long interval, but at least being able to handle the case of row-by-slow querying in loops without flooding audit log.
Note that this can't be a pure percentage based sampling - since you could easily miss the single "select * from salaries" in that case, but goal is to eliminate duplicate combinations.
Implementation suggestion: Just capture a limited size in-memory hash of user+host+table1...tableN and skip the log entry if the timestamp in hash value is more recent than the cutoff.
This could potentially be used for QUERY events, though those likely wouldnt' be strict duplicates.