[MDEV-6882] TOKENIZE query Created: 2014-10-16 Updated: 2022-12-06 |
|
| Status: | Open |
| Project: | MariaDB Server |
| Component/s: | None |
| Fix Version/s: | None |
| Type: | Task | Priority: | Major |
| Reporter: | VAROQUI Stephane | Assignee: | Unassigned |
| Resolution: | Unresolved | Votes: | 8 |
| Labels: | None | ||
| Issue Links: |
|
||||||||||||||||||||
| Description |
|
Derived from the "EXPLAIN query" "TOKENIZE query" print a resultset like
|
| Comments |
| Comment by Sergei Petrunia [ 2014-10-19 ] | |
|
The question is whether the server should be used to do it. MaxScale has a similar feature where it uses MariaDB's parser to parse the query and then replaces constants with '?'. | |
| Comment by Sergei Petrunia [ 2014-10-19 ] | |
|
More details about how MaxScale does it: see skygw_get_canonical(). It walks through thd->free_list and replaces Item::STRING_ITEM, Item::INT_ITEM, Item::DECIMAL_ITEM etc with '?'. I have a doubt about how it does this, though. It calls replace_literal(). Is there a warranty that it replaces the right occurence of the literal? | |
| Comment by Sergei Petrunia [ 2014-10-19 ] | |
|
The technique used by maxscale to catch constants is not applicable to The copying is done in sql_lex.cc, get_token(), get_quoted_token(). So, if we want to have info about where "table.column" was located in the One way to save it would be to add another element into %union and then the
should also save the data about the source's location. | |
| Comment by Sergei Petrunia [ 2014-10-19 ] | |
|
The above says how to get info about token locations from the lexer. Lexer itself doesn't know about whether the tokens are table names or column In the parser, when we use a token as e.g. a table name, we could record that |