Details
-
Bug
-
Status: Closed (View Workflow)
-
Major
-
Resolution: Fixed
-
None
-
None
Description
modutil_get_SQL uses signed char pointers to determine the query length. When the data is larger than 0x80, this is translated to a negative number, and then converted wrong to the unsigned int that is the length.
Here is an isolated test case that demonstrates the problem:
#include <stdio.h>
|
|
int main()
|
{
|
/* simulates the data coming in. packet length should be 0x80 */
|
char buf[4] = { 0x80, 0, 0, 0};
|
|
char* ptr = buf;
|
unsigned int length;
|
length = *ptr;
|
printf("length when using signed char*: 0x%x\n", length);
|
|
unsigned char* uptr = buf;
|
length = *uptr;
|
printf("length when using unsigned char*: 0x%x\n", length);
|
}
|
And the output is:
$ cc t.c
|
$ ./a.out
|
length when using signed char*: 0xffffff80
|
length when using unsigned char*: 0x80
|
This then tries to allocate nunecessary 4G of memory.
I will send a PR.