|
Hi, I wrote a little program to do some maintenance here on a big table(330M rows) and I noted that the connector is not releasing memory from datasets consumed.
As I can't load all the data into memory, I'm reading it in chunks, as shown below:
import mariadb
|
import resource
|
|
db = mariadb.connect(...)
|
qr = db.cursor() # doesn't matter if it's buffered or not..
|
|
sz = 400
|
last_id = 0
|
|
while True:
|
qr.execute('select a, b, c from big_table where id > ? limit ?', (last_id, sz))
|
|
usage = resource.getrusage(resource.RUSAGE_SELF)
|
print(f'Mem: {usage[2]}') # here I'm monitoring how much memory is allocated - It's always a growing number
|
|
rows = qr.fetchall() # Consume dataset
|
|
last_id += sz
|
|
if len(rows) < sz:
|
break
|
With mysql.connector the memory allocated is almost constant.
|