python - Memory issues when writing to file from MySQL -


db.connect() sentencetb = sentencetb.select() i, sentence in enumerate(sentencetb):     open('./commentary/sentence%i.txt' %i, 'w', encoding='utf-8') f:         f.write(sentence.sntenc) db.close() 

i use code connect database, select table , write in table separate file. table has on 1 mln records , going great first, when code started writing 900 000th record, computer slowed down much. pycharm continuously asking me allocating more memory , if first 500k records done in 1 hour, takes him 1 hour write 50-100 records.

i have thoughts should somehow connected releasing memory, don't know how it.

any appreciated.

do need close cursor? looks you're opening new 1 each iteration, explain lack of memory after few hundred thousand iterations.

db.connect() sentencetb = sentencetb.select() i, sentence in enumerate(sentencetb): open('./commentary/sentence%i.txt' %i, 'w', encoding='utf-8') f: f.write(sentence.sntenc) cursor.close() db.close()


Popular posts from this blog

php - How should I create my API for mobile applications (Needs Authentication) -

5 Reasons to Blog Anonymously (and 5 Reasons Not To)

Google AdWords and AdSense - A Dynamic Small Business Marketing Duo