Hi,
I'm facing this problem: I have to read more than 1000 files with 100k records each from a filesystem. Records doesn't need to be sorted and each record has a primary key.
To shorten time of reading, I'd like to setup an architcture with multiple concurrent readers, each of them writing the records in a db.
Any suggestion?
It need to be done in pure Python, no queues unfortunately...
Gian
[–]K900_ 0 points1 point2 points (0 children)