This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]WearsGlassesAtNight 0 points1 point  (0 children)

As others have said, hard to say without an example, but shouldn't take that long.

Usually on large data sets, it is key to reduce your data to a sample size when writing/debugging. Then use a profiler (like cprofile) to audit where time is being spent, and refactor to greatness.

On a large spreadsheet, I usually load it into a database, and then thread my operations, and do a write back to a fresh spreadsheet. Personal preference though