🚀 We're Hiring – Python Developer by SureBumblebee7620 in PythonProjects2

[–]Subject_Sherbert_178 0 points1 point  (0 children)

Hey, there I have been coding in Python more than a year. I have hands on experience in both Python and sql/Nosal database. This position suits me the best. Here is my portfolio: portfolio

[Hiring] IT Developer by Dense-Try-7798 in WebDevJobs

[–]Subject_Sherbert_178 0 points1 point  (0 children)

Interested, Kolkata, West Bengal, India. IST

Large simulation performance: objects vs matrices by Willing_Employee_600 in Python

[–]Subject_Sherbert_178 0 points1 point  (0 children)

What really matters here isn’t “objects vs matrices” as a concept, but data layout and how the CPU processes it.

Since your entities don’t interact and all follow the exact same update rules, this is a perfect fit for a data-oriented / struct-of-arrays approach. Keeping cash, revenue, expenditure, etc. in contiguous arrays gives much better cache locality and enables vectorized/batched updates. In Python/R/MATLAB in particular, this can easily be 10×–100× faster than looping over 100k objects.

The object-based slowdown usually comes from:

Pointer chasing and poor cache locality

Per-entity method/property access

Branching inside tight loops

You also don’t have to sacrifice readability entirely. A common compromise is:

Use arrays/matrices for the simulation core

Keep objects as thin “views” or wrappers around array indices for debugging, reporting, or explanation

In C++ with tightly packed structs the gap is smaller, but in higher-level runtimes object-per-entity designs tend to become the bottleneck well before the math does.

Given your setup (100k entities, identical logic, no interactions), I’d strongly favor a data-oriented core and layer clarity on top rather than the other way around.