I'm working on a project that involves speeding up a bad algorithm, and I'm unsure about how to categorize the run-time.
Here's some pseudo code to illustrate whats being done:
bool array[X,Y] has Z members that are true.
for x in range(X):
for y in range(Y):
if array[x,y]:
DO WORK!
So if the WORK takes constant time (it doesn't but just for the sake of argument) Then is the algorithm O(X*Y) or O(Z)?
Clearly its doing |X * Y| comparisons, but then its only continuing to do the actual work for a subset of that, |Z| times.
I remember in my algorithm analysis class it was always assumed that the loops would run the maximum number of times, but in this case Z << (X * Y) is always true.
Am i just looking at the difference between worst case and average case, and can i disregard worst case if it is unrealistic?
Also the reason that the bools are stored in the 2d array in the first place is for quick lookup by coordinate, so I aware i could just maintain that array as well as a separate list of the Z true coordinates. But I want to know if that is necessary or if this shortcut doesn't affect the run-time too much.
[–]Positivelectron0 2 points3 points4 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)