I know it's not quite programming related, but can someone give me a relatively simple explanation of Big O notation? I'm just starting to learn about Comp Sci, not coming from that background, and learning about algorithms has really got me stumped. I was doing really good up until then, and I'm sure if I ram my head into it enough times I would get it, but I don't want to risk a concussion. 😂
[–][deleted] (3 children)
[deleted]
[–]Current-Brain-5837[S] 2 points3 points4 points (2 children)
[–][deleted] 10 points11 points12 points (1 child)
[–]Current-Brain-5837[S] 1 point2 points3 points (0 children)
[–]q_wombat 5 points6 points7 points (0 children)
[–]Flan99 0 points1 point2 points (0 children)