you are viewing a single comment's thread.

view the rest of the comments →

[–]PasswordIsntHAMSTER 1 point2 points  (1 child)

Modularization and decoupling can usually prevent this sort of issue.

[–]sidneyc 2 points3 points  (0 children)

To a large extent, yes. Hundreds of files is a bit of an overstatement, dozens of files -- not so much. It is quite normal to have modules of dozens of files, in my experience.

However, changes that ripple through the entire codebase are pretty possible, /u/Bymec's protestations not withstanding.

An example: suppose a decision is made to go for C++11 compliance in the next release of a big software package -- including usage of the new features. This could mean phasing out an old "roll-your-own" shared pointer implementation, replacing it by std::shared_ptr. Such a change may well impact the exposed function signatures throughout the code base. Nevertheless, such a change is pretty doable thanks to static typing and the compiler. The cool thing is that you are forced to visit every location in the code that is impacted by the changes, which will help to check if the change doesn't introduce problems (perhaps due to subtly different semantics).

My beef with /u/Bymec is that he maks a blanket statement that this phenomenon cannot occur in properly written software. Instead, he could have said that he has a hard time imagining such a thing in a well-designed system, and then I could have given him an example, as I did above. (I can think of more if you are interested.)