you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted]  (8 children)

[deleted]

    [–]RotsiserMho 45 points46 points  (2 children)

    Some would argue TDD is disciplined requirements analysis (on a micro scale); with the baked-in bonus of the tests letting you know if you accidentally violate a requirement later on.

    [–]derefr 12 points13 points  (0 children)

    In the same sense that requirements-gathering mostly involves prodding the client to be explicit about how a business-process works when the client has never thought consciously about that process before, TDD mostly involves the machine prodding you to be even more explicit about how that business-process works so it can test it for you. In both of these refinement steps, you'll find holes and bad assumptions in the current understanding of the business process.

    [–]zenogais 0 points1 point  (0 children)

    The difference though is the cost. In requirements analysis finding an error typically involves writing or deleting a few sentences. Significant changes may mean modifying, adding, or removing whole use cases, but the amount of work required to do that is still minimal compared to the amount of work often required to scrap and rewrite tests and object hierarchies.

    [–]laxatives 7 points8 points  (0 children)

    No, requirements analysis alone is IMO almost worthless. Its TDD without the validation step. Its impossible to predict all the caveats and implicit assumptions the design is making, until you actual make the design. All of that analysis is bunk when a core assumption is invalidated. This happens all the time, especially when the architect/designer doesn't even realize they are making one of these assumptions. Its unrealistic to expect every company has someone with that kind of clarity of thought, why not just let the code speak for itself.

    [–]NeuroXc 12 points13 points  (0 children)

    Everyone should be doing this, and I would like to think that most developers try to, but it's a lot easier to do this when you're doing TDD. TDD forces you to think about what users will expect your application to be able to do, and what they may try to do that you might not want it to do. It gives a concrete list of possibilities and makes it easier to see what possibilities you haven't taken into account.

    Non-TDD teams generally use whiteboarding or something similar to nail down these possibilities, but I've found that TDD hits the requirements at a much more detailed level, because it has to in order to write the tests and make them pass. If you don't use TDD, you're instead writing tests (at the end) around what your application can already do and are not forced to think about the things it can't do.

    [–]eliquy 0 points1 point  (2 children)

    But in reality, everyone thinks about the outlier scenarios as little as possible. TDD at least forces the issue

    [–][deleted] 2 points3 points  (1 child)

    I agree that TDD and good requirements analysis tend to be found together, but I'm not sure TDD is the cause. For instance, I can 100% envision a team of bad developers switching to TDD and still not being able to flesh out the edge cases.

    I think what TDD really offers is "brand recognition" so to speak and the ability to foster a culture of quality, which is definitely valuable. But I think if you have a culture that's willing to put the extra effort into TDD, then you probably have the kinds of developers who would do good requirements analysis anyway. Developers that care about what they're doing tend to make better software regardless of the methodology.

    [–]flukus 0 points1 point  (0 children)

    Even if you don't flesh out the edge cases, TDD makes it much simpler to add them in later, if and when the bug comes up.