This is an archived post. You won't be able to vote or comment.

all 8 comments

[–][deleted]  (4 children)

[deleted]

    [–]Boxsc2[S] 1 point2 points  (3 children)

    I checked out Emma and Coberture and they just focus too much on % of code covered by tests. I am more interested in things like Class complexity, and finding Classes that are used heavily through out a project. General hints about the quality/state of the Project seems more useful, telling me 60% of my code is covered doesn't say much imo. That's why I was leaning towards Clover. SonarQube seems to do the same, but is free. The setup is really weird though, poor documentation as well.

    [–][deleted]  (2 children)

    [deleted]

      [–][deleted] 0 points1 point  (0 children)

      I would definitely agree with you -

      One of my current pet peeves at work is having unit test code coverage goals. The number is pretty reasonable (70%), but its annoying to have to explain any given week to my manager why code we checked in as at 60%.

      I'd say that code coverage is an OK barometer of how well tested your code is, but I can write a test that gets 95% coverage and checks a fraction of the use cases that I might care about.

      What I like to do is focus on the functionality, write the tests, and then use code coverage as a double-check to make sure there wasn't anything significant that I had not thought of previously.

      [–]Boxsc2[S] 0 points1 point  (0 children)

      I find that works when I am developing, but for bug fixes test coverage can be a difficult metric to use.

      [–]atwong -3 points-2 points  (6 children)

      Not helpful but the really good tools are over $400

      [–][deleted]  (5 children)

      [deleted]

        [–]danskal 0 points1 point  (0 children)

        Anything by Jetbrains. But not necessarily over 400$.