HN2new | past | comments | ask | show | jobs | submitlogin

Unit tests as documentation in my experience is only true in a very minor, almost trivial sense, and for reasons that are very much in line with the subject of the original article.

For the most part, unit tests aren't organized around what a unit is supposed to do from a use case perspective, but more about how it happens to be implemented today. The article refers to this with the inability of most developers to indicate which business requirement would fail if the test failed.

When you're attempting to understand or refactor something where the testing for the most part tests the implementation of this particular unit rather than it's purpose, the tests themselves offer little in either documentation or refactoring confidence than the implementation itself.

What's worse, making your code "more testable" often compounds this effect. As code is taken to an extreme level of modularity, and unit isolation is enforced rigidly in tests, the tests often do nothing other than specifically test every line of the implementation of the unit (i.e. method 'foo' on class 'Bar' calls method 'a' on class 'B' and passes it's results to method 'x' on class 'Y', returning that result)



IMHO "unit tests as documentation" isn't quite the right description. Its more like "guaranteed to work example code and tutorials".

One example is where there might be preconditions for calling a function. This type of information is necessary to know but can be hard to work out even with rich documentation. Without documentation, you are going to be reading a lot of code to try and reverse-engineer the internals.

With unit tests, you can do a quick "find all references", see how the method is called within different tests cases and be confident that the setup actions in the given test fixtures will work for you.


If a method has preconditions, say X cannot be null, Y must be positive, and X > Y, you could test this in a unit test. But honestly make it the top 3 lines of the method with Asserts. Living with the code is going to be generally better than living off in some test. You can read it and see it very easily.


Assertions are most of the time completely useless and just a waste of time. If you put these assertions in one method it will break only at runtime. Usually it is a not very used method and the code that caused the assertion to fail is shipped in production and you realise only too late that it's broken. If you had a proper unit test instead of an assertion you would have caught the bug as soon as you pushed the code and the continuous integration kicked in.


Why not put an assertion, then add a test that deliberately checks if it's triggered?


The comment that I replied to was about using assertions instead of tests. You can certainly write an assertion and test it after, but unless it is absolutely necessary I would prefer to avoid polluting the code with "documentation". The right place for that is in the unit tests, not in the middle of production code.


I disagree, the right place is in the code where the functionality is defined. And the comment you replied to i didn't read as "assertions instead of tests". That's absurd. You should have tests that exercise the code that contains the assertions. It's the best of both worlds. You get code automatically tested and you get inline documentation that can't lie to you.


I say code the happy path, and then have assertions.

Saves writing 12 unit tests that show passing null breaks the code. No one cares about that.


And the comment train I was on was using unit test for documentation on valid inputs and outputs.

Asserts are documentations on valid inputs and outputs that very cleanly break when violated.


Sure. I don't mean simple local pre-conditions like arg validation but more system level issues that are inappropriate for asserts e.g. maybe you need to hook an output sink to an event trigger before you invoke a function to see any results. Not having done this or doing it in the wrong order is not strictly a mistake from the system's perspective but would you leave you stuck as a coder.


Integration test should catch if for a given input your entire app does nothing, right?


The issue isn't how to test pre-conditions, its how to know they exist and how to meet them correctly. An integration test just gives a big ball of mud of every production concern whereas a unit test provides simple example code for a single concern that is maintained and tested.

It feels like discussing this without real code is likely to cause confusion. The type of systems I have in mind are those that invert control through e.g. pub/sub events, plugins, strategies, dependency injection etc. Unit tests help document how to use components in a system that has traded clarity of control flow in exchange for extensibility.


I have found these kind of systems that put a ton of effort into unit testing are wasting time. And when it comes to refactor, you end up creating weeks of extra work to make the new unit tests pass.

How a system is wired together is not that exciting and not worth unit tests. If you screw up loading a config file on startup o bet your first integration test fails. Why would you unit test this?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: