Hacker News new | past | comments | ask | show | jobs | submit login

> All of us have used mocks in our testing code, to substitute the real things. I know I’m definitely guilty of this. Why though? Why do we do it? Well, It’s usually because initialising those collaborators is not trivial (i.e. can’t be done with a one-liner).

I practically never mock something because I’m concerned that creating the real instance is more than a single line of code or non-trivial.

I don’t want my code making network requests if the actual instance is some kind of network/service client. As a matter of fact, I don’t want network requests in my unit tests. At that point, they’re not unit tests, and they could fail because the request failed or the system on the other end had issues - unrelated to what I’m actually trying to unit test.

I think the author fundamentally has a different idea of unit tests. I write unit tests and mock some dependency because I want to test some code in isolation where the “mock” is some interface with some possible inputs, outputs, or it fails perhaps by throwing an exception. And I can set up this behavior and validate my code gives me the expected outputs or makes some function call on a dependency.

The blog author seems to be conflating unit tests with integration tests, and I’m even wondering what’s the point - if I take them as objectively onto something, perhaps we should throwaway our tests and just test in production.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: