Hacker News new | past | comments | ask | show | jobs | submit login

>I think this is supposed to mean using dependency injection instead.

I'm pretty sure he means "just use integration tests".




I think so too.

Unfortunately, integration tests are too slow, so the practice doesn't scale if one is trying to TDD.

Insinuating integration testing into every user story will lead to friction. Test run times will balloon, cycle times will get extended and resentment will grow for the test suite and the team's testing regime.

If your test suites cannot complete quickly (seconds), then they cost more than they're worth. I've learned this about outside-in TDD at-scale. Our code quality is glorious. But our test run times are untenable.

I'm experimenting with sociable tests to curb my appetite for integration tests or at the very least keep on writing them but make it safe/productive to run the vast majority of them on the CI/CD server only.


Unfortunately, integration tests are too slow, so the practice doesn't scale if one is trying to TDD.

If integration tests get more useful outcomes than unit tests in some situation but TDD only works well with unit tests, maybe that means TDD isn’t the best process to use in that situation. Isn’t the essence of agile development to recognise what is working well and what is not, so you can make changes where they are needed?

If your test suites cannot complete quickly (seconds), then they cost more than they're worth.

I disagree with this. The goal of testing in software development is to improve results. Any testing methodology should be evaluated accordingly. Fast feedback is good, other things being equal. However, if going slower means getting more valuable feedback — identifying more defects, providing more actionable information about how to correct them, checking for a broader range of potential problems — then maybe going slower is the right choice.


Integration tests, for my team at least, really are the workhorses for determining if we have shipping software or have more work to do. So much of our deployed code relies on configuration and packaging to function properly that units just don't test!

There have been so many errors in our coding over the years that only came out in our integration tests. I agree with you that, given my team's current maturity level, going slower and having multi-minute test runs is okay given our incredibly low defect rate in PROD.

Those long running test suites have created too much friction for new feature development. So, while the way we are working is comfy and life is easy, I feel we can do much better.

I believe it is possible to have rapid test cycles with our outside-in style TDD by keeping the added safety of integration tests but run the majority of them on the CI/CD server only, and change the way we unit test to rely less on mocks which should get us the 2-5 second inner loop cycle times we wish to hit.


I find I can do ITDD effectively with integration test times of < 30 seconds. I've even done it with tests of up to 5 minutes.

What makes it work is pairing it with a REPL. That way I can have an "outer" loop that triggers the REPL and then an "inner" loop where I can experiment in the area where I'm writing the code and get feedback quickly.

I might run the outer loop just a few times:

* At the beginning of the ticket

* When I messed up the state with the REPL and when I want a fresh slate.

* When I've pasted code I think will hypothetically make the test pass from the REPL and I want final verification.

* One or two more times after that coz I missed something stupid.

Often the waiting times are a good time to get up and go for a walk/make a coffee/check my messages.

>I'm experimenting with sociable test

What's this? I'm unfamiliar with the term.


I don't have easy REPL opportunities for speeding up my cycles in the language/environment I have to work with. :/

Martin Fowler discussed it recently, note the section that discusses about sociable vs solitary tests: https://martinfowler.com/articles/2021-test-shapes.html

My team is oriented towards solitary. For unit tests, mocks stand-in for collaborators with the system under test. For integration tests, databases and queues are realistically simulated with locally running instances and downstream services we call are wiremocked with local configs pointing service clients to those mocks. It's an incredibly effective way to produce reliable, fully tested code. It's also incredibly costly execution time-wise.

James Shore has a nice take on sociable tests; I interpret what he's produced is like a mishmash of unit and integration tests. I have trouble making my brain work that way, but I am intrigued given his impressively low test run times. There is something special there, I feel it just needs a sexy paint job for people to take notice. https://www.youtube.com/watch?v=jwbKSiqG0DI

I liked this blog post on the topic as well: https://merge-conflict.com/post/sociable-unit-testing/


Generally you shouldn’t really be using integration tests for TDD, but it’s totally possible to write your tests in a way where the dependency supplied is talking to a real system in one scenario and in another scenario is talking to a system with mocked responses - or any sort of level of depth in between.

However, I wouldn’t start out writing a system like this (aka TDD). From my experience, the best tested software looks like this as the end result and the design of the software itself has been forged in the same furnace.

Not sure if that completely makes sense, since some of these concepts are from functional programming and I never know what is totally foreign or totally obvious.


Why run your whole test suite every time? Chances are you’re only really interested in a small set of them, just let those run continuously.


Definitely. Most of our developers do do that. We typically only run a subset corresponding to just the service containing the feature we're focused on.

On our larger services, unit tests take 20-30 seconds to run (WAY too slow) and integration tests take 2-4 minutes to run. In both suites we have hundreds of tests. There's no excuse for how slow the unit tests are. I chalk it up to the heavy reflection in the mocking frameworks we rely on and inefficient use of those mocks. Got to either optimize or decrease the use of mocks in our tests.


>I'm pretty sure he means "just use integration tests".

Really he means e2e testing. You can drive yourself mad writing integrations that break constantly or provide false positives. Better to unit test what is truly unit testable, and then rely on e2e to ensure your integrations are fine. Ultimately all that matters is the system running as expected at the user level.


I think that depends on the scope of what you're testing. If you have a database, a set of backend services and 1 or more clients (websites, mobile apps, etc), then I think it makes a lot of sense to test the backend services in isolation from the frontends (which would mean not e2e test), but backend by an actual database (so integration tests, not unit tests)




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: