For those of you who have a software project with an automated test suite, in whatever language. Can the test suite run on a machine without network access, beyond localhost?
(There are no wrong answers. I'm trying to get a feeling of how common this is, for a possible future project of mine.)
@liw I voted localhost only, because the few network integration tests run against mock services that are executed locally.
@lpwaterhouse Good answer.
@liw I learned the hard way not to test against (third party) network resources. A payment provider that didn't check the "test mode" api flag when processing a "delete subscription" call :-P
@liw We added some checks to @librecast to detect the network level when running network-related tests.
Many build environments have no network, or very broken/restricted networking. So now we specify in a test the network level it requires, if any. eg:
`test_require_net(TEST_NET_ADDR6);`
This way we can skip the test with a warning, rather than fail the build if, say, multicast routing is not found or an interface doesn't have an IPv6 interface.
@liw the tests run without access to external network, the setup for the test tooling requires access to the internet. I don't know which case this qualifies for. :)
@mariusor That would be "only localhost to run test".
@liw Doing the microserviceish stuff I do nowadays, I like to have multiple test suites which might have different requirements for network connectivity.
Still I get annoyed when so-called unit tests require connectivity and perhaps even some credentials. They're not unit tests in my opinion.
@liw "I have tests that require external network access" BUT "these are only executed if the optional environment variable TEST_ALLOW_NETWORK is set to 'true'".
That way it passes by default for Debian and friends, and my CI can flip it to true where I *know* network is OK.
@liw absolutely, it's a necessity I'd say.
@liw I actually made a PR to Webdriverio because even though the test used Firefox and that was downloaded, it still checked google chrome versions. The project was very helpful in formulating that PR so that at least if the tests are run once, it runs offline afterwards.
@liw Oh, the things I have seen in build systems.
@liw I divide my tests into 2 groups. First group that runs hits no external resources and usually runs quickly. Second group hits external resources (database, web services) and usually takes a lot longer.
@liw Selected "localhost", but I do require a default route so that site local multicast can be tested without a zone identifier (even though the responder is local again)
@liw in general, I like to have a part of the suite that can completely locally. If the project relies on network and/or remote third party, I like to have additional layers checking them (more close to the real environment) and checking that my core suite is a good proxy.
@liw a project I work on has tests that depend on remote sites, but I've been working on eradicating them. One left, but I don't know what it returned as it stopped responding a couple of years ago... Makes it hard to replicate.
@liw The test suite for #SpamAssassin cannot be *completely* run without access to the (public) DNS for test domains that we control. This is due to the nature of the tool, but network tests can be easily skipped and tests that need more than DNS are off by default.
We *could* handle our deep dependency chain automatically at test time to make tests pass, but that would be wrong. :)