Client - server integration testing: mock or not? - testing

I'm working on project with two applications: android app (client) and rest service (server). My android app consumes my rest service.
Both applications are tested separately to ensure they're doing their business as expected.
During server tests I prepare requests and check server responses.
During client tests I set up a simple http mock server and test client's requests against different mocked responses.
Now, this technique works pretty well. It gives me a flexibility I like. I can use different test frameworks and continuous integration environments. But there is one weak point. In both (client and server) test cases I specify the same api. I assume that e.g.
GET /foo-list.json
will return HTTP 200 with json
[{
id: 1,
name: foo1,
}, {
id: 2,
name: foo2
}]
So I repeat myself. If I change a response format my client tests won't fail.
My question is about good practices in testing this kind of scenario. How to make true integration tests without sacrificing flexibility of independent tests. Should I test client with mocked server or with a real instance of my rest service?
Please share your professional experience.

In your scenario you should continue to write unit tests to test individual classes, and integration tests to test the inter-operation between multiple application layers (e.g. business and database layers).
You ask:
"How to make true integration tests without sacrificing flexibility of independent tests"
All of your code should should use abstractions, so that you can use dependency injection to unit test classes in complete isolation using mock dependencies. The use of mocks will ensure that these tests will remain independent i.e. not coupled to any other classes. Hence taking this approach, the integration tests, which would use your final concrete classes, would not affect the unit tests which use the mocked classes.
Also:
"Should I test client with mocked server or with a real instance of my rest service?"
In addition to unit and integration tests you should also perform client-server integration testing; I use automated acceptance testing for doing this. Using a test framework such as Cucumber (also check out calabash-android, which is written specifically to test mobile applications) you can write tests which would test specific features and scenarios which would interact with both the client (your Android application) and server (your RESTful service). These client-server integration tests would start-up and stop concrete instances of the client and server.

Mocks are for unit testing. Your description of the tests with the mocks describes exactly that. You test the client and server as separate units.
Integration testing tests if the units work well together. Since the interface is a REST interface, mocking makes no sense then, you have to test the real thing over HTTP.
See also What is the difference between integration and unit tests?

If your service is based in Java, I'd strongly recommend looking into the Spock framework, for mocking any sort of calls that might be coming from the client. Since Spock is just an extension of jUnit, you might also be able to use it for Android (though, to be fair I've never done Android development)
I'd say you want to do two things. Integration testing and Unit testing. Integration testing would attempt to bring up the android application and cause it to make service calls, ensuring the contexts interact with each other kindly.
However, in your regular commits, I'd suggest unit testing that mocks away everything but the class under test. Spock makes this pretty easy to do, and since it's built on top of jUnit all it takes a jar.

There is no reason you can't run automated end to end tests with a real service instance. You can run a real service instance on the same test machine you are using to run the unit tests, perhaps in the same container. You can set up configuration to use a different URL for the server instance for running automated end to end testing.
Why would you want to do the extra work of creating the mock service if you can run them against the real service?
I would only create a mock service, if the service was an external service over which I had no control!

Related

Can you write tests for Express Google Cloud Functions?

I have built a Cloud Function express API and I would like to know how to write tests for it.
To answer your question : of course, you can write tests for Google Cloud Functions. I will even add that, like any other application, you should write tests.
You can take a look at the Cloud Functions documentation about Testing and CI/CD. In the Node.js part, it shows how to test Cloud Functions with Mocha as a test framework and Sinon as a mocking framework. The testing process should be part of your local development and your CI tool, if you have one.
Basically, there are 3 types of tests :
unit tests
integration tests
system tests
In unit tests, you should mock your HTTP framework (Express) to test small parts of your code.
In integration tests, you should mock any external dependency between your Function and other components (for example, if your Function writes data to a Cloud SQL database, then you should mock this Cloud SQL database).
In system tests, you should deploy your Function to a specific GCP environment (most probably an isolated project), to ensure that your Function interacts well with other GCP components, or starts when you trigger it.
Finally, there was previously a Cloud Functions Node.js emulator to locally test Functions. This have been deprecated, but replaced by the Functions Framework, that also can be used to "spin up a local development server for quick testing".

How to test Service Contracts implemented as OSGi Bundles?

We are in the process of transitioning towards SOA.
Our current goal is to try and ensure that more of the application is developed as "Services" (mainly to improve visibility of capability, re-use and de-risk change). Some of those services will be exposed as web services, but many (and probably the majority) will not, and be used for "internal" use only to help reap some of the benefits of SOA.
For those "internal" services we are currently intending on implementing them as OSGi bundles; however we are struggling to understand how best to test them. Our goal is to enable the current System Test team to test all types of services and we have been investigating tools like SoapUI and SOA Test; however it's becoming clearer that we may face some challenges in testing our services implemented as OSGi bundles using tools like these; and indeed asking the test team to do so.
So we're looking for some advice on how best to test aspects of our capability designed to act as a "service", but implemented as an OSGi bundle instead of a web service.
What tools would people recommend, and is this a type of testing that's traditionally done by a developer during unit test, or can it be done by a less technical tester, undertaking the same basic principles of testing interfaces (i.e. inputs, processing, outputs)?
You could theoretically use a Remote Service Admin implementation (like Aries RSA or Eclipse ECF) to expose your internal services to the outside during testing to access them using an external system test tool.
I would not recommend to let an external team test your OSGi services though. It is much better to test the services in your own build using an integration testing tool like pax exam. It allows to define which bundles and other config to install. Then it boots up an OSGi framework with your setup and runs modified junit tests against it. The advantage is that such tests are quite realistic and still quite simple.
See here for some pax exam tests in aries rsa or apache karaf.
The first example uses the pax exam forked container for a very fast test (<1s per Test) while the second example uses the apache karaf container (~10s per Test) for tests that are very near a production system.
So you get much faster feedback than with an external system test team that will always lag a bit behind your current development. It also allows you to establish the policy that each team member runs the tests locally before committing.

WCF project will not be hosted when unit test runs

I created a WCF project using the IIS model. I then created unit tests which reference services from that project. Now when I start the unit test, the service is not hosted and thus I get EndpointNotFoundExceptions. When I simply hit F5 everything works fine and the IIS Express comes up in tray.
I checked the option "Always start when debugging" in the property-pane of the service-project and it is set to true.
A Unit Test is a unit test. If you need another process for it (namely some IIS or other web server) it's no longer a closed unit. If you want to test your service when it's hosted, I'd suggest you host it in your test yourself. Check self-hosting services. Then you have control about what class is hosted when and where. For example, you may want a different URL for your unit test and you may want to inject a different data layer so your tests don't need anything outside your unit, like a database.
You are not doing unit testing, and you are actually doing integration testing, since your test suit has no direct knowledge/binding of the service codes and it is just a client program of the service.
Both unit testing and integration testing contribute to good QA. Generally you should create unit testing which directly test the service codes through in process binding interfaces, and make sure it has comprehensive code coverage.
Then create integration testing using MS Test, NUnit or xUnit as test harness, in order to test some run time behaviors of the service.
What puzzle you is that how to make the test suit in the same VS sln run while the service should be running in debug mode. There could be a few solutions:
Build the test suit using Nunit or xUnit, then run the test suit outside VS IDE which is running the service in Debug mode. Actually MS Test could support as well, but only in command line mode.
Host the service in IIS, and you have a batch files to copy assemblies and web.config to there upon every update. Then attach the service codes with respective w3wp.exe instance.
If you just want to test rather than debug, there could be another solution: Use IIS Express. You use either C# codes or batch file to launch IIS Express with the service during tearing up, and close IIS Express when tearing down.

What is the best way to test clients of different programming languages with a server?

We have written clients in different programming languages (Java, .NET/Silverlight, Flash, Javascript) that communicate with a server, as our target is to support various technologies on client side. The functionality they are supposed to perform is the same.
One of the main challenges we are having now is finding a simple and effective approach for testing this variety of client technologies against the server. Currently we use maven, hooked with many maven plugins such as JSTestDriver, Flexmojo, NPanday and others which we have developed by our own to do this. Is there any better approach?
Any help would be appreciated, whether it is recommendation for available frameworks/tools or innovative ideas to do this.
Thanks
What you need is a clean design, otherwise everything is a mess and you have to test everything together.
Your server should have an interface with other systems (Browsers, desktop applications, mobile apps) and then test thoroughly this API. You can do that by using the appropriate framework, depending on technology used for the server. This should be your main test effort and then try to keep API stable, so that for every new version of the server you just run a regression test.
Meanwhile you can test the client applications alone by creating a mock server that uses the same API.
Last one would be your integration test where you run a live version of your server and your client application and you run integration tests.
expect is a good framework for testing program-external text interfaces such as client-server interaction. It operates with tests formulated in Tcl on a purely black-box logic level.

Best way to unit-test WCF REST/SOAP service while dynamically generating stubs

I have a webservice written with WCF 4.0 that exposes REST and SOAP functions, and I want to set up my unit tests so that as I work on my web services I can quickly test by having the test framework start up the service, outside of IIS, and then do the tests.
I want it to be dynamically generated as I am not certain what the interface will look like, and it is easier to not worry about having to generate the stubs before I start the tests.
But, I couldn't get Groovy to work with my web service, so I am curious if Iron Python or Iron Ruby would work well for this, or is there another .NET language that may work well for this.
SOAPUI can take your WSDL and/or WADL and general your first order tests. You can script up move complex usecase. It is easy to use but powerful through the use of Groovy or Java. It is without doubt the best test tool in this space.