How to stub HTTP requests on Common LISP tests? - testing

Is there any tool usually used in Common LISP tests to block all network requests and stub responses for specific URLs?
Just for reference, in Ruby we usually use:
https://github.com/bblimke/webmock
https://github.com/chrisk/fakeweb
(or even more powerful tools like https://github.com/vcr/vcr made on top of them)
I know similar tools exist in Python (I remember this one: https://github.com/gabrielfalcao/HTTPretty) and I have found:
https://github.com/johanhaleby/stub-http
created for Clojure and described here:
Strategy for stubbing HTTP requests in Clojure tests
Is there anything similar to this? If not, what do you usually test code that open connections and do external requests. Do you only mock methods directly with tools like mockingbird and cl-mock or is there anything I'm missing?

Related

Migrating from LoadRunner/The Grinder to JMeter: Where are the Scripts?

I've done load tests for users building scripts in LoadRunner or The Grinder, and now I'm trying out JMeter and it all feels incredibly clunky. Where are the scripts? Does everything have to be done through the UI? Is JMeter able to do complex scripting?
JMeter has good user friendly GUI. We create scripts in JMeter using the UI. JMeter saves the script in XML format with .JMX extension. Script creation is NOT very difficult as you say.
Check this site to get an idea.
Complex scripting can be done in JMeter using logic controllers.
Also JMeter
Is Free & Open source
Is Light weight & easy to install
supports any platform
Supports many protocols - HTTP/HTTPS, FTP, SOAP, LDAP, JDBC, JMS, SMTP, POP etc
Supports external plugins
Can be extended with Beanshell scripts, Groovy, Javascript, Java.
I have tried out Jmeter briefly but I think to it's for you to try it out yourself, there are a number of Jmeter tutorials on the Jmeter site itself as well as on youtube which should prove useful.
Below are some links:
http://jmeter.apache.org/usermanual/intro.html
https://www.youtube.com/watch?v=cv7KqxaLZd8
Hope these links help you in getting a better understanding of Jmeter.
What do you call "scripts"?
JMeter is designed so anyone could create tests using UI only. JMeter's Logic Controllers, Pre/Post-Processors, Assertions, etc. are quite enough to build a load test of any complexity.
If you feel yourself too creative and you're limited by JMeter Test Elements you're welcome to extend it using
Code-enabled test elements like BSF, Beanshell, JSR-223 Pre/PostProcessors, Samplers, Timers and Assertions
Developing custom functions
Developing custom implementation of Java Request Sampler
Developing custom implementation of your own Sampler or Function
Finally it is possible to either run existing or create new tests using JMeter API

Using Common Lisp Apache fastcgi

Concerning a web app, using Common Lisp, Apache, and fastcgi can one match urls with the desired functions defined in top-level rather than writing separate script files per urls?
Is it possible to use both approach above in a development environment of Common Lisp Apache fastcgi combination? And how, by which tools?
Is it compulsory to use a server which is loaded on Common Lisp implementation (as described and mentioned as "simple-server" in this page presents sb-fastcgi) in a development environment of Common Lisp Apache fastcgi combination? Or it is not compulsory on SBCL Apache sb-fastcgi environment and "simple-server" mentioned there is just another way? I try to avoid a server like those and some others that is not as well supported and maintained as Apache.
Does fastcgi make the connection between Interpreter (a Common Lisp here) and Apache directly or via another server software loaded on Common Lisp, say like Hunchentoot or "simpleserver" mentioned?
I want to have both of above top mentioned approaches and to be able to run Common Lisp with Apache via fastcgi? What tools i need as sufficiently necessary?
I use Clack for all web-development in lisp now.
With Caveman2 you will have a pleasant lisp web experience :)
As the web documentation states:
The reason why Clack had only a few bugs so far is plenty of quality
unit tests. There are 173 tests currently. The test coverage has been
kept over 70% since its first official release.
All releases have passed the test suite on three CL implementations:
Clozure CL, SBCL, and CLISP. You can check the current status at
http://ci.clacklisp.org/.

Artifice for Objective-C?

Is there an Objective-C version of Artifice?
If not, how would I design/develop/create it?
Related Questions
Mock HTTP response via Objective-C
Mock NSURLConnection
I think I might be able to help you here.
I have a Ruby library that is somewhat similar to artifice, albeit more self-contained and built on top of Sinatra, called Mimic. I'm pretty happy with it and one of my favourite features is that as well as being configured using it's Ruby DSL (or using the Sinatra API directly), it can be configured remotely or from any process that speaks HTTP. This means you can use it in your Objective-C tests and configure it from the tests too (rather than having say, a set of external fixtures in a Ruby file).
In the name of eating my own dog food, I recently converted the acceptance tests for my Objective C RestClient port, Resty to use Mimic. The Mimic daemon is started up as part of the build process and my stubs are configured directly in the tests, using a thin Objective-C wrapper around the Mimic REST API.
As you can see, I strive very hard for test clarity!
Those tests use OCUnit but you can use this with Kiwi. In fact, the assertEventually macro in the above tests was the basis of the asynchronous testing support that I ported to Kiwi.
I've since extracted the Objective-C wrapper for Mimic from LRResty and moved it into the Mimic repository. You may want to check out the Resty project to see how my project and the tests are configured. If you have any questions, please ask.
One caveat: I haven't found a way of getting these tests to run successfully in Xcode 4, using the "Test" option, due to the way that it runs. In Xcode 3, I rely on Run Script build phases to start and stop the Mimic daemon, but because Xcode 4 doesn't run the tests as part of the build process this doesn't work. I've tried to accomplish something similar using pre/post test actions but unfortunately these are woefully inadequate due to various bugs.
Bonus tip: I find Charles Debugging Proxy as massive help when working with web services and you can use it with Mimic too; the Objective-C wrapper can be proxied through Charles so you can see exactly what is happening, both in terms of stub configuration and actual HTTP requests (Mimic can even be configured to return some helpful debugging data in the response headers).
Do let me know if you have any questions.

Jakarta Cactus alternate?

Greetings, we have a project with loads of beans, JSP and etc. There is a desperate need for performing automated tests in our environment (we use Maven). Now, we can easily write tests for database project layer, for various security utilities we implemented. But the JSP pages remain untested.
I searched for utilities for server-side testing and Cactus seems the best option. However, according to their changelist, their last release is 1.8 and it was released more than two years ago!
So the question is - what happened to Cactus, is it still developing or what? And what are the recent alternates for Jakarta Cactus (if any exists)?
I've used a combination of Spring, JUnit and HttpClient with some success in recent projects.
Apache HttpClient provides a powerful and flexible API for constructing and sending http requests into your application. It cannot replicate a web browser, say by running client side scripts, however if there is sufficient content within the resulting http responses (headers, URI, body), then you can use this information to traverse pages within the application and validate the behavior. You can post forms, follow re-directs, process cookies and supply the inputs into your application.
JUnit (junit.org) drives the tests, invoking a series of pages with HttpClient and can be deployed alongside the application, run standalone with ant/maven, or run separately inside your IDE.
Spring (springsource.org) is, of course, optional as you may not be using it for your project. I've found it useful to stub/mock out parts of the application, such that I can isolate specific areas, such as front-end controllers, through to the business logic, by substituting the DAOs to return specific data values. It provides an excellent Test Context Framework and specialized TestRunners that hook in well to testing frameworks like JUnit (or TestNG if you prefer).
Cactus served as a good server-side testing framework in the ejb2 ages and but it's not supported anymore.
You can use combination of both Mock testing (fine-grained) and In-Container testing (coarse-grained) strategy to test your application completely.
Mock Testing Frameworks : Mockito, Jmockit, EasyMock etc..
Integration Testing Frameworks (Java EE) : Arquillian, Embeddable API, etc..
I prefer Mockito and Arquillian for server-side testing.
How about Arquillian? I haven't used it and it doesn't even have a stable version yet, but at least it's in active development.
You might want to try selenium. That with jBehave is a good combination I'm finding. And the more support for both those projects, the more they will not go defunct (like cactus).

Apache module for restful services

My objective is create an apache module that will provide RESTful services (i.e., we have some legacy code that controls/queries some networking equipment and we would now like to expose that functionality as a RESTful service). I guess the flow might look something like this:
WebBrowser -- issues RESTful URI---> [Apache (my_module) ] -->..
..---> Interface to existing Legacy code.
I have been mucking around various wikis, blogs, forums, articles etc. but I just can't seem to understand how those RESTful urls will get to (my_module) in apache [you can tell I have never worked with web-servers internals, much less modules, before]. I mean, do I have to edit that httpd.conf file and say something like: Send all urls that look like http://baseurl/restservices/... to my_module. If so, how do I do it?
Also, what will my_module actually get? Does it get the full http request message and it has to parse it like typical CGI programs?
Further, what is the best way for my_module to interact with my legacy code? E.g., Open a TCP connection to it and send messages and write wrapper around legacy code to interpret those messages. Or can my_module directly invoke the functions in my legacy code somehow if I compiled my entire legacy code as a module in apache?
Thanks for any hints. If u know of a good tutorial, please point me to it. I'm looking for a high level overview that will give me the architecture (the developers under me can then follow up on the nitty-gritty details).
I'd write an extension for PHP or Python and use mod_php / mod_wsgi
I think you are approaching this in the wrong way:
Apache modules are not really how you want to handle a URL if your requirements are quote basic. Depending on the language your legacy code is in, I would advise:
Binding its API into a python or PHP module, and have that script called by Apache through normal means. It is also a lot simple (in many cases) to glue a C-call style compiled language to these scripting languages rather than Apache itself.
It also has the advantage of adding an abstractions which allows you to layer additional logic in a scripting language on your core legacy code. You may also want to preprocess data and validate it from the request before handing it into your legacy code.
Both PHP and Python also have RESTful frameworks and utilities.
If you do write an Apache module, then check out Writing Apache Modules with Perl and C
See:
Developing PHP Extensions in C, Extending Python in C or C++ ... also if using Python checkout the WSGI stuff.
I'd agree with Aiden. Writing Apache modules is not for the faint hearted and you definitely don't want to go there unless you absolutely must. You would need to be prepared to become very conversant with how Apache works.
If you still think you need to, then look at:
http://httpd.apache.org/apreq/
This is a library which uses existing Apache Runtime Libraries and which provides higher level functionality for dealing with POST data, cookies etc from C code hooked into Apache via a custom module.
The book Aiden mentions though is a bit dated. Better off getting:
The Apache Modules Book: Application Development with Apache