Testing a mail server in clojure tests - testing

I need to test our mail server in a clojure project. To do that I thought I would open a mock up server and send emails using this mock up server and check if they are sent. For that I found for example this server.
To be able to execute
lein test
and have each test tested, I need to run the SMTP server once before each test and once at the end. I can also run the server in a fixture and exit it after each test. Since i am running about 100 tests, it does not make sense to always start and shutdown the server.
My approaches that i thought are the following:
1 - I write a bash script that starts the (mockup) mail server, runs lein test, then shuts down the server.
(Here I lose the ease of executing lein test in the IDE)
2- I could have a fixture checking if the server is started and start it if its not. However after the test finished the server will still be running which is not desired.
What is the correct way to solve this problem?
Can I order the tests in clojure such that the last test file shutsdown the mail server ?

One solution is to use the Java GreenMail library. It allows you to start up an SMPT mail server in your JVM, which makes it easy to start, stop and inspect.
Here are a few snippets from my testing with GreenMail. First you create mail server:
(def mail-setup (ServerSetup. ServerSetup/PORT_SMTP nil ServerSetup/PROTOCOL_SMTP))
(def green-mail (GreenMail. mail-setup))
(.start green-mail)
; Now the server listens on localhosts SMPT port for emails. Run your test code
; Run the code under test. Then you can receive the emails
(doseq [m (.getReceivedMessages green-mail)]
; Verify the emails etc
(println "Subject: " (.getSubject m)
" TO: " (str/join "," (.getRecipients m Message$RecipientType/TO)))
)
; When done
(.stop green-mail)
Depending on you your tests you can start and stop it per test. Or might keep a test server running for a whole test suite.
Check the GreeMail documentation for more details. It supports tons of scenarious.

Related

Running E2E-Tests in parallel

we are trying to run Selenium-Tests on Browserstack against an AWS-Vaadin-App on several Jenkins slaves in parallel.
Companies-Jenkins -> Browserstack -> AWS-Vaadin-App
Our test framework uses the Vaadin Testbench with a valid license key.
All tests start as expected with a login (at the app) and the business workflow. But after a while there is a connection closed on all tests, the Vaadin framework shows "server connection lost".
T0 -> T1
-> T2
-> Tn
If we run the same on just one Jenkins slave in a sequence, it runs ok (also here we can see sometimes "server connection lost" but the selenium tests tries to wait and go on when the warning disappears, usually that works. In parallel it never works).
T0 -> T1 -> T2 -> Tn
Do you have an idea, why this happens? Could it be a problem with our Vaadin license?
It sounds like a problem with the server, not the tests. Is your server running out of memory by doing too much in parallel? Check the server logs, that is where you should find the reason. You can most likely see the same "server connection lost" if you manually open a session in your browser while the tests are running
We got help from the browserstack support:
eg. Initiate a Binary connection as:
./BrowserStackLocal --key $BROWSERSTACK_ACCESS_KEY --local-identifier test123
This initiates a unique Binary connection with a unique modifier "test123". This can then be used when executing the tests by setting the below capabilities:
caps.setCapability("browserstack.local","true");
caps.setCapability("browserstack.localIdentifier", "test123");
The same details are also mentioned in the link: https://www.browserstack.com/local-testing/app-automate#multiple-local-testing-connections

How to start and pause jmeter test plan run

I have a jmeter test plan that is running in a non gui mode on linux to test a server. I want to pause the jmeter test plan for some time to carry some maintenance on server and want to resume the test plan from where it got paused.
I don't know when to stop the test plan before hand, so i can't use timers to code in jmeter
Is there a pause button on jmeter GUI and NON GUI mode to pause the test plan
Linux solution. If you're running Linux you can use kill command like:
kill -STOP 1234 - pause JMeter
kill - CONT 1234 - resume JMeter
replace 1234 with the associated Java process ID
JMeter solution. You can add Constant Throughput Timer to your test plan and set the desired throughput in "requests per minute" using __P() function. When you need to suspend JMeter - you can set the desired throughput to 0 via Beanshell Server. Check out How to Change JMeter´s Load During Runtime article for comprehensive information if needed.
It is not possible to pause Jmeter execution.Thread groups are configured in such a way that it wind through requests and execute test plans fully. The only way to make changes in server is to stop the test, change what you need to and then execute the test again from the beginning.
Also, the test will not make any sense when you resume it after updates in server. Because mostly when updates are done on servers, apache will be restarted for safer side. That means the requests you sent previously is not anymore in the queue. So, even if there is a pause button in JMETER, the test after resume would be the same as that of a new test.
Best practice to do: Run test completely before server update. Take out the results.
Run test after server update. Take out the results.
Then compare the results.

How to pause the Jmeter run in non gui mode in linux server

I am posting data to a REST API using HTTP POST requests. The jmeter setup is single thread. As i am making 200,000 post calls i want to pause the run when needed and i want to resume the run when needed.
NOTE: i am running the jmeter in NON GUI mode on Linux server, which will not have a GUI for anything.
Other important thing is i can't program it before the run starts because i'm not sure when to pause the suit or when to resume it.
JMeter-specific solution would be using Beanshell Server and Constant Throughput Timer combination.
Add Constant Throughput Timer to your test plan and set your desired throughput in requests per minute. If you don't want to limit JMeter - set it to something very high using __P() function
${__P(throughput,10000000)}
Enable Beanshell Server by adding the next 2 lines to user.properties file:
beanshell.server.port=9000
beanshell.server.file=../extras/startup.bsh
Create 2 scripts like
suspend.bsh containing the next line:
setprop(throughput, 0);
and resume.bsh containing the next line:
setprop(throughput, 10000000);
Whenever you need to suspend your test invoke the following command from "lib" folder of your JMeter installation:
java -jar bshclient.jar localhost 9000 /path/to/your/suspend.bsh
Chwck out How to Change JMeter´s Load During Runtime article for more details.
Linux specific solution would be using kill command like:
to suspend: kill -SIGSTOP JMETER_JAVA_PID
to continue: kill -SIGCONT JMETER_JAVA_PID
where JMETER_JAVA_PID is process id of the JVM which is running JMeter, you can find this out using jps command

Before launch, run external tool asynchronously?

Is it possible to create run configurations where it doesn't wait for the external tool to exit before launching?
I'm currently trying to get Dart's pub serve to run on my project directory before opening the Dartium browser on localhost:8080 but it seems that it's just waiting for pub to exit before it does. Which won't happen as pub serve continues to show output from the local server.
Any ideas?
Probably not the best answer, but this should work:
Create a wrapper script, say pub_async.bat, that asynchronously calls pub.bat with correct command line arguments and returns immediately.
You may lose the ability to "stop" the process, and so shutting down / cleaning up may need to be handled in a different way.

jmeter hangs up and won't return

I am running 340 concurrent users to load test on server using jmeter.
But on most of the cases jmeter hangs up and won' t return, even if I try to close the connection it just hangs up. and eventually I have to close the application.
Any idea how to check what is holding the requests and how to check the requests sent by jmeter and find the bottleneck.
Got the following message on closing the thread
Shutting down thread please be patient message
I've hit this several times over the past few years. In each of my cases (may not be in your's) the issue was with the Load Balance (F5) I was sending my traffic through. Basically a property called OneConnect was holding the connections in a time-wait state and never killing the connection.
Run a pack tool like wireshark and see what's happening with the requests.
Try distributed testing, 340 concurrent users is not a big deal, but still you can try if that decreases your pain. Also take a look at the following link:
http://jmeter.apache.org/usermanual/best-practices.html#lean_mean
First check you script is ok with one user.
Ensure you use assertions.
Then run you test following jmeter best practices:
no gui
no costly listeners
You should then be able to see in csv output the longest request and be able to fix your issue.
I also encountered this problem before when I run my JMeter on my laptop(Core 2 Duo 1.5Ghz) it always hang-up in the middle of the processing. I tried to run on another pc which is more powerful than my laptop and its works now smoothly. Therefore, JMeter will run effectively if your pc or laptop has a better specs.
Note: It is also advisable to run your JMeter in non-gui mode.
Example to run JMeter in Linux box:
$ ./jmeter -t test.jmx -n -l /Users/home/test.jtl
I had the
one or more test threads won't exit
because of a firewall blocking some requests. So I had to leap in the firewalls timeout for all blocked request... then it returned.
You are getting this error probably because JVM is not capable of running so many threads. If you take a look at your terminal, you will see the exception you get:
Uncaught Exception java.lang.OutOfMemoryError: unable to create new native thread. See log file for details.
You can solve this by doing Remote Testing and have multiple clusters running, instead of one.