Jmet assessment - testing

Anyone can help solve this assessment?
Using JMeter framework (https://jmeter.apache.org) please implement a load test script:
The script should send 10 concurrent requests to Capital API: https://restcountries.eu/rest/v2/capital/?fields=name;capital;currencies;latlng;regionalBlocs
The script should read the capital values from a CSV file (contains 10 capital names)
The script should perform a status code verification for the transaction response
The script should run for 2 minutes.
The script should contain at least 2 listeners.

My suggestion is: create the script on your own. It's best way to learn any subject. Contributors to this forum will be more that happy to answer any specific questions if you get stuck in your quest.

The script should send 10 concurrent requests - concurrency is defined at Thread Group level
Requests are configured using HTTP Request sampler
Values can be read from the CSV file using CSV Data Set Config
Status code verification is being more or less automatically done by JMeter, it treats status codes below 400 as successful, additionally you can use Response Assertion for this
It's not recommended to use Listeners at all

Related

Is there any way to store API response to a file while performing loadtest with Ghatling using karate

I am performing a load test with karate Gatling. As per my requirement, I need to create the booking and use the bookingId from the response and need to pass it to the update/cancel the booking request.
I have tried with below process:
In the test.feature file:
def createBooking = call read('createBooking')
def updateBooking = call read('updateBooking') { bookingid: createBooking.response.bookingId }
I am trying to apply 1000 ramp users at a time.
In the ghatling simulation file:
val testReq = scenario("testing").exec(karateFeature("classpath:test.feature"))
setUp(
testReq.inject(rampUsers(1000).during(1 seconds))
)
This process is unable to provide me the required throughPut. I am unable to find the bottleneck whether there is a problem with the karate or API server. In each scenario, we have both create and update bookings, so I am trying to capture all the 1000 bookings ids from the response during the load test and pass it to the update/cancel bookings. I will save it to a file and utilize the booking response for updating a booking. As I am new to Karate, can anyone suggest a way to store all the load test API responses to a file?
The 1.0 RC version has better support for passing data across feature files, refer this: https://github.com/intuit/karate/issues/1368
so in the scala code you should be able to do something like this:
session("myVarName").as[String]
And to get the RC version, see: https://github.com/intuit/karate/wiki/1.0-upgrade-guide
That said - please be aware that getting complex data-driven tests to work as a performance test is not easy, so yes - you will need to do some research. My suggestion is read and understand the info in the first link in this answer.
Writing to file is absolutely NOT recommended during a performance test. If you really want to go down that route, please read this: https://stackoverflow.com/a/54593057/143475
Finally if you are still stuck, please follow the instructions here: https://github.com/intuit/karate/wiki/How-to-Submit-an-Issue

Jira API bulk create - create all or none

I want to create multiple issues at once using /rest/api/2/issue/bulk endpoint.
However, I want it to fail if ANY of the tickets fail. Right now it creates tickets that are correct, but my preferred way is to block it from adding any ticket if at least one fails.
Is there a way to do it? Thanks!
If the response from the server is an error message (IE the request failed) why not use that as the point to stop processing any more requests?

Gatling, testing concurrency issues

We have a concurrency issue in our system. This one occurs mainly during burst load through our API from an external system and is not reproducible manually.
So I would like to create a Gatling test to 1) reproduce it whenever I want and 2) check that we have solved the issue.
1) I am done for the first point. I have created two requests checking for the status 201 and I run them with many users.
2) This issue allow the creation of two resources with the same unique value. The expected behaviour is to have one that is created and the others should fail with the status 409. But I have no idea on how we can check that any of the request, but at least once, complete with 201 while all the others are failing with 409.
Can we do a kind of post-check on all requests with Gatling ?
Thanks
Store results already seen in a global ConcurrentHashMap and compute expected value in the is check in a function, based on presence in the CHM (201 for missing or 409 for existing).
I don't think you can achieve what you're after with a check on the call itself as gatling users have no visibility of results returned to other users, so you have no way of knowing whether the successful (201) request has already been made (short of some very messy hacking using a check transformer)
But you could use simulation level assertions to do this.
So you have your request where you assert that you expect a 201 response
http("my request")
.get("myUrl")
.check(status.is(201))
this should result in all but one of these requests failing in a simulation, which you can specify using the assertion...
setUp(
myScenario.inject(
...
)
)
.assertions(
details("my request").successfulRequests.count.is(1))

HTTP Post request get cancelled after 2 minutes and process is working

My problem is:
HTTP request gets cancelled after 2 minutes but server side processing still continues.
I have large data processing and my database contains huge data.So i am using a normal form submit method for a processing screen and when i checked the browser console the request status becomes cancelled.But on the server the process is continuing after this request cancelling.Also when the request is cancelled a file wil be downloaded automatically which cannot be opened,also the file extension is not there.I have made the maximum execution time limit to unlimited using
set_time_limit(0);
,but it didnt changed the situation.In my code i have writted some code to write contents into a file.So after the request gets cancelled the file writing operation continues.I am trying to resolve this error but didnt find any solution.Please help me.
I am using apache server.
Screenshots
Process i am doing:
1.Selecting large number of data from a table which contains large number of data.
2.Checks whether each record matches certian conditions
3.Matching records are written into a file and that file is report generation
4.Allowing the user to download the file after the process completion.
I have heared that if the client did'nt recive any response after a particular time then it will cancel the request to the server.Is this the issue with me.If so how can i resolve it.?
Php doesn't find out the request is cancelled until it tries to send data to client. You should be able to resolve this by doing this at regular intervals:
echo ' '; flush();
This will end your script if ignore_user_abort is false.

How to update file upload messages using backbone?

I am uploading multiple files using javascript.
After I upload the files, I need to run several processing functions.
Because of the processing time that is required, I need a UI on the front telling the user the estimated time left of the entire process.
Basically I have 3 functions:
/upload - this is an endpoint for uploading the files
/generate/metadata - this is the next endpoint that should be triggered after /upload
/process - this is the last endpoint. SHould be triggered after /generate/metadata
This is how I expect the screen to look like basically.
Information such as percentage remaining and time left should be displayed.
However, I am unsure whether to allow server to supply the information or I do a hackish estimate solely using javascript.
I would also need to update the screen like telling the user messages such as
"currently uploading"
if I am at function 1.
"Generating metadata" if I am at function 2.
"Processing ..." if I am at function 3.
Function 2 only occurs after the successful completion of 1.
Function 3 only occurs after the successful completion of 2.
I am already using q.js promises to handle some parts of this, but the code has gotten scarily messy.
I recently come across Backbone and it allows structured ways to handle single page app behavior which is what I wanted.
I have no problems with the server-side returning back json responses for success or failure of the endpoints.
I was wondering what would be a good way to implement this function using Backbone.js
You can use a "progress" file or DB entry which stores the state of the backend process. Have your backend process periodically update this file. For example, write this to the file:
{"status": "Generating metadata", "time": "3 mins left"}
After the user submits the files have the frontend start pinging a backend progress function using a simple ajax call and setTimeout. the progress function will simply open this file, grab the JSON-formatted status info, and then update the frontend progress bar.
You'll probably want the ajax call to be attached to your model(s). Have your frontend view watch for changes to the status and update accordingly (e.g. a progress bar).
Long Polling request:
Polling request for updating Backbone Models/Views
Basically when you upload a File you will assign a "FileModel" to every given file. The FileModel will start a long polling request every N seconds, until get the status "complete".