How to capture the clusternames in the below arraylist using groovy - arraylist

jsonResponse: [clustername1=Succeeded, clustername2=Succeeded, clustername3=Succeeded, clustername4=Succeeded]
I am trying to hit one get url to get clusternames and above is the response i got
now i want to list the clusternames that are Succeeded and save into a variable "clusterNames"

Related

I Have a JSON response in string so I used JSR223 processor and extracted the required token

I Have a JSON response in string so I used JSR223 processor and extracted the required token, is there any way to make that variable as global variable for all the threads to use
COuldn't get any slutions or haven't got the right link to check
As per request I have added screenshot for more clarity in my question
enter image description here
I even tried using
$(_setProperty(text3,${token},)};
still no use
I have extracted a string from response and assigned to text3 and I want to make it as global variable in thread group so that It can be used all other next API's I tried using props.put() however it didn't worked

get request for search in JMeter

I am performing a search request in jmeter. So my test plan flow is home then login then product catalogue and then search. I tried to make a post request for search but it failing all the time. I used a CSV file so each time the query is changed. But then I used a get request and used the query variable in the search path like this search?query=${search_input}and then it passed but when i checked the html it is not the correct page. In the html response I also see this
{{noSearchResults.query}}'. But if i put the url on the browser it works fine. Can you please help me with this?
Double check that your ${search_input} variable has the anticipated value using Debug Sampler and View Results Tree listener combination
It might be the case that your ${search_input} variable contains special characters which need to be URL-encoded so you might need to wrap the variable into __urlencode() function like:
search?query=${__urlencode(${search_input})}
JMeter automatically treats responses with status code below 400 as successful, if you need to add an extra layer of check for presence of the results or absence of {{noSearchResults.query}} - use Response Assertion

How to save JSON Input body parameter to global variable in POSTMAN (Check image)

How to save JSON Input body parameter to a global variable in POSTMAN (Check image)
Click the Environment quick look (eye button) in the top right of Postman and click Edit next to Globals.
Add a variable named timestamp and give it an initial value, Save and close the environment modal.
Open a new request tab and enter https://postman-echo.com/get?var={{timestamp}} as the URL. Hover over the variable name and you'll see the value.
Send the request. In the response, you'll see that Postman sent the variable value to the API.
Note :No need for "$"
In order to get that value from the Request Body, you can add a simple script like this to the Tests tab:
let depositRef = JSON.parse(pm.request.body.raw).api_data.deposit_reference
pm.globals.set('depositRef', depositRef)
This is using pm.request.body.raw from the pm.* API to grab the value from the Request Body.
You need to add this to the Tests rather than the Pre-request Script, as that would set the Global variable but it wouldn't resolve the dynamic variable at that point and it would just store TX1{{$timestamp}}.

JMeter variable in GET request failing

I have a GET Request that is returning an XML that contains a TicketName. I have setup the Regular Expression Extractor with Debug Sampler. It is picking up the TicketName as required and is displaying it in the View Results Tree, with the correct name variable name ticketID_g1.
However when I pass that variable to the next GET request the test plan fails with Non HTTP response message: Socket closed.
The thing is that the GET request looks find when I look at the request tab in the Results Tree.
I have changed my regular expression a number of times with each one extracting the TicketName properly but each time I apply it as a variable the GET request fails. However if I copy the request showing in the Results Tree Request Tab and paste it directly into my browser I get the desired result.
I have been through the manuals and on-line tutorials and it appears that I am doing everything right but obviously I am missing something.
The 1st GET Request returns an XML that contains name="2019-05-09-16-59-54cmrpip000613_EDASERVE" needsPrompt
I am using the following regular expression to extract the name for my variable ticketID
name="([^"]+)" needsPrompt - This works
The Results Tree is showing the following response from the Debug Sampler -
ticketID_g1=2019-05-09-16-59-54cmrpip000613_EDASERVE
When I pass the ticketID variable to the next GET request
//localhost:8080/ibi_apps/rs?IBIRS_action=getReport&IBIRS_ticketName=${ticketID_g1}cmrpip000589_EDASERVE&IBIRS_service=defer
The Response tab in the Results Tree for the second GET request is showing that the request is good but is failing.
GET http://localhost:8080/ibi_apps/rs?IBIRS_action=getReport&IBIRS_ticketName=2019-05-09-16-59-54cmrpip000613_EDASERVE&IBIRS_service=defer
What I am expecting is that this second GET will run with the variable and return a report but is throwing the Non HTTP response message: Socket closed error.
You have below variable which is capturing ticket id.
ticketID_g1=2019-05-09-16-59-54cmrpip000613_EDASERVE
But, in the below request you are passing the same which also have repeated content that is
"cmrpip000589_EDASERVE"
Request:-//localhost:8080/ibi_apps/rs?IBIRS_action=getReport&IBIRS_ticketName=${ticketID_g1}cmrpip000589_EDASERVE&IBIRS_service=defer
Please pass the ticketID variable correctly and hopefully it solves the issue. If I am correct you request should look like:-
Request:-//localhost:8080/ibi_apps/rs?IBIRS_action=getReport&IBIRS_ticketName=${ticketID_g1}&IBIRS_service=defer

Scrapy view and requests.get difference for "https://www.mywebsite.com.sg"

I am trying to scrape https://www.mywebsite.com.sg but the following command returns 400 bad request error:
scrapy view https://www.mywebsite.com.sg
If I use:
data=requests.get("https://www.mywebsite.com.sg")
I can get the content of the webpage in data.text and data.content.
However all the xpath operations in my script dotn't work as data.xpath and data.content are both empty.
There seems to be no protection in the webpage as postman software can get result with a simple HTTP GET query.
How do I get the response object to be properly filled?