JMeter: Recallable BeanShell Assertion? - api

I'm performing API testing of basic CRUD functionality. For the creation of each record type in each thread group, I am using the same BeanShell Assertion template customized to each thread group.
import org.apache.jmeter.services.FileServer;
if (ResponseCode != null && ResponseCode.equals("200") == true) {
SampleResult.setResponseOK();
}
else if (!ResponseCode.equals ("200") == true ) {
Failure = true;
FailureMessage ="Creation of a new {insert record type} record failed. Response code " + ResponseCode + "." ; // displays in Results Tree
print ("Creation of a new {insert record type} record failed: Response code " + ResponseCode + "."); // goes to stdout
log.warn("Creation of a new {insert record type} record failed: Response code " + ResponseCode); // this goes to the JMeter log file
// Static elements or calculations
part1 = "\n FAILED TO CREATE NEW {insert record type} RECORD via POST. The response code is: \"";
part2 = "\". \n\n - For \'Non-HTTP response code - org.apache.jorphan.util.JMeterStopThreadException\' is received,verify the payload file still exists. \n - For response code = 409, \n\t a) check the payload for validity.\n\t b) verify the same {insert record type} name doesn't already exist in the {insert table name} table. If found, delete record and re-run the test. \n - For response code = 500, verify the database and its host server are reachable. \n";
// Open File(s)
FileOutputStream f = new FileOutputStream(FileServer.getFileServer().getBaseDir() + "\\error.log", true);
//FileOutputStream f = new FileOutputStream("c:\\error.log", true);
PrintStream p = new PrintStream(f);
// Write data to file
p.println( part1 + ResponseCode + part2 );
// Close File(s)
p.close();
f.close();
}
Is there a way to make this re-callable as opposed to having to repeat it in each thread group? Right now I'm up to 20 thread groups, thus 20 versions of this same assertion.
I've looked at multiple pages on this site and also at How to Use BeanShell: JMeter's Favorite Built-in Component, but I'm not finding a solution to this. Any feedback is appreciated.

If you place your Assertion (any Assertion) at the same level as Thread Groups like:
It will be applied to both "Sampler 1" and "Sampler 2". Moreover, the assertion will be applied to each sampler in each thread group on each iteration.
See How to Use JMeter Assertions in Three Easy Steps guide which clarifies assertions scope, cost and best practices.

Related

PLC4X:Exception during scraping of Job

I'm actually developing a project that read data from 19 PLCs Siemens S1500 and 1 modicon. I have used the scraper tool following this tutorial:
PLC4x scraper tutorial
but when the scraper is working for a little amount of time I get the following exception:
I have changed the scheduled time between 1 to 100 and I always get the same exception when the scraper reach the same number of received messages.
I have tested if using PlcDriverManager instead of PooledPlcDriverManager could be a solution but the same problem persists.
In my pom.xml I use the following dependency:
<dependency>
<groupId>org.apache.plc4x</groupId>
<artifactId>plc4j-scraper</artifactId>
<version>0.7.0</version>
</dependency>
I have tried to change the version to an older one like 0.6.0 or 0.5.0 but the problem still persists.
If I use the modicon (Modbus TCP) I also get this exception after a little amount of time.
Anyone knows why is happening this error? Thanks in advance.
Edit: With the scraper version 0.8.0-SNAPSHOT I continue having this problem.
Edit2: This is my code, I think the problem can be that in my scraper I am opening a lot of connections and when it reaches 65526 messages it fails. But since all the processing is happenning inside the lambda function and I'm using a PooledPlcDriverManager, I think the scraper is using only one connection so I dont know where is the mistake.
try {
// Create a new PooledPlcDriverManager
PlcDriverManager S7_plcDriverManager = new PooledPlcDriverManager();
// Trigger Collector
TriggerCollector S7_triggerCollector = new TriggerCollectorImpl(S7_plcDriverManager);
// Messages counter
AtomicInteger messagesCounter = new AtomicInteger();
// Configure the scraper, by binding a Scraper Configuration, a ResultHandler and a TriggerCollector together
TriggeredScraperImpl S7_scraper = new TriggeredScraperImpl(S7_scraperConfig, (jobName, sourceName, results) -> {
LinkedList<Object> S7_results = new LinkedList<>();
messagesCounter.getAndIncrement();
S7_results.add(jobName);
S7_results.add(sourceName);
S7_results.add(results);
logger.info("Array: " + String.valueOf(S7_results));
logger.info("MESSAGE number: " + messagesCounter);
// Producer topics routing
String topic = "s7" + S7_results.get(1).toString().substring(S7_results.get(1).toString().indexOf("S7_SourcePLC") + 9 , S7_results.get(1).toString().length());
String key = parseKey_S7("s7");
String value = parseValue_S7(S7_results.getLast().toString(),S7_results.get(1).toString());
logger.info("------- PARSED VALUE -------------------------------- " + value);
// Create my own Kafka Producer
ProducerRecord<String, String> record = new ProducerRecord<String, String>(topic, key, value);
// Send Data to Kafka - asynchronous
producer.send(record, new Callback() {
public void onCompletion(RecordMetadata recordMetadata, Exception e) {
// executes every time a record is successfully sent or an exception is thrown
if (e == null) {
// the record was successfully sent
logger.info("Received new metadata. \n" +
"Topic:" + recordMetadata.topic() + "\n" +
"Partition: " + recordMetadata.partition() + "\n" +
"Offset: " + recordMetadata.offset() + "\n" +
"Timestamp: " + recordMetadata.timestamp());
} else {
logger.error("Error while producing", e);
}
}
});
}, S7_triggerCollector);
S7_scraper.start();
S7_triggerCollector.start();
} catch (ScraperException e) {
logger.error("Error starting the scraper (S7_scrapper)", e);
}
So in the end indeed it was the PLC that was simply hanging up the connection randomly. However the NiFi integration should have handled this situation more gracefully. I implemented a fix for this particular error ... could you please give version 0.8.0-SNAPSHOT a try (or use 0.8.0 if we happen to have released it already)

Iterate over a CSV Data Set Config with varying starting index in Apache JMeter

My requirement is to iterate over a CSV Data Set Config in Apache JMeter with a varying starting index. Let us assume I have started a test plan in JMeter today and my CSV file has 8 variables. The first time my sampler will run from 1st row to 8th row. The next time I will start running my test plan I want sampler to pick values from 2nd index to 8th index. In this manner, I want to iterate over CSV file using CSV Data set config.
I am able to initialize a counter for every test run in Apache JMeter using setUp ThreadGroup and tearDown Thread group. I am able to extract the same using _P(count) in JMeter.
In setUp Thread group I have included JSR 223 Sampler and written a script like
def file = new File('number')
if (!file.exists() || !file.canRead()) {
number = '1'
}
else {
number = file.text
}
props.put('number', number as String)
In tearDown Thread Group the JSR223 Sampler has a script like
def number = props.get('number') as int
number++
new File('number').text = number
I want to loop over my CSV data set config file with the counter through properties file( which is getting incremented by 1 for every test run)
Please check the below plan:-
Input CSV example:-
If Controller has the below code:-
${__groovy(vars.get('Used').take(1)!='Y')}
In JSR223 post processor, I have the below code:-
def inputFile = new File("C:\\Path\\toFile\\Excel\\OutputCSV.csv")
def lines = inputFile.readLines()
boolean isWrite = false;
lines.each { String line ->
if(line.contains('Used'))
{
inputFile.write(line + '\n')
}
else
{
if(line.startsWith('Y'))
{
inputFile.append(line + '\n')
}
else if (!isWrite)
{
inputFile.append('Y' + line + '\n')
isWrite = true;
}
else
{
inputFile.append(line + '\n')
}
}
}
First Run output:-
Second Run output:-
As you can see, in first run sample 1 execute 4 time and in 2nd it is executed 3 times.
This is not the nicest or best code, just first try.
Please check if helps.

Soapui add new headers

We are using soapui to test API, currently we have some testcases. I have created new test suit, in this test suite we have more than 120test cases, I need to add additional header in that, please suggest code snippet how can add additional header in this?
you really ought to try something then ask for help, it's the best way to learn.
However, this is not very straightforward. The following snippet will help you I guess, but you'll have to adapt it to your needs.
testRunner.testCase.testSuite.project.testSuiteList.each
{
suite ->
name = suite.getName()
suite.testCaseList.each{
TC ->
// parse each Test Case
TC.testStepList.each{
TS ->
// parse each Test Step
if (TS.config.type == "restrequest")
{
// only on REST request type steps
// check its headers
headers = TS.getHttpRequest().getRequestHeaders()
//log.info "headers = " + headers
refHeaderName = "Accept" // search Accept header
found = false
headers.find(){
hd ->
//log.info "header name = ${hd.key}, value = ${hd.value}"
if(hd.key == refHeaderName)
{
found = true
}
}
if (found == false)
{
log.info "testSuite $name - testCase ${TC.getName()} - testStep ${TS.getName()}"
// le header n'existe pas, on le cree
headers.put("Accept", "application/json")
//log.info "add a new header : " + headers
TS.testRequest.setRequestHeaders(headers)
}
}
} // TS each
} // TC
} // TSuite each
This is what I've done in my project, I use REST requests ...
good luck
Alexandre

ReactiveX collect elements processed before a failure

I'm using RxJava to create a background job syncronizing my db.
It connects to an external source and start to process entries, map them and insert in the db.
When it ends I need the list with all the elements processed, I can get it when everything goes right, but how can I collect all the elements processed if during the flow something fail?
final List<String> res = Observable.create(onSubscribe)
.buffer(4)
.flatMap(TestRx::doStuff)
.buffer(8)
.map(TestRx::calculateList)
.toList()
.toBlocking()
.single();
System.out.println("strings = " + res);
What I would like to have is a way that if doStuff or calculateList throw exceptions, the flow stop an returns the list with everything it processed until the error.
List<String> res = Observable.create(onSubscribe)
.buffer(4)
.flatMap(TestRx::doStuff)
.onErrorResumeNext(Observable.empty()) // turn error into completion
.buffer(8)
.map(TestRx::calculateList)
.onErrorResumeNext(Observable.empty()) // turn error into completion
.toList()
.toBlocking()
.single();
System.out.println("strings = " + res);

noflo 0.5.13 spreadsheet example broken?

I am new to noflo and looking at examples in order to explore it. The spreadsheet example looked interesting but I couldn't make it run. First, it takes some time and manual debugging to identify missing components, not a big deal and I believe will be improved in the future, for now the error message I get is
return process.component.outPorts[port].attach(socket);
^
TypeError: undefined is not a function
Apparently, before this isAddressable() was also undefined. Checked with this SO issue but I don't have any noflo 0.4 as a dependency anywhere. Spent some time to debug it but seemingly stuck at it, decided to post to SO.
The question is, what are the correct steps to run the spreadsheet example?
For reproduction, here is what I have done:
0) install the following components
noflo-adapters
noflo-core
noflo-couchdb
noflo-filesystem
noflo-groups
noflo-objects
noflo-packets
noflo-strings
noflo-tika
noflo-xml
i) edit spreadsheet/parse.fbp, because first error was
throw new Error("No outport '" + port + "' defined in process " + proc
^
Error: No outport 'error' defined in process Read (Read() ERROR -> IN Display())
apparently couchdb ReadDocument component does not provide Error outport. therefore replaced ReadDocument with ReadFile.
18c18
< 'tika-app-0.9.jar' -> TIKA Read(ReadDocument)
---
> 'tika-app-0.9.jar' -> TIKA Read(ReadFile)
ii) at this point, received the following:
if (process.component.outPorts[port].isAddressable()) {
^
TypeError: undefined is not a function
and improvised a stunt by checking if isAddressable is defined at this location of code:
## -259,9 +261,11 ##
throw new Error("No outport '" + port + "' defined in process " + process.id + " (" + (socket.getId()) + ")");
return;
}
- if (process.component.outPorts[port].isAddressable()) {
+ if (process.component.outPorts[port].isAddressable && process.component.outPorts[port].isAddressable()) {
return process.component.outPorts[port].attach(socket, index);
}
return process.component.outPorts[port].attach(socket);
};
and either way fails. Again, the question is What are the correct steps to run the spreadsheet example?
Thanks in advance.