Doing a post request in java ninja test? - ninjaframework

I am using the ninjaframework and documentation is quite limited. There is no documentation on making a post request with ninja test? I see the method:
ninjatestbrowser.makePostRequestWithFormParameters();
But nothing explaining how to use it. Parameters, yes there is: String, Map, Map.
An example would be very beneficial!

It isn't documented well but, I looked into the source code and found the method signature. The parameters are as follows: url, header, parameters. You can easily test a controller post like this:
#Test
public void test() {
Map<String, String> headers = new HashMap<String, String>();
Map<String, String> parameters = new HashMap<String, String>();
headers.put("TESTHEADER", "value");
parameters.put("email", "test#test.ca");
parameters.put("username", "tester");
parameters.put("secret", "pass123");
String result = ninjaTestBrowser
.makePostRequestWithFormParameters(getServerAddress() + "/", headers, parameters);
assertTrue(result.contains("true"));
}

Related

Error when running my first pact-jvm test

I'm new to contract Testing Automation and I've written my first test using jvm-pact. I'm using junit5.
Below is the code
#ExtendWith(PactConsumerTestExt.class) #PactTestFor(providerName = "testProvider", port = "8081") public class ConsumerTests {
public static final String EXPECTED_BODY = "/integration/stubs/team_members/SingleTeamMember.json";
#Pact(consumer = "testConsumer" , provider="testProvider")
public RequestResponsePact singleTeamMemberSuccess(PactDslWithProvider builder) {
Map<String, String> headers = new HashMap<>();
headers.put("Content-Type", "application/json");
return builder
.given("I have at least one team member")
.uponReceiving("a request for a single team member")
.path("/team-members/1")
.method("GET")
.willRespondWith()
.status(200)
.headers(headers)
.body(EXPECTED_BODY)
.toPact();
}
#Test
#PactTestFor(pactMethod = "singleTeamMemberSuccess")
void testSingleTeamMemberSuccess(MockServer mockServer) throws IOException {
HttpResponse httpResponse = (HttpResponse) Request.Get(mockServer.getUrl() + "/team-members/1")
.execute().returnResponse();
assertThat(httpResponse.getStatusLine().getStatusCode(), is(equalTo(200)));
//assertThat(httpResponse.getEntity().getContent(), is(equalTo(TeamMemberSingle200.EXPECTED_BODY_SINGLE_TEAM_MEMBER)) );
}
I'm getting below error on running mvn install
ConsumerTests The following methods annotated with #Pact were not executed during the test: ConsumerTests.singleTeamMemberSuccess If these are currently a work in progress, and a #Disabled annotation to the method
[ERROR] ConsumerTests.singleTeamMemberSuccess:42 » NoClassDefFound Could not initialize class org.codehaus.groovy.reflection.ReflectionCache
Please can someone take a look and advise if I'm missing anything important to run the test successfully.
Thanks,
Poonam

Use Java8 Stream on JDBCTemplate Results from HIVE

I am using jdbcTemplate to query hive then writing the results to a .csv file. I basically just generate a list of objects then steam the list to write each record to the file.
I will like to stream the results as they coming back from hive and write it to the file instead of wait to get the whole thing then processing it. Can anyone pointing me to the right direction? Thanks!
private List<Avs> queryAvsData(String asSql) {
List<Avs> llistAvs = new ArrayList<Avs>();
List<Map<String, Object>> rows = hiveJdbcTemplate.queryForList(asSql);
Iterator<Map<String, Object>> it = rows.iterator();
while (it.hasNext()) {
Map<String, Object> row = it.next();
Avs laAvs = Avs.builder()
.make((String) row.get("make"))
.model((String) row.get("model"))
.build();
llistAvs.add(laAvs);
}
return llistAvs;
}
It doesn't look like there's a built-in solution, but you can do it. Basically, you wrap the existing functionality in an iterator, and use a spliterator to turn it into a stream. Here's a blog post on the subject:
The code implements Spring’s ResultSetExtractor interface, which is a Single Abstract Method (SAM) interface, allowing the use of a lambda expression to implement it.
The implementation wraps the SQL ResultSet in an iterator, constructs a stream using the Spliterators and StreamSupport utility classes, and applies that to a Function taking a stream of row sets and returning a generic result.
It's possible to stream values from JdbcTemplate. The following example is a service based on Spring Boot 2.4.8.
As, I run into problems (connection leak) using queryForStream then I will put a demo code here just to know that stream must be closed after usage.
import lombok.RequiredArgsConstructor;
import org.springframework.jdbc.core.SingleColumnRowMapper;
import org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate;
import org.springframework.stereotype.Service;
import java.util.Map;
import java.util.stream.Stream;
#Service
#RequiredArgsConstructor
public class DataCleaningService {
private final NamedParameterJdbcTemplate jdbcTemplate;
public void doSomeStreaming() {
String nativeQuery = "SELECT string_value FROM my_table WHERE column = :valueToFiler";
Map<String, Object> queryParameters = Map.of("valueToFiler", "my value");
SingleColumnRowMapper<String> stringRowMapper = SingleColumnRowMapper.newInstance(String.class);
try (Stream<String> stringValueStream = jdbcTemplate.queryForStream(nativeQuery, queryParameters, stringRowMapper)) {
stringValueStream.forEach(stringValue -> {
// do the needed action with the value
//..
System.out.printf("My cool value: %s", stringValue);
});
}
}
}

Dynamic testing for any camunda bpmn process

I am working on a Camunda java code and i am looking for a testing methodology that i can use to test any of my bpmn processes.
i have made some google search and i found on Camunda documentation some ideas about unit testing but it is do test for a specific bpmn model .
i need one to test any bpmn model (just by passing name of bpmn file and id of process etc)
the strategy should take into account the integration with DB to get candidate (user&group) for any expected path.i know maybe i can't do that but i have a large model and it will be time waste to test all of it in traditional ways.
Mohammad, your question is interesting - this goal can be achieved (test on bpmn can be dynamic), if we talk about not very detailed test.
Look at my code below, written by your idea of such common test (as i understood it, of course)
I use camunda-bpm-assert-scenario and camunda-bpm-assert libs in it.
First of all you gives to your test info about "wait states" in testing process (this can be done by json file - for not to change the code)
// optional - for mocking services for http-connector call, et.c.
private Map<String, Object> configs = withVariables(
"URL_TO_SERVICE", "http://mock-url.com/service"
);
private Map<String, Map<String, Object>> userTaskIdToResultVars = new LinkedHashMap<String, Map<String, Object>>() {{
put("user_task_0", withVariables(
"a", 0
));
put("user_task_1", withVariables(
"var0", "var0Value",
"var1", "var1Value"));
}};
// optional - if you want to check vars during process execution
private Map<String, Map<String, Object>> userTaskIdToAssertVars = new LinkedHashMap<String, Map<String, Object>>() {{
put("user_task_1", withVariables(
"a", 0
));
}};
Then you mocking user tasks (and other wait states) with given info:
#Mock
private ProcessScenario processScenario;
#Before
public void defineHappyScenario() {
MockitoAnnotations.initMocks(this);
for (String taskId : userTaskIdToResultVars.keySet()) {
when(processScenario.waitsAtUserTask(taskId)).thenReturn(
(task) -> {
// optional - if you want to check vars during process execution
Map<String, Object> varsToCheck = userTaskIdToAssertVars.get(taskId);
if (varsToCheck != null) {
assertThat(task.getProcessInstance())
.variables().containsAllEntriesOf(varsToCheck);
}
task.complete(userTaskIdToResultVars.get(taskId));
});
}
// If it needs, we describe mocks for other wait states in same way,
// when(processScenario.waitsAtSignalIntermediateCatchEvent(signalCatchId).thenReturn(...);
}
And your test will be anything like this:
#Test
#Deployment(resources = "diagram_2.bpmn")
public void testHappyPath() {
Scenario scenario = Scenario.run(processScenario).startByKey(PROCESS_DEFINITION_KEY, configs).execute();
ProcessInstance process = scenario.instance(processScenario);
assertThat(process).isStarted();
assertThat(process).hasPassedInOrder( // or check execution order of all elements -- not only wait-states (give them in additional file)
userTaskIdToResultVars.keySet().toArray(new String[0]) // user_task_0, user_task_1
);
assertThat(process).isEnded();
}
Hope this helps in your work.

Adding custom Response header to Spring WebFlux contoller endpoint

Is there a way to add a response header to spring webflux controller endpoint? for example to the following method I have to add a custom header say 'x-my-header'
#GetMapping(value = "/search/{text}")
#ResponseStatus(value = HttpStatus.OK)
public Flux<SearchResult> search(#PathVariable(
value = "text") String text){
return searchService().find(text);
}
In the functional API, this is really easy; the ServerResponse builder has builders for almost everything you need.
With the annotated controllers; you can return an ResponseEntity<Flux<T>> and set the headers:
#GetMapping(value = "/search/{text}")
public ResponseEntity<Flux<SearchResult>> search(#PathVariable(
value = "text") String text) {
Flux<SearchResult> results = searchService().find(text);
return ResponseEntity.ok()
.header("headername", "headervalue")
.body(results);
}
Note that the updated code doesn't need the #ResponseStatus annotation now.
UPDATE:
Apparently the solution above works; unless you have spring-cloud-starter-netflix-hystrix-dashboard dependency. In that case you can use the following code:
#GetMapping(value = "/search/{text}")
public Mono<ResponseEntity<List<SearchResult>>> search(#PathVariable(
value = "text") String text) {
return searchService().find(text)
.collectList()
.map(list -> ResponseEntity.ok()
.header("Header-Name", "headervalue")
.body(list));
}
A couple of things to note:
Outer type should be Mono<ResponseEntity<T>>: There is one response for request. If you declare it to be a Flux, Spring will try to deserialize the ResponseEntity as if it was a POJO.
You need to use an operator to transform the Flux into a Mono: collectList() or single() will do the job for you.
Checked with Spring Boot 2.0.3.RELEASE

OutOfMemory while using Jackson 1.9

I am using Jackson 1.9. in my web application wherein I require to convert complex objects e.g Spring’s ModelMap, BindingResult, java.uil.Map to JSON String objects.
Please consider the following code snippet where I am attempting one such conversion:
Map<String, Object> methodArgsMap = new HashMap<String, Object>();
methodArgsMap.put("map", map);/*map is an instance of org.springframework.ui.ModelMap*/
methodArgsMap.put("command", command);/*command is an instance of a custom POJO viz.ReportBeanParam*/
methodArgsMap.put("result", result);/*result is an instance of org.springframework.validation.BindingResult*/
The method is JSONProcessUtil. getObjectsAsJSONString(...) implemented as follows :
public final class JSONProcessUtil {
private static ObjectMapper objectMapper;
static {
objectMapper = new ObjectMapper();
/*Start : Configs. suggested by Jackson docs to avoid OutOfMemoryError*/
SerializationConfig serConfig = objectMapper.getSerializationConfig();
serConfig.disable(SerializationConfig.Feature.FAIL_ON_EMPTY_BEANS);
objectMapper.getJsonFactory().configure(
JsonParser.Feature.INTERN_FIELD_NAMES, false);
objectMapper.getJsonFactory().configure(
JsonParser.Feature.CANONICALIZE_FIELD_NAMES, false);
/*End : Configs. suggested by Jackson docs to avoid OutOfMemoryError*/
}
public static Map<String, String> getObjectsAsJSONString(
Map<String, Object> argsMap) throws JsonGenerationException,
JsonMappingException, IOException {
log.info("Source app.In JSONProcessUtil.getObjectsAsJSONString(...)");
Map<String, String> jsonStrMap = null;
if (!(argsMap == null || argsMap.isEmpty())) {
jsonStrMap = new HashMap<String, String>();
Set<String> keySet = argsMap.keySet();
Iterator<String> iter = keySet.iterator();
String argName = null;
while (iter.hasNext()) {
argName = iter.next();
log.info("Source app. argName = {}, arg = {} ", argName,
argsMap.get(argName));
jsonStrMap.put(argName,
objectMapper.writeValueAsString(argsMap.get(argName)));/*The line giving error*/
log.info("Proceeding to the next arg !");
}
}
log.info("Source app. Exit from JSONProcessUtil.getObjectsAsJSONString(...)");
return jsonStrMap;
}
}
I am getting an OutOfMemoryError as follows :
INFO [http-8080-7] (JSONProcessUtil.java:73) - Source app. argName = result, arg = org.springframework.validation.BeanPropertyBindingResult: 0 errors DEBUG [http-8080-7] (SecurityContextPersistenceFilter.java:89) - SecurityContextHolder now cleared, as request processing completed Feb 20, 2012 5:03:30 PM org.apache.catalina.core.StandardWrapperValve invoke
SEVERE: Servlet.service() for servlet saas threw exception
java.lang.OutOfMemoryError: Java heap space
at org.codehaus.jackson.util.TextBuffer._charArray(TextBuffer.java:
674)
at org.codehaus.jackson.util.TextBuffer.expand(TextBuffer.java:633)
at org.codehaus.jackson.util.TextBuffer.append(TextBuffer.java:438)
at org.codehaus.jackson.io.SegmentedStringWriter.write(SegmentedStringWriter.java:69)
at org.codehaus.jackson.impl.WriterBasedGenerator._flushBuffer(WriterBasedGenerator.java:1810)
at org.codehaus.jackson.impl.WriterBasedGenerator._writeFieldName(WriterBasedGenerator.java:345)
at org.codehaus.jackson.impl.WriterBasedGenerator.writeFieldName(WriterBasedGenerator.java:217)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:426)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.impl.ObjectArraySerializer.serializeContents(ObjectArraySerializer.java:121)
at org.codehaus.jackson.map.ser.impl.ObjectArraySerializer.serializeContents(ObjectArraySerializer.java:28)
at org.codehaus.jackson.map.ser.ArraySerializers$AsArraySerializer.serialize(ArraySerializers.java:56)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:428)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:428)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.MapSerializer.serializeFields(MapSerializer.java:287)
at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:212)
at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:23)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:428)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.MapSerializer.serializeFields(MapSerializer.java:287)
at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:212)
at org.codehaus.jackson.map.ser.MapSerializer.serialize(MapSerializer.java:23)
at org.codehaus.jackson.map.ser.BeanPropertyWriter.serializeAsField(BeanPropertyWriter.java:428)
at org.codehaus.jackson.map.ser.BeanSerializer.serializeFields(BeanSerializer.java:175)
at org.codehaus.jackson.map.ser.BeanSerializer.serialize(BeanSerializer.java:142)
at org.codehaus.jackson.map.ser.MapSerializer.serializeFields(MapSerializer.java:287)
Please guide about resolving the same.
Thanks and regards !
Sounds like you are producing a huge JSON output, which gets buffered in memory.
This based on error message.
Your choices are either:
Use streaming output to avoid buffering it in memory (however, I am not sure if Spring allows you to do this), or
Increase heap size so you have enough memory
Features to disable interning and canonicalization are only relevant for parsing, and you are generating JSON, not parsing.