How to Validate a Non-JSON response body using Karate (2) - karate

This is in continuation to How to validate Non-JSON response body using Karate.
Details: When the API post call is made, if the employee is already available in the DB, then an error response is thrown as follows in the response body, which is not of Json/String format:
{"error":{"text":SQLSTATE[23000]: Integrity constraint violation: 1062 Duplicate entry 'NewEmp' for key 'employee_name_unique'}}
My aim is to validate the above if the error response is thrown as expected.
I tried the solution provided in How to validate Non-JSON response body using Karate, but it did not work as expected. Below are the details:
I do not understand how to use the * provided in the solution of my previous question. Could you please explain how to use the *
Karate Feature:
Scenario: Testing non-string response
Given url 'dummy.restapiexample.com/api/v1/create'
And request {"name":"PutTest8","salary":"123","age":"23"}
When method POST
Then status 200
* string temp = response
And match temp contains 'error'
The above is throwing an error as follows --
line 20:4 mismatched input '*' expecting <EOF>
17:43:46.230 [main] ERROR com.intuit.karate.core.FeatureParser - syntax error: mismatched input '*' expecting <EOF>
17:43:46.235 [main] ERROR com.intuit.karate.core.FeatureParser - not a valid feature file: src/test/java/learnKarate/postcall.feature - mismatched input '*' expecting <EOF>
NOTE: I also tried to 'assert' the response - which failed, too, with the below error.
Then assert $ contains 'error'
Error:
com.intuit.karate.exception.KarateException: postcall.feature:29 - javascript evaluation failed: $ contains 'error', <eval>:1:2 Expected ; but found contains
$ contains 'error'
^ in <eval> at line number 1 at column number 2
at ✽.Then assert $ contains 'error' (postcall.feature:29)

There is something seriously wrong in your example or your environment. The * is just a replacement for Given When etc. For example paste this into a new Scenario, and this works for me:
* def response = 'error'
* string temp = response
And match temp contains 'error'
Since you seem to be stuck, it is time for you to follow this process: https://github.com/intuit/karate/wiki/How-to-Submit-an-Issue
All the best !

Related

An error occurred when executing Randomwalk2dmobility in NS3

I am a newbie to NS3. I want to understand the execution status of handover in the Randomwalk2d module and visualize it. The default is two Ue and two enb, but errors will always occur during execution. Can anyone help me solve the problem?
This is my code link:https://drive.google.com/file/d/163NQOyvs0bTh2J4P9_vpS4Y7iqocB3HJ/view?usp=sharing
When I execute the command : ./waf --run scratch/lte_handover --visualize, the following error appear
../scratch/lte_handover.cc:In funtion 'int main(int, char**)':
../scratch/lte_handover.cc:296:78: error: expected ')' before ';' token
"Bounds",RectangleValue (Rectangle (0,2000,0,2000)));
^
Build failed
->task in 'lte_handover' failed with exit status 1 (run with -v to display more information)
Follow the instructions to enter the command :./waf --run scratch/lte_handover -v, and the following information appears
Several tasks use the same identifier. Please check the information on
https://waf.io/apidocs/Task.html?highlight=uid#waflib.Task.Task.uid
object 'SuidBuild_task'(
{task 139759060979784: SuidBuild_task -> }) defined in 'tap-creator'
object 'SuidBuild_task'(
{task 139759060980008: SuidBuild_task -> }) defined in 'tap-creator'
object 'SuidBuild_task'(
{task 139759065638504: SuidBuild_task -> }) defined in 'tap-creator'
Seems that you have an extra ) in that line above. You are not closing this command as you commented all the lines
ueMobility.SetPositionAllocator ("ns3::RandomRectanglePositionAllocator", // <-- close
ueMobility.SetMobilityModel ("ns3::RandomWalk2dMobilityModel","Bounds", RectangleValue (Rectangle (0,2000,0,2000)));

Error: while Importing data into Redshift

I wanted to unload from one database(production) and reload into another database(QA) in Redshift having exact same schema.
I issued S3 load command as following.
copy table(col1,col2,col3,col4) from 's3://<bucket_path>/<file_name>.gzip' CREDENTIALS 'aws_access_key_id=<your_key>;aws_secret_access_key=<your_secret>' delimiter '|' gzip NULL AS 'null_string';
Got following error.
ERROR: Failed writing body (0 != XXX) Cause: Failed to inflateinvalid or incomplete deflate data. zlib error code: -3
error: Failed writing body (0 != XXX) Cause: Failed to inflateinvalid or incomplete deflate data. zlib error code: -3
code: 9001
context: S3 key being read : s3://<some_s3_bucket>/<some_s3_bucket_file>
query: XXXXX
location: table_s3_scanner.cpp:355
process: query1_23 [pid=2008]
-----------------------------------------------
This happens when you are trying to use Gzip file during copy and it cannot read the file as a Gzip.
If I’ve made a bad assumption please comment and I’ll refocus my answer.

Load File delimited by double colon :: in pig

Following is a sample dataset delimited by double colon(::).
1::Toy Story (1995)::Animation|Children's|Comedy
I want to extract three fields from above data set as movieID,title and genre. I have written following code for that
movies = LOAD 'location/of/dataset/on/hdfs '
using PigStorage('::')
as
(MovieID:int,title:chararray,genre:chararray);
But i am getting following error
ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1200: Pig script failed to parse:
<file script.pig, line 1, column 9> pig script failed to validate:
java.lang.RuntimeException: could not instantiate 'PigStorage' with arguments '[::]'
Use MyRegExloader: You will need piggybank.jar for this.
REGISTER '/path/to/piggybank.jar'
A = LOAD '/path/to/dataset' USING org.apache.pig.piggybank.storage.MyRegExLoader('([^\\:]+)::([^\\:]+)::([^\\:]+)')
as (movieid:int, title:chararray, genre:chararray);
Output :
(1,Toy Story (1995),Animation|Children's|Comedy)

Pig Filter Syntax error, unexpected symbol

inputData = LOAD '$input' AS (line:chararray);
statusLineFilter = FILTER smallData BY (line MATHCES '^.* AppWrite-Dispatcher: Status code: [0-9]+$');
This code, when I run it, yields this error: ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1200: Syntax error, unexpected symbol at or near 'line'
The log file says the exact same thing. I'm at a loss, because the exact same syntax is working in other scripts I've written.
In order to avoid misspelling of key words I recommend you to use an IDE or a Text-Editor like emacs with the pig-mode.el which add syntax highlight ;)

Unexpected error while loading data

I am getting an "Unexpected" error. I tried a few times, and I still could not load the data. Is there any other way to load data?
gs://log_data/r_mini_raw_20120510.txt.gzto567402616005:myv.may10c
Errors:
Unexpected. Please try again.
Job ID: job_4bde60f1c13743ddabd3be2de9d6b511
Start Time: 1:48pm, 12 May 2012
End Time: 1:51pm, 12 May 2012
Destination Table: 567402616005:myvserv.may10c
Source URI: gs://log_data/r_mini_raw_20120510.txt.gz
Delimiter: ^
Max Bad Records: 30000
Schema:
zoneid: STRING
creativeid: STRING
ip: STRING
update:
I am using the file that can be found here:
http://saraswaticlasses.net/bad.csv.zip
bq load -F '^' --max_bad_record=30000 mycompany.abc bad.csv id:STRING,ceid:STRING,ip:STRING,cb:STRING,country:STRING,telco_name:STRING,date_time:STRING,secondary:STRING,mn:STRING,sf:STRING,uuid:STRING,ua:STRING,brand:STRING,model:STRING,os:STRING,osversion:STRING,sh:STRING,sw:STRING,proxy:STRING,ah:STRING,callback:STRING
I am getting an error "BigQuery error in load operation: Unexpected. Please try again."
The same file works from Ubuntu while it does not work from CentOS 5.4 (Final)
Does the OS encoding need to be checked?
The file you uploaded has an unterminated quote. Can you delete that line and try again? I've filed an internal bigquery bug to be able to handle this case more gracefully.
$grep '"' bad.csv
3000^0^1.202.218.8^2f1f1491^CN^others^2012-05-02 20:35:00^^^^^"Mozilla/5.0^generic web browser^^^^^^^^
When I run a load from my workstation (Ubuntu), I get a warning about the line in question. Note that if you were using a larger file, you would not see this warning, instead you'd just get a failure.
$bq show --format=prettyjson -j job_e1d8636e225a4d5f81becf84019e7484
...
"status": {
"errors": [
{
"location": "Line:29057 / Field:12",
"message": "Missing close double quote (\") character: field starts with: <Mozilla/>",
"reason": "invalid"
}
]
My suspicion is that you have rows or fields in your input data that exceed the 64 KB limit. Perhaps re-check the formatting of your data, check that it is gzipped properly, and if all else fails, try importing uncompressed data. (One possibility is that the entire compressed file is being interpreted as a single row/field that exceeds the aforementioned limit.)
To answer your original question, there are a few other ways to import data: you could upload directly from your local machine using the command-line tool or the web UI, or you could use the raw API. However, all of these mechanisms (including the Google Storage import that you used) funnel through the same CSV parser, so it's possible that they'll all fail in the same way.