Exception in thread "main" com.fasterxml.jackson.core.JsonParseException: Unexpected character (' ' (code 160)): was expecting double-quote to start field name
at [Source: (String)"{
"CorrelationId": "{{$guid}}",
"UserId": 50,
"SenderId": "ICICIL",
"MSISDN": "9845628794",
"Message": "test message",
"MTag": "12345",
"MsgType": 1,
"Costcenter" : "B99-50330"
}"; line: 2, column: 2]
Related
I have been trying to log an error from a web activity (POST method) into a field in a synapse table. The problem is, there are some special characters in the message key string like:
{
"value": [
{
"id": "",
"runId": "",
"debugRunId": ,
"runGroupId": "",
"pipelineName": "my_dynamic_pipeline_name",
"parameters": {
"region_code": "",
"data_start_date": "",
"data_end_date": "",
"etl_insert_batch_id": "",
"pipeline_subject_area": "",
"type_of_request": "",
"pipeline_name": "",
"pipeline_requested_by": "",
"debug": "",
"cdmloadtype": ""
},
"invokedBy": {
"id": "",
"name": "",
"invokedByType": ""
},
"runStart": "",
"runEnd": "",
"durationInMs": ,
"status": "",
"message": "Operation on target my_dynamic_pipeline_name failed: Operation on target my_dynamic_dataflow_name failed: {\"StatusCode\":\"DFExecutorUserError\",\"Message\":\"Job failed due to reason: at Sink 'SinkutilFailedDummy': java.sql.BatchUpdateException: Execution Status - FAILED ;Error number - 15165 ;Pipeline name - my_dynamic_pipeline_name; Stored procedure name - my_stored_proc_name ; Error step - Step 3: Key Hash and Util Type Hash Generation ; Insert batch ID - 1816 ; Error Message - Could not find object 'my object' or you do not have permission.\",\"Details\":\"java.sql.BatchUpdateException: Execution Status - FAILED ;Error number - 15165 ;Pipeline name - my_dynamic_pipeline_name; Stored procedure name - my_stored_proc_name ; Error step - Step 3: Key Hash and Util Type Hash Generation ; Insert batch ID - 1816 ; Error Message - Could not find object 'my_object' or you do not have permission.\\n\\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerStatement.executeBatch(SQLServerStatement.java:1845)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter.executeBatchSQLs(JDBCStore.scala:462)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter.executeSQL(JDBCStore.scala:440)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostSQLAndDDL$2.apply$mcV$sp(JDBCStore.scala:494)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostSQLAndDDL$2.apply(JDBCStore.scala:494)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostS\"}",
...
}
So I can filter down the output with:
#activity('pingPL').output.value[0].message
but there are {} and $ special characters that the Data Flow expression is trying to evaluate.
I already try to use replace or string functions in the pipeline expression or in the dataflow expression without success.
Is there a way to parse this as a string or get to the Message key?, something like:
#activity('pingPL').output.value[0].message*.failed*.failed.Message
Update:
This seems to be working:
#json(split(activity('pingPL').output.value[0].message, 'failed: ')[2]).Message
I can split by failed: and the index 2 will give me the error logs within the {...}. I can parse that as a json and use the Message key. It is working but it is not the ideal dynamic solution since the error message wouldn't have always the same structure.
Got a solution using substring and indexof to extract the {...} info:
substring(activity('pingPL').output.value[0].message,indexof(activity('pingPL').output.value[0].message,'{'),sub(indexof(activity('pingPL').output.value[0].message,'}'),sub(indexof(activity('pingPL').output.value[0].message,'{'),1)))
Getting this string as the output:
{\"StatusCode\":\"DFExecutorUserError\",\"Message\":\"Job failed due to reason: at Sink 'SinkutilFailedDummy': java.sql.BatchUpdateException: Execution Status - FAILED ;Error number - 15165 ;Pipeline name - my_dynamic_pipeline_name; Stored procedure name - my_stored_proc_name ; Error step - Step 3: Key Hash and Util Type Hash Generation ; Insert batch ID - 1816 ; Error Message - Could not find object 'my object' or you do not have permission.\",\"Details\":\"java.sql.BatchUpdateException: Execution Status - FAILED ;Error number - 15165 ;Pipeline name - my_dynamic_pipeline_name; Stored procedure name - my_stored_proc_name ; Error step - Step 3: Key Hash and Util Type Hash Generation ; Insert batch ID - 1816 ; Error Message - Could not find object 'my_object' or you do not have permission.\\n\\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerStatement.executeBatch(SQLServerStatement.java:1845)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter.executeBatchSQLs(JDBCStore.scala:462)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter.executeSQL(JDBCStore.scala:440)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostSQLAndDDL$2.apply$mcV$sp(JDBCStore.scala:494)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostSQLAndDDL$2.apply(JDBCStore.scala:494)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostS\"}
Then I used json expression to extract the key message:
json('extracted string').message
Then use replace to remove the single quotations ' to avoid a sql error.
This is the final expression I got to extract the error message:
#replace(json(substring(activity('pingPL').output.value[0].message,indexof(activity('pingPL').output.value[0].message,'{'),sub(indexof(activity('pingPL').output.value[0].message,'}'),sub(indexof(activity('pingPL').output.value[0].message,'{'),1)))).message,'''','-')
I am trying to run a PyFlink Job that takes data from source Kafka topic sinks it into hdfs. There is a weird SQL-related error that keeps arising. This is from SQL statement in Apache-Flink (PyFlink) Table API Sink:
SQL:
sql_statement_sink = """
CREATE TABLE avro_sink (
timeTime STRING,
correlationId STRING,
spanId STRING,
appName STRING,
messageType STRING,
message STRING,
tag STRING,
journey as SPLIT_INDEX(tag, '_', 2)
) PARTITIONED BY (
journey,
appName,
messageType
) WITH (
'connector' = 'filesystem',
'partition.default-name' = 'others',
'format" = 'avro',
'path' = 'file:///Users/ahmedawny/PycharmProjects/ms_log_consumer/output'
)
"""
Full Error:
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.flink.api.java.ClosureCleaner (file:/Users/ahmedawny/PycharmProjects/%20ms_log_consumer/venv/lib/python3.8/site-packages/pyflink/lib/flink-dist_2.11-1.14.0.jar) to field java.util.Properties.serialVersionUID
WARNING: Please consider reporting this to the maintainers of org.apache.flink.api.java.ClosureCleaner
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Traceback (most recent call last):
File "log_consumer.py", line 96, in <module>
main(**vars(args))
File "log_consumer.py", line 77, in main
statement_set.add_insert(avro_sink, table_known_tag)
File "/Users/ahmedawny/PycharmProjects/ ms_log_consumer/venv/lib/python3.8/site-packages/pyflink/table/statement_set.py", line 116, in add_insert
self._j_statement_set.addInsert(target_path_or_descriptor, table._j_table, overwrite)
File "/Users/ahmedawny/PycharmProjects/ ms_log_consumer/venv/lib/python3.8/site-packages/py4j/java_gateway.py", line 1285, in __call__
return_value = get_return_value(
File "/Users/ahmedawny/PycharmProjects/ ms_log_consumer/venv/lib/python3.8/site-packages/pyflink/util/exceptions.py", line 146, in deco
return f(*a, **kw)
File "/Users/ahmedawny/PycharmProjects/ ms_log_consumer/venv/lib/python3.8/site-packages/py4j/protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o697.addInsert.
: org.apache.flink.table.api.SqlParserException: Invalid SQL identifier
CREATE TABLE avro_sink (
timeTime STRING,
correlationId STRING,
spanId STRING,
appName STRING,
messageType STRING,
message STRING,
tag STRING,
journey as SPLIT_INDEX(tag, '_', 2)
) PARTITIONED BY (
journey,
appName,
messageType
) WITH (
'connector' = 'filesystem',
'partition.default-name' = 'others',
'format" = 'avro',
'path' = 'file:///Users/ahmedawny/PycharmProjects/ms_log_consumer/output'
)
.
at org.apache.flink.table.planner.parse.CalciteParser.parseIdentifier(CalciteParser.java:96)
at org.apache.flink.table.planner.delegation.ParserImpl.parseIdentifier(ParserImpl.java:109)
at org.apache.flink.table.api.internal.StatementSetImpl.addInsert(StatementSetImpl.java:76)
at org.apache.flink.table.api.bridge.java.internal.StreamStatementSetImpl.addInsert(StreamStatementSetImpl.java:48)
at org.apache.flink.table.api.bridge.java.internal.StreamStatementSetImpl.addInsert(StreamStatementSetImpl.java:28)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.apache.flink.sql.parser.impl.ParseException: Encountered "TABLE" at line 2, column 20.
Was expecting one of:
<EOF>
"." ...
at org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.generateParseException(FlinkSqlParserImpl.java:40981)
at org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.jj_consume_token(FlinkSqlParserImpl.java:40792)
at org.apache.flink.sql.parser.impl.FlinkSqlParserImpl.TableApiIdentifier(FlinkSqlParserImpl.java:6316)
at org.apache.flink.table.planner.parse.CalciteParser.parseIdentifier(CalciteParser.java:87)
... 15 more
Thanks in advance.
Adding more sentences as StackOverflow is not allowing to post with "mostly code".
Adding more sentences as StackOverflow is not allowing to post with "mostly code".Adding more sentences as StackOverflow is not allowing to post with "mostly code".Adding more sentences as StackOverflow is not allowing to post with "mostly code".Adding more sentences as StackOverflow is not allowing to post with "mostly code".Adding more sentences as StackOverflow is not allowing to post with "mostly code".Adding more sentences as StackOverflow is not allowing to post with "mostly code".
There is a syntax error at journey field and change it to journey String. using SPLIT_INDEX function when you insert data to the sink.
try this :
sql_statement_sink = """
CREATE TABLE avro_sink (
timeTime STRING,
correlationId STRING,
spanId STRING,
appName STRING,
messageType STRING,
message STRING,
tag STRING
) WITH (
'connector' = 'filesystem',
'partition.default-name' = 'others',
'format" = 'avro',
'path' = 'file:///Users/ahmedawny/PycharmProjects/ms_log_consumer/output'
)
"""
I run the following code snippet for Pig:
publisher_hour_listings_cdf = foreach rscf_pub_hours_cumsum_proj generate
rscf_publisher_id,
rscf_is_uw,
rscf_hour,
(int)rscf_hour as rscf_hour_int,
rscf_cum_listings,
total_daily_listings,
(rscf_cum_listings*1.0)/(total_daily_listings*1.0) as cdf,
'$handledDate$';
and I get an error:
2020-05-02 04:31:53,130 INFO - [Step] - thread 74038 -
/opt/pig/pig-0.17.0/bin/pig: [main] ERROR
org.apache.pig.tools.pigstats.PigStats - ERROR 0:
org.apache.pig.backend.executionengine.ExecException: ERROR 0:
Exception while executing (Name: publisher_hour_listings_cdf: New For
Each(false,false,false,false,false,false,false,false)[bag] - scope-203
Operator Key: scope-203):
org.apache.pig.backend.executionengine.ExecException: ERROR 0:
Exception while executing [POCast (Name: Cast[long] - scope-171
Operator Key: scope-171) children: [[POProject (Name: Project[int][3]
- scope-170 Operator Key: scope-170) children: null at []]] at [rscf_cum_listings[-1,-1]]]: java.lang.ClassCastException:
java.lang.Long cannot be cast to java.lang.Integer
We thought that Pig fails to cast Long to Integer.
We tried to cast with "biginteger" instead of "int":
(biginteger)rscf_hour as rscf_hour_int
and the script also failed with the exact same error message.
We also tried to remove the int casting at all:
rscf_hour as rscf_hour_int
and we still got the same error.
Do you know why?
I'm new in Mulesoft, I'm following Quickstart guide. In Step 2 (https://developer.mulesoft.com/guides/quick-start/developing-your-first-mule-application), I need to receive variables from URI in this way:
[{'id' : attributes.uriParams.productId}]
But when I try my GET I have the following error in console:
**Message : "Cannot coerce Array ([{id: "2" as String {class: "java.lang.String"}}]) to Object 1| [{'id' : attributes.uriParams.productId}] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Trace: at main (line: 1, column: 1)" evaluating expression: "[{'id' : attributes.uriParams.productId}]". Error type : MULE:EXPRESSION Element : get:\products(productId):test_daniel-config/processors/1 # test6_db_connection:test_daniel.xml:133 (Select) Element XML : SELECT product.,CONCAT('["', (GROUP_CONCAT(variant.picture SEPARATOR '","')),'"]') AS pictures,CONCAT('[', GROUP_CONCAT('{"',variant.identifierType, '":"', variant.identifier, '"}'),']') AS identifiersFROM product INNER JOIN variant ON product.uuid = variant.productUUIDWHERE product.uuid = :id; #[[{'id' : attributes.uriParams.productId}]] *
Any Ideas? Thanks!
cannot coerce Array to object error pop's up when you are using an array where you were supposed to use an object.
in the exception above the uri-param should be treated as ab object i.e. enclosed in {} but its being treated as an array of objects [{}].
this is causing the error.
results <- SPARQL(endpoint,query,curl_args=list(style="post"))
**
Opening and ending tag mismatch: hr line 5 and body
Opening and ending tag mismatch: body line 3 and html
Premature end of data in tag html line 1
Error: 1: Opening and ending tag mismatch: hr line 5 and body
2: Opening and ending tag mismatch: body line 3 and html
3: Premature end of data in tag html line 1
In addition: Warning message:
In mapCurlOptNames(names(.els), asNames = TRUE) :
Unrecognized CURL options: style
I used curl_args=list(style="post"), but it was Unrecognized CURL options: style