Passing Argument to Specific Function in STAX Job Using Command Line - testing

I have a STAX Job new15.xml inside /home/dharm/staf/services/stax/samples location.
new15.xml have two function main and readFile
file_name = '/home/dharm/datafiles/ReadData3.txt'
Required Argument for readFile is file_name.
I want to execute this command using command line but i also want to give required parameter to readFile function.
staf local execute file /home/dharm/staf/services/stax/samples/new15.xml wait return result
What are the modification i should do in command to make it work.
"What I have Trierd"
FIRST
dharm#ubuntu:~$ staf local stax execute file /home/dharm/staf/services/stax/samples/new15.xml ARGS file_name="'/home/dharm/datafiles/ReadData3.txt'" wait returnresult
Response
--------
{
Job ID : 3
Start Date-Time: 20130418-23:54:39
End Date-Time : 20130418-23:54:40
Status : Terminated
Result : None
Job Log Errors : [
{
Date-Time: 20130418-23:54:40
Level : Error
Message : STAXPythonEvaluationError signal raised. Terminating job.
===== XML Information =====
File: /home/dharm/staf/services/stax/samples/new15.xml, Machine: local://local
Line <Error in ARGS option>: Error in element type "<External>".
===== Python Error Information =====
com.ibm.staf.service.stax.STAXPythonEvaluationException:
Python object evaluation failed for:
file_name='/home/dharm/datafiles/ReadData3.txt'
SyntaxError: ("mismatched input '=' expecting EOF", ('<pyEval string>', 1, 9, "file_name='/home/dharm/datafiles/ReadData3.txt'\n"))
===== Call Stack for STAX Thread 1 =====
[]
}
]
Testcase Totals: {
Tests : 0
Passes: 0
Fails : 0
}
}
"2ND Command"
dharm#ubuntu:~$ staf local stax execute file /home/dharm/staf/services/stax/samples/new15.xml SCRIPT file_name="'/home/dharm/datafiles/ReadData3.txt'" wait returnresult
Response
--------
{
Job ID : 4
Start Date-Time: 20130418-23:56:01
End Date-Time : 20130418-23:56:02
Status : Terminated
Result : None
Job Log Errors : [
{
Date-Time: 20130418-23:56:02
Level : Error
Message : STAXFunctionArgValidate signal raised. Terminating job.
===== XML Information =====
File: /home/dharm/staf/services/stax/samples/new15.xml, Machine: local://local
Line 20: Error in element type "call".
Required argument "file_name" is not provided in the call to function "readFile".
===== Call Stack for STAX Thread 1 =====
[
function: main (Line: 19, File: /home/dharm/staf/services/stax/samples/new15.xml, Machine: local://local)
]
}
]
Testcase Totals: {
Tests : 0
Passes: 0
Fails : 0
}
}

Related

Gatling feeder/parameter issue - Exception in thread "main" java.lang.UnsupportedOperationException

I just involved the new project for API test for our service by using Gatling. At this point, I want to search query, below is the code:
def chnSendToRender(testData: FeederBuilderBase[String]): ChainBuilder = {
feed(testData)
exec(api.AdvanceSearch.searchAsset(s"{\"all\":[{\"all:aggregate:text\":{\"contains\":\"#{edlAssetId}_Rendered\"}}]}", "#{authToken}")
.check(status.is(200).saveAs("searchStatus"))
.check(jsonPath("$..asset:id").findAll.optional.saveAs("renderedAssetList"))
)
.doIf(session => session("searchStatus").as[Int] == 200) {
exec { session =>
printConsoleLog("Rendered Asset ID List: " + session("renderedAssetList").as[String], "INFO")
session
}
}
}
I declared the feeder already in the simulation scala file:
class GVRERenderEditor_new extends Simulation {
private val edlToRender = csv("data/render/edl_asset_ids.csv").queue
private val chnPostRender = components.notifications.notice.JobsPolling_new.chnSendToRender(edlToRender)
private val scnSendEDLForRender = scenario("Search Post Render")
.exitBlockOnFail(exec(preSimAuth))
.exec(chnPostRender)
setUp(
scnSendEDLForRender.inject(atOnceUsers(1)).protocols(httpProtocol)
)
.maxDuration(sessionDuration.seconds)
.assertions(global.successfulRequests.percent.is(100))
}
But Gatling test failed to run, showing this error: Exception in thread "main" java.lang.UnsupportedOperationException: There were no requests sent during the simulation, reports won't be generated
If I hardcode the #{edlAssetId} (put the real edlAssetId in that query), I will get result. I think I passed the parameter wrongly in this case. I've tried to print the output in console log but no luck. What's wrong with this code? I would appreciate your help. Thanks!
feed(testData)
exec(api.AdvanceSearch.searchAsset(s"{\"all\":[{\"all:aggregate:text\":{\"contains\":\"#{edlAssetId}_Rendered\"}}]}", "#{authToken}")
.check(status.is(200).saveAs("searchStatus"))
.check(jsonPath("$..asset:id").findAll.optional.saveAs("renderedAssetList"))
)
You're missing a . (dot) before the exec to attach it to the feed.
As a result, your method is returning the last instruction, ie the exec only.

Parse a web activity error message into a synapse field

I have been trying to log an error from a web activity (POST method) into a field in a synapse table. The problem is, there are some special characters in the message key string like:
{
"value": [
{
"id": "",
"runId": "",
"debugRunId": ,
"runGroupId": "",
"pipelineName": "my_dynamic_pipeline_name",
"parameters": {
"region_code": "",
"data_start_date": "",
"data_end_date": "",
"etl_insert_batch_id": "",
"pipeline_subject_area": "",
"type_of_request": "",
"pipeline_name": "",
"pipeline_requested_by": "",
"debug": "",
"cdmloadtype": ""
},
"invokedBy": {
"id": "",
"name": "",
"invokedByType": ""
},
"runStart": "",
"runEnd": "",
"durationInMs": ,
"status": "",
"message": "Operation on target my_dynamic_pipeline_name failed: Operation on target my_dynamic_dataflow_name failed: {\"StatusCode\":\"DFExecutorUserError\",\"Message\":\"Job failed due to reason: at Sink 'SinkutilFailedDummy': java.sql.BatchUpdateException: Execution Status - FAILED ;Error number - 15165 ;Pipeline name - my_dynamic_pipeline_name; Stored procedure name - my_stored_proc_name ; Error step - Step 3: Key Hash and Util Type Hash Generation ; Insert batch ID - 1816 ; Error Message - Could not find object 'my object' or you do not have permission.\",\"Details\":\"java.sql.BatchUpdateException: Execution Status - FAILED ;Error number - 15165 ;Pipeline name - my_dynamic_pipeline_name; Stored procedure name - my_stored_proc_name ; Error step - Step 3: Key Hash and Util Type Hash Generation ; Insert batch ID - 1816 ; Error Message - Could not find object 'my_object' or you do not have permission.\\n\\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerStatement.executeBatch(SQLServerStatement.java:1845)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter.executeBatchSQLs(JDBCStore.scala:462)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter.executeSQL(JDBCStore.scala:440)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostSQLAndDDL$2.apply$mcV$sp(JDBCStore.scala:494)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostSQLAndDDL$2.apply(JDBCStore.scala:494)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostS\"}",
...
}
So I can filter down the output with:
#activity('pingPL').output.value[0].message
but there are {} and $ special characters that the Data Flow expression is trying to evaluate.
I already try to use replace or string functions in the pipeline expression or in the dataflow expression without success.
Is there a way to parse this as a string or get to the Message key?, something like:
#activity('pingPL').output.value[0].message*.failed*.failed.Message
Update:
This seems to be working:
#json(split(activity('pingPL').output.value[0].message, 'failed: ')[2]).Message
I can split by failed: and the index 2 will give me the error logs within the {...}. I can parse that as a json and use the Message key. It is working but it is not the ideal dynamic solution since the error message wouldn't have always the same structure.
Got a solution using substring and indexof to extract the {...} info:
substring(activity('pingPL').output.value[0].message,indexof(activity('pingPL').output.value[0].message,'{'),sub(indexof(activity('pingPL').output.value[0].message,'}'),sub(indexof(activity('pingPL').output.value[0].message,'{'),1)))
Getting this string as the output:
{\"StatusCode\":\"DFExecutorUserError\",\"Message\":\"Job failed due to reason: at Sink 'SinkutilFailedDummy': java.sql.BatchUpdateException: Execution Status - FAILED ;Error number - 15165 ;Pipeline name - my_dynamic_pipeline_name; Stored procedure name - my_stored_proc_name ; Error step - Step 3: Key Hash and Util Type Hash Generation ; Insert batch ID - 1816 ; Error Message - Could not find object 'my object' or you do not have permission.\",\"Details\":\"java.sql.BatchUpdateException: Execution Status - FAILED ;Error number - 15165 ;Pipeline name - my_dynamic_pipeline_name; Stored procedure name - my_stored_proc_name ; Error step - Step 3: Key Hash and Util Type Hash Generation ; Insert batch ID - 1816 ; Error Message - Could not find object 'my_object' or you do not have permission.\\n\\tat shaded.msdataflow.com.microsoft.sqlserver.jdbc.SQLServerStatement.executeBatch(SQLServerStatement.java:1845)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter.executeBatchSQLs(JDBCStore.scala:462)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter.executeSQL(JDBCStore.scala:440)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostSQLAndDDL$2.apply$mcV$sp(JDBCStore.scala:494)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostSQLAndDDL$2.apply(JDBCStore.scala:494)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCWriter$$anonfun$executeTableOpAndPostS\"}
Then I used json expression to extract the key message:
json('extracted string').message
Then use replace to remove the single quotations ' to avoid a sql error.
This is the final expression I got to extract the error message:
#replace(json(substring(activity('pingPL').output.value[0].message,indexof(activity('pingPL').output.value[0].message,'{'),sub(indexof(activity('pingPL').output.value[0].message,'}'),sub(indexof(activity('pingPL').output.value[0].message,'{'),1)))).message,'''','-')

channel checks as empty even if it has content

I am trying to have a process that is launched only if a combination of conditions is met, but when checking if a channel has a path to a file, it always returns it as empty. Probably I am doing something wrong, in that case please correct my code. I tried to follow some of the suggestions in this issue but no success.
Consider the following minimal example:
process one {
output:
file("test.txt") into _chProcessTwo
script:
"""
echo "Hello world" > "test.txt"
"""
}
// making a copy so I check first if something in the channel or not
// avoids raising exception of MultipleInputChannel
_chProcessTwo.into{
_chProcessTwoView;
_chProcessTwoCheck;
_chProcessTwoUse
}
//print contents of channel
println "Channel contents: " + _chProcessTwoView.toList().view()
process two {
input:
file(myInput) from _chProcessTwoUse
when:
(!_chProcessTwoCheck.toList().isEmpty())
script:
def test = _chProcessTwoUse.toList().isEmpty() ? "I'm empty" : "I'm NOT empty"
println "The outcome is: " + test
}
I want to have process two run if and only if there is a file in the _chProcessTwo channel.
If I run the above code I obtain:
marius#dev:~/pipeline$ ./bin/nextflow run test.nf
N E X T F L O W ~ version 19.09.0-edge
Launching `test.nf` [infallible_gutenberg] - revision: 9f57464dc1
[c8/bf38f5] process > one [100%] 1 of 1 ✔
[- ] process > two -
[/home/marius/pipeline/work/c8/bf38f595d759686a497bb4a49e9778/test.txt]
where the last line are actually the contents of _chProcessTwoView
If I remove the when directive from the second process I get:
marius#mg-dev:~/pipeline$ ./bin/nextflow run test.nf
N E X T F L O W ~ version 19.09.0-edge
Launching `test.nf` [modest_descartes] - revision: 5b2bbfea6a
[57/1b7b97] process > one [100%] 1 of 1 ✔
[a9/e4b82d] process > two [100%] 1 of 1 ✔
[/home/marius/pipeline/work/57/1b7b979933ca9e936a3c0bb640c37e/test.txt]
with the contents of the second worker .command.log file being: The outcome is: I'm empty
I tried also without toList()
What am I doing wrong? Thank you in advance
Update: a workaround would be to check _chProcessTwoUse.view() != "" but that is pretty dirty
Update 2 as required by #Steve, I've updated the code to reflect a bit more the actual conditions i have in my own pipeline:
def runProcessOne = true
process one {
when:
runProcessOne
output:
file("inputProcessTwo.txt") into _chProcessTwo optional true
file("inputProcessThree.txt") into _chProcessThree optional true
script:
// this would replace the probability that output is not created
def outputSomething = false
"""
if ${outputSomething}; then
echo "Hello world" > "inputProcessTwo.txt"
echo "Goodbye world" > "inputProcessThree.txt"
else
echo "Sorry. Process one did not write to file."
fi
"""
}
// making a copy so I check first if something in the channel or not
// avoids raising exception of MultipleInputChannel
_chProcessTwo.into{
_chProcessTwoView;
_chProcessTwoCheck;
_chProcessTwoUse
}
//print contents of channel
println "Channel contents: " + _chProcessTwoView.view()
println _chProcessTwoView.view() ? "Me empty" : "NOT empty"
process two {
input:
file(myInput) from _chProcessTwoUse
when:
(runProcessOne)
script:
"""
echo "The outcome is: ${myInput}"
"""
}
process three {
input:
file(defaultInput) from _chUpstreamProcesses
file(inputFromProcessTwo) from _chProcessThree
script:
def extra_parameters = _chProcessThree.isEmpty() ? "" : "--extra-input " + inputFromProcessTwo
"""
echo "Hooray! We got: ${extra_parameters}"
"""
}
As #Steve mentioned, I should not even check if a channel is empty, NextFlow should know better to not initiate the process. But I think in this construct I will have to.
Marius
I think part of the problem here is that process 'one' creates only optional outputs. This makes dealing with the optional inputs in process 'three' a bit tricky. I would try to reconcile this if possible. If this can't be reconciled, then you'll need to deal with the optional inputs in process 'three'. To do this, you'll basically need to create a dummy file, pass it into the channel using the ifEmpty operator, then use the name of the dummy file to check whether or not to prepend the argument's prefix. It's a bit of a hack, but it works pretty well.
The first step is to actually create the dummy file. I like shareable pipelines, so I would just create this in your baseDir, perhaps under a folder called 'assets':
mkdir assets
touch assets/NO_FILE
Then pass in your dummy file if your '_chProcessThree' channel is empty:
params.dummy_file = "${baseDir}/assets/NO_FILE"
dummy_file = file(params.dummy_file)
process three {
input:
file(defaultInput) from _chUpstreamProcesses
file(optfile) from _chProcessThree.ifEmpty(dummy_file)
script:
def extra_parameters = optfile.name != 'NO_FILE' ? "--extra-input ${optfile}" : ''
"""
echo "Hooray! We got: ${extra_parameters}"
"""
}
Also, these lines are problematic:
//print contents of channel
println "Channel contents: " + _chProcessTwoView.view()
println _chProcessTwoView.view() ? "Me empty" : "NOT empty"
Calling view() will emit all values from the channel to stdout. You can ignore whatever value it returns. Unless you enable DSL2, the channel will then be empty. I think what you're looking for here is a closure:
_chProcessTwoView.view { "Found: $it" }
Be sure to append -ansi-log false to your nextflow run command so the output doesn't get clobbered. HTH.

How to make if else checkbox enabled/disabled in sql query and asp.net?

I want to make my checkbox is enabled and disable when user login from my data
My syntaks in asp.net:
if (Session["Berhasil"] != null)
{
Label1.Visible = true;
Label1.Text = "Berhasil..";
if(Label1 = "select * from cs100020 where countno=2 and status=3");
{
cbxinven.Enabled=true
cbxfinadmin.Enabled=true
cbxkaskecil.Enabled=true
cbxemail.Enabled=false
cbxsap.Enabled=false
cbxpc.Enabled=false
cbxuserad.Enabled=false
}
else (Label1="select * from cs100020 where countno=3 and status=3);
{
cbxinven.Enabled=false
cbxfinadmin.Enabled=false
cbxkaskecil.Enabled=false
cbxemail.Enabled=true
cbxsap.Enabled=true
cbxpc.Enabled=true
cbxuserad.Enabled=true
}
}
and i got error :
Compilation Error
Description: An error occurred during the compilation of a resource required to service this request. Please review the following specific error details and modify your source code appropriately.
Compiler Error Message: CS1010: Newline in constant
Source Error:
Line 137: cbxuserad.Enabled=false
Line 138: }
Line 139: else (Label1="select * from cs100020 where countno=3 and status=3);
Line 140: {
Line 141: cbxinven.Enabled=false
Source File: d:\Sharing\Budiman\IAPHRM BACKUP 08022019\IapHRM_180119_Backup\ViewCS.aspx.cs Line: 139
Show Detailed Compiler Output:
Show Complete Compilation Source:
The closing double quote on the SQL statement for the ELSE branch (i.e. else (Label1 = .... line) is missing.

TCL, get full error message in catch command

#!/usr/bin/tclsh
proc test {} {
aaa
}
test
When I run this script I get error message:
invalid command name "aaa"
while executing
"aaa"
(procedure "test" line 2)
invoked from within
"test"
(file "./a.tcl" line 7)
If I run test command in catch I get only first line of error message.
#!/usr/bin/tclsh
proc test {} {
aaa
}
catch test msg
puts $msg
This prints:
invalid command name "aaa"
Is it possible to get full error message (file, line, procedure) in catch command? My program has many files and by getting just one line of error message it is difficult to find from where is it.
The short answer is to look at the value of errorInfo which will contain the stack trace.
The more complete answer is to look at the catch and the return manual pages and make use of the -optionsVarName parameter to the catch statement to collect the more detailed information provided. The return manual page gives some information on using this. But a rough example from an interactive session:
% proc a {} { catch {funky} err detail; return $detail }
% a
-code 1 -level 0 -errorstack {INNER {invokeStk1 funky} CALL a} -errorcode NONE -errorinfo {invalid command name "funky"
while executing
"funky"} -errorline 1
%
The detail variable is a dictionary, so use dict get $detail -errorinfo to get that particular item.