Path has type Object rather than String - ssl

I am implementing SSL connection for my web application. I have keystore which i am passing while running SBT. But i am getting error "PATH HAS TYPE OBJECT RATHER THAN STRING".
This way i am passing keystore :-
run -Dhttp.port=disabled -Dhttps.port=9448 -Dhttps.keyStore.path="certs\example.com.jks" -Dhttps.keyStore.type=JKS -Dhttps.keyStore.password=changeit
I am getting below error :-
error] p.c.s.AkkaHttpServer - Cannot load SSL context
ava.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at play.core.server.ssl.ServerSSLEngine$.createScalaSSLEngineProvider(ServerSSLEngine.scala:103)
at play.core.server.ssl.ServerSSLEngine$.createSSLEngineProvider(ServerSSLEngine.scala:35)
at play.core.server.AkkaHttpServer.$anonfun$httpsServerBinding$1(AkkaHttpServer.scala:126)
at play.core.server.AkkaHttpServer.$anonfun$httpsServerBinding$1$adapted(AkkaHttpServer.scala:124)
at scala.Option.map(Option.scala:146)
at play.core.server.AkkaHttpServer.<init>(AkkaHttpServer.scala:124)
aused by: com.typesafe.config.ConfigException$WrongType: system properties: path has type OBJECT rather than STRI
at com.typesafe.config.impl.SimpleConfig.findKeyOrNull(SimpleConfig.java:159)
at com.typesafe.config.impl.SimpleConfig.findOrNull(SimpleConfig.java:170)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:184)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:189)
at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:246)
at play.core.server.ssl.DefaultSSLEngineProvider.createSSLContext(DefaultSSLEngineProvider.scala:34)
at play.core.server.ssl.DefaultSSLEngineProvider.<init>(DefaultSSLEngineProvider.scala:24)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
Please tell me if anything wrong in the above.

The Exception suggest, that one of your path is wrong.
Here is the code, the exception happened:
def createSSLContext(applicationProvider: ApplicationProvider): SSLContext = {
val httpsConfig = serverConfig.configuration.underlying.getConfig("play.server.https")
val keyStoreConfig = httpsConfig.getConfig("keyStore")
val keyManagerFactory: KeyManagerFactory = if (keyStoreConfig.hasPath("path")) {
val path = keyStoreConfig.getString("path") // HERE EXACTLY
As you can see it expects a path, called: play.server.https.keyStore.path.
It seems that you have configured that path wrong: Check your application.conf.
Or adjust your run command.

Related

Problem with passing config as part of karate.callSingle() [duplicate]

This question already has an answer here:
How to pass multiple parameters to callSingle karate on karate-config.js
(1 answer)
Closed 1 year ago.
As the title says I'm experiencing issues when trying to pass a config item to the feature file I call from callSingle() in karate-config.js. Everything works as expected if I only pass 1 parameter which is the path to the feature file. I tried copying the entire config from the example karate-config.js and yet was still getting the same error. Would appreciate any explanation.
karate-config.js:
function fn() {
var env = karate.env; // get system property 'karate.env'
karate.log('karate.env system property was:', env);
if (!env) {
env = 'kermit';
}
karate.log('karate.env system property was:', env);
var config = {
env: env,
base_url: 'https://api.'+env+'.chdev.org',
account_url: 'https://account.'+env+'.chdev.org'
}
var result = karate.callSingle('classpath:authorization/authorization.feature', config);
karate.log(result);
return config;
}
Error I'm getting:
16:48:10.548 [main] ERROR com.intuit.karate - evaluation of 'karate-config.js' failed: javascript function call failed: unexpected feature call arg type: class jdk.nashorn.internal.scripts.JO4
Exception in thread "main" java.lang.NullPointerException
at com.intuit.karate.core.ScenarioExecutionUnit.init(ScenarioExecutionUnit.java:147)
at com.intuit.karate.core.ScenarioExecutionUnit.run(ScenarioExecutionUnit.java:236)
at com.intuit.karate.core.FeatureExecutionUnit.run(FeatureExecutionUnit.java:164)
at com.intuit.karate.core.FeatureExecutionUnit.run(FeatureExecutionUnit.java:73)
at com.intuit.karate.core.Engine.executeFeatureSync(Engine.java:109)
at com.intuit.karate.IdeUtils.exec(IdeUtils.java:64)
at cucumber.api.cli.Main.main(Main.java:36)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
In fact no matter what I pass as 2nd arg I always get a variation of class jdk.nashorn error.
config} definitely looks like a typo to me.
Sorry but I think the best option for you is to follow this process: https://github.com/intuit/karate/wiki/How-to-Submit-an-Issue
MR Peter Thomas has confirmed the project I've sent to him as a github issue works as expected which proved it is an environmental issue.
It was resolved by updating the maven and jdk8 to the latest releases.

Lookup Hbase Tbl from UDF (Beeline , Hbase,Delegation Tokens)

I have a requirement to write Custom UDF for data lookup from Hbase Table .
NOTE : I have done Unit Testing with HIVE . It seems to be working .
But when I use the same UDF Beeline, Its failed . By default Cloudera restricts impersonation and allows only hive user for running queries in Beeline. On Job startup , YarnChild is setting the following delegations tokens.
I want to add token (Kind : HBASE_AUTH_TOKEN ) for dealing with Hbase.
Kind: mapreduce.job
Kind: HDFS_DELEGATION_TOKEN
Kind: kms-dt
I researched and found out how HbaseStorageHandler is using Delegation Token ( i.e HBASE_AUTH_TOKEN ) for Hbase . So I used the same set of functions and its not working either .
Functions from HbasestorageHandler (to obtain tokens to Job ) :
private void addHBaseDelegationToken(Configuration conf, JobConf jconf) throws IOException {
if (User.isHBaseSecurityEnabled(conf)) {
try {
logger.info("isHbaseSecurityEnabled :True ");
User e = User.getCurrent();
logger.info("isHbaseSecurityEnabled :User ==> " + e.toString());
Token authToken = getAuthToken(conf, e);
logger.info("isHbaseSecurityEnabled :AuthToken==> "+authToken.toString());
Job job = new Job(conf);
if(authToken == null) {
UserGroupInformation ugi = UserGroupInformation.getLoginUser();
ugi.setAuthenticationMethod(UserGroupInformation.AuthenticationMethod.KERBEROS);
e.obtainAuthTokenForJob(jconf);
} else {
logger.info("authToken is not null"+authToken.toString());
job.getCredentials().addToken(authToken.getService(), authToken);
}
logger.info("obtained Token /....");
} catch (InterruptedException var5) {
throw new IOException("Error while obtaining hbase delegation token", var5);
}
}
}
private static Token<AuthenticationTokenIdentifier> getAuthToken(Configuration conf, User user) throws IOException, InterruptedException {
ZooKeeperWatcher zkw = new ZooKeeperWatcher(conf, "mr-init-credentials", (Abortable) null);
Token var4;
try {
String e = ZKClusterId.readClusterIdZNode(zkw);
logger.info("====== clusterID : " + e);
var4 = (new AuthenticationTokenSelector()).selectToken(new Text(e), user.getUGI().getTokens());
if (var4 == null) {
logger.info("var4 is null===========================");
} else {
logger.info("====== Hbase Token : " + var4.toString());
}
} catch (KeeperException var8) {
throw new IOException(var8);
} catch (NullPointerException np) {
return null;
} finally {
zkw.close();
}
return var4;
}
After calling addHBaseDelegationToken() in configure() of UDF. I am getting the following exception .I am not sure How I can make hvie user to talk with Hbase as hive.keytab is handled by Cloudera and its secured.
Any Inputs might be helpful. Thanks !
Exception StackTrace :
2018-10-11 04:48:07,625 WARN [main] org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:hive (auth:SIMPLE) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
2018-10-11 04:48:07,627 WARN [main] org.apache.hadoop.hbase.ipc.RpcClientImpl: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
2018-10-11 04:48:07,628 FATAL [main] org.apache.hadoop.hbase.ipc.RpcClientImpl: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:181)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:618)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:163)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:744)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:741)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:741)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:907)
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:874)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1246)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:34118)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1633)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:104)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:94)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:136)
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:107)
at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callBlockingMethod(CoprocessorRpcChannel.java:73)
at org.apache.hadoop.hbase.protobuf.generated.AuthenticationProtos$AuthenticationService$BlockingStub.getAuthenticationToken(AuthenticationProtos.java:4512)
at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:86)
at org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:111)
at org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:108)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:340)
at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:108)
at com.barclaycardus.hadoop.utils.udfs.HbaseTblLookupUDF.configure(HbaseTblLookupUDF.java:131)
at org.apache.hadoop.hive.ql.exec.MapredContext.setup(MapredContext.java:120)
at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:143)
at org.apache.hadoop.hive.ql.exec.Operator.initEvaluators(Operator.java:954)
at org.apache.hadoop.hive.ql.exec.Operator.initEvaluatorsAndReturnStruct(Operator.java:980)
at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:63)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:469)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:425)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableScanOperator.java:196)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)
at org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:431)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:126)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:455)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 66 more
Already Tried below options :
https://github.com/apache/oozie/blob/master/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java
https://github.com/ibm-research-ireland/sparkoscope/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/security/HBaseCredentialProvider.scala

Node Identifier of copied node in Jackrabbit not found while deleting

I copied a node in Jackrabbit using session.getWorkspace().copy(sourceNode.getPath(), destinationNode.getPath())
This operation changes are persisted instantly as far as I know. But when I tried to get the copied node for deleting it using session.getNodeByIdentifier("nodeId of copied node"), it gives ItemNotFoundException. The reason for that error is that the copied node loses is mix:referenceable property during copy which causes getNodeByIdentifier to fail.
The question is how do I set the mix:referenceable property to copied node as I m not able to get the node from session after copy operation. Could someone help me out on this?
UPDATE:
CODE:
Node srcNode = session.getNodeByIdentifier("SOURCE_NODE_ID");
if(srcNode == null) {
System.out.println("File not found");
}
Node rootNode = session.getRootNode();
Node appNode = rootNode.getNode("JACKRABBIT");
Node destNode = appNode.addNode("Copy_Test_"+System.currentTimeMillis(),"nt:file");
session.getWorkspace().copy(srcNode.getPath(),destNode.getPath());
destNode.addMixin(MIX_VERSIONABLE);
destNode.addMixin(MIX_LOCKABLE);
destNode.addMixin(MIX_REFERENCEABLE);
destNode.addNode(DMSConstants.RESOURCE_NODE,"nt:unstructured");
session.refresh(true);
session.save();
EXCEPTION:
Exception in thread "main" javax.jcr.InvalidItemStateException: Unable to update a stale item: item.save()
at org.apache.jackrabbit.rmi.server.ServerObject.getRepositoryException(ServerObject.java:111)
at org.apache.jackrabbit.rmi.server.ServerSession.save(ServerSession.java:265)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:346)
at sun.rmi.transport.Transport$1.run(Transport.java:200)
at sun.rmi.transport.Transport$1.run(Transport.java:197)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Transport.java:196)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:568)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:826)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:683)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:682)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
at sun.rmi.transport.StreamRemoteCall.exceptionReceivedFromServer(StreamRemoteCall.java:276)
at sun.rmi.transport.StreamRemoteCall.executeCall(StreamRemoteCall.java:253)
at sun.rmi.server.UnicastRef.invoke(UnicastRef.java:162)
at org.apache.jackrabbit.rmi.server.ServerXASession_Stub.save(Unknown Source)
at org.apache.jackrabbit.rmi.client.ClientSession.save(ClientSession.java:272)
Please note that I am using JCR 2.0 Also if I change the code to session.refresh(false), the code works fine but I m not able to find the node identifier from session for deleting the same which is my original issue.
Why are you creating a node at the destination then copying to the same place? I believe the stale exception is because the call to copy has updated the underlying node making your destNode reference stale/out-of-date.
Simply remove the addNode then do something like ...
String destPath = "Copy_Test_" + System.currentTimeMillis()";
session.getWorkspace().copy(srcNode.getPath(), destPath);
Node destNode = session.getPath(destPath);
As #TedTrippin pointed out the issue was with creating a destination node before copy which was not required. As part of copy, the node is created. So my final working code is as follows:
Node srcNode = session.getNodeByIdentifier("SOURCE_NODE_ID");
if(srcNode == null) {
System.out.println("File not found");
}
Node rootNode = session.getRootNode();
Node appNode = rootNode.getNode("JACKRABBIT");
String destNodeName = "Copy_Test";
session.getWorkspace().copy(srcNode.getPath(),appNode.getPath() + "/" + destNodeName);
Node destNode = appNode.getNode(destNodeName);
destNode.addMixin(MIX_VERSIONABLE);
destNode.addMixin(MIX_LOCKABLE);
destNode.addMixin(MIX_REFERENCEABLE);
session.refresh(true);
session.save();

How to print a Groovy variable in Jenkins?

I have the following code within a Jenkins pipeline:
stage ('Question') {
try {
timeout(time: 1, unit: 'MINUTES') {
userInput = input message: 'Choose server to publish to:', ok: '', parameters: [
[$class: 'hudson.model.ChoiceParameterDefinition', choices: 'pc-ensureint\nother-server', description: 'Choose server to publish to:', name: 'server']
]
}
} catch (err) {
userInput = [server: 'pc-ensureint'] // if an error is caught set this value
}
}
node () {
println ${server}
}
I'm trying to troubleshoot a problem with the server variable which is set in the ChoiceParameterDefinition.
When I run the build, I get the following error:
java.lang.NoSuchMethodError: No such DSL method '$' found among steps [AddInteractivePromotion, ArtifactoryGradleBuild, ArtifactoryMavenBuild, ConanAddRemote, ConanAddUser, InitConanClient, MavenDescriptorStep, RunConanCommand, ansiblePlaybook, archive, artifactoryDownload, artifactoryPromoteBuild, artifactoryUpload, bat, build, catchError, checkout, collectEnv, deleteDir, dir, dockerFingerprintFrom, dockerFingerprintRun, dockerPullStep, dockerPushStep, echo, emailext, emailextrecipients, envVarsForTool, error, fileExists, getArtifactoryServer, getContext, getDatabaseConnection, git, input, isUnix, library, libraryResource, load, mail, milestone, newArtifactoryServer, newBuildInfo, newGradleBuild, newMavenBuild, node, parallel, properties, publishBuildInfo, pwd, readFile, readTrusted, resolveScm, retry, script, sh, sleep, sql, stage, stash, step, svn, timeout, timestamps, tool, unarchive, unstash, validateDeclarativePipeline, waitForQualityGate, waitUntil, withContext, withCredentials, withDockerContainer, withDockerRegistry, withDockerServer, withEnv, wrap, writeFile, ws, xrayScanBuild] or symbols [all, allOf, always, ant, antFromApache, antOutcome, antTarget, any, anyOf, apiToken, architecture, archiveArtifacts, artifactManager, batchFile, booleanParam, branch, buildButton, buildDiscarder, caseInsensitive, caseSensitive, choice, choiceParam, cleanWs, clock, cloud, command, configFile, configFileProvider, cron, crumb, defaultView, demand, disableConcurrentBuilds, docker, dockerfile, downloadSettings, downstream, dumb, envVars, environment, expression, file, fileParam, filePath, fingerprint, frameOptions, freeStyle, freeStyleJob, git, github, githubPush, gradle, hyperlink, hyperlinkToModels, installSource, jdk, jdkInstaller, jgit, jgitapache, jnlp, jobName, junit, label, lastDuration, lastFailure, lastGrantedAuthorities, lastStable, lastSuccess, legacy, legacySCM, list, local, location, logRotator, loggedInUsersCanDoAnything, masterBuild, maven, maven3Mojos, mavenErrors, mavenMojos, mavenWarnings, modernSCM, msbuild, msbuildError, msbuildWarning, myView, node, nodeProperties, nonStoredPasswordParam, none, not, overrideIndexTriggers, paneStatus, parameters, password, pattern, pipeline-model, pipelineTriggers, plainText, plugin, pollSCM, projectNamingStrategy, proxy, queueItemAuthenticator, quietPeriod, remotingCLI, run, runParam, schedule, scmRetryCount, search, security, shell, skipDefaultCheckout, skipStagesAfterUnstable, slave, stackTrace, standard, status, string, stringParam, swapSpace, text, textParam, tmpSpace, toolLocation, unsecured, upstream, usernameColonPassword, usernamePassword, viewsTabBar, weather, withSonarQubeEnv, zfs, zip] or globals [Artifactory, currentBuild, docker, env, params, pipeline, scm]
at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:149)
at org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:108)
at groovy.lang.GroovyObject$invokeMethod.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:151)
at org.kohsuke.groovy.sandbox.GroovyInterceptor.onMethodCall(GroovyInterceptor.java:21)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:115)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:149)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:146)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:123)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:123)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:16)
at WorkflowScript.run(WorkflowScript:16)
at ___cps.transform___(Native Method)
at com.cloudbees.groovy.cps.impl.ContinuationGroup.methodCall(ContinuationGroup.java:57)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.dispatchOrArg(FunctionCallBlock.java:109)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.fixArg(FunctionCallBlock.java:82)
at sun.reflect.GeneratedMethodAccessor637.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
at com.cloudbees.groovy.cps.impl.ClosureBlock.eval(ClosureBlock.java:46)
at com.cloudbees.groovy.cps.Next.step(Next.java:74)
at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:154)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:18)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:33)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:30)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox.runInSandbox(GroovySandbox.java:108)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:30)
at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:165)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:330)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$100(CpsThreadGroup.java:82)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:242)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:230)
at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:64)
at java.util.concurrent.FutureTask.run(Unknown Source)
at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:112)
at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Finished: FAILURE
As far as I know, server is a groovy variable and thus I'm supposed to be able to access it using ${ }.
So I've tried:
echo ${server}
print ${server}
println ${server}
println "${server}"
But no matter what I try I keep getting this error.
Any idea what I'm doing wrong?
The following code worked for me:
echo userInput
You shouldn't use ${varName} when you're outside of strings, you should just use varName. Inside strings you use it like this; echo "this is a string ${someVariable}";. Infact you can place an general java expression inside of ${...}; echo "this is a string ${func(arg1, arg2)}.

I can't create a SipProvider- MjSip library

I am new to MjSip and i want to create an instance of SipProvider class.
so I worte this little code but get some error:
sip_provider = new SipProvider("192.168.0.254",5060);
and here is error stack:
java.io.FileNotFoundException: log\192.168.0.254.5060_events.log (The system cannot find the path specified)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(Unknown Source)
at java.io.FileOutputStream.<init>(Unknown Source)
at org.zoolu.tools.Log.<init>(Log.java:112)
at org.zoolu.tools.RotatingLog.<init>(RotatingLog.java:73)
at org.zoolu.sip.provider.SipProvider.initLog(SipProvider.java:295)
at org.zoolu.sip.provider.SipProvider.<init>(SipProvider.java:224)
at local.ua.UA.main(UA.java:539)
java.io.FileNotFoundException: log\192.168.0.254.5060_messages.log (The system cannot find the path specified)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(Unknown Source)
at java.io.FileOutputStream.<init>(Unknown Source)
at org.zoolu.tools.Log.<init>(Log.java:112)
at org.zoolu.tools.RotatingLog.<init>(RotatingLog.java:73)
at org.zoolu.sip.provider.SipProvider.initLog(SipProvider.java:296)
at org.zoolu.sip.provider.SipProvider.<init>(SipProvider.java:224)
at local.ua.UA.main(UA.java:539)
Exception in thread "main" java.lang.NullPointerException
at org.zoolu.tools.Log.flush(Log.java:147)
at org.zoolu.tools.Log.println(Log.java:177)
at org.zoolu.sip.provider.SipProvider.printLog(SipProvider.java:1161)
at org.zoolu.sip.provider.SipProvider.initLog(SipProvider.java:298)
at org.zoolu.sip.provider.SipProvider.<init>(SipProvider.java:224)
at local.ua.UA.main(UA.java:539)
i'm using a lan network and the Ip address is valid.
why this happen??
please help me!!!
No log folder. You need to create a folder. Or you should give a specific directory.
String home = System.getProperty("user.home");
File f = new File(home +"//"+SipStack.log_path);
try {
if (!f.exists() || !f.isDirectory())
{
JOptionPane.showMessageDialog(null,home +"//"+SipStack.log_path + " dosyası oluşturulacak");
File dir = new File(home +"//"+SipStack.log_path);
dir.mkdir();
}
} catch (Exception e) {
JOptionPane.showMessageDialog(null,"Log klasörü oluşturulamadı");
}
} catch (Exception ex) {
}
if (SipStack.debug_level>0)
{
String home = System.getProperty("user.home");
String filename=home+"//"+SipStack.log_path+"//"+via_addr+"."+host_port;
log=new RotatingLog(filename+"_events.log",SipStack.debug_level,SipStack.max_logsize*1024,SipStack.log_rotations,SipStack.rotation_scale,SipStack.rotation_time);
message_log=new RotatingLog(filename+"_messages.log",SipStack.debug_level,SipStack.max_logsize*1024,SipStack.log_rotations,SipStack.rotation_scale,SipStack.rotation_time);
}