I'm having trouble reading a YAML file that consists of a list in MuleSoft.
I have my YAML file set up as a Configuration property in my Global Elements, in the following structure.
applications:
- appId: "123456"
appName: Application One
- appId: "456789"
appName: Application Two
I am able to read the values when there's no list. But when I have it set up as a list, MuleSoft throws this error:
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+ Failed to deploy artifact 'test', see below +
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
org.mule.runtime.deployment.model.api.DeploymentException: Failed to deploy artifact [test]
Caused by: org.mule.runtime.api.exception.MuleRuntimeException: org.mule.runtime.deployment.model.api.DeploymentInitException: ConfigurationPropertiesException: Configuration properties does not support type a list of complex types. Complex type keys are: appId,appName
Caused by: org.mule.runtime.deployment.model.api.DeploymentInitException: ConfigurationPropertiesException: Configuration properties does not support type a list of complex types. Complex type keys are: appId,appName
Caused by: org.mule.runtime.core.api.config.ConfigurationException: Configuration properties does not support type a list of complex types. Complex type keys are: appId,appName
Caused by: org.mule.runtime.config.internal.dsl.model.config.ConfigurationPropertiesException: Configuration properties does not support type a list of complex types. Complex type keys are: appId,appName
Pleas help me out: Is this not the right way to use a YAML file? Do I need to change my format?
Thank you!
Configuration properties does not support types that are a list of complex types. The properties have to be converted into spring properties on the backend
You can only use simple lists:
applications:
- "123456"
- "456789"
But a simple object would work just as well for properties:
applications:
"123456":
name: Application One
"456789":
name: Application Two
Then you could look up dynamically by the appId like:
<set-variable variableName="appId" value="123456" />
<logger level="INFO" message="#[p('applications.' ++ vars.appId ++ '.name')]" />
Related
Warning message
WARN [io.qua.hib.orm.dep.HibernateOrmProcessor] Could not find a suitable persistence unit for model classes:
- io.quarkus.hibernate.orm.panache.kotlin.PanacheEntity
- io.quarkus.hibernate.orm.panache.kotlin.PanacheEntityBase
The same issue with both io.quarkus:quarkus-hibernate-orm-panache and io.quarkus:quarkus-hibernate-orm-panache-kotlin (PanacheCompanion).
My project has multiple named persistent units and datasources (no default). I'm also using multitenant feature.
INFO [io.quarkus] Installed features: [agroal, cache, cdi, config-yaml, hibernate-orm, hibernate-orm-panache-kotlin, jdbc-mysql, kotlin, mutiny, narayana-jta, resteasy, resteasy-jackson, security, smallrye-context-propagation, smallrye-jwt, smallrye-openapi, swagger-ui, vertx, vertx-web]
It seems that the ORM processor doesn't exclude those base entities, and tries to attach them to a non-existent "default" persistent units. Hence the warning.
I could get rid of it by either define "default" PU or assign io.quarkus.hibernate.orm.panache.kotlin to a named one.
quarkus:
hibernate-orm:
dummy:
pakages: io.quarkus.hibernate.orm.panache.kotlin
datasource: dummy
i'm fairly new with working with NiFi. We're trying to validate an xmlfile, except we need to use a different xsd depending on some value passed in the file. Extracting and routing on the name wasn't an issue, and we stored the desired filepath in an attribute (xsdFile).
However, when trying to use that attribute in the XMLValidation processor, it changes the path and gives an error. When I copy the path from the attributes and copy it to the schema, it works, so the path itself isn't wrong.
Attribute passed in flowfile:
xsdFile:
C:\Users\MYNAME\Documents\NiFi\FLOW_RESOURCES\input\validatexml\camt.053.001.02_CvW_2.xsd
XMLValidation processor properties:
Schema File: ${xsdFile}
Error:
Failed to properly initialize Processor. If still scheduled to run, NiFi will attempt to initialize and run the Processor again after the 'Administrative Yield Duration' has elapsed. Failure is due to java.io.FileNotFoundException:
Schema file not found at specified location: C:\Users\MYNAME\DOCUME~1\NiFi\NIFI-1~1.0: java.io.FileNotFoundException:
Schema file not found at specified location: C:\Users\MYNAME\DOCUME~1\NiFi\NIFI-1~1.0
java.io.FileNotFoundException: Schema file not found at specified location: C:\Users\MYNAME\DOCUME~1\NiFi\NIFI-1~1.0
Why does this not work? Is there another way to do this, or do we need to route to different XMLValidators?
Check documentation for this processor:
https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-standard-nar/1.9.2/org.apache.nifi.processors.standard.ValidateXml/index.html
Schema File:
The path to the Schema file that is to be used for validation
Supports Expression Language: true
(will be evaluated using variable registry only)
So, flow file attribute can't be used for this parameter
I exported datasets from HBASE to HIVE via export utility. Data is coming as sequence file format.
Now, I am creating a hive table with "stored as sequencefile".
When I am doing simple, select query with limit on the hive table, i am getting the below exception
java.io.IOException: Could not find a deserializer for the Value class: 'org.apache.hadoop.hbase.client.Result'. Please ensure that the configuration 'io.serializations' is properly configured,
Detailed error trace below:
Bad status for request TFetchResultsReq(fetchType=0,
operationHandle=TOperationHandle(hasResultSet=True,
modifiedRowCount=None, operationType=0,
operationId=THandleIdentifier(secret='\xb2\x9a^\xc5\xc5\xf3BV\xb1\xb8\xe7\xb6\xab\xe7f\xc0',
guid='\xca\x0c\xc9\\xc0%#\xb4\xa1{\xe4\xc7\xaa(;\xcb')),
orientation=4, maxRows=100):
TFetchResultsResp(status=TStatus(errorCode=0,
errorMessage="java.io.IOException: java.io.IOException: Could not find
a deserializer for the Value class:
'org.apache.hadoop.hbase.client.Result'. Please ensure that the
configuration 'io.serializations' is properly configured, if you're
using custom serialization.", sqlState=None,
infoMessages=["*org.apache.hive.service.cli.HiveSQLException:java.io.IOException:
java.io.IOException: Could not find a deserializer for the Value
class: 'org.apache.hadoop.hbase.client.Result'. Please ensure that the
configuration 'io.serializations' is properly configured, if you're
using custom serialization.:14:13",
'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:366',
'org.apache.hive.service.cli.operation.OperationManager:getOperationNextRowSet:OperationManager.java:277',
'org.apache.hive.service.cli.session.HiveSessionImpl:fetchResults:HiveSessionImpl.java:753',
'org.apache.hive.service.cli.CLIService:fetchResults:CLIService.java:438',
'org.apache.hive.service.cli.thrift.ThriftCLIService:FetchResults:ThriftCLIService.java:686',
'org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1553',
'org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults:getResult:TCLIService.java:1538',
'org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39',
'org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39',
'org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor:process:HadoopThriftAuthBridge.java:746',
'org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286',
'java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1142',
'java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:617',
'java.lang.Thread:run:Thread.java:745',
"*java.io.IOException:java.io.IOException: Could not find a
deserializer for the Value class:
'org.apache.hadoop.hbase.client.Result'. Please ensure that the
configuration 'io.serializations' is properly configured, if you're
using custom serialization.:18:4",
'org.apache.hadoop.hive.ql.exec.FetchOperator:getNextRow:FetchOperator.java:508',
'org.apache.hadoop.hive.ql.exec.FetchOperator:pushRow:FetchOperator.java:415',
'org.apache.hadoop.hive.ql.exec.FetchTask:fetch:FetchTask.java:138',
'org.apache.hadoop.hive.ql.Driver:getResults:Driver.java:1798',
'org.apache.hive.service.cli.operation.SQLOperation:getNextRowSet:SQLOperation.java:361',
"*java.io.IOException:Could not find a deserializer for the Value
class: 'org.apache.hadoop.hbase.client.Result'. Please ensure that the
configuration 'io.serializations' is properly configured, if you're
using custom serialization.:26:8",
'org.apache.hadoop.io.SequenceFile$Reader:init:SequenceFile.java:2040',
'org.apache.hadoop.io.SequenceFile$Reader:initialize:SequenceFile.java:1878',
'org.apache.hadoop.io.SequenceFile$Reader::SequenceFile.java:1827',
'org.apache.hadoop.io.SequenceFile$Reader::SequenceFile.java:1841',
'org.apache.hadoop.mapred.SequenceFileRecordReader::SequenceFileRecordReader.java:49',
'org.apache.hadoop.mapred.SequenceFileInputFormat:getRecordReader:SequenceFileInputFormat.java:64',
'org.apache.hadoop.hive.ql.exec.FetchOperator$FetchInputFormatSplit:getRecordReader:FetchOperator.java:674',
'org.apache.hadoop.hive.ql.exec.FetchOperator:getRecordReader:FetchOperator.java:324',
'org.apache.hadoop.hive.ql.exec.FetchOperator:getNextRow:FetchOperator.java:446'],
statusCode=3), results=None, hasMoreRows=None)
I'm trying to use Mule Credentials Vault security feature.
I've created .properties file, Security Property Placeholder and defined the key and encryption algorithm.
Now I want to use some of the properties from the file when I return HTTP response.
I have the file src/main/resources/data.properties that contains for example:
In my canvas, under Configuration XML I added:
<secure-property-placeholder:config name="Secure_Property_Placeholder" key="24681357" location="data.properties" doc:name="Secure Property Placeholder" encryptionAlgorithm="DES"/>
<set-variable variableName="card.number" value="${number}" />
In my canvas I have message flow that builds xml 'Create XML response based on User'. The value in settings is:
This doesn't work. The error I get is:
-> org.mule.module.launcher.DeploymentInitException: IllegalArgumentException: Could not resolve placeholder 'key' in string value "${key}"
-> Caused by: org.mule.api.lifecycle.InitialisationException: Invalid bean definition with name 'org.mule.autogen.bean.13' defined in null: Could not resolve placeholder 'key' in string value "${key}"; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder 'key' in string value "${key}"
-> Caused by: java.lang.IllegalArgumentException: Could not resolve placeholder 'key' in string value "${key}"
Does anyone know how can I read the properties from .properties file (credentials vault)? And then use it in my flow?
Thanks,
Keren
If you simply want to get the value for the property number and add it into the XML you can use ${number} from .properties. No need to define any other variables in Configuration XML.
<set-payload value="<user><name>Royal Bank of Canada</name><id>Royal_Bank_Of_Canada</id><cc><company>>Visa</company><number>${number}</number><secret>123</secret></cc></user>" doc:name="Set Payload"/>
However note that the property placeholder is resolved at startup so you will not be able to dynamically retrieve a property based on some user input. For this you will have to do some Java coding. This SO post gives you some hints on how this can be achieved. Based on those answers I have created a simple example on how this can be done with a very simple helper bean.
I'm afraid you just can't. The Mule Credentials Vault is an enterprise feature and therefore tipically you won't have access to the source code unless you are a MuleSoft customer.
Even if you were a customer, the api you'd use would be sort of unsupported. I suggest to manually create a custom java component levearing your code and Jasypt (not as a property placeholder but as a library).
The other option, if you are a customer (I guess you are given you are using the credentials vault) is to contact the official support so they take care of it for you.
The property placeholder is used resolve at startup so you will not be able to dynamically retrieve a property based on some user input.
Use ${propertyName} from .properties in MEL to access particular property
From Dataweave you can read it as given below
p('variablename')
where variablename is defined in property files ex: variablename = 15
I have an object with embedded members that I'm making persistent without problems using RDBMS and MySQL.
When I change the datastore to S3 (json plugin) I get the following exception:
Dec 30, 2011 9:50:30 AM org.datanucleus.state.JDOStateManagerImpl isLoaded
WARNING: Exception thrown by StateManager.isLoaded
This constructor is only for objects using application identity.
org.datanucleus.exceptions.NucleusUserException: This constructor is only for objects using application identity.
at org.datanucleus.state.JDOStateManagerImpl.initialiseForHollowAppId(JDOStateManagerImpl.java:226)
at org.datanucleus.state.ObjectProviderFactory.newForHollowPopulatedAppId(ObjectProviderFactory.java:119)
at org.datanucleus.store.json.fieldmanager.FetchFieldManager.getObjectFromJSONObject(FetchFieldManager.java:322)
at org.datanucleus.store.json.fieldmanager.FetchFieldManager.fetchObjectField(FetchFieldManager.java:250)
at org.datanucleus.state.AbstractStateManager.replacingObjectField(AbstractStateManager.java:2228)
at myproject.MyObject.jdoReplaceField(Unknown Source)
at myproject.MyObject.jdoReplaceFields(Unknown Source)
at org.datanucleus.state.JDOStateManagerImpl.replaceFields(JDOStateManagerImpl.java:1949)
at org.datanucleus.state.JDOStateManagerImpl.replaceFields(JDOStateManagerImpl.java:1976)
at org.datanucleus.store.json.JsonPersistenceHandler.fetchObject(JsonPersistenceHandler.java:269)
at org.datanucleus.state.JDOStateManagerImpl.loadFieldsFromDatastore(JDOStateManagerImpl.java:1652)
at org.datanucleus.state.JDOStateManagerImpl.loadSpecifiedFields(JDOStateManagerImpl.java:1254)
at org.datanucleus.state.JDOStateManagerImpl.isLoaded(JDOStateManagerImpl.java:1742)
at myproject.MyObject.jdoGetmember_(Unknown Source)
at myproject.MyObject.getMember(Unknown Source)
member_ in myproject.MyObject is defined as:
#Persistent
#Embedded(members = {
...
})
private Member member_;
and
#PersistenceCapable(detachable="true")
#EmbeddedOnly
public class Member implements Serializable {
(no application identity, no key)
The jdoconfig.xml is roughly:
<jdoconfig
xmlns="http://java.sun.com/xml/ns/jdo/jdoconfig"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="http://java.sun.com/xml/ns/jdo/jdoconfig">
<persistence-manager-factory name="trans-optional">
<property name="javax.jdo.PersistenceManagerFactoryClass"
value="org.datanucleus.api.jdo.JDOPersistenceManagerFactory"/>
<property name="datanucleus.ConnectionURL"
value="amazons3:http://s3.amazonaws.com/"/>
<property name="datanucleus.ConnectionUserName"
value="..."/>
<property name="datanucleus.ConnectionPassword"
value="..."/>
<property name="datanucleus.cloud.storage.bucket"
value="mybucket"/>
</persistence-manager-factory>
</jdoconfig>
I've been to the Supported Features table but I must admit I don't fully understand it.
Does it say that the json plugin does NOT supports embedded objects?
Why do my embedded objects need to have application identity? If I define them with application identity I'm also asked to provide a key and I don't want that, I want them to be embedded.
Any help will be much appreciated!
As the Supported Features table says very clearly (to me), there is a CROSS against the JSON datastore column for the feature "Embedded PC", hence it is not supported for that datastore. Obviously if some user/company wanted such a feature they could either
Update the JSON plugin to support it, like was done for the ODF
plugin for example
Sponsor that work.
Alternatively, don't use embedded objects with that datastore.