I use AtomicLongMap in the construct function of RichMapFunctin.Like
public class PathAnalysis extends RichMapFunction<ApiLog, ApiLog> {
private final AtomicLongMap<Object> mObjectAtomicLongMap;
public PathAnalysis()
{
mObjectAtomicLongMap = AtomicLongMap.create();
}
}
register the custom serizlize class, but it not work
env.getConfig().registerTypeWithKryoSerializer(AtomicLongMap.class, new AtomicLongMapSerializer());
It cause:
org.apache.flink.api.common.InvalidProgramException: The implementation of the RichMapFunction is not serializable. The object probably contains or references non serializable fields.
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:99)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:1550)
at org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:184)
at org.apache.flink.streaming.api.datastream.DataStream.map(DataStream.java:528)
at com.ghzs.Topology.main(Topology.java:91)
Caused by: java.io.NotSerializableException: org.apache.flink.shaded.curator.org.apache.curator.shaded.com.google.common.util.concurrent.AtomicLongMap
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:315)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:81)
... 4 more
AtomicLongMap is not implements of Serializable .How can I register a effective custom serialize method?
Mark AtomicLongMap as transient, and allocate it in the open() call that your function will receive (because it's a RichMapFunction). So something like:
public class PathAnalysis extends RichMapFunction<ApiLog, ApiLog> {
private transient AtomicLongMap<Object> mObjectAtomicLongMap;
public PathAnalysis() { }
#Override
public void open(Configuration parameters) throws Exception {
super.open(parameters);
mObjectAtomicLongMap = AtomicLongMap.create();
}
}
try with implementation of this function like:
import org.apache.flink.api.common.functions.RichMapFunction;
import com.google.common.util.concurrent.AtomicLongMap;
public class PathAnalysis extends RichMapFunction<ApiLog, ApiLog> {
private final AtomicLongMap<Object> mObjectAtomicLongMap;
public PathAnalysis()
{
mObjectAtomicLongMap = AtomicLongMap.create();
}
#Override
public ApiLog map(ApiLog value) throws Exception {
// TODO Auto-generated method stub
return null;
}
}
Related
When trying to use #AutoWire feature with one of StandAlone Application unable to do so instead getting Null Pointer Exception. Please highlight my mistakes if any. Your help is appreciated.
Spring Ver 5.1.5.RELEASE and we're not using any xml config file to tell spring there are annotated classes to look into instead using #ComponentScan or #EnableAutoConfiguration at the top of AppConfig and boost strap the Context from main() class as a first line. But Autowiring works perfectly with internal bean/java classes of jdk(Environment) but not with custom POJO classes. If we're trying to get through getBean method then it works. But I'm trying to avoid creating context everywhere and using getBean() Please Refer below and help me only with your valuable guidelines.
public class ContextMaster {
private static AnnotationConfigApplicationContext appContext;
public static AnnotationConfigApplicationContext getApplicationContext() {
if (appContext == null) {
appContext = new AnnotationConfigApplicationContext(ContextConfig.class);
//appContext = new AnnotationConfigApplicationContext("com.xx.xx.xxx","xx.xxx.xxxx.xxx.datamanager");
logger.debug("Context Invoked !!");
}
return appContext;
}
}
#Configuration
#EnableAutoConfiguration
#PropertySource("classpath:db.properties")
#EnableTransactionManagement
#ComponentScans(value = {
#ComponentScan(basePackages = "xxxxx.datamanager"),
#ComponentScan(basePackages = "com.xx.xx.xxx"),
#ComponentScan(basePackages = "com.xx.xx.xxx.utils")})
public class AppConfig {
#Autowired
private Environment env;
#Bean
public DataSource getDataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(env.getProperty("db.driver"));
dataSource.setUrl(env.getProperty("db.url"));
return dataSource;
}
#Bean
public LocalSessionFactoryBean getSessionFactory() {
LocalSessionFactoryBean factoryBean = new LocalSessionFactoryBean();
//LocalSessionFactoryBean sessionFactoryBean = new AnnotationSessionFactoryBean();
factoryBean.setDataSource(getDataSource());
Properties props=new Properties();
props.put("hibernate.show_sql", env.getProperty("hibernate.show_sql"));
props.put("hibernate.hbm2ddl.auto", env.getProperty("hibernate.hbm2ddl.auto"));
props.put("hibernate.cache.region.factory_class", env.getProperty("hibernate.cache.region.factory_class"));
factoryBean.setHibernateProperties(props);
factoryBean.setAnnotatedClasses(xx.class, xxxx.class, xxxx.class, xxx.class);
return factoryBean;
}
#Bean
public HibernateTransactionManager getTransactionManager() {
return transactionManager;
}
}
// Here is NPE thrown when tried with auto-configured bean
#Component
public class Good extends Good11 {
#Autowired
private RxxxDyyyyHelper rdh;
//RxxxDyyyyHelper rdh =
ContextHelper.getApplicationContext().getBean(RxxxDyyyyHelper .class);
rdh.setProperty(); // NPE here
rdh.getProperty(); // NPE
}
// Here we're trying to initiate the LosUtils class
public class LosUtils {
public static void main(String args[]) throws Exception {
AnnotationConfigApplicationContext applicationContext = `ContextHelper.getApplicationContext();`
}
It seems like you didn't put the full code here, because your Good class won't compile this way..
I am trying to implement a custom deserializer.
Because I only want to add functionality to the default deserializer, I tried to store in my custom deserializer the default one: I would like to use the default to deserialize the json and then add other information.
I am trying to use BeanDeserializerModifier to register the custom deserializer.
SimpleModule module = new SimpleModule("ModelModule", Version.unknownVersion());
module.setDeserializerModifier(new BeanDeserializerModifier() {
#Override
public JsonDeserializer<?> modifyDeserializer(DeserializationConfig config, BeanDescription beanDesc, JsonDeserializer<?> deserializer) {
JsonDeserializer<?> configuredDeserializer = super.modifyDeserializer(config, beanDesc, deserializer);
if (Document.class.isAssignableFrom(beanDesc.getBeanClass())) {
logger.debug("Returning custom deserializer for documents");
configuredDeserializer = new DocumentDeserializer(configuredDeserializer, (Class<Document>)beanDesc.getBeanClass());
}
return configuredDeserializer;
}
});
As you can see, if the object to generate is a "Document", I am modifying the deserializer returning a custom deserializer. I am passing the default deserializer to the constructor so I can use it later.
When I try to deserialize, Jackson fails with the error:
No _valueDeserializer assigned(..)
I have investigated and it seems that the default deserializer does not have the correct deserializers for its properties: for all the properties, it is using the deserializer FailingDeserializer that, of course, fails and returns the error mentioned above. This deserializer is supposed to be substituted but it is not.
It seems that, after calling the method modifyDeserializer, Jackson completes the configuration.
The custom deserializer that I am using is:
#SuppressWarnings("serial")
public class DocumentDeserializer extends StdDeserializer<Document> {
private JsonDeserializer<?> defaultDeserializer;
private DocumentDeserializer(JsonDeserializer<?> defaultDeserializer, Class<? extends Document> clazz) {
super(clazz);
this.defaultDeserializer = defaultDeserializer;
}
#Override
public Document deserialize(JsonParser jp, DeserializationContext ctxt) throws IOException, JsonProcessingException {
Document documentDeserialized = (Document) defaultDeserializer.deserialize(jp, ctxt);
/* I want to modify the documentDeserialized before returning it */
return documentDeserialized;
}
}
UPDATE:
I solved the problem using a different Deserializer:
public class CustomDeserializerModifier extends BeanDeserializerModifier {
private static final Logger logger = Logger.getLogger(CustomDeserializerModifier.class);
public CustomDeserializerModifier (Factory factory) {
this.factory = factory;
}
#Override
public JsonDeserializer<?> modifyDeserializer(DeserializationConfig config, BeanDescription beanDesc, JsonDeserializer<?> deserializer) {
JsonDeserializer<?> configuredDeserializer;
if (CustomDeserializedNode.class.isAssignableFrom(beanDesc.getBeanClass())) {
Converter<Object, Object> conv = beanDesc.findDeserializationConverter();
JavaType delegateType = conv.getInputType(config.getTypeFactory());
configuredDeserializer = new CustomDeserializedNodeDeserializer(conv, delegateType, (JsonDeserializer<Document>) deserializer,
(Class<? extends CustomDocument<?>>)beanDesc.getBeanClass());
} else {
configuredDeserializer = super.modifyDeserializer(config, beanDesc, deserializer);
}
return configuredDeserializer;
}
#SuppressWarnings("serial")
public class CustomDeserializedNodeDeserializer extends StdDelegatingDeserializer<Object> {
private Class<? extends CustomDocument<?>> beanClass;
public CustomDeserializedNodeDeserializer(Converter<Object,Object> converter,
JavaType delegateType, JsonDeserializer<Document> delegateDeserializer, Class<? extends CustomDocument<?>> beanClass) {
super(converter, delegateType, delegateDeserializer);
this.beanClass = beanClass;
}
#Override
public CustomDeserializedNode deserialize(JsonParser jp, DeserializationContext ctxt)
throws IOException, JsonProcessingException {
CustomDeserializedNode node = (CustomDeserializedNode)factory.createCustomDocument(beanClass);
CustomDeserializedNode documentDeserialized = (Document) super.deserialize(jp, ctxt, node);
return documentDeserialized;
}
}
}
Probably extending StdDelegatingDeserializer does what #StaxMan is suggesting.
This should be added in a FAQ, but what you need to do is to implement 2 interfaces:
ResolvableDeserializer (method resolve(...))
ContextualDeserializer (method createContextual(...))
and delegate these calls to defaultDeserializer in case it implements one or both interfaces. These are required for deserializer initialization; especially ContextualDeserializer through which property annotations are made available to deserializers.
And ResolvableDeserializer is used by BeanDeserializer to get deserializers for properties it has, if any; this is where _valueDeserializer in question is likely to be fetched.
I'm having two routes in two separated projects :
First route is setting the header with a data format bean name as a constant :
setHeader("dataFormatBeanName", constant("myFirstList"))
First route :
public class MyTest {
#Configuration
public static class MyTestConfig extends CamelConfiguration {
#Bean(name = "myFirstList")
public DataFormat getMyFirstListDataFormat() {
return new MyFirstListDataFormat();
}
#Bean(name = "mySecondList")
public DataFormat getMySecondListDataFormat() {
return new MySecondListDataFormat();
}
#Bean
public RouteBuilder route() {
return new RouteBuilder() {
#Override
public void configure() throws Exception {
from("direct:testFirstDataFormat").setHeader("dataFormatBeanName", constant("myFirstList")).to("direct:myRoute");
from("direct:testSecondDataFormat").setHeader("dataFormatBeanName", constant("mySecondList")).to("direct:myRoute");
}
};
}
}
}
Second route is supposed to retrieve the bean name from the header and use it as a custom marshaller. Something like :
custom(header("dataFormatBeanName"))
(doesn't compile)
Anyone knows how I'm supposed to get my bean name from the header to use it in the custom method ?
#Component
public class MyRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
final RouteDefinition routedefinition = this.from("direct:myRoute");
routedefinition.marshal().custom(??????????).to("netty4:tcp://{{route.address}}:{{port}}?textline=true&sync=true");
}
After a few more hours searching, here is the solution a found :
No changes in the first class.
Second class uses an anonymous DataFormat in which I retrieve the bean name from the header and get the spring bean from camel context before calling its marshal method.
The AbstractXxxDataFormat class belongs to project2 and is inherited by the Project1 DataFormat.
#Override
public void configure() throws Exception {
final RouteDefinition routedefinition = this.from("direct:myRoute");
routedefinition.marshal(new DataFormat() {
#Override
public void marshal(final Exchange exchange, final Object graph, final OutputStream stream) throws Exception {
AbstractXxxDataFormat myDataFormat = (AbstractGoalDataFormat) getContext().getRegistry().lookupByName(exchange.getIn().getHeader("dataFormatBeanName", String.class));
myDataFormat.marshal(exchange, graph, stream);
}
#Override
public Object unmarshal(final Exchange exchange, final InputStream stream) throws Exception {
return null;
}
});
routedefinition.to("netty4:tcp://{{route.address}}:{{port}}?textline=true&sync=true");
}
If there's any better solution available, I'll be interested.
Have you tried simple("${header.dataFormatBeanName}") to access the header?
Also, rather than passing the format bean name in a header in the first place, why not factor out each .marshal() call into two subroutes (one for formatBeanA and one for formatBeanB) and then call the appropriate subroute rather than setting the header in the first place? I believe this could be a cleaner approach.
If you really need to get it in the route as a variable (as opposed to a predicate to be used in the builder api) you could use an inline processor to extract it:
public class MyRouteBuilder extends RouteBuilder {
public void configure() throws Exception {
from("someEndpoint")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
String beanName = exchange.getHeader("beanNameHeader");
}
});
}
}
Just be careful of scope and concurrency when storing the extracted beanName however.
A collegue of mine (thanks to him) found the definite solution :
set bean name in the exchange properties :
exchange.setProperty("myDataFormat", "myDataFormatAutowiredBean");
retrieve the dataFormat bean with RecipientList pattern and (un)marshal :
routedefinition.recipientList(simple("dataformat:${property.myDataFormat}:marshal"));
routedefinition.recipientList(simple("dataformat:${property.myDataFormat}:unmarshal"));
Very concise and works just fine.
The problem I have involves a pretty complex class structure but I managed to summarize the gist of it in the following simpler example. I need to be able to serialize an object of class MyItem (including private property 'text') and subsequently deserialize it without having a parameter-less constructor available and without being able to create one because it would totally mess up the current logic.
class MyCollection:
#XmlRootElement(name="collection")
public class MyCollection {
public MyCollection() {
this.items = new ArrayList<MyItem>();
}
#XmlElement(name="item")
private List<MyItem> items;
public void addItem(String text) {
this.items.add(new MyItem(text));
}
}
class MyItem:
public class MyItem {
public MyItem(String text) {
this.text = text;
}
#XmlAttribute
private String text;
}
The first requirement (serialize MyItem including private property) is met out of the box and I get the following xml as a result:
<collection>
<item text="FIRST"/>
<item text="SECOND"/>
<item text="THIRD"/>
</collection>
In order to meet the second requirement I decorated class MyItem with attribute #XmlJavaTypeAdapter
#XmlJavaTypeAdapter(MyItemAdapter.class)
public class MyItem {
...
and introduced classes AdaptedMyItem
public class AdaptedMyItem {
private String text;
public void setText(String text) { this.text = text; }
#XmlAttribute
public String getText() { return this.text; }
}
and MyItemAdapter
public class MyItemAdapter extends XmlAdapter<AdaptedMyItem, MyItem> {
#Override
public MyItem unmarshal(AdaptedMyItem adaptedMyItem) throws Exception {
return new MyItem(adaptedMyItem.getText());
}
#Override
public AdaptedMyItem marshal(MyItem item) throws Exception {
AdaptedMyItem result = new AdaptedMyItem();
result.setText("???"); // CANNOT USE item.getText()
return result;
}
}
but this is where I get stuck because in method marshal I cannot access MyItem.text and so I cannot use the standard approach for dealing with immutable classes in JAXB.
Bottomline: I would like to use the class adapter mechanism only when deserializing (because I need to invoke a non-parameterless constructor) but not when serializing (because I cannot access private properties). Would that be possible?
I have the following interface
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "className")
public interface InfoChartInformation {
public String name();
}
And the following implementation (enum):
public class InfoChartSummary {
public static enum Immobilien implements InfoChartInformation {
CITY, CONSTRUCTION_DATE;
}
public static enum Cars implements InfoChartInformation {
POWER, MILEAGE;
}
}
Then I use all of It in the following entity:
#Entity(noClassnameStored = true)
#Converters(InfoChartInformationMorphiaConverter.class)
public class TestEntity{
#Id
public ObjectId id;
#Embedded
public List<InfoChartInformation> order;
}
Jackson, in order to detect the type on the unmarshalling time, will add to every enum on the list the className.
I thought morphia would do the same, but there's no field className in the List of enum and the unmarshalling cannot be done correctly: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassCastException: java.lang.String cannot be cast to com.mongodb
.DBObject
I guess the correct behavior should be to save all the enum route (package+name), not only the enum name. At least in that way the unmarshalling could be performed. There's a way morphia supports that by default or I need to create my own converter (similar to this) ?
I tried creating a Custom Converter:
public class InfoChartInformationMorphiaConverter extends TypeConverter{
public InfoChartInformationMorphiaConverter() {
super(InfoChartInformation.class);
}
#Override
public Object decode(Class targetClass, Object fromDBObject, MappedField optionalExtraInfo) {
if (fromDBObject == null) {
return null;
}
String clazz = fromDBObject.toString().substring(0, fromDBObject.toString().lastIndexOf("."));
String value = fromDBObject.toString().substring(fromDBObject.toString().lastIndexOf(".") + 1);
try {
return Enum.valueOf((Class)Class.forName(clazz), value);
} catch (ClassNotFoundException e) {
return null;
}
}
#Override
public Object encode(final Object value, final MappedField optionalExtraInfo) {
return value.getClass().getName() + "." + ((InfoChartInformation) value).name();
}
}
Then, I added the converter information to morphia morphia.getMapper().getConverters().addConverter(new InfoChartInformationMorphiaConverter());.
However, when serializing (or marshalling) the object to save it into the database, the custom converter is ignored and the Enum is saved using the default Morphia converter (only the enum name).
If I use in the TestEntity class only an attribute InfoChartInformation; instead of the List<>InfoChartInformation>, my customer converter will work. However I need support for List
Use:
public class InfoChartInformationMorphiaConverter extends TypeConverter implements SimpleValueConverter
It is a marker interface required to make your Convertor work.