How to do programmatic indexing with infinispan-13? - infinispan

How is it possible to do programmatic definition of indexes for pojos with infinispan-13? According to the online documentation here:
the code hasn't changed. However, the classes referred to no longer exist in hibernate-6, which is used by infinispan-13.
Note that it is not possible to use annotations on the pojo class to be indexed because it is used elsewhere by code that cannot have any dependency on infinispan or hibernate etc, so I need to pursue the programmatic route that used to work in infinispan-11.
This is the code that currently works in infinispan-11:
EmbeddedCacheManager cacheManager = new DefaultCacheManager(new GlobalConfigurationBuilder().jmx().build());
SearchMapping mapping = new SearchMapping();
mapping.entity(MyData.class).indexed().property("expiry", ElementType.FIELD).field();
Properties properties = new Properties();
properties.put(Environment.MODEL_MAPPING, mapping);
properties.put("hibernate.search.default.indexBase", "/some/path");
Configuration dcc = cacheManager.getDefaultCacheConfiguration();
ConfigurationBuilder b = new ConfigurationBuilder();
if (dcc != null)
b = b.read(dcc);
b.indexing().addIndexedEntity(MyData.class).withProperties(properties);

I also use Infinispan 13.0.6. This configuration works just fine.
Here is my configuration:
ConfigurationBuilder builder = new ConfigurationBuilder();
Properties p = new Properties();
try(Reader r = new FileReader("/srv/ws-emporium/hotrod-client.properties")) {
p.load(r);
builder.withProperties(p);
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
remoteCacheManager = new RemoteCacheManager(builder.build());
cache = remoteCacheManager.getCache("teste15Fev22");
My properties file content:
infinispan.client.hotrod.server_list = server1:11222;server2:11322
infinispan.client.hotrod.auth_username=admin
infinispan.client.hotrod.auth_password=password
infinispan.client.hotrod.connection_pool.max_active = 10
infinispan.client.hotrod.connection_pool.exhausted_action = WAIT
infinispan.client.hotrod.connection_pool.max_wait = 1
infinispan.client.hotrod.connection_pool.min_idle = 20
infinispan.client.hotrod.connection_pool.min_evictable_idle_time = 300000
infinispan.client.hotrod.connection_pool.max_pending_requests = 20
infinispan.client.hotrod.marshaller=org.infinispan.commons.marshall.JavaSerializationMarshaller
infinispan.client.hotrod.java_serial_allowlist=.*

Related

throwing exception inside the java 8 stream foreach

I am using java 8 stream and I can not throw the exceptions inside the foreach of stream.
stream.forEach(m -> {
try {
if (isInitial) {
isInitial = false;
String outputName = new SimpleDateFormat(Constants.HMDBConstants.HMDB_SDF_FILE_NAME).format(new Date());
if (location.endsWith(Constants.LOCATION_SEPARATOR)) {
savedPath = location + outputName;
} else {
savedPath = location + Constants.LOCATION_SEPARATOR + outputName;
}
File output = new File(savedPath);
FileWriter fileWriter = null;
fileWriter = new FileWriter(output);
writer = new SDFWriter(fileWriter);
}
writer.write(m);
} catch (IOException e) {
throw new ChemIDException(e.getMessage(),e);
}
});
and this is my exception class
public class ChemIDException extends Exception {
public ChemIDException(String message, Exception e) {
super(message, e);
}
}
I am using loggers to log the errors in upper level. So I want to throw the exception to top. Thanks
Try extending RuntimeException instead. The method that is created to feed to the foreach does not have that type as throwable, so you need something that is runtime throwable.
WARNING: THIS IS PROBABLY NOT A VERY GOOD IDEA
But it will probably work.
Why are you using forEach, a method designed to process every element, when all you want to do, is to process the first element? Instead of realizing that forEach is the wrong method for the job (or that there are more methods in the Stream API than forEach), you are kludging this with an isInitial flag.
Just consider:
Optional<String> o = stream.findFirst();
if(o.isPresent()) try {
String outputName = new SimpleDateFormat(Constants.HMDBConstants.HMDB_SDF_FILE_NAME)
.format(new Date());
if (location.endsWith(Constants.LOCATION_SEPARATOR)) {
savedPath = location + outputName;
} else {
savedPath = location + Constants.LOCATION_SEPARATOR + outputName;
}
File output = new File(savedPath);
FileWriter fileWriter = null;
fileWriter = new FileWriter(output);
writer = new SDFWriter(fileWriter);
writer.write(o.get());
} catch (IOException e) {
throw new ChemIDException(e.getMessage(),e);
}
which has no issues with exception handling. This example assumes that the Stream’s element type is String. Otherwise, you have to adapt the Optional<String> type.
If, however, your isInitial flag is supposed to change more than once during the stream processing, you are definitely using the wrong tool for your job. You should have read and understood the “Stateless behaviors” and “Side-effects” sections of the Stream API documentation, as well as the “Non-interference” section, before using Streams. Just converting loops to forEach invocations on a Stream doesn’t improve the code.

JAX-RS 2.0 MULTIPART_FORM_DATA file upload not library specific

I need to create a JAX-RS 2.0 client that posts a file and a couple of parameters using MULTIPART_FORM_DATA content type. (Don't need the service, just the client) I’ve seen some examples that depend on an specific implementation, like Jersey or RESTEasy, but I’d like not to bind my code to any... in particular, to Apache CXF (I am using WAS Liberty Profile). Any ideas on how to do it? Do I have to stick to some specific classes? If so, how can I do it using Apache CXF 3.0 (Liberty uses CXF for JAX-RS 2.0)
Thanks
[I currently cannot comment under the already written answer]
If someone is searching for the maven dependency of IMultipartBody from the answer of Anatoly:
<dependency>
<groupId>com.ibm.websphere.appserver.api</groupId>
<artifactId>com.ibm.websphere.appserver.api.jaxrs20</artifactId>
<version>1.0.39</version>
<scope>provided</scope>
</dependency>
Thanks to andymc12 from https://github.com/OpenLiberty/open-liberty/issues/11942#issuecomment-619996093
You can use this example how to implement it by using jax-rs 2.0 feature: https://www.ibm.com/support/knowledgecenter/SSD28V_8.5.5/com.ibm.websphere.wlp.nd.doc/ae/twlp_jaxrs_multipart_formdata_from_html.html this is almost working example (some statements should be wrapped in try-catch block, but you'll see when'll post it to IDE.
package com.example.jaxrs;
#POST
#Consumes("multipart/form-data")
#Produces("multipart/form-data")
public Response postFormData(IMultipartBody multipartBody) {
List <IAttachment> attachments = multipartBody.getAllAttachments();
String formElementValue = null;
InputStream stream = null;
for (Iterator<IAttachment> it = attachments.iterator(); it.hasNext();) {
IAttachment attachment = it.next();
if (attachment == null) {
continue;
}
DataHandler dataHandler = attachment.getDataHandler();
stream = dataHandler.getInputStream();
MultivaluedMap<String, String> map = attachment.getHeaders();
String fileName = null;
String formElementName = null;
String[] contentDisposition = map.getFirst("Content-Disposition").split(";");
for (String tempName : contentDisposition) {
String[] names = tempName.split("=");
formElementName = names[1].trim().replaceAll("\"", "");
if ((tempName.trim().startsWith("filename"))) {
fileName = formElementName;
}
}
if (fileName == null) {
StringBuffer sb = new StringBuffer();
BufferedReader br = new BufferedReader(new InputStreamReader(stream));
String line = null;
try {
while ((line = br.readLine()) != null) {
sb.append(line);
}
} catch (IOException e) {
e.printStackTrace();
} finally {
if (br != null) {
try {
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
formElementValue = sb.toString();
System.out.println(formElementName + ":" + formElementValue);
} else {
//handle the file as you want
File tempFile = new File(fileName);
...
}
}
if (stream != null) {
stream.close();
}
return Response.ok("test").build();
}

Drools marshall/unmarshall not save global

I have a Test to unmarshall kiesession, with data in Global, but the unmarshall not return global.
The code is that :
Java Test
KieServices kieServices = KieServices.Factory.get();
KieContainer kContainer = kieServices.getKieClasspathContainer();
KieBase kBase1 = kContainer.getKieBase("KBase1");
KieSession kieSession1 = kContainer.newKieSession("KSession2_1");
Map<String, Object> map = new ConcurrentHashMap<String, Object>();
int tam = 10000;
for (int i = 0; i < tam; i++) {
map.put("map" + i, i);
}
kieSession1.setGlobal("map", map);
for (int i = 0; i < tam; i++) {
Client client = new Client();
client.setName("test");
client.setEdad(10);
kieSession1.insert(client);
}
kieSession1.fireAllRules();
Marshaller marshaller = MarshallerFactory.newMarshaller(kBase1);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try {
marshaller.marshall(baos, kieSession1);
} catch (IOException e) {
fail("error");
}
byte[] data = baos.toByteArray();
try {
baos.close();
} catch (IOException e) {
fail("error");
}
//
kieSession1.dispose();
InputStream is = new ByteArrayInputStream(data);
try {
kieSession1 = marshaller.unmarshall(is);
} catch (ClassNotFoundException e) {
fail("error");
} catch (IOException e) {
fail("error : " + e);
} finally {
try {
is.close();
} catch (IOException e) {
}
}
assertEquals(tam, kieSession1.getFactCount());
assertNotNull("No existe Global !!", kieSession1.getGlobal("map"));
Drools Rule
global java.util.Map map
rule "test"
when
then
System.out.println("test !!" + map.size());
end
The version are :
org.drools:drools-compiler:jar:6.1.0.Final
org.drools:drools-core:jar:6.1.0.Final
org.kie:kie-api:jar:6.1.0.Final.1.2
org.kie:kie-internal:jar:6.1.0.Final
Globals are not inserted into the Working Memory, consequently they are not saved with the KieSession's state.
Globals have to be inserted every time you restore KieSession's state.
Just ran into this awesome behaviour, so here's a solution for loading:
You can initialise your globals BEFORE loading the session by registering a resolver with the environment:
Environment environment = kieServices.getEnvironment();
MapGlobalResolver resolver = new MapGlobalResolver(droolsProvider.globals());
environment.set(EnvironmentName.GLOBALS, resolver);
The MapGlobalResolver is the default resolver anyway. By using this approach, the resolver will be pre-initialised with the correct globals. Personally, I am thinking of writing an InjectionResolver so that Guice will inject globals on demand, but that might not be for everyone's need.
Then loading is as simple as passing the correct environment in:
KieSession loadedKieSession = kieServices.getKieService().getStoreServices().loadKieSession(session.getId(), kieBase, ksConf, environment);
Where the objects are the corresponding config object that are needed to set up the Environment.

Kafka Serialization of an object [duplicate]

This question already has answers here:
Writing Custom Kafka Serializer
(3 answers)
Closed 2 years ago.
I started playing with Kafka. I've set an a zookeeper configuration, and I managed to send and consume String messages.
Now I am trying to pass an Object (in java), but from some reason, when parsing the Message in the consumer I have header issues. I tried several serialization options (using Decoder/Encoder), and all of the return the same header issue.
Here is my code
The producer:
Properties props = new Properties();
props.put("zk.connect", "localhost:2181");
props.put("serializer.class", "com.inneractive.reporter.kafka.EventsDataSerializer");
ProducerConfig config = new ProducerConfig(props);
Producer<Long, EventDetails> producer = new Producer<Long, EventDetails>(config);
ProducerData<Long, EventDetails> data = new ProducerData<Long, EventDetails>("test3", 1, Arrays.asList(new EventDetails());
try {
producer.send(data);
} finally {
producer.close();
}
And the consumer:
Properties props = new Properties();
props.put("zk.connect", "localhost:2181");
props.put("zk.connectiontimeout.ms", "1000000");
props.put("groupid", "test_group");
// Create the connection to the cluster
ConsumerConfig consumerConfig = new ConsumerConfig(props);
ConsumerConnector consumerConnector = Consumer.createJavaConsumerConnector(consumerConfig);
// create 4 partitions of the stream for topic “test”, to allow 4 threads to consume
Map<String, List<KafkaMessageStream<EventDetails>>> topicMessageStreams =
consumerConnector.createMessageStreams(ImmutableMap.of("test3", 4), new EventsDataSerializer());
List<KafkaMessageStream<EventDetails>> streams = topicMessageStreams.get("test3");
// create list of 4 threads to consume from each of the partitions
ExecutorService executor = Executors.newFixedThreadPool(4);
// consume the messages in the threads
for (final KafkaMessageStream<EventDetails> stream: streams) {
executor.submit(new Runnable() {
public void run() {
for(EventDetails event: stream) {
System.err.println("********** Got message" + event.toString());
}
}
});
}
and my Serializer:
public class EventsDataSerializer implements Encoder<EventDetails>, Decoder<EventDetails> {
public Message toMessage(EventDetails eventDetails) {
try {
ObjectMapper mapper = new ObjectMapper(new SmileFactory());
byte[] serialized = mapper.writeValueAsBytes(eventDetails);
return new Message(serialized);
} catch (IOException e) {
e.printStackTrace();
return null; // TODO
}
}
public EventDetails toEvent(Message message) {
EventDetails event = new EventDetails();
ObjectMapper mapper = new ObjectMapper(new SmileFactory());
try {
//TODO handle error
return mapper.readValue(message.payload().array(), EventDetails.class);
} catch (IOException e) {
e.printStackTrace();
return null;
}
}
}
And this is the error I get:
org.codehaus.jackson.JsonParseException: Input does not start with Smile format header (first byte = 0x0) and parser has REQUIRE_HEADER enabled: can not parse
at [Source: N/A; line: -1, column: -1]
When I worked with MessagePack and with plain writing to a ObjectOutputStream I got a similiar header issue. I also tried to add the payload CRC32 to the message, but that didn't help as well.
What am I doing wrong here?
Hm, I haven't run into the same header issue that you are encountering but my project wasn't compiling correctly when I didn't provide a VerifiableProperties constructor in my encoder/decoder. It seems strange that the missing constructor would corrupt Jackson's deserialization though.
Perhaps try splitting up your encoder and decoder and include the VerifiableProperties constructor in both; you shouldn't need to implement Decoder[T] for serialization. I was able to successfully implement json de/serialization using ObjectMapper following the format in this post.
Good luck!
Bytebuffers .array() method is not very reliable. It depends on the particular implementation. You might want to try
ByteBuffer bb = message.payload()
byte[] b = new byte[bb.remaining()]
bb.get(b, 0, b.length);
return mapper.readValue(b, EventDetails.class)

Extract xml data from gzip file using apache tika?

I am working a project in which i need to extract xml(sitemap)data from gz file using apache tika[AM NEW TO TIKA].
the fie name is something like sitemap01.xml.gz
I could extract data from normal text file or html,but i don't know how to extract xml from gz and extract the meta and data from xml...
I searched Google for past two days.
Do i need to use delegateParser in tika to extract data from xml?
Please guide me to some sample or articles....
Here is my try
public void parseXml() throws IOException{
Metadata metadata = new Metadata();
ContentHandler handler = new BodyContentHandler();
Parser parser = new AutoDetectParser();
ParseContext context = new ParseContext();
InputStream stream =this.getClass().getResourceAsStream("sitemap.xml.gz");
try {
parser.parse(stream,handler,metadata,context);
for(int i = 0; i <metadata.names().length; i++) {
String name = metadata.names()[i];
System.out.println(name + " : " + metadata.get(name));
}
System.out.println(handler.toString());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (SAXException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (TikaException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}finally{
if(stream!=null) {
stream.close();
}
}
}
The thing you're missing is setting a recursing parser on your ParseContext. You probably want something like:
Parser parser = new AutoDetectParser();
ParseContext context = new ParseContext();
context.set(Parser.class, parser);
parser.parse(....)
By setting a Parser on the ParseContext, you tell Tika to call that when it encounters embedded documents (such as the XML inside your GZip)
Here is how you can use XML parser from Apache Tika for your case:
//detecting the file type
BodyContentHandler handler = new BodyContentHandler(-1);
Metadata metadata = new Metadata();
File inFile = new File("sitemap.xml.gz");
System.out.println(inFile.isFile());
FileInputStream inputstream = new FileInputStream(inFile);
ParseContext pcontext = new ParseContext();
//Xml parser
XMLParser xmlparser = new XMLParser();
xmlparser.parse(inputstream, handler, metadata, pcontext);
System.out.println(pcontext.toString());
System.out.println("Contents of the document:" + handler.toString());//this one contains all contents from xml files and tags are also removed
System.out.println("Metadata of the document:");
String[] metadataNames = metadata.names();
for(String name : metadataNames) {
System.out.println(name + ": " + metadata.get(name));