Migration of hazelcast from 3.x.x to 5.x.x using javaclient cause problem because of absence of EntryBackupProcessor.java - migration

what i can used instead of the EntryBackupProcessor.java
import java.io.Serializable;
import java.util.Map;
public interface EntryBackupProcessor<K, V,R> extends Serializable {
void processBackup(Map.Entry<K, V> entry);
}

Related

retrieving client IP address in quarkus

Just as the title says, I need a help to get the local IP of the client in a quarkus resource ¿any idea?
I already try this but it dosen't work
import javax.enterprise.context.RequestScoped;
import javax.inject.Inject;
import javax.servlet.http.HttpServletRequest;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.container.ResourceContext;
import javax.ws.rs.core.Context;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import javax.ws.rs.core.Response.Status;
#Path("/auth")
#RequestScoped
public class AuthResource {
#GET
#Path("/getIpAddres")
#Produces(MediaType.TEXT_PLAIN)
public String getIpAddres(#Context HttpServletRequest request){
String ip = request.getRemoteAddr();
return ip;
}
Pretty simple. Instead of using #Context HttpServletRequest request. do this instead:
#Path("/auth")
#RequestScoped
public class AuthResource {
#Inject
RoutingContext context;
#GET
#Path("/getIpAddres")
#Produces(MediaType.TEXT_PLAIN)
public String getIpAddres(){
String ip = context.request().host();
return ip;
}
I faced this problem myself. Cool thing about this RoutingContext is that this class can be injected into the Service layer, and it will have all the context of the ip, and other stuff, given that the Service was called by the Controller.
Hope this helps!
Just call the method remoteAddress()
#Path("/getIpAddres")
#Produces(MediaType.TEXT_PLAIN)
public String getIpAddres(#Context HttpServletRequest request){
String ip = request.remoteAddress().hostAddress();
return ip;
}
`

The import org.springframework.cloud.sleuth.Sampler cannot be resolved

I am working on Spring Boot and Cloud Sleuth, migrating from Spring Boot v1.4.1.RELEASE to Spring Boot 2.2.6.RELEASE.
When I upgraded maven dependency, my code started breaking
CustomSampler.java
import org.springframework.cloud.sleuth.Sampler;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import zipkin2.Span;
#Configuration
public class CustomSampler {
#Bean
public Sampler smartSampler() {
return new Sampler() {
#Override
public boolean isSampled(Span span) {
System.out.println("custom sampler used!");
return true;
}
};
}
}
I went through this link : https://github.com/spring-cloud/spring-cloud-sleuth/wiki/Spring-Cloud-Sleuth-2.0-Migration-Guide, but things are not clear.

JAX-RS #EJB injection gives NullPointerException

I'm trying to deploy a war file to run on WildFly Swarm. While doing a GET request, a NullPointerException occurs because an injection fails and the reference is, obviously, null.
SomeDao.java
import java.util.List;
import java.util.UUID;
import javax.ejb.Local;
#Local
public interface SomeDao {
public List<MyEntity> listAll();
public void store(MyEntity entity);
}
SpecializedDao.java
import java.util.List;
import java.util.UUID;
import javax.ejb.Stateless;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
#Stateless
public class SpecializedDao implements SomeDao {
#PersistenceContext(unitName="primary")
protected EntityManager entityManager;
public SpecializedDao() {}
#Override
public List<MyEntity> listAll() {
return this.entityManager
.createQuery("SELECT entity FROM MyEntity entity", MyEntity.class)
.getResultList();
}
#Override
public void store(MyEntity entity) {
entityManager.getTransaction().begin();
entityManager.persist(entity);
entityManager.getTransaction().commit();
}
}
Then, there's the endpoint where I need to inject a SpecializedDao instance.
MyEndpoint.java
import javax.annotation.PostConstruct;
import javax.ejb.EJB;
import javax.ejb.Stateless;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
#Stateless
#Path("/something")
public class MyEndpoint {
#EJB
private SomeDao dao;
#GET
#Path("/test")
#Produces({MediaType.APPLICATION_JSON})
public Response test() {
MyEntity testEntity = new MyEntity("something", "something");
dao.store(testEntity);
return Response.ok("All done!").build();
}
}
beans.xml
<beans xmlns="http://xmlns.jcp.org/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/beans_1_1.xsd"
version="1.2" bean-discovery-mode="annotated">
</beans>
The NullPointerException is launched at dao.store(testEntity), because dao references a null object. I'm pretty sure the persistence.xml file is correct because the EntityManager works on another test case, so I think the problem is at injection.
Where did I do something wrong?
Other things you can check.
Is your beans.xml located in the correct folder (ie in WEB-INF in case of a webapp/war) ?
Did you include the Swarm CDI fraction/dependency (org.wildfly.swarm:cdi) ?
I'm not a Swarm expert but it may be that the CDI fraction only works when beans are injected "à la JSR-299" that is using java.inject.#Inject (rather than using #EJB).
#Inject
private SomeDao dao;
Ultimately: try with bean-discovery-mode="all" in beans.xml (...even though "annotated" seems correct)
Didn't you forget to put a "beans.xml" file (under WEB-INF/META-INF folder) ?

HSQLDB with JdbcTemplate, nothing is getting saved

For some reason after doing changes to my file based HSQL database and shutting down the java process, nothing seems to be saved in the database. I.E. i can rerun this program over and over without meeting the "table already exists" exception. What the hell is going on?!
Main class:
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.jdbc.core.JdbcTemplate;
import java.io.IOException;
import java.sql.SQLException;
public class TestApp {
public static void main(String[] args) throws IOException, SQLException, ClassNotFoundException {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(DbConfig.class, TestDao.class);
JdbcTemplate template = ctx.getBean(JdbcTemplate.class);
TestDao dao = ctx.getBean(TestDao.class);
dao.testTransactionality();
}
}
Config:
import org.apache.commons.dbcp.BasicDataSource;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.datasource.DataSourceTransactionManager;
import org.springframework.transaction.PlatformTransactionManager;
import javax.sql.DataSource;
#Configuration
public class DbConfig {
#Bean
public DataSource getDataSource(){
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("org.hsqldb.jdbcDriver");
ds.setUrl("jdbc:hsqldb:file:databaseFiles/test/");
ds.setUsername("sa");
ds.setPassword("1");
return ds;
}
#Bean
JdbcTemplate getJdbcTemplate(DataSource ds){
return new JdbcTemplate(ds);
}
#Bean
PlatformTransactionManager getTransactionManager(DataSource dataSource){
return new DataSourceTransactionManager(dataSource);
}
}
DAO:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import org.springframework.transaction.annotation.Transactional;
#Repository
#EnableTransactionManagement
#Transactional
public class TestDao {
#Autowired
private JdbcTemplate template;
#Transactional
public void testTransactionality(){
template.execute("create table LIBRARY (LIBRARY_ID INT, LIBRARY_TITLE VARCHAR(400))");
template.execute("insert into library values (1, 'Library')");
}
}
I have tried doing something similar with plain JDBC classes as well as doing explicit commits, nothing seems to help. I am guessing it's a HSQLDB problem. Please help
Your database URL is not quite right (shouldn't end with a slash). You should also change the write delay to 0 to see the changes:
ds.setUrl("jdbc:hsqldb:file:databaseFiles/test;hsqldb.write_delay_millis=0");

WELD-001408 Unsatisfied dependencies for type [Logger] with qualifiers [#Default] at injection point [[field] using arquillian

I am running a basic arquillian unit test, using the Greeter example on the arquillian site. The only difference is that am doing a log.debug in the greet(PrintStream to, String name) function in Greeter.java. Am using slf4j for logging.
Greeter.java
package org.arquillian.example;
import java.io.PrintStream;
import javax.inject.Inject;
import org.slf4j.Logger;
public class Greeter {
#Inject
private Logger log;
public void greet(PrintStream to, String name) {
log.debug("Greeter Testing");
to.println(createGreeting(name));
}
public String createGreeting(String name) {
return "Hello, " + name + "!";
}
}
GreeterTest.java
package org.arquillian.example;
import javax.inject.Inject;
import org.jboss.arquillian.container.test.api.Deployment;
import org.jboss.arquillian.junit.Arquillian;
import org.jboss.shrinkwrap.api.ShrinkWrap;
import org.jboss.shrinkwrap.api.asset.EmptyAsset;
import org.jboss.shrinkwrap.api.spec.JavaArchive;
import org.junit.Assert;
import org.junit.Test;
import org.junit.runner.RunWith;
#RunWith(Arquillian.class)
public class GreeterTest {
#Inject
Greeter greeter;
#Deployment
public static JavaArchive createDeployment() {
return ShrinkWrap.create(JavaArchive.class)
.addClass(Greeter.class)
.addAsManifestResource(EmptyAsset.INSTANCE, "beans.xml");
}
#Test
public void should_create_greeting() {
Assert.assertEquals("Hello, Earthling!",
greeter.createGreeting("Earthling"));
greeter.greet(System.out, "Earthling");
}
}
Am getting WELD-001408 Unsatisfied dependencies for type [Logger] with qualifiers [#Default] at injection point [[field] #Inject private org.arquillian.example.Greeter.log] error when running the test. Can someone please help on this?
This is a CDI issue. You don't have a producer for your Logger in the first place.
Secondly, any such producer should be added to the ShrinkWrap deployment.
A producer for the Logger is usually written as such:
import javax.enterprise.inject.Produces;
import javax.enterprise.inject.spi.InjectionPoint;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class SLF4JProducer {
#Produces
public Logger producer(InjectionPoint ip){
return LoggerFactory.getLogger(
ip.getMember().getDeclaringClass().getName());
}
}
This producer receives an injection point and proceeds to return a SLF4J Logger instance. The instance has the same name as the class containing the injection point.
also change in bean.xml bean-discovery-mode to all
bean-discovery-mode="all"
Instead of injecting Logger, it worked just fine for me when I used LoggerFactory.
private Logger log = LoggerFactory.getLogger(Greeter.class);
In my case I must provide the injections programmatically
Import:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
Initialization
private Logger logger;
#Inject
public LoggingInterceptor() {
logger = LoggerFactory.getLogger(LoggingInterceptor.class);
}