LettuceConnectionFactory with SentinelTopologyProvider - redis

I am using lettuce through the spring boot API org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory .
My bean is configured as
#Bean
public LettuceConnectionFactory lettuceSentinelConnectionFactory() {
RedisSentinelConfiguration sentinelConfig = new RedisSentinelConfiguration(mastername, new HashSet<>(sentinelnodes));
return new LettuceConnectionFactory(sentinelConfig);
}
This works fine, except for reconnecting after topology changes, resulting in exceptions.
io.lettuce.core.RedisCommandExecutionException: READONLY You can't write against a read only slave.
Some bugreports seem to confirm this behaviour .
With great interest I read that lettuce implements a dynamic topology discoverer for sentinels SentinelTopologyProvider, which should fix the issue elegantly.
However I am unable to make the LettuceConnectionFactory use this dynamic SentinelTopologyProvider, and googling did not return anything neither.
Can anyone give a hint or some sample code on how to write this code?

Related

Incosistent Results in neo4j-ogm - Related to Session Scope?

We developing a Spring Boot REST Application using Spring Data Neo4J. Recently we upgraded to Spring Data Neo4j 4.2 along with ogm 2.1.1 and using the embedded driver.
In our application we provide some GET operations in which we build some object structures from nodes fetched from Neo4j.
As soon as we are processing multiple requests in parallel, we are facing inconsitent results, i.e., our object structures have a different size on each request.
We are not really sure about the reasons for this inconsistent behavior - but probably it is related to the session handling in OGM? We know that Sessions are not thread safe, but we have no idea how to deal with this issue in SD 4.2. Before 4.2 we changed the sesssion scope to prototype when defining the session bean, but the configuration changed in SD 4.2
Configuration before 4.2
#Bean
#Scope(value = "prototype", proxyMode = ScopedProxyMode.TARGET_CLASS)
public Session getSession() throws Exception {
return;
}
We could narrow the source of our problems to the place where we are loading elements from Neo4j via a Repository class:
repository.findOne(id,-1);
If we place this call in a synchronized block, no inconsistencies occur.
synchronized (this) {
repository.findOne(id,-1);
}
We are probably missing some important point using SD 4.2/ogm, but could not find any useful information in the documentation, and on the web.
Is it still possible/necessary to change the sesssion scope in SD 4.2?
This is a bug in the OGM. See: https://jira.spring.io/browse/DATAGRAPH-951
We hope to have a fix for this in the next version (OGM 2.1.2).

Camel sql component cron schedule customization

As I have looked , camel sql-component is not supporting cron expressions but fixed delays etc. I have checked source code of the component but I could not find an easy way to customize it. Is there any other way to make it or should I extend all component, endpoint , consumer and producer in order to make it?
Thanks
See the documentation about polling consumer: http://camel.apache.org/polling-consumer.html at the section further below for scheduled poll consumers.
You can configure to use a different scheduler such as spring/quartz2 that has cron capabilities.
I blogged about how to do this: http://www.davsclaus.com/2013/08/apache-camel-212-even-easier-cron.html but it should work with the sql component also.
I second #Neron's comment above. I believe this is a bug in the camel-sql compnent. I am currently using version 2.16.2, but I don't see any changes in a higher version that would have resolved this.
For those interested, you can work around this by creating a subclass of the SQLComponent like this.
public class SQLComponentPatched extends SqlComponent {
#Override
protected Endpoint createEndpoint(String uri, String remaining, Map<String, Object> parameters) throws Exception {
Endpoint endpoint = super.createEndpoint(uri, remaining, parameters);
setProperties(endpoint, parameters);
return endpoint;
}
}

Need simple code: Get Glassfish InitialContext & lookup ConnectionFactory

My glassfish4 server runs on localhost.
My java client is a simple GUI run from Eclipse.
My glassfish4 admin console assures me I have a Resources/JMS Resources/Connection Factories/jms/goConnectionFactory properly configured.
So I though it would be trivial for the client to get an InitialContext from glassfish and use it to lookup "jms/goConnectionFactory. Not true,
And after reading answered questions (all too wrapped up in other special issues to be helpful) I still can't connect.
Can anyone tell me the properties I need to load to make this work?
Properties prop = new Properties:
prop.setProperty(Context.INITIAL_CONTEXT_FACTORY,
"com.sun.enterprise.naming.impl.SerialInitContextFactory");
prop.setProperty(Context.URL_PKG_PREFIXES,
"com.sun.enterprise.naming");
prop.setProperty("org.omg.CORBA.ORBInitialPort","3700");
prop.etProperty("org.omg.CORBA.ORBInitialHost","localhost");
System.out.printlin (jndiContext.getClass().getName();
InitialContext jndiContext = new InitialContext (props);
ConnectionFactory =
(ConnectionFactory) jndiContext.lookup("jms/goConnectionFactory");
The 1st printlin() returns this to the console "javax.naming.InitialContext".
But a try/catch block (not shown) around the lookup() catches a NullPointerException
And I don't know how to tell if my InitialContext object is a valid one from my glassfish4 server, or I'm using it incorrectly.
Anyway ... a glassfish server, and a java GUI client (run from Eclipse), both running on the same machine seems a pretty generic arrangement. So I'm sure it would help all newbees to JMS to have just one trivial snipt of code that proves it works -- without having to spend a week hunting to no avail.
Note: I also tried
factory =
(ConnectionFactory)
PortableRemoteObject.narrow(
jndiContext.lookup("jms/goConnectionFactory"),
ConnectionFactory.class
);
But I'm pretty sure I won't need that till I try to get my local client to talk to my glassfish4 running on a virtual server elsewhere.
And I used "localhost:8080", instead of just "localhost", in prop above. Same result. I get some kind of InitialContext Object. But it won't retrieve a ConnectionFactory.
Can anyone offer any help?
...
System.out.printlin (jndiContext.getClass().getName();
InitialContext jndiContext = new InitialContext (props);
...
Isn't the first line the cause of your null pointer? The jndiContext is not yet initialized at that point.

How to setup Spring Data Solr with EmbeddedSolrServer and multicore support?

I'm using Spring Data Solr to implement the search module in my project. To enable multicore support, I simply instantiate a HttpSolrServer and then declare a java-based Spring configuration class with #EnableSolrRepositores(multicoreSupport=true). Everything works perfectly, until when I try to write integration test for Solr related codes and schema.
I want to use EmbeddedSolrServer for testing so that the tests can run without depending on an external Solr server, but I can't find a way to configure correctly. Please advise.
This can at this time not be done directly due to DATASOLR-203.
Once the issue mentioned above is resolved you can do it as follows:
#Configuration
#EnableSolrRepositories(multicoreSupport = true)
static class SolrConfiguration {
#Bean
SolrServer solrServer() throws FileNotFoundException {
String solrHome = ResourceUtils.getURL("classpath:your/path/here").getPath();
CoreContainer container = CoreContainer.createAndLoad(solrHome, new File(solrHome + "/solr.xml"));
return new EmbeddedSolrServer(container, null);
}
}

ServiceLoader issue in WebLogic12c

I have been trying to refactor our Activiti implementation into using CDI but ran into a number of problems. I've spent way too much time trying to resolve this already, but I just can't let it go...I think I've pinned the problem down now, setting up a clean structured war without involving Activiti and have been able to reproduce what I think is the main problem.
Basically I have jar1 and jar2, both CDI enabled by including META-INF/beans.xml. Both jars specify a class in META-INF/services/test.TheTest pointing to implementations local to respective jar. jar1 depends on jar2. Also, both jars point to an implementation of javax.enterprise.inject.spi.Extension, triggering the scenario. In each implementation of Extension, I have a method like:
public void afterDeploymentValidation(
#Observes AfterDeploymentValidation event, BeanManager beanManager) {
System.out.println("In jar1 extension");
ServiceLoader<TheTest> loader = ServiceLoader.load(TheTest.class);
Iterator<TheTest> serviceIterator = loader.iterator();
List<TheTest> discoveredLookups = new ArrayList<TheTest>();
while (serviceIterator.hasNext()) {
TheTest serviceInstance = (TheTest) serviceIterator.next();
discoveredLookups.add(serviceInstance);
System.out.println(serviceInstance.getClass().getName());
}
}
Now, my problem is that the ServiceLoader does not see any implementations in either case when running WebLogic12c. The same code works perfectly fine in both Jboss 7.1.1 and Glassfish , listing both implementations of the test.TheTest interface.
Is it fair to assume that this is indeed a problem in WebLogic 12c or am I doing something wrong? Please bare in mind that I am simply trying to emulate the production setup we use when incorporating Activiti.
Regards,
/Petter
There is a Classloader Analysis Tool provided with WLS, have you seen if this will help with the diagnosis of your issue.
You can access this tool by going to ip:port/wls-cat/index.jsp
Where port will be the port of the managed server where your application is deployed.