Apache Ignite Exception - Failed to initialize cache store (data source is not provided) - ignite

I am trying to implement persistent store for my ignite cache ,I am using CacheJdbcPojoStoreFactory,My cache store factory initialization looks like this
#Autowired
DataSorce datasource;
#Bean
public CacheJdbcPojoStoreFactory<?, ?> cacheJdbcdPojoStorefactory(){
CacheJdbcPojoStoreFactory<?, ?> factory = new CacheJdbcPojoStoreFactory<>();
factory.setDataSource(dataSource);
return factory;
}
My implementation of the cache looks like this
CacheConfiguration pesonConfig = new CacheConfiguration();
pesonConfig.setName("personCache");
cacheJdbcdPojoStorefactory.setTypes(jdbcTypes.toArray(new JdbcType[jdbcTypes.size()]));
Collection<QueryEntity> qryEntities = new ArrayList<>();
qryEntities.add(qryEntity);
pesonConfig.setQueryEntities(qryEntities);
pesonConfig.setCacheStoreFactory((Factory<? extends CacheStore<Integer, Person>>) cacheJdbcdPojoStorefactory);
ROCCache<Integer, Person> personCache= rocCachemanager.createCache(pesonConfig);
personCache.put(1, p1);
personCache.put(2, p2)
(I am passing correct query Entities and JdbcTypes , for simplicity i have not shown that code here)
But when i run this code i get the below stack trace
Failed to initialize cache store (data source is not provided).
at org.apache.ignite.internal.util.IgniteUtils.startLifecycleAware(IgniteUtils.java:8385)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.createCache(GridCacheProcessor.java:1269)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.prepareCacheStart(GridCacheProcessor.java:1638)
at org.apache.ignite.internal.processors.cache.GridCacheProcessor.prepareCachesStart(GridCacheProcessor.java:1563)
at org.apache.ignite.internal.processors.cache.distributed.dht.preloader.GridDhtPartitionsExchangeFuture.startCaches(GridDhtPartitionsExchangeFuture.java:944)
at org.apache.ignite.internal.processors.cache.distributed.dht.preloader.GridDhtPartitionsExchangeFuture.init(GridDhtPartitionsExchangeFuture.java:511)
at org.apache.ignite.internal.processors.cache.GridCachePartitionExchangeManager$ExchangeWorker.body(GridCachePartitionExchangeManager.java:1297)
at org.apache.ignite.internal.util.worker.GridWorker.run(GridWorker.java:110)
at java.lang.Thread.run(Thread.java:745)
Caused by: class org.apache.ignite.IgniteException: Failed to initialize cache store (datasource is not provided). at org.apache.ignite.cache.store.jdbc.CacheAbstractJdbcStore.start(CacheAbstractJdbcStore.java:297)
at org.apache.ignite.internal.util.IgniteUtils.startLifecycleAware(IgniteUtils.java:8381)
... 8 more
When i debug i can see that my datasource parameters are correctly set inside cacheJdbcdPojoStorefactory object. Where am i going wrong ?

Instead of wiring data source bean and setting it to the factory, you can provide its bean ID and the factory will fetch it from the application context. Here is the example:
#Bean
public CacheJdbcPojoStoreFactory<?, ?> cacheJdbcdPojoStorefactory(){
CacheJdbcPojoStoreFactory<?, ?> factory = new CacheJdbcPojoStoreFactory<>();
factory.setDataSourceBean("data-source-bean");
return factory;
}
The issue is that factory will be serialized, but data source field is transient. This makes setDataSource() property very confusing, I think it should be deprecated and reworked.

Related

JdbcTemplate separate transactions created for each query

I'm using jdbc for some sql queries and i wanted to execute all separate queries in one method in one transaction. I tried to set configuration setting only for transaction in one query and read it in another:
#Transactional
public void testJDBC() {
SqlRowSet rowSet =jdbcTemplate.queryForRowSet("select set_config('transaction_test','im_here',true)");
String result;
while (rowSet.next()) {
result = rowSet.getString("set_config");
System.out.println("Result1: "+result);
}
SqlRowSet rowSet2 =jdbcTemplate.queryForRowSet("select current_setting('transaction_test',true)");
String result2;
while (rowSet2.next()) {
result2 = rowSet2.getString("current_setting");
System.out.println("Result2: "+result2);
}
}
But my second query uses other transaction or both queries are not transactional, becouse result looks like this:
Result1: im_here
Result2:
I dont get it what is wrong here that despite Transactional annotation it is still not transactional.
Here are my beans setting:
#Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory emf) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(emf);
return transactionManager;
}
public BasicDataSource getApacheDataSource(){
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(environment.getRequiredProperty("jdbc.driverClassName"));
dataSource.setUrl(getUrl());
dataSource.setUsername(getEnvironmentProperty("spring.datasource.username"));
dataSource.setPassword(getEnvironmentProperty("spring.datasource.password"));
}
#Bean
public JdbcTemplateExtended jdbc(){
return new JdbcTemplateExtended(getApacheDataSource());
}
I think making sure #Transactional annotations are being handled well is the first step in troubleshooting. To do this, add the following settings to application.properties (or application.yml file). I assume you are using spring boot.
logging:
level:
org:
springframework:
transaction:
interceptor: trace
If you run the logic after applying the above settings, you can see the following log message.
2020-10-02 14:45:07,162 TRACE - Getting transaction for [com.Class.method]
2020-10-02 14:45:07,273 TRACE - Completing transaction for [com.Class.method]
Make sure the #Transactional annotation is handled properly by the TransactionInterceptor.
Note: The behavior of the #Transactional annotation works on proxy objects. If you call from a method of the same class or create a class directly instead of autowired, the proxy object is not created and hence the #Transactional annotation's expected behavior is not applied.

NotNavigableException cause

Whenever I am running my hibernate connection java code then i am getting this exception org.hibernate.metamodel.NotNavigableException: com.javaa2z.hibernate.Customer is not a navigable (managed-type or collection)
I have written hibernate.cfg.xml code and Customer.hbm.xml.
I am using Mysql8.0
org.hibernate.metamodel.NotNavigableException: com.javaa2z.hibernate.Customer is not a navigable (managed-type or collection)
at org.hibernate.metamodel.spi.AbstractRuntimeModel.getEntityDescriptor(AbstractRuntimeModel.java:129)
at org.hibernate.internal.SessionImpl.getEntityDescriptor(SessionImpl.java:1492)
at org.hibernate.event.internal.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:126)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.saveWithGeneratedOrRequestedId(DefaultSaveOrUpdateEventListener.java:190)
at org.hibernate.event.internal.DefaultSaveEventListener.saveWithGeneratedOrRequestedId(DefaultSaveEventListener.java:36)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.entityIsTransient(DefaultSaveOrUpdateEventListener.java:175)
at org.hibernate.event.internal.DefaultSaveEventListener.performSaveOrUpdate(DefaultSaveEventListener.java:30)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.onSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:71)
at org.hibernate.internal.SessionImpl.fireSave(SessionImpl.java:682)
at org.hibernate.internal.SessionImpl.save(SessionImpl.java:674)
at org.hibernate.internal.SessionImpl.save(SessionImpl.java:669)
at com.javaa2z.hibernate.Lab1A.main(Lab1A.java:14)
I have faced same issue because haven't set model package in datasource :
#Bean
public LocalSessionFactoryBean sessionFactory() {
System.out.println("Creating entity Manager");
logger.info("DATASOURCE :"+dataSource());
LocalSessionFactoryBean factoryBean=new LocalSessionFactoryBean();
factoryBean.setDataSource(dataSource());
factoryBean.setPackagesToScan(new String[]{"You model package will set here"});
factoryBean.setHibernateProperties(additionalProperties());
return factoryBean;
}

Spring - Rabbit template - Bulk operation

Anyone knows if it is possible to send a collection of messages to a queue using Rabbit template?
Obviously I can send them one at a time, but I want to do it in a single bulk operation (to gain performance).
Thanks!
You can create a bean of BatchingRabbitTemplate and use it. Here is a working example bean:
#Bean
public BatchingRabbitTemplate batchingRabbitTemplate(ConnectionFactory connectionFactory) {
BatchingStrategy strategy = new SimpleBatchingStrategy(500, 25_000, 3_000);
TaskScheduler scheduler = new ConcurrentTaskScheduler();
BatchingRabbitTemplate template = new BatchingRabbitTemplate(strategy, scheduler);
template.setConnectionFactory(connectionFactory);
// ... other settings
return template;
}
Now you can inject BatchingRabbitTemplate in another bean and use it:
#Bean
public ApplicationRunner runner(BatchingRabbitTemplate template) {
MessageProperties props = //...
return args -> template.send(new Message("Test").getBytes(), props);
}
See Reference Manual about batching support:
Starting with version 1.4.2, the BatchingRabbitTemplate has been introduced. This is a subclass of RabbitTemplate with an overridden send method that batches messages according to the BatchingStrategy; only when a batch is complete is the message sent to RabbitMQ.

In-memory H2 database, insert not working in SpringBootTest

I have a SpringBootApplicationWhich I wish to test.
Below are the details about my files
application.properties
PRODUCT_DATABASE_PASSWORD=
PRODUCT_DATABASE_USERNAME=sa
PRODUCT_DATABASE_CONNECTION_URL=jdbc:h2:file:./target/db/testdb
PRODUCT_DATABASE_DRIVER=org.h2.Driver
RED_SHIFT_DATABASE_PASSWORD=
RED_SHIFT_DATABASE_USERNAME=sa
RED_SHIFT_DATABASE_CONNECTION_URL=jdbc:h2:file:./target/db/testdb
RED_SHIFT_DATABASE_DRIVER=org.h2.Driver
spring.datasource.platform=h2
ConfigurationClass
#SpringBootConfiguration
#SpringBootApplication
#Import({ProductDataAccessConfig.class, RedShiftDataAccessConfig.class})
public class TestConfig {
}
Main Test Class
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(classes = {TestConfig.class,ConfigFileApplicationContextInitializer.class}, webEnvironment = SpringBootTest.WebEnvironment.NONE)
public class MainTest {
#Autowired(required = true)
#Qualifier("dataSourceRedShift")
private DataSource dataSource;
#Test
public void testHourlyBlock() throws Exception {
insertDataIntoDb(); //data sucessfully inserted
SpringApplication.run(Application.class, new String[]{}); //No data found
}
}
Data Access In Application.class;
try (Connection conn = dataSourceRedShift.getConnection();
Statement stmt = conn.createStatement() {
//access inserted data
}
Please Help!
PS for the spring boot application the test beans are being picked so bean instantiation definitely not a problem. I think I am missing some properties.
I do not use hibernate in my application and data goes off even within the same application context (child context). i.e. I run a spring boot application which reads that data inserted earlier
Problem solved.
removing spring.datasource.platform=h2 from the application.properties.
Made my h2 data persists.
But I still wish to know how is h2 starting automatically?

Hazelcast No DataSerializerFactory registered for namespace: 0 on standalone process

Trying to set a HazelCast cluster with tcp-ip enabled on a standalone process.
My class looks like this
public class Person implements Serializable{
private static final long serialVersionUID = 1L;
int personId;
String name;
Person(){};
//getters and setters
}
Hazelcast is loaded as
final Config config = createNewConfig(mapName);
HazelcastInstance node = Hazelcast.newHazelcastInstance(config);`
Config createNewConfig(mapName){
final PersonStore personStore = new PersonStore();
XmlConfigBuilder configBuilder = new XmlConfigBuilder();
Config config = configBuilder.build();
config.setClassLoader(LoadAll.class.getClassLoader());
MapConfig mapConfig = config.getMapConfig(mapName);
MapStoreConfig mapStoreConfig = new MapStoreConfig();
mapStoreConfig.setImplementation(personStore);
return config;
}
and my myhazelcast config has this
<tcp-ip enabled="true">
<member>machine-1</member>
<member>machine-2</member>
</tcp-ip>
Do I need to populate this tag in my xml?
I get this error when a second instance is brought up
com.hazelcast.nio.serialization.HazelcastSerializationException: No DataSerializerFactory registered for namespace: 0
2275 at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:98)
2276 at com.hazelcast.nio.serialization.DataSerializer.read(DataSerializer.java:39)
2277 at com.hazelcast.nio.serialization.StreamSerializerAdapter.read(StreamSerializerAdapter.java:41)
2278 at com.hazelcast.nio.serialization.SerializationServiceImpl.toObject(SerializationServiceImpl.java:276)
Any help is highly appericiated.
Solved my problem, I had a pom.xml with hazelcast-wm so I did not have actual hazelcast jar in my bundled jar. Including that fixed my issue.
Note that this same "No DataSerializerFactory registered for namespace: 0" error message can also occur in an OSGi environment when you're attempting to use more than one Hazelcast instance within the same VM, but initializing the instances from different bundles. The reason being that the com.hazelcast.util.ServiceLoader.findHighestReachableClassLoader() method will sometimes pick the wrong class loader during Hazelcast initialization (as it won't always pick the class loader you set on the config), and then it ends up with an empty list of DataSerializerFactory instances (hence causing the error message that it can't find the requested factory with id 0). The following shows a way to work around that problem by taking advantage of Java's context class loader:
private HazelcastInstance createHazelcastInstance() {
// Use the following if you're only using the Hazelcast data serializers
final ClassLoader classLoader = Hazelcast.class.getClassLoader();
// Use the following if you have custom data serializers that you need
// final ClassLoader classLoader = this.getClass().getClassLoader();
final com.hazelcast.config.Config config = new com.hazelcast.config.Config();
config.setClassLoader(classLoader);
final ClassLoader previousContextClassLoader = Thread.currentThread().getContextClassLoader();
try {
Thread.currentThread().setContextClassLoader(classLoader);
return Hazelcast.newHazelcastInstance(config);
} finally {
if(previousContextClassLoader != null) {
Thread.currentThread().setContextClassLoader(previousContextClassLoader);
}
}
}