Using SQL Queries in Apache Ignite without a database - sql

I'm using Apache Ignite as a distributed cache whose configuration I've generated based on an existing database using the Ignite Web Console--it's a writethrough cache that will periodically persist cached data to the Postgres database. However, I want to write unit tests in Java for my project, and do not have a reliable test database to use.
Part of what I'm wanting to test are the cache queries I'm occasionally running on my Ignite cache--I wanted to use sql queries to do this. However, I can't figure out how to preserve the queryEntities from my cache configuration without also having the database. I tried making a new xml file for test purposes that only configures the caches I need, and only sets the query entities (not the datastore or any db information), but when I run the test I get a "Failed to initialize DB connection" error--even though there is no DB defined in my config.
Is there a way to leverage these query entities without actually connecting the cache to a database? If not, is there a good way to spin up a postgres database as a part of a unit test?

You need to check persistence store configuration and disable that first to have everything in memory.
Next, make sure you are not initializing any DB connection while having your test cache configuration(You already said you looked after this fact).

cacheCfg.setWriteThrough(false).setReadThrough(false) should do the trick when defining a cache (note that after cache is started cfg can't be changed)

Related

Ignet query on local node pontential issue?

New to ignite, i have a use case, i need to run a job to clean up. I have ignite embedded in our spring boot application, for multiple instances, i am thinking have the job run on each instance, then just query the local data and clean up those. Do you see any issue with this? I am not sure how often ignite does reshuffing data?
Thanks
Shannon
You can surely do that.
With regards to data reshuffling, it will only happen when node is added or removed to cluster. However, ignite.compute().affinityRun() family of calls guarantees that code is ran near the data.
Otherwise, you could do ignite.compute().broadcast() and only iterate on each affected cache's local entries. You don't have the aforementioned guarantee then, though.

Refresh redis cache on DB change

I've got a stored procedure that loads some data (about 59k items) and it takes 30 seconds. This SP must be called when the application starts. I was wondering if there's a reasonable way to invalidate the Redis cache entry via SQL ...any suggestion?
Thanks
Don't do it from your SQL, do the invalidation / (re)loading to Redis from your application.
The loading of this data into your application should be done by a separate component/service/module/part of your application. So that part should have all the responsibility of handling the needed data, including (re)loading it into the app, invalidating and reloading into Redis and so on. You should see your Redis server as an extension of your application cached data and not of your sql server data. That's why you should not link your relational database to your Redis. If you are going to change how you save this data into Redis that should not affect the SQL part, but only the application, and actually only the part of your application specialized with this.

NHibernate Second Level Cache with database change notification on desktop App

I am developing a WPF application using NHibernate to communicate with a PostgreSQL Database.
The only caching provider that works on a desktop app is Bamboo Prevalence (correct me if I am wrong). Given that every computer running my application will have different Session Factory, my application retrieves stale data from the cache.
My question is, how can I tell NHibernate/Prevalence to look at the timestamp of when the data was last updated, and if the cache is stale, refresh it?
Well, I found out that there is no way the Second Level cache can know if the database was changed outside Nhibernate/Cache, so what I did was creating a new column 'Timestamp' on all my tables.
On my queries, I first select the timestamp of the db using Session.Cachemode(CacheMode.Ignore) to get the timestamp of the db and I compare with the result from the cache. In the case the timestamps differ, I invalidate the cache for that query and run it again.
About the SysCache, even knowing it 'can work' on a WPF desktop app, I was not keen to use System.Web.Cache as my application would need the the complete .Net Framework instead of the Client Profile. I did a search and for my happiness someone wrote a Nhiberate cache proviver that implements (System.Runtime.Caching), witch is not a ASP.Net component. If anyone is interested you can find the source at:
https://github.com/Leftyx/nhcontrib/tree/master/src/NHibernate.Caches/MemoryCache
Well that is a property that you could set at the cache level and expire items according to your applications needs and then have the cache. Ncache is a possible L2 cache provider for NHibernate. NCache ensures that its cache is consistent across multiple servers and all cache updates are synchronized correctly so no data integrity issues arise. To learn more please visit:
http://www.alachisoft.com/ncache/nhibernate-l2cache-index.html

Is Nhibernate Shards production ready?

At the company I work we have a single database schema but with each of our clients using their own dedicated database, with one central database that stores client contact details and what database the client is using so we can connect to the appropriate database. I've looked at using NHibernate Shards but it seems to have gone very quiet and doesn't look complete.
Does anyone know the status of this project? Has anyone used it in production?
If it's not yet at a point that is considered usable in production, what are the alternatives? The two main ones seem to be:
Create a session factory per database and then a wrapper to select the appropriate factory to generate the correct session - this seems to me to have redundant session factories and not too efficient
Create just one session factory but when calling opensession pass it an IDbConnection - which would allow the session to have a different database connection.
My concern with 2 is how will NHibernate cope with a 2nd level cache as I believe it is controlled by the session factory - also the HiLo generator uses the session factory I believe. In these cases will having sessions attach to different dbs cause problems? For example we will end up with a MyCompany.Model.User class that has an id of 2 in both databases will this cause conflicts within the cache?
You could have a look at Enzo SQL Shard a sharding library for SQL Server. If you are already using NHibernate there might be a few changes required in the code though
NHibernate Shards is up-to-date with the latest NHibernate API changes and now supports all query models of NHibrrnate, including Linq. Complex scalar queries are currently not supported.
We use it in production for a multi-tenant environment, but there are a few things to be mindful of.
NHibernate Shards has a session factory for each shard, but only uses a single NHibernate Configuration instance to generate the session factories. This approach likely won't scale well to large numbers of shards.
Querying across shards does not play well with paging. It works, but can involve considerable client-side processing. It is best to keep result sets as small as possible and lock queries to single shards where feasible.

SQLite/Fluent NHibernate integration test harness initialization not repeatable after large data session

In one of my main data integration test harnesses I create and use Fluent NHibernate's SingleConnectionSessionSourceForSQLiteInMemoryTesting, to get a fresh session for each test. After each test, I close the connection, session, and session factory, and throw out the nested StructureMap container they came from. This works for almost any simple data integration test I can think of, including ones that utilize Fluent NHib's PersistenceSpecification object.
The trouble starts when I test the application's lengthy database bootstrapping process, which creates and saves thousands of domain objects. It's not that the first setup and tear-down of the test harness fails, in fact, the test harness successfully bootstraps the in-memory database as the application would bootstrap the real database in the production environment. The problem occurs when the database is bootstrapped a second time on a new in-memory database, with a new session and session factory.
The error is:
NHibernate.StaleStateException : Unexpected row count: 0; expected: 1
The row count is indeed Unexpected, the row that the application under test is looking for should be in the session. You see, it's not that any data from the last integration test is sticking around, it's that for some reason the session just stops working mid-database-boostrap. And I've looked everywhere for a place I might be holding on to an old session and I can't find one.
I've searched through the code for static singleton objects, but there are none anywhere near the code in question. I have a couple StructureMap InstanceScope singleton's but they are getting thrown out with each nested container that is lost after every test teardown.
I've tried every possible variation on disposing and closing every object involved with each test teardown and it still fails on this lengthy database bootstrap. I've even messed around with current_session_context_class to no avail. But non-bootstrap related database tests appear to work fine. I'm starting to run out of options and I may have to surrender lengthy database integration tests in favor of WatiN-based acceptance tests.
Can anyone give me any clue about how I can figure out why some of my SingleConnectionSessionSourceForSQLiteInMemoryTesting aren't repeatable?
Any advice at all, about how to make an NHibernate SqlLite database integration test harness repeatable for large data sessions?
Here is how we do it http://www.gears4.net/blog/archive/14/nhibernate-integration-testing
Hope it helps
I was able to solve this problem by not using an in memory database, and instead saving a hard copy file after initialization, once per test suite run. Then instead of reinitializing the database after each test, the file-based SqlLite database is copied and that new copy is used for testing.
Essentially, I only setup the initial database data once and save that database off to the side and it is copied for each test. There is a definite possibility that the problem could be on my end, but I suspect that there is an issue with large in-memory SqlLite databases. So I recommend using the file-mode of the database, if you are running into trouble with a large in-memory sqllite database.