I am using stackexchange redis client, I want to connect to multiple db not only one How can I handle it ?
https://github.com/StackExchange/StackExchange.Redis/blob/master/Docs/Basics.md
You have to pass the DB number in the getDatabase() Method
ConnectionMultiplexer redis = ConnectionMultiplexer.connect("local host");
IDatabase db = redis.GetDatabase(databaseNumber);
if you are using it in .Net Core, I have created a wrapper class which you can use like this:
var redisConnectionString = "{Your Redis Cache Connection String}";
var rest = new Restme(redisConnectionString);
//get cache data (support Generic cast)
var cacheResult = rest.Get("home:testKey");
var cacheResult2 = rest.Get<bool>("home:testKey2");
var cacheResult3 = rest.Get<ObjectType>("home:testKey3");
//set cache data
rest.Post("home:testKey","value");
rest.Post<bool>("home:testKey2",true);
it's actually a simple wrapper of StackExchange.Redis, so if you want to conect to multiple databases, just simply instantiate multiple Restme() objects as separate variables, each contains different Redis db connection.
The source code is in github: https://github.com/oelite/RESTme
Related
I want to setup a connection to the database so that i can test 2 methods in 2 repositories, i want to test getAllGamesasync() method which returns a list of all entities from the database, and the getGamesByNameasync() method which returns games by their names.
I am running docker to run the db and has populated rows with dummy data, i want to connect to it and run the test, So the question is What connection string to configure so code can talk to the SQL Server instance running in docker.
The methods work fine i have tested using a In-Memory DB to manually insert entities and test them against the methods using the below unit test. So the unit test for Get_GamesByName looks like this:
public async Task Get_GamesByName()
{
var options = new DbContextOptionsBuilder<GamesDbContext>()
.options.UseSqlServer(Configuration.GetConnectionString("GamesDbContext")));
using (var context = new GamesDbContext(option))
{
GamesRepository gamesRepository = new GamesRepository (context);
var result = await gamesRepository.GetGamesByNameAsync("Witcher");
Assert.Equal(2, result.count);
I have created a Custom processor which take care of saving some records in mysql database. For setting up mysql database i am using DBCPConnectionPool object in my custom processor which does work of saving data to database tables correctly, But i am worried of pooling mechanism i am not closing this connection after my logic of saving is completed. This is working for 2 to 3 flowfiles but when i send multiple flowfile will it work correctly?
DBCPService dbcpService = context.getProperty(DBCP_SERVICE).asControllerService(DBCPService.class);
Connection con = dbcpService.getConnection();
I am looking for clarification as my currently flow is working correctly with less number of flowfile
You should be returning it to the pool, most likely with a try-with-resource:
try (final Connection con = dbcpService.getConnection();
final PreparedStatement st = con.prepareStatement(selectQuery)) {
}
You can always consult the standard processors to see what they do:
https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-standard-bundle/nifi-standard-processors/src/main/java/org/apache/nifi/processors/standard/AbstractExecuteSQL.java#L223
I am using a cloud backup/sync service (SpiderOak) which automatically Syncs folders across several computers / devices.
I am trying to figure out a way to automatically sync all my databases across my work computer and personal laptop, without actually needing to backup/restore from one instance to the other.
So what I am thinking of is to create a new sql instance on my laptop which is identical to my work desktop instance, then to pick both SQL Server directories in Program Files to sync with each other using SpiderOak (the whole root SQL Server folders).
Will this be enough for my two instances to Sync with each other? Meaning if I create a new database on my computer at work, will I see this database on my laptop when I open SQL Server Database Management Studio?
I am almost sure if databases already exist they will sync with each other (since the root folders contain the mdf & ldf files - but correct me if I am wrong). however, I am not sure if a new database will be created if it doesn't already exist on one of the machines.
Is there any other folders that I need to sync other than the ones I specified already?
You could use Sql Sync Framework, you can download it here
some more readfood
It works for Sql Server 2005
Download and import references and include with the default ones:
using System.Data.Sql;
using System.Data.SqlClient;
using Microsoft.Synchronization;
using Microsoft.Synchronization.Data;
using Microsoft.Synchronization.Data.SqlServer;
using System.Diagnostics;
using System.Reflection;
using System.Net;
Than the actual code:
private void SyncTables()
{
SqlConnection ConStringOnline = new SqlConnection("connstring");
SqlConnection ConStringOffline = new SqlConnection("connString");
SyncOrchestrator sync = new SyncOrchestrator();
sync.Direction = SyncDirectionOrder.Download; //or DownloadAndUpload
//the 'scope1' is important, read more about it in the articles
var provider1 = new SqlSyncProvider("scope1", ConStringOnline);
var provider2 = new SqlSyncProvider("scope1", ConStringOffline);
PrepareServerForProvisioning(provider1);
PrepareClientForProvisioning(provider2, ConStringOnline);
sync.LocalProvider = provider2;
sync.RemoteProvider = provider1;
sync.Synchronize();
}
private static void PrepareServerForProvisioning(SqlSyncProvider provider)
{
SqlConnection connection = (SqlConnection)provider.Connection;
SqlSyncScopeProvisioning config = new SqlSyncScopeProvisioning(connection);
if (!config.ScopeExists(provider.ScopeName))
{
DbSyncScopeDescription scopeDesc = new DbSyncScopeDescription(provider.ScopeName);
scopeDesc.Tables.Add(SqlSyncDescriptionBuilder.GetDescriptionForTable("TABLENAME", connection));
config.PopulateFromScopeDescription(scopeDesc);
config.SetCreateTableDefault(DbSyncCreationOption.CreateOrUseExisting);
config.Apply();
}
}
private static void PrepareClientForProvisioning(SqlSyncProvider provider, SqlConnection sourceConnection)
{
SqlSyncScopeProvisioning config = new SqlSyncScopeProvisioning((SqlConnection)provider.Connection);
if (!config.ScopeExists(provider.ScopeName))
{
DbSyncScopeDescription scopeDesc = SqlSyncDescriptionBuilder.GetDescriptionForScope(provider.ScopeName, sourceConnection);
config.PopulateFromScopeDescription(scopeDesc);
config.Apply();
}
}
The downside of using Sync Framework: It is a pain in the a** to add these prerequisites to your application before publishing, no problem if you just use an application for yourself or for your company, but when you would like to publish it online it is a bit harder. I already had a topic about that
However, when using tools like InnoScript, you can install the prerequisites easily while installing the application. Here is how.
Now for the ScopeName: It is important that you don't use twice the same name, I believe. I had multiple tables so I just named them scope1,scope2,scope3,scope4. Apparently Sync Framework does the rest of the work for you. It also automatically adds _tracking tables to your database, this is just metadata to store information to synchronize properly.
I'm trying to synchronize an Sql Server database with SQL Azure Database (please be patient 'cause I don't fully understand Sync Framework). These are the requirements:
First: synchronize 1 table from Sql Azure to Sql Server
Second: synchronize 13 other tables (including the table I mentioned in the first step) from Sql Server to Azure.
I've created a console application, and this is the code:
1.I create one scope with the 13 tables:
DbSyncScopeDescription myScope = new DbSyncScopeDescription("alltablesyncgroup");
DbSyncTableDescription table = qlSyncDescriptionBuilder.GetDescriptionForTable("tablename", sqlServerConn);
myScope.Tables.Add(table); //repeated 13 times.
2.I Provision both data bases:
SqlSyncScopeProvisioning sqlAzureProv = new SqlSyncScopeProvisioning(sqlAzureConn,myScope);
if (!sqlAzureProv.ScopeExists("alltablesyncgroup"))
{
sqlAzureProv.Apply();
}
SqlSyncScopeProvisioning sqlServerProv = new SqlSyncScopeProvisioning(sqlServerConn, myScope);
if (!sqlServerProv.ScopeExists("alltablesyncgroup"))
{
sqlServerProv.Apply();
}
3.I create the SyncOrchestrator with the SyncDirectionOrder.Download to sync the firts table:
SqlConnection sqlServerConn = new SqlConnection(sqllocalConnectionString);
SqlConnection sqlAzureConn = new SqlConnection(sqlazureConnectionString);
SyncOrchestrator orch = new SyncOrchestrator
{
RemoteProvider = new SqlSyncProvider(scopeName, sqlAzureConn),
LocalProvider = new SqlSyncProvider(scopeName, sqlServerConn),
Direction = SyncDirectionOrder.Download
};
orch.Synchronize();
4.Later, I use the same function only changing the direction SyncDirectionOrder.Upload to sync the 13 remaining tables
SqlConnection sqlServerConn = new SqlConnection(sqllocalConnectionString);
SqlConnection sqlAzureConn = new SqlConnection(sqlazureConnectionString);
SyncOrchestrator orch = new SyncOrchestrator
{
RemoteProvider = new SqlSyncProvider(scopeName, sqlAzureConn),
LocalProvider = new SqlSyncProvider(scopeName, sqlServerConn),
Direction = SyncDirectionOrder.Upload
};
orch.Synchronize();
Now, here is the thing, obviously I'm doing it wrong 'cause when I download, the syncStats shows that a lot of change have been applied BUT I can't see it reflected on any data base and when I try to execute the Upload sync it seems to be going into a loop 'cause the Upload process doesn't stop.
Thanks!!!
first, you mentioned you only want to sync one table from Azure to your SQL Server but you're provisioning 13 tables in the scope. if you want one table, just provision a scope with one table. (e.g. one scope for the download with table, one scope for the upload with the rest of the tables)
to find out why rows are not synching, you can subscribe to the ApplyChangeFailed event for both sides, and check if there are conflicts or errors being encountered.
or you can enable Sync Framework tracing in verbose mode so you can see what's happening underneath.
When trying to run the following in Redis using booksleeve.
using (var conn = new RedisConnection(server, port, -1, password))
{
var result = conn.Server.FlushDb(0);
result.Wait();
}
I get an error saying:
This command is not available unless the connection is created with
admin-commands enabled"
I am not sure how do i execute commands as admin? Do I need to create an a/c in db with admin access and login with that?
Updated answer for StackExchange.Redis:
var conn = ConnectionMultiplexer.Connect("localhost,allowAdmin=true");
Note also that the object created here should be created once per application and shared as a global singleton, per Marc:
Because the ConnectionMultiplexer does a lot, it is designed to be
shared and reused between callers. You should not create a
ConnectionMultiplexer per operation. It is fully thread-safe and ready
for this usage.
Basically, the dangerous commands that you don't need in routine operations, but which can cause lots of problems if used inappropriately (i.e. the equivalent of drop database in tsql, since your example is FlushDb) are protected by a "yes, I meant to do that..." flag:
using (var conn = new RedisConnection(server, port, -1, password,
allowAdmin: true)) <==== here
I will improve the error message to make this very clear and explicit.
You can also set this in C# when you're creating your multiplexer - set AllowAdmin = true
private ConnectionMultiplexer GetConnectionMultiplexer()
{
var options = ConfigurationOptions.Parse("localhost:6379");
options.ConnectRetry = 5;
options.AllowAdmin = true;
return ConnectionMultiplexer.Connect(options);
}
For those who like me faced the error:
StackExchange.Redis.RedisCommandException: This operation is not
available unless admin mode is enabled: ROLE
after upgrading StackExchange.Redis to version 2.2.4 with Sentinel connection: it's a known bug, the workaround was either to downgrade the client back or to add allowAdmin=true to the connection string and wait for the fix.
Starting from 2.2.50 public release the issue is fixed.