Migrating from TableServiceContext to CloudTable - azure-storage

I am migrating my Azure code from using TableServiceContext to CloudTable, due to the following warning (migrating to the latest Azure SDK from an older version)
'Microsoft.WindowsAzure.Storage.Table.DataServices.TableServiceEntity' is obsolete: 'Support for accessing Windows Azure Tables via WCF Data Services is now obsolete. It's recommended that you use the Microsoft.WindowsAzure.Storage.Table namespace for working with tables.'
One of the problems is that in TableServiceContext I used the following
_tableServiceContext.MergeOption = MergeOption.NoTracking;
_tableServiceContext.SaveChangesDefaultOptions = SaveChangesOptions.ReplaceOnUpdate;
_tableServiceContext.IgnoreResourceNotFoundException = true;
_tableServiceContext.Format.UseAtom();
_tableServiceContext.WritingEntity += JobRepository_WritingEntity;
Whats the equivalent in the new SDK?

These properties are specific to the behavior of the ServiceContext, and do not apply to using the CloudTable class.

Related

Assembly name was changed after deployment in core web api application

I am using VS 2019 to develop core.net web Api. I am trying to read the all methods and Parameters inside my controller. I am using Repository pattern to develop API.
Below is the code from my repository.
var method = MethodBase.GetCurrentMethod();
_log4net.Info("Assembly Name : " + Assembly.GetCallingAssembly().FullName);
_log4net.Info("Method Name : " + method.Name);
_log4net.Info("Repository Name :" + method.ReflectedType.FullName);
var result =
((System.Reflection.TypeInfo)Assembly.GetCallingAssembly().GetTypes().Where(type
=> type.FullName.Contains("AsmeController")).FirstOrDefault()).DeclaredMethods;
_log4net.Info(result);
Log's:
In Debug Mode:
After deployment in IIS
This code is working as expected and returns the list of Method Info in Debug mode and not working and return Null in Release mode even after deployed in IIS.
As i observed using logs, Assembly name was changing Demo.dll to “ Assembly Name : Anonymously Hosted DynamicMethods Assembly “ after deployment.
Please give me suggestions to solve this problem.
For the work around i am directly reading the application dll, Instead of reading current assembly. So that i can able to access the all info from there.
string assemblyFile = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location + "\\Demo.dll");
Assembly testAssembly = Assembly.LoadFile(assemblyFile);
var result = ((TypeInfo)testAssembly.GetTypes().Where(type => type.FullName.Contains("AsmeController")).FirstOrDefault()).DeclaredMethods;

Domino XPages: Import PKCS12-SSL-Certificate into ID from ID-Vault

IBM introduced the dominoIDVaultBean in Xpages. Is it possible to get the ID from ID Vault, add an PKCS12-SSL-Certificate and Upload the ID to the Vault again with pure XPages and no API?
With API this is possible via:
SECidfGet
PKCS12_ImportFileToIDFile
SECidfPut
Actually that is quite an easy task if you use Domino 9.0.1 FP8. I solved the issue using the newly introduced IDVault class as well as calling the PKCS12_ImportFileToIDFile C API via JNI.
So basically your code (in a bean initiated by a REST call) could look like this:
Session session = DominoUtils.getCurrentSession();
IDVault idvault = session.getIDVault();
//****** DOWNLOAD ID *****************
idvault.getUserIDFile(this.getIdFilePath(), this.getUsername(), this.getIdPassword(), VAULT_SERVER);
//****** IMPORT P12 ******************
Native.setProtected(true);
nnotes lib = (nnotes) Native.loadLibrary(NOTES_LIB, nnotes.class);
short errorint = lib.PKCS12_ImportFileToIDFile(this.getP12Path(), this.getP12Pin(), this.getIdFilePath(), this.getIdPassword(), 0, 0, 0);
//****** SYNC ID BACK TO VAULT *******
idvault.syncUserIDFile(this.getIdFilePath(), this.getUsername(),this.getIdPassword(), VAULT_SERVER);
To call the C API via java this is a good reference to start with.

DocumentDB TransientFaultHandling for Core

I am trying to migrate my code to Core.
I was using DocumentDB TransientFaultHandling package, but I can't seem to find it for a Core library.
Is it still best practice to use it, or are there other options for achieving the same results?
TIA
The current SDK (both Core and Full Framework) already include the fault handling that was part of the TransientFaultHandling package, not entirely the same since you can't define an exponential logic, but it works on the most common scenarios.
It's on the ConnectionPolicy settings:
var _dbClient = new DocumentClient("Db_uri", "Db_key", new ConnectionPolicy()
{
MaxConnectionLimit=100,
ConnectionMode = ConnectionMode.Direct,
ConnectionProtocol = Protocol.Tcp,
RetryOptions = new RetryOptions() { MaxRetryAttemptsOnThrottledRequests=3, MaxRetryWaitTimeInSeconds=60 }
});

Microsoft sync framework 2.1 ServerSyncProviderProxy with client SQL Server Express

I've been following the tutorial here to configure a proxy service with WCF for synchronization. But all the examples I see the ClientProvider is for SqlServer Compact Edition. Can it be done with the SqlSyncProvider for SQL Server Express?
For example my code is:
var svc = new ServiceForSyncClient();
ServerSyncProvider serverProvider = new ServerSyncProviderProxy(svc);
// create the sync orhcestrator
var syncOrchestrator = new SyncOrchestrator
{
LocalProvider = new SqlSyncProvider("ProductsScope", clientConn),
RemoteProvider = serverProvider,
Direction = SyncDirectionOrder.DownloadAndUpload
};
var syncStats = syncOrchestrator.Synchronize();
But when synchronizing, I get an exception:
An unhandled exception of type 'System.InvalidCastException' occurred
in Microsoft.Synchronization.dll
Additional information:
Microsoft.Synchronization.KnowledgeSyncProvider
you're using two completely different sync providers. part of your code is using the the older offline providers (DBServerSyncProvider and SQLCEClientSyncProvider) and you're trying to use the SQLSyncProvider which is part of the newer knowledge-based/peer-to-peer sync provider (SqlSycProvider/SqlCeSyncProvider).
You can't mix and match the older and newer sync providers
If you want to use SQL Express as the client, here is a sample for using it with WCF

Automatically Sync SQL Databases across two computers

I am using a cloud backup/sync service (SpiderOak) which automatically Syncs folders across several computers / devices.
I am trying to figure out a way to automatically sync all my databases across my work computer and personal laptop, without actually needing to backup/restore from one instance to the other.
So what I am thinking of is to create a new sql instance on my laptop which is identical to my work desktop instance, then to pick both SQL Server directories in Program Files to sync with each other using SpiderOak (the whole root SQL Server folders).
Will this be enough for my two instances to Sync with each other? Meaning if I create a new database on my computer at work, will I see this database on my laptop when I open SQL Server Database Management Studio?
I am almost sure if databases already exist they will sync with each other (since the root folders contain the mdf & ldf files - but correct me if I am wrong). however, I am not sure if a new database will be created if it doesn't already exist on one of the machines.
Is there any other folders that I need to sync other than the ones I specified already?
You could use Sql Sync Framework, you can download it here
some more readfood
It works for Sql Server 2005
Download and import references and include with the default ones:
using System.Data.Sql;
using System.Data.SqlClient;
using Microsoft.Synchronization;
using Microsoft.Synchronization.Data;
using Microsoft.Synchronization.Data.SqlServer;
using System.Diagnostics;
using System.Reflection;
using System.Net;
Than the actual code:
private void SyncTables()
{
SqlConnection ConStringOnline = new SqlConnection("connstring");
SqlConnection ConStringOffline = new SqlConnection("connString");
SyncOrchestrator sync = new SyncOrchestrator();
sync.Direction = SyncDirectionOrder.Download; //or DownloadAndUpload
//the 'scope1' is important, read more about it in the articles
var provider1 = new SqlSyncProvider("scope1", ConStringOnline);
var provider2 = new SqlSyncProvider("scope1", ConStringOffline);
PrepareServerForProvisioning(provider1);
PrepareClientForProvisioning(provider2, ConStringOnline);
sync.LocalProvider = provider2;
sync.RemoteProvider = provider1;
sync.Synchronize();
}
private static void PrepareServerForProvisioning(SqlSyncProvider provider)
{
SqlConnection connection = (SqlConnection)provider.Connection;
SqlSyncScopeProvisioning config = new SqlSyncScopeProvisioning(connection);
if (!config.ScopeExists(provider.ScopeName))
{
DbSyncScopeDescription scopeDesc = new DbSyncScopeDescription(provider.ScopeName);
scopeDesc.Tables.Add(SqlSyncDescriptionBuilder.GetDescriptionForTable("TABLENAME", connection));
config.PopulateFromScopeDescription(scopeDesc);
config.SetCreateTableDefault(DbSyncCreationOption.CreateOrUseExisting);
config.Apply();
}
}
private static void PrepareClientForProvisioning(SqlSyncProvider provider, SqlConnection sourceConnection)
{
SqlSyncScopeProvisioning config = new SqlSyncScopeProvisioning((SqlConnection)provider.Connection);
if (!config.ScopeExists(provider.ScopeName))
{
DbSyncScopeDescription scopeDesc = SqlSyncDescriptionBuilder.GetDescriptionForScope(provider.ScopeName, sourceConnection);
config.PopulateFromScopeDescription(scopeDesc);
config.Apply();
}
}
The downside of using Sync Framework: It is a pain in the a** to add these prerequisites to your application before publishing, no problem if you just use an application for yourself or for your company, but when you would like to publish it online it is a bit harder. I already had a topic about that
However, when using tools like InnoScript, you can install the prerequisites easily while installing the application. Here is how.
Now for the ScopeName: It is important that you don't use twice the same name, I believe. I had multiple tables so I just named them scope1,scope2,scope3,scope4. Apparently Sync Framework does the rest of the work for you. It also automatically adds _tracking tables to your database, this is just metadata to store information to synchronize properly.