Dynamics CRM 365 plugin to external SQL Server - wcf

Currently we are using plugins to integrate records for Dynamics CRM 2011 (on premise) to back office (SQL Server) using WCF as a bridge. This process is same for insert and update. (Plugin > WCF > stored procedure)
However, we are due to upgrade to Dynamics CRM 365 on Azure and wondering if there are any better (new tech!) ways to do the same process?
I would really appreciate if you can share your experience with similar CRM to Back Office sync.

We are using (and I recommend) Data Export service
Data Export is an add-on service made available as a Microsoft Dynamics 365 (online) solution that adds the ability to replicate Dynamics 365 (online) data to a Microsoft Azure SQL Database store in a customer-owned Microsoft Azure subscription. The supported target destinations are Microsoft Azure SQL Database and Microsoft Azure SQL Server on Microsoft Azure virtual machines. Data Export intelligently synchronizes the entire Dynamics 365 schema and data initially and thereafter synchronizes on a continuous basis as changes occur (delta changes) in the Microsoft Dynamics 365 (online) system.
Probably the easiest/cleanest way. Just a Managed solution import, enabling Change tracking for Entities, setup Azure SQL & Key vault, friendly Profile setup & sync issues troubleshooting.

If your WCF services is available in the internet I do not see any reason to change the working integration (unless you really want to get rid of good, old WCF :)).
If it is intranet-only app you will probably need to use some other integration patterns and technologies. For example: service bus, web jobs, app logic, etc.
All of them may work perfecly well, however there are many different conditions that need to be considered during decission process.

Related

Query Azure SQL from Nintex 2013

I was wondering if it's possible to connect to an Azure SQL database from a Nintex Workflow running on Sharepoint 2013 on premise. I have multiple workflows where I connect to an on premise MS SQL server, but cannot get this to work with Azure SQL.
I've tried using many different connection strings, but keep getting "Unexpected SQL error occured". I know that the account I use has access to the database, so there might be other issues like firewall settings.
However, before I start messing with the firewall, can anyone tell me if it is even possible to connect to Azure SQL from Nintex 2013?
I found that you have post the same question in Nintex 2013 forum.
Just for now, no one reply you and there's no documents talk about this, none Azure documents or Nintex documents.
So we may assume that on-premise Nintex 2013 doesn't support Azure SQL or Azure SQL database doesn't support Nintex 2013 for now.

How to move a windows .net runtime frontend application to the cloud (it uses a local sql server backend database)

I had a engineer design our .net application back in 2009, my guess is that it was coded using visual studio, and all I have is the installer application. We have been using it on our 1 or 2 local client machines very well for the past few years, but now I want to move this front end to the cloud. Instead of installing it as an application on our windows 7 machines.
It is a very simple application used in our small warehouse that keeps track of cargo/shipments etc. It uses Sql Server 2008 Express as a backend which is stored locally.
I know how to get the database in the cloud, their are many options for that, using Amazon or Azure, but how do i get the local client application to the cloud?
I dont have access to the visual studio code, i just have the runtime executable file..
I am sure there is no way to do this, and many of SO users will say i need to re-write the front end.
I have tried to contact the developer and they hav since closed down. Is their anyway i can run this in the cloud?
I welcome all options and solutions!
Thanks.
I believe you have two options for hosting this application:
If you are able to configure the database connection string, you could host the database in the cloud, and distribute the application to your end users. However, you've already stated that you know how to move the database, so I assume this isn't an option.
The only alternative is to run the entire application on a cloud server, and send the user interface to a client using terminal services. This makes it appear as if the application is running locally on the user's computer, while it is actually running on the server.
For an off-the-shelf solution to achieve this, you could consider using Microsoft's RemoteApp Azure service. I'm sure there are other similar offerings available.

Migration to Windows Azure

In our organization we are using Hyper-V VMs. We are using Progress Database and apps in the workstations.
For us to migrate into the Microsoft Azure cloud, do we have to migrate our existing Progress database to SQL and rewrite our apps ?
No. You haven't given us much detail about your applications or architecture, but if I make the assumption that you are using the embedded database product by Progress software, then I see no reason that can't run on an Azure VM.

Importing database from an sFTP server in Windows Azure

I'm building a website that will surface data from a third party system. The third party will provide a copy of all the data I need as a SQL restore file (*.bak) inside a rar file on their sftp server. The data changes every day, so my application will need to connect to the sftp site, get the file, unzip it, then restore it into my database server every night. I'm fairly comfortable scripting this in a standard windows environment, but the customer would prefer the application to be built on the MS Azure cloud, which doesn't seem to support a common solution to the problem. It's possible we could abandon Azure, but I'd like to know what the best strategy would be for implementing in Azure if it's possible.
This depends on whether you are trying to use Azure PaaS (cloud service and SQL Azure) or IaaS (VMs). If you are using VMs on Windows Azure, there is going to be no difference between Windows Azure and your familiar Windows environment - so yes, you can do this on Windows Azure.
This can't really be done in Azure cloud services and SQL Azure (SQL Azure cannot restore a .bak file). But your application doesn't seem to be the kind that would run as a cloud service anyway.
Stick to doing it on VMs and it will work as you are familiar with.

My SharePoint 2010 SSRS Data Sources are being disabled, and I can't figure out why

I am having a problem with SSRS data sources being mysteriously disabled in a SharePoint 2010, SSRS integrated-mode environment.
When I set up the data source, it is saved and I verify that it is enabled. I can also successfully execute a report that uses that data source. At some point later, it is disabled. I can't figure out why, and it's driving me crazy.
Has anyone else experienced this?
My environment:
SharePoint 2010 Farm
One WFE
Two SharePoint-integrated SSRS app servers
One database server
Read this two articles:
Sharepoint dropping link to Data Source for SSRS reports
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/91016bff-20e7-446a-bb70-7be407096faf/sharepoint-dropping-link-to-data-source-for-ssrs-reports
Reporting Services Data Source are getting disabled when deployed on SSRS Integrated mode
http://connect.microsoft.com/SQLServer/feedback/details/742238/reporting-services-data-source-are-getting-disabled-when-deployed-on-ssrs-integrated-mode
In a nutshell:
It was related to some sort of problem with the Encrytion Keys set up in Reporting Services Configuration Manager .... Sharepoint Integration was also showing as Not Configured (despite the fact that it was working in Integration mode)
When I put in the Windows Account password under Windows Service Identity it once again prompted for saving the Encryption Key to a file ... once this was done Sharepoint Integration shows as Configured and the nightly problem (2AM when the Reporting Services daily maintenance programs run) goes away ....