We have a SQL Server 2014 database that is basically a set of views to a 3rd party remote SQL Server instance that we connect to as a linked server. They are upgrading to 2016 and said that we would need a 2016 instance to connect to their system soon.
I setup a new instance on the same server we have SQL Server 2014 and created a new database there. In the 2014 instance, we have a snapshot of a few tables, which we refresh a few times each day. Pulling data from their system is working fine with this setup, but we have a couple spots where we update the 3rd party database. This stopped working after the change.
I either receive the error
'OLE DB provider "SQLNCLI11" for linked server "" returned message "The transaction manager has disabled its support for remote/network transactions.".'
or an error about there not being transactions. The 2nd error only occurred when I was trying to troubleshoot the issue, yesterday.
I tried adjusting the DTC settings on our server, but that just gave me the 2nd error, instead, and seemed to cause an issue with remote connections to our database...
This is all that is in the update that is breaking:
UPDATE [2016Instance].[DBName].dbo.EmpPers
SET eepAddressEMail = #CurEmpEmail
WHERE eepEEID = #CurEEID
Is there something else that needs to be setup for this to work? I'm considering just reworking this to be able to run from the SQL Server 2016 instance, instead, but I thought I would ask here first.
I ended up getting this to work by moving our stored procedures to the SQL 2016 instance, setting up a link back to the SQL 2014 instance, and then changing the stored procedures on the SQL 2014 instance to just execute the new versions on the SQL 2016 instance. This also doesn't require the DTC settings from the comments on the main question.
Related
I have an on premise SQL Server database that is the backend for our project management software, a Azure sql table that contains limited data used for reporting with power bi and a linked server to connect the two. Both of the databases have a specific user/pass account just for this which is stored in the linked server. Heres the problem:
When I run a SQL Server Agent job to update the azure table from the on prem table using the linked server everything works fine.
When I manually run a sql update statement from an open window in SSMS to do the same everything works fine.
When I use a workflow in the project management software to trigger a Stored Procedure to execute the same code (update Azure from the on prem database) I get the following error:
The OLE DB provider "SQLNCLI11" for linked server "LinkedServerName" reported an error. One or more arguments were reported invalid by the provider.
The operation could not be performed because OLE DB provider "SQLNCLI11" for linked server "LinkedServerName" was unable to begin a distributed transaction.
OLE DB provider "SQLNCLI11" for linked server "LinkedServerName" returned message "The parameter is incorrect.". Error occurred in: STORED_PROCEDURE_NAME[CRLF]Error occurred on line 23
There's nothing on line 23, and like I mentioned earlier, if I manually run the same update statement it works and if I have a SQL Server Agent Job run the same statement it works. Why does it fail when the code is executed by the project management software? Anyone have experience with this?
This is the code to insert the data from on prem into Azure:
INSERT INTO [LinkedServerName].DatabaseName.SchemaName.TableName ([ProjectNumber],[CreateDate],[SyncDate])
I'm not sure about this with Azure but I had a similar issue with a remote server and had to disable promotion of distributed transactions. It might not be the best thing to do in a production environment so read up carefully about the implications of doing this.
I'm only suggesting to try this to narrow down what the real issue is..
Change this setting and test.
I ended up taking a different strategy. We know using a scheduled SQL Agent Job to insert data in to azure works, it just wouldnt work in any script ran by our software and the user it uses to access the on prem database. So I created a SP in the on prem database that the software executes through a built in workflow. The SP executes saving the data to a staging table, then executes the SQL Job, which reads from the staging table and then inserts the data into an Azure table.
Everything worked in the testing environment, but when I replicated all the scripts into production I got a permissions error. After doing a lot of research and testing adjustments to the user I ended up getting it to work by assigning the role TargetServersRole and db_ddladmin to the user in the msdb database and that worked.
ssms screenshot
Below are the two articles that let me to this conclusion:
Article 1
Article 2
I have tried creating a sequence for my table in Microsoft SQL Server Management Studio 18. I have a syntax error for some reason (although I have checked multiple times the syntax), but I cannot even find the sequences folder for my database. I have also tried not writing the schema name, but the same error appears. What might be the problem? Where do I go wrong?
Here is a screenshot of the problem
First note that SSMS (SQL Server Management Studio) is just a client application that talks to a connected SQL Server instance in the background. So SSMS only passes your SQL statements to the connected SQL Server and shows you the results that SQL Server returns. Nothing more.
What is the version of the SQL Server instance to which your SSMS is connected? (You can check it quickly by executing the SQL statement PRINT ##VERSION.)
You should be aware that the CREATE SEQUENCE statement is only supported by SQL Server 2012 and higher.
I am working on a project which is migrating some legacy SQL Server 2000 instances to SQL Server 2012. As you read word legacy, these databases are used by some VB based desktop applications. It has around 4000+ users and application is rated GOLD (means it has to be up 24x7)
Summary again
Desktop exe Installed VB applications -> SQL Server 2000
Target State
Desktop exe Installed VB applications -> SQL Server 2012
Application uses a config file that contains SQL Server details that it connects to. So once data move to new SQL Server, this config file needs to be changed with new server details.
I have been told that SQL Server 2000 can't migrate directly. It should first go to SQL Server 2008 and then SQL Server 2012. Please correct if this is not right understanding?
My problem is around Implementation Plan for this task in production. I can't move all users in one go means I would be migrating 100 users first and then few other hundreds and finally all left. Which means some users might start using SQL Server 2012 while other still working with SQL Server 2000. The reason I don't want everything in one go because it's too risky in case of any glitch and because application has to be up 24x7 it's not possible to bring down the applications and update config files on each user's desktop.
But if I allow 2000 and 2012 running together (say 1 week until all users move), it will make these databases out of sync and I don't think they can be merged later because both databases may be having same primary keys assigned to different data.
I can't bring the application down and take 4 hours outage to allow all users move to new databases in one shot because application has to be up 24x7.
Can any one recommend any approach that generally companies take to migrate SQL Server without outage like I stated above with keeping data consistency?
The easiest way to handle this is to create a new 2012 instance and create a database from a restore of the 2000 database. Then have replication between the 2 databases so that changes in either database will be published to the other that way your primary keys stay in sync. You will have to be down for a short period where you do the backup and restore to move all the data but assuming the 2 servers are co-located then it should only be a matter of minutes. Then once all your users have been migrated just turn off the 2000 server.
My SQL Server 2012 server with several linked servers hangs when i try to add a linked server to an Access database using the ACE 12 provider. Once it hangs, SQL management studio stops displaying linked server catalog objects. I have enabled InProcess on the ACE provider.
How do I fix this/add the Access DB as a linked server without this problem?
For context, I have a MS Access database (about 1GB in size) with about a dozen tables that I want to load into corresponding tables in a SQL server 2012 database. The data is loaded on demand when a user process a button in an Excel worksheet. (worksheet calls stored procedure, stored procedure calls DTS package)
I want to change this process to remove the DTS package because users will no longer have permission to run DTS packages or xp_cmdshell.
I think the DTS package could easily be replaced using a linked server or even OPENDATASOURCE query, but I'm having trouble using the linked server (constantly hangs when i try to add it)
I have a main SQL Server, running SQLServer 2000, with two (in theory) subscribing servers, each running SQL Server 2005.
One of these is subscribing fine, but the other always seems to fail subscribing, both when attempting to set up the subscription from the publisher (SQL2000) to the subscriber(SQL2005), and when trying to set it up from the subscriber to the publisher, both via Server Management Studio 2008 and via SQL Enterprise Manager
In both cases, the publication is created on the publisher, but a corresponding subscription is not created on the subscriber.
I then get an error message saying "The process could not connect to Subscriber [ServerName]", and no more sign of activity. There's no problem with logins, permissions, etc. The password for sa is the same on both machines, and is different on the 2005 machine that works.
Is this a problem anyone else has encountered?
EDIT: I've now tried adding both a dbSubscriber and a dbPublisher access account on each server so that they're not logging into each other using "sa", but it doesn't seem to have made any difference.
EDIT2: Adding a push subscription does not create a Local Subscription on the subscribing server. Is this normal, or is this the point at which everything is falling to pieces?
Thanks for posting an update, always good to know how things turned out.
There are "complications" and intracacies involved when creating SQL Server Replication topologies incorporating different versions of SQL Server, as it sounds like you are discovering.
Keep in mind that Replication functionality is limited to that of the oldest version of SQL Server in your topology:
Using Multiple Versions of SQL Server in a Replication Topology
We don't really understand what was going wrong, but we think that the 2005 server was unable to accept the 2000 server as a Push Publisher.
We created four different Pull subscriptions on the 2005 server and the first three failed, while the fourth magically worked.
We are accepting this as a blessing from the God of Computers and will not question His benevolence.