SQL Server creating external data source - sql

I'm trying to create external data source in SQL Server 2019. From one SQL Server instance to another.
I have done everything like in documentation (https://learn.microsoft.com/en-us/sql/relational-databases/polybase/polybase-configure-sql-server?view=sql-server-ver15).
To create external data source i use following command:
CREATE EXTERNAL DATA SOURCE SQLServerInstance
WITH ( LOCATION = 'sqlserver://SQL2:port',
PUSHDOWN = ON,
CREDENTIAL = MyCredentials);
But i keep getting following error:
Msg 46721, Level 20, State 1, Line 1
Login failed. The login is from an untrusted domain and cannot be used with Integrated authentication.
What can I do to fix this?

Are you running the DDL from a local Windows account (i.e., non-domain joined machine)? There is a regression in SQL Server 2019 where you will get this error when trying to use PolyBase. We are in the process of fixing the issue in an upcoming CU.

Related

SQL Server & RStudio - SQL Connection Almost Working

I've been running into an issue in R Studio with a SQL connection.
We've had an on-prem SQL Server that's been upgraded over the years, and a colleague that set it up no longer is with the organization.
We also have an Azure Server that's loaded with a SQL Server as well that was much more recently set up before they departed.
We have a GUI program we're currently developing, and one of the early steps is a SQL Login connection for the user where the variable is declared (db_user) and changes with their login and passes the password correctly within system variables defined in .Renviron as posted on RStudio's site for references.
Our initial connection string looks like this, and this is the line of code that starts the connection and where I believe the issue may lie first:
db_conn_onprem <- DBI::dbConnect(odbc::odbc(),
Driver = "SQL Server",
Server = Sys.getenv("server"),
Database = Sys.getenv("database"),
UID = Sys.getenv("db_user"),
PWD = Sys.getenv("PWD")
Whenever the Azure connection succeeds, it connects as dbo#Azure\Azure vs On-Prem's guest#Server\Server.
(I can't post in-line screenshots yet)
On-Prem Connection Screenshot: https://i.ibb.co/PmbGt5y/RStudio-SQL.png
Azure Connection Screenshot: https://i.ibb.co/WFY3FqZ/azure1.png
I feel this is something dbo-related since that's where the connection drops.
(variable names anonymized)
Now for the issue:
Whenever we attempt to run a series of queries, our on-prem errors out with this:
Error: nanodbc/nanodbc.cpp:1655: 42000: [Microsoft][SQL Server][SQL Server]Cannot execute as the server principal because the principal "db_user" does not exist, this type of principal cannot be impersonated, or you do not have permission.
<SQL> 'EXECUTE AS LOGIN = 'db_user' SELECT name FROM master.sys.sysdatabases WHERE dbid > 4 AND HAS_DBACCESS(name) = 1 ORDER BY name ASC'
However, run the exact same procedure on the SQL Server in Azure with relatively no major configuration, and it succeeds.
Here's the SQL Code we run:
EXECUTE AS LOGIN = 'db_user' SELECT name
FROM master.sys.sysdatabases
WHERE dbid > 4
AND HAS_DBACCESS(name) = 1
ORDER BY name ASC
I feel like I've exhausted my resources for this, first I thought it was the initial R code or possibly SQL Drivers, however I don't believe that to be the issue since the SQL driver pulls a list of names in R Studio in the Connections context menu, but bounces back the error when attempting to complete the query.
Whenever I'm searching errors for references for this error, I see
Cannot execute as the server principal because the principal "dbo" does not exist, this type of principal cannot be impersonated, or you do not have permission.
Listed as the most commonly related error for the one I'm experiencing, however I've tried a number of those (From blank DB ownerships to unrelated solutions), but I've mostly hit a wall here.
Any assistance would be greatly appreciated.
I feel this is something dbo-related since that's where the connection drops, but I have no clue where to continue going on this issue.
Yep.
This
EXECUTE AS LOGIN = 'db_user'
requires impersonate permission for the login. Which the error message is clearly telling you. It's unclear why you want to impersonate that login instead of simply connecting as the login to begin with.

Could not find server error when parsing the import query

I'm trying to create a new import SSIS package on my production server. I'm receiving an error that I didn't receive on my development server.
I'm using the SQL Server Import and Export Wizard launched from within SSMS. I right-clicked the database I want to import data into, chose tasks, and then Import Data. I selected the data source using the SQL Server Native Client. Then I selected the destination, again using the SQL Server Native Client. The next screen I selected use a query. I imported the query that I used in my development system and just changed which database to look at. When I click on Parse I receive this message:
Deferred prepare could not be completed. Statement(s) could not be prepared. Could not find server '' in sys.servers. Verify that the correct server name was specified. If necessary, execute the stored procedure sp_addlinkedserver to add the server to sys.servers.
Source is on another machine, different SQL instance. Destination is the server that is showing up in the error message.
This is the query that is working in development, but not production:
DECLARE
#lasttran_num INT
select
#lasttran_num = last_tran
from DB01.ATR_App_plt2.dbo.lasttran_mst
where lasttran_mst.lasttran_key = 5000
select ID, CAST(Coil AS nvarchar(15)) as 'Lot', KgNetWt, TCode, TransactionDateTime
from Transactions trx
where trx.ID > #lasttran_num
I know this is working in development because I set up a job to run the SSIS package for 2 weeks. Checked it daily and it was, indeed, importing the new records.
The issue is 4-part name:
DB01.ATR_App_plt2.dbo.lasttran_mst
DB01 is SQL Server instance and most likely it is different on PROD env

SQL Azure Export Data-Tier Application & import into local SQL server

I have a SQL Azure database. I'm able to export the Database using Tasks > Export Data Tier Application. This is successful.
I then try to use Import Data Tier Application in my local SQL server and I get the following error:
Could not import package. Warning SQL0: A project which specifies
Microsoft Azure SQL Database v12 as the target platform may experience
compatibility issues with SQL Server 2008. Warning SQL72012: The
object [db_Data] exists in the target, but it will not be dropped even
though you selected the 'Generate drop statements for objects that are
in the target database but that are not in the source' check box.
Warning SQL72012: The object [db_Log] exists in the target, but it
will not be dropped even though you selected the 'Generate drop
statements for objects that are in the target database but that are
not in the source' check box. Error SQL72014: .Net SqlClient Data
Provider: Msg 102, Level 15, State 1, Line 1 Incorrect syntax near
'CREDENTIAL'. Error SQL72045: Script execution error. The executed
script: CREATE DATABASE SCOPED CREDENTIAL [databasenameAzureStorageCredential]
WITH IDENTITY = N'SHARED ACCESS SIGNATURE';
I have SQL Server Management Studio 14.0.17289.0 and everything is up to date.
I have read different posts on Stack overflow and done some googling but unsure the best way to move forward. How can I solve this?
It seems like there is a compatibility mode differences in your local SQL server DB and Azure SQL server DB. Check your compatibility level and if it is mismatched here is the resource to solve that. The error was because you use SSMS version 'X' to generate the bacpac against Azure SQL version 'Y'. Try to generate the same bacpac using SSMS version 'Y' and it works for me.
Please download the latest version of SQL Server Management Studio from here to have the best user experience with Azure SQL Database. SSMS v14 is too old. The current version of SSMS is v17.9.
Remove (drop) the database scoped credential named "databasenameAzureStorageCredential" before exporting the database. The following query should give you a list of credentials created.
SELECT * FROM sys.database_scoped_credentials
In general, you need to remove references to external sources before exporting your database.

SQLVDI error when backing up to Azure Storage BLOB

I'm running a patched SQL 2014 trying to backup a database to one of our Azure Storage BLOBs, using:
BACKUP DATABASE [DB]
TO URL = N'https://storage.blob.core.windows.net/server-mssqlserver/DB.bak'
WITH CREDENTIAL = N'AzureCredential'
,NOFORMAT
,NOINIT
,NAME = N'DBA_DB-Full Database Backup'
,NOSKIP
,NOREWIND
,NOUNLOAD
,COMPRESSION
,STATS = 5
GO
but the query throws the following error:
Msg 3292, Level 16, State 9, Line 1
A failure occurred while attempting to execute Backup or Restore with a URL device specified. Consult the Windows Event Log for details.
Msg 3013, Level 16, State 1, Line 1
BACKUP DATABASE is terminating abnormally.
Checking the server's Event Logs shows the actual error as:
SQLVDI: Loc=IdentifySQLServer. Desc=MSSQLSERVER. ErrorCode=(5)Access is denied.
. Process=4668. Thread=6596. Client. Instance=MSSQLSERVER. VD=.
I have made sure that the SQL Server Agent service's account has the Create global objects policy, and also made sure the SQL VSS Writer service is running under the Local System account. The error keeps happening!
Is there something I can do to fix it, or just log some more detailed error messages than the "SQLVDI: Loc=IdentifySQLServer" one above?
Crikey, this is one of those "no idea how I fixed it" things.
Before I went to lunch, I could reliably generate the error in question by running the provided T-SQL, yet when I came back from lunch the BACKUP command completed fine!
The main thing I remember changing was on the Azure side, where I created a SAS (Shared Access Signature).
This is supposedly not required for SQL 2014 as it uses an actual Azure Credential to connect to the storage instead. I actually created the SAS for an instance of SQL Server 2016 that I want backing up to the same container, and that may have opened the access pathway to the container for SQL Server 2014 too!
Many thanks to Sean Gallardy for recommending ProcMon, which showed a whole ton more error log information than the "ErrorCode=(5)Access is denied" message did.

SQL Server 2012 Migrating Spatial data across a Linked server, Query timeout?

We are doing a migration from our old system (sql server 2008) to the new system (SQL server 2012) , the data sources we are using are Remote so we have it configured as a linked servers , the data in the source we are migrating have special data (Geography type) , we are migrating the data per customer , so some customers have more data that the others, we batch the data and we are using OPENQUERY to pull the Spatial data across. For the customers with less data the migration goes smoothly and it completes successfully , but for customers with more than couple million records in one table the migration stops and gives mainly 2 errors:
This how the error comes like :
OLE DB provider "yyy" for linked server "xxx" returned message "Query timeout expired".
Msg 7399, Level 16, State 1, Server nnn, Line 1
The OLE DB provider "yyy" for linked server "xxx" reported an error. Execution terminated by the provider because a resource limit was reached.
Msg 7320, Level 16, State 2, Server ttt , Line 1
Cannot execute the query "
select top (200000)
[row] = row_number () over ( order by t.[x])
, .....
, [Spatial] = cast(ts.[Spatial] as varbinary(max))
from [..].[..].[..] t
join [...].[..].[… ] s
on t.[..] = s.[...]
where (t.[x] > '00000000-0000-0000-0000-000000000000')
and v.[x] = x
order by t.[x]
" against OLE DB provider "yyy" for linked server "xxx".Build step 'Execute Windows batch command' marked build as failure
also this problem happened with one other table that doesn't have a spatial data in it.
The approaches we tried to follow,
We have increased the timeout of the query,
We have dropped the batch size to 200,000 per batch
The Provider is “ in process mode”
we only have couple linked servers so the buffer size is more than acceptable ,
We tried to run the migration using an admin role to make sure it’s not a permissions problem
We are thinking this might be a network problem but it’s not a Load Balancer issue. maybe its something else,
The other error that comes frequently is
HResult 0x40, Level 16, State 1
TCP Provider: The specified network name is no longer available.
Any idea's for what could be a reason will be so much appreciated
Thank you,
Lsaif
I would say the "The specified network name is no longer available." error indicates no response from the remote server. Since SQL Server hasn't "heard" from the remote server in a while, it gives up. I would cut down the batch size to something really small and increase with success (rather than the other way around). That way you'll find a batch size that works. Also, this may vary between "customers" depending on your connection to them (i.e., type and size of line, traffic on the line, etc.).
Personally, I like the BCP OUT/BCP IN option as well because I know it works however, you still have to consider the transfer method of the data from the remote server. If you have a robust enterprise MFT over a dedicated T1 or better, you probably won't have an issue.
SSIS offers more of a direct transfer but I imagine you'll run into the same network issues you're having now. That said, you can create a general BCP solution within SSIS.