Issue with SSIS package executing procedure that reads from a file share - sql

I have an SSIS Package that runs via a SQL job on a SSIS server (Server A) that executes a stored procedure on the database server (server B). This stored procedure does a Bulk insert from a file share that is located on the SSIS Server (Server A). However, every time that the stored procedure runs it fails with the follow error:
Execute Membership Process:Error: Executing the query "exec storedprocname ?, ?" failed with the following error: "Cannot bulk load because the file "\ServerA\TestLoads\Membership\Processing\filename.csv" could not be opened. Operating system error code 5(Access is denied.).". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
I am pretty positive that the issue is related to permissions. If I store the files on the database server (Server B) then it processes. Or if I run the proc manually, it will work.
When I execute the job on Server A it executes as the service account for that server. That account has full access to Server A and Server B (Admin in SQL and on the server). I believe what is happening is the credentials get passed the first time, but they are not continued once the stored proc runs. I ran wireshark on Server A (SSIS Server) so that I could see what was access the share and try to get some more information. What I found was that there was no account information being passed, it was just blank.
I went through a lot of steps just to try to see if could get that work such as granting everyone access to the share, enabling the guest account, allowing anonymous users, etc. Not stuff I would want to do, but trying to narrow down the issue. None of those worked.
I tried modify the stored proc to use WITH EXECUTE AS OWNER. Still did not work, but got a different error. Also tried a variety of other accounts to execute as and got the same error each time.
Execute Membership Process:Error: Executing the query "exec [storedprocname] ?, ?" failed with the following error: "Could not obtain information about Windows NT group/user '', error code 0x5.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Tried a variety of solutions that I found online to get this to work and nothing so far has done it.
I understand that is not an ideal solution. I was under the impression that the developers where using SSIS to load the file initially and then using SQL for the rest of the process which would have worked. But because SQL has to touch the file system it keeps failing. And at this stage, there is not the option of rewriting this. Additionally, this process will work if we move the files to the database server (Server B), but that eliminates a large need for us in having the SSIS server in the first place which was to get files being processed off of the database server
Any ideas on if there is a way to get the current solution to work? Basically, what I think I am needing is to run the SSIS package and for a way to pass credentials via the stored proc to the file share during that process.
We are using Windows Server 2012 R2 on both servers and SQL Server 2012 sp3 Developer edition.
Thanks for the help!

I've had this issue before, and I still don't fully understand Kerberos authentication, but that fixed it for me. It's something to do with "double-hop" of authentication i.e. creds going from SSIS, through SQL Server, to a network Server.
Try setting up Kerberos Authentication for SQL Server. There are detailed step-by-step instructions with screenshots here => Setup Kerberos Authentication for SQL Server
I understand this is like a "link-only" answer, but I don't want to copy-paste & plagiarize the author's original works i.e. blog post, hence the link.

Related

SQL BULK INSERT using UNC path

I have a developer pc "A". And I have a SQL Server "B".
My SQL Server is a Microsoft SQL Server 2019.
On server B, I have a database with a stored procedure, which bulk loads data from a text file (using the BULK INSERT command).
I have now created an SSIS project on server A, which calls the stored procedure on server B using the "Execute SQL Task". The connection on the task is pointing at the database on server B. I have tried using the OLE DB connection and ADO.
When I place my text file on server B and reference the file like D:\myFolder\myFile.txt, everything works fine.
When I place my text file on server A and references the file like \\\A\myShare\myFile.txt it fails. The error I get is:
[Execute SQL Task] Error: Executing the query "exec BulkInsert
'\\A\myShare\myFile.txt'" failed with the following error: "Cannot
bulk load because the file "\\A\myShare\myFile.txt" could not be
opened. Operating system error code 5(Access is denied.).". Possible
failure reasons: Problems with the query, "ResultSet" property not set
correctly, parameters not set correctly, or connection not established
correctly.
I have tried logging on to server B and via a File Explorer open the file using the above path. It works fine. And yes, I am logged in as the same user on both servers A and B.
I have also tried giving "Everyone" read/write access to the share, but still the same poor result.
I am only interested in knowing how to fix this problem, using the stored procedure call - I do not wish to rebuild its functionality in SSIS (the stored procedure is maintained by an external company and they may change it at their will, but we agree on how I can call it).
According to what I read from Microsoft, it should not be a problem, call BULK INSERT using a UNC path.
According to what I read from Microsoft, it should not be a problem, call BULK INSERT using a UNC path.
This is true, UNC paths are supported by the BULK INSERT command.
Based on the Microsoft documentation, BULK INSERT has three main requirements:
The server must have permission to access both the file and the destination database.
The server runs the Bulk Insert task. Therefore, any format file that the task uses must be located on the server.
The source file that the Bulk Insert task loads can be on the same server as the SQL Server database into which data is inserted, or on a remote server. If the file is on a remote server, you must specify the file name using the Universal Naming Convention (UNC) name in the path.
The first requirement means that you should grant the SQL Server Service account to access the UNC path, not the Windows account you are logging in with.
You should refer to the following articles to find the SQL Server service account name:
Configure File System Permissions for Database Engine Access
How to Find Service Account for SQL Server and SQL Server Agent?
Besides, you can learn more about SQL Server service accounts and permissions in the following documentation:
Configure Windows Service Accounts and Permissions
Alternative - Mapping network drive
As an alternative, you can try mapping the network drive within SQL Server. You can check the following articles for more information:
Make Network Path Visible For SQL Server Backup and Restore in SSMS
How to Map Network Drive as Fixed Drive?

SSIS flat file folder permission error when NOT running from SQL Server Agent

Setup: A pretty standard data export SSIS package (SQL Server 2016 compatible), created in VS2019/Data Tools and deployed using the SSIS Project Deployment model to the Integration Services Catalog of a SQL Server 2016 instance. The package creates files in a network folder before sending the file out via FTP and putting a copy of the file in a Sent folder.
The project requirements include having the package running on a schedule using "default" parameter values, as well as allowing users to manually run the package using "non-default" parameter values from within a stand-alone application.
Current behavior: the package behaves correctly when run from a SQL Server Agent Job that is configured with a SQL proxy and credentials mapped to a domain login with the proper permissions for the network folder.
Problem: the Data Flow task fails to create the file with a "Cannot open the datafile" error when running the package directly using any of the following methods (even when the "current" session is using the same credentials as the SQL Server Credentials/Proxy used by the SQL Server Agent Job):
Using SSMS to right-click on the package and selecting Execute
Using the DTEXEC SQL utility
Using the SSISDB.catalog.start_execution SQL Server stored procedure
As far as I'm aware, these are the only methods capable of starting a SSIS package and changing the package's parameter values. I either need to get one of the latter 2 methods to work, find another option that allows for changing the parameter values while launching the package, or use one of 2 techniques I'm aware of (detailed below) that would add yet another failure point to the process as well as other potential issues.
Note: If the process is changed to initially create the file on the SQL Server's local harddrive, then the Data Flow task succeeds, but the later copy to Sent folder task fails with a very similar permissions error.
Alternative #1: this technique requires creating a new table, loading the parameter values to the table, changing the package to check the table and potentially set it's parameters/variables based on what it finds. The package can then be launched using a SQL Server Agent Job (for which there are multiple methods to manually launch them) and if the calling object has correctly populated the table, the package will behave as if it's parameters were changed at runtime otherwise it will run with the default values.
Alternative #2: Change all folders used by the package to point to folders local to the SQL Server instance and then create a separate scheduled task/application/whatever, with the valid credentials, that would synchronize or move the files to their proper network folders.
even when the "current" session is using the same credentials as the SQL Server Credentials/Proxy used by the SQL Server Agent Job
This is probably because the account is not logged on locally at the SQL Server, and so it's a Double-Hop Impersonation scenario, and would require Kerberos Constrained Delegation to be configured.
And you are correct in assessing the options. The general solution is to invoke catalog.start_execution from a session running on the SQL Server, and an Agent Job is the simplest built-in way to do this (the others being xp_cmdshell, Service Broker Activation, or SQL CLR).

SQL xp_cmdshell copy files between servers

I am trying to move all .zip in a specific folder to another folder. the source folder is located on another server, currently i am using
EXECUTE xp_cmdshell 'copy \\server1\e$\ETL\*.zip \\server2\e$\ETL\'
GO
Which is working if I am logged into both server, but the goal is to automate this process VIA sql server job agent. I have tried
EXECUTE sp_xp_cmdshell_proxy_account 'domain\useracc','pass'
GO
EXECUTE xp_cmdshell 'copy \\server1\e$\ETL\*.zip \\server2\e$\ETL\'
GO
but I am receiving the following error;
An error occurred during the execution of sp_xp_cmdshell_proxy_account. Possible reasons: the provided account was invalid or the '##xp_cmdshell_proxy_account##' credential could not be created. Error code: '0'.
And also not sure if this is my solution. Please help with how I can achieve this. The file names on server1 change name and quantity everyday.
I would strongly advise...Do not use xp_cmdshell. It opens up large security wholes in your surface area and makes you vulnerable to attack. xp_cmdshell should be disabled!
Instead, if you want to automate this with server agent you have 2 options. My preference would be to write a simple SSIS package with a file system task and schedule this package with server agent. SSIS is underutilized for this kind of task but is actually pretty good at it.
Alternatively re-write your script to use Server Agent CmdExec job steps. This does not require xp_cmdshell to be enabled and reduces the attack surface.
I Found that the following worked for me;
In the command prompt, type services.msc, this would open the list of all services on the server.
In the list of services, look for SQL Server Agent, Right Click -> Properties. Go to Logon Tab
Change the logon to a user with access on both servers. then re-write your script to use Server Agent CmdExec job steps(Thank you Pete Carter)

Whose logon is being used in a SQL Server 2008 Agent Job

I have an SSIS package that transfers some tables to CSV files on a network drive; it runs fine from my computer manually. I store it on the server in the MSDB database and execute it from there and it runs fine, but when I create a job that has one step that runs the SSIS package from MSDB it fails saying it can't find the CSV file name.
I spent all day yesterday figuring out this means a permissions issue with whatever logon credentials are being used through the job. The job owner shows to be domain/myuserid and step properties show they are using windows authentication with my username. The problem is, I know I have access to this folder.
The first line of the error log says: "Executed as user: servername\SYSTEM". So I made sure user "SYSTEM" has access to the network folder I want to load the files on, but I still get the same error.
The command line looks like #command=N'/SQL "\SSIS package name" /SERVER servername /CHECKPOINTING OFF /REPORTING E'
edit: I found SQL Server agent job account issue where someone asks who the job is run under and marc_s says "I can't seem to find any definitive answers on that one, really. Since my Jobs typically select and update stuff in the database, I am lead to assume that the "Owner" account will be used by default, unless you specify some other account on a given step"
Which also leads me to believe it is using my logon information that has access
The best practice that we've been able to come up with here is to make a domain account for SSIS and then set up a Proxy in SQL Server that is used to run the SSIS Package in a SQL Job.
I would say that the servername\SYSTEM account is a local account, and therefore won't have access to network folders on other servers.
You probably want to run this as a domain account of some sort, which does have access.
Typically this will be the SQL Server Agent, so check in the Services list, in the control panel, and see what account is running the agent, and if necessary change it to the appropriate account.
This may have knock on consequences though, so be careful what other jobs are running.

Attaching catalog with SQL Authentication credentials attaches it as Read-Only

As part of our product's installation process, a database is attached to the server.
We use EXEC sp_attach_db in order to attach it to MSSQL.
The problem occures when we try to attach it with "SQL Authentication" connection string - the database is attached to the server as read-only, thus preventing any write access from being performed
This is driving us nuts... it's working just fine with Windows Authentication, and the only difference is the connection string... I tried googling for it but no mention for such a scenario is found.
Any ideas anyone?
It's important to mention that the MDF/LDF physical files are not set with "ReadOnly" attribute, so this is not the problem.
I'm curious why you can't attach the database with write permissions and then just make the whole thing read only after the install?