I'm working on SQL 2012 Enterprise and I have a set of SSIS package exports which push data out to text files on a shared network folder. The packages aren't complex and under most circumstances they work perfectly. The problem I'm facing is that they do not work when scheduled - despite reporting that they have succeded.
Let me explain the scenarios;
1) When run manually from within BIDS, they work correctly, txt files are created and populated with data.
2) When deployed to the SSISDB and run from the Agent job they also work as expected - files are created and populate with data.
3) When the Agent job is scheduled to run in the evening, the job runs and reports success. The files are created but the data is not populated.
I've checked the reports on the Integration Services Catalogs and compared the messages line by line from the OnInformation. Both runs reports that the Flat File Destination wrote xxxx rows.
The data is there, the Agent account has the correct access. I cannot fathom why the job works when started manually, but behaves differently when scheduled.
Has anyone seen anything similar? It feels like a very strange bug....
Kind Regards,
James
Make sure that the account you have set up as the proxy for the SSIS task has read/write access to the file.
IMX, when you run an SQL Agent job manually, it appears to use the context of the user who initiates it in some way. I always assumed it was a side effect of impersonation. It's only when it actually runs with the schedule that everything uses the assigned security rights.
Additionally, I think when the user starts the job, the user is impersonating the proxy, but when the job is run via the schedule, the agent's account is impersonating the proxy. Make sure the service account has the right to impersonate the proxy. Take a look at sp_grant_login_to_proxy and sp_enum_login_for_proxy.
Here's a link that roughly goes through the process:
http://www.mssqltips.com/sqlservertip/2163/running-a-ssis-package-from-sql-server-agent-using-a-proxy-account/
I also recall this video being useful:
http://msdn.microsoft.com/en-us/library/dd440761(v=SQL.100).aspx
I had the same problem with Excel files. It was permission rights.
What worked for me was adding the SERVICE account to the folder's security tab. Then the SQL Agent can access the files.
Related
Setup: A pretty standard data export SSIS package (SQL Server 2016 compatible), created in VS2019/Data Tools and deployed using the SSIS Project Deployment model to the Integration Services Catalog of a SQL Server 2016 instance. The package creates files in a network folder before sending the file out via FTP and putting a copy of the file in a Sent folder.
The project requirements include having the package running on a schedule using "default" parameter values, as well as allowing users to manually run the package using "non-default" parameter values from within a stand-alone application.
Current behavior: the package behaves correctly when run from a SQL Server Agent Job that is configured with a SQL proxy and credentials mapped to a domain login with the proper permissions for the network folder.
Problem: the Data Flow task fails to create the file with a "Cannot open the datafile" error when running the package directly using any of the following methods (even when the "current" session is using the same credentials as the SQL Server Credentials/Proxy used by the SQL Server Agent Job):
Using SSMS to right-click on the package and selecting Execute
Using the DTEXEC SQL utility
Using the SSISDB.catalog.start_execution SQL Server stored procedure
As far as I'm aware, these are the only methods capable of starting a SSIS package and changing the package's parameter values. I either need to get one of the latter 2 methods to work, find another option that allows for changing the parameter values while launching the package, or use one of 2 techniques I'm aware of (detailed below) that would add yet another failure point to the process as well as other potential issues.
Note: If the process is changed to initially create the file on the SQL Server's local harddrive, then the Data Flow task succeeds, but the later copy to Sent folder task fails with a very similar permissions error.
Alternative #1: this technique requires creating a new table, loading the parameter values to the table, changing the package to check the table and potentially set it's parameters/variables based on what it finds. The package can then be launched using a SQL Server Agent Job (for which there are multiple methods to manually launch them) and if the calling object has correctly populated the table, the package will behave as if it's parameters were changed at runtime otherwise it will run with the default values.
Alternative #2: Change all folders used by the package to point to folders local to the SQL Server instance and then create a separate scheduled task/application/whatever, with the valid credentials, that would synchronize or move the files to their proper network folders.
even when the "current" session is using the same credentials as the SQL Server Credentials/Proxy used by the SQL Server Agent Job
This is probably because the account is not logged on locally at the SQL Server, and so it's a Double-Hop Impersonation scenario, and would require Kerberos Constrained Delegation to be configured.
And you are correct in assessing the options. The general solution is to invoke catalog.start_execution from a session running on the SQL Server, and an Agent Job is the simplest built-in way to do this (the others being xp_cmdshell, Service Broker Activation, or SQL CLR).
I have a JOB done in SPOON, which is executed without problems in the command line, but I would like to know if there is any software in which I can execute these JOBS and go to see the execution visually. The idea is that for the most pleasant exploitation area these tasks are executed.
You have two solutions:
Carte:
Use the carte server which is shipped with the PDI. Install the PDI on any server, launch carte (specifying the port), then you can execute/view/stop/restart job/transformation from any browser. Documentation is here.
Of course you can launch a job/transformation from your own PDI. Just define a new Slave server, on the left panel, tab view, default username/password = cluster/cluster. Then each time you run a job/transformation, choose the carte server, instead of Pentaho/local in the Run configuration.
Loggin
If you just want to follow job/transformation, you may use the database logging: Right-click any where, Parameters, Logging, Job/Transformation, then define a database, a table and a logging interval of 2 seconds.
Then every two seconds, the line_read, line_written, errors, and log_field are written to a database. This database can be read by an external process and displayed on the screen or on a browser.
This method is used in the github/ETL-pilot which uses a tomcat (because you probably have a tomcat already running with a Pentaho server), but can easily be adapted to a nodejs or any other server. (If you do it and OpenSource it, please add a link to your work on our github).
I have an SSIS package with the Execute Process Task, which runs 7zip exe to zip a file. This works fine when I run the SSIS. But when I run this SSIS from the SQL Agent it hangs. I assume this is something to do with the permission. I have given full control to Network Services and sqlsvc to the folder which has the zip exe and the folder it is extracting to. Still no luck. What should I do to make this SSIS run from the SQL agent.
I have created a proxy account which has administrator privilege and change the Job Step "Run As" property to the new proxy account instead of SQL Agent Service Account. I think the SQL Agent Service account doesnt have the access to run the process. You can also change the SQL Agent Serice Account group policies to make it work.
I would change the WindowStyle property to Hidden - the SQL Server agent may be hanging when it tries to create a Windowed process.
I have got a ssis package which runs when I manually run from Integration Services. But when I try to run it from a job. Then it runs but no data is seen in the data. There seems to be some permission issue. Can somebody tell me what permissions are required for running a package from a SQL Server Job?
State the error message.
If you are using a flat file connection manager, and that's where the error is occurring, click 'start' then 'computer' then check to make sure you are mapped to that drive. If not, click the tab upper right corner to map to the drive then when you access the file through SSIS you shouldn't have an error.
If the package runs successfully as a job using the SQL Server Agent then you have the permissions set right for the database side.
However make sure if you are accessing any external data such as flat files that the agent is able to access these locations. You may have permissions on your Windows account to access the locations when you run the package in Visual Studio but the agent service running the job requires those permissions too.
If this is not the case can you clarify what your package does and any messages you receive from the catalog reports so I can help further.
I have an SSIS package that transfers some tables to CSV files on a network drive; it runs fine from my computer manually. I store it on the server in the MSDB database and execute it from there and it runs fine, but when I create a job that has one step that runs the SSIS package from MSDB it fails saying it can't find the CSV file name.
I spent all day yesterday figuring out this means a permissions issue with whatever logon credentials are being used through the job. The job owner shows to be domain/myuserid and step properties show they are using windows authentication with my username. The problem is, I know I have access to this folder.
The first line of the error log says: "Executed as user: servername\SYSTEM". So I made sure user "SYSTEM" has access to the network folder I want to load the files on, but I still get the same error.
The command line looks like #command=N'/SQL "\SSIS package name" /SERVER servername /CHECKPOINTING OFF /REPORTING E'
edit: I found SQL Server agent job account issue where someone asks who the job is run under and marc_s says "I can't seem to find any definitive answers on that one, really. Since my Jobs typically select and update stuff in the database, I am lead to assume that the "Owner" account will be used by default, unless you specify some other account on a given step"
Which also leads me to believe it is using my logon information that has access
The best practice that we've been able to come up with here is to make a domain account for SSIS and then set up a Proxy in SQL Server that is used to run the SSIS Package in a SQL Job.
I would say that the servername\SYSTEM account is a local account, and therefore won't have access to network folders on other servers.
You probably want to run this as a domain account of some sort, which does have access.
Typically this will be the SQL Server Agent, so check in the Services list, in the control panel, and see what account is running the agent, and if necessary change it to the appropriate account.
This may have knock on consequences though, so be careful what other jobs are running.