I have configured a sql job which backups the databases and then transfer them to a remote location in another step. On command prompt my command is working fine but when I schedule this in a job I found the error :
Executed as user administrator. Logon Failure Unknown User Name or Bad Password. 0File(s) copied . Process Exit code 0. The step succedded.
I want to solve these issue and I also want that if does get transferred then job should report failure but it doesn`t show any such message.
I just want that when no files get copied i.e. 0File(s) copied . it should notify failure job .
Thanks
Nitesh Kumar
One way is to use a Script Task to check if there is a file you want to copy. If there is one, the process can proceed, if not, the step can result in an error. You do this by adding
Dts.TaskResult = (int)ScriptResults.Failure;
to the end of the script task logic.
Anyways i dont know your package design so there might be more suitable ways.
The issue has been solved . As the remote location folder was shared and was accessible to every one. My command was working fine from command prompt , even any user was able to create their own file on that location and also able to delete the file from that location.
The issue was related to user. My job was being executed by servername\administrator and remote location administrator password was changed due to that bad password error occurred. I told my IT Team about the problem and they reset the server password as older one, and my job began to work fine.
The issue was solved.
I just want to know how my sql job authenticates the server login as I go through the script of my job and found nothing helpful regarding authentication.
Can any one explain it to me.
Thanks
Nitesh Kumar
Related
One of my backups has failed because seemingly the pass phase is corrupted.
I am attempting to re-register the server (I do not know if this will fix the problem but I do know that in this process the pass phrase is entered) and so need to re-download the Vault Credentials.
In the old Azure portal the download of the Vault Credentials link was right there as soon as you went to the particular vault.
In the new Azure portal I cannot find it anywhere. I have looked and looked and Googled and Googled.
I get the feeling that one has to start the whole backup setup again for the server in order to get the credentials via Getting Started > Backup.
So I am in vault that the server backs up to and I go to Getting Started > Backup and follow the steps and then I end up with a list of servers that I need to choose from but my server is not there because it says "VMs in same region as vault and not protected by another vault are shown....".
Anyway I am stuck.
Path to download the vault credential file: Home>Recovery Services vaults>Select your recovery services vault>Properties>Backup credentials
See the screenshot:
OK I fixed my problem but I did not resolve the "How to re-download Vault Credentials".
In the Azure Backup app on the server I went to Actions > Change Properties and re-entered the pass phrase and tried to save it.
I got a message that nothing was saved because the pass phrase had not changed.
So how the heck did it know that if my initial problem was that the pass phrase was corrupted???
I chanced my luck and tried a "Backup Now" and lo and behold it worked.
GO FIGURE AND THANK YOU TEAM AT MS FOR YET ANOTHER CONUNDRUM I HAD TO SOLVE WITH SMOKE AND MIRRORS.
Don't get me started...
im using oracle sql developer to make my database, but now i stopped to work.
it wrote this:
An error was encountered performing the requested operation:
Listener refused the connection with the following error: ORA-12514,
TNS:listener does not currently know of service requested in connect
descriptor
Vendor code 12514
i also tried to make new connection but it didnt help
pls help me,i dont know what to do, i need this to do my homework
when i tried to change connection it wrote this:
An error was encountered performing the requested operation:
ORA-01031: insufficient privileges
01031. 00000 - "insufficient privileges"
*Cause: An attempt was made to change the current username or password
without the appropriate privilege. This error also occurs if
attempting to install a database without the necessary operating
system privileges.
When Trusted Oracle is configure in DBMS MAC, this error may occur
if the user was granted the necessary privilege at a higher label
than the current login.
*Action: Ask the database administrator to perform the operation or grant
the required privileges.
For Trusted Oracle users getting this error although granted the
the appropriate privilege at a higher label, ask the database
administrator to regrant the privilege at the appropriate label. Vendor code 1031
Please check the services are running:
OracleOraDb11g_home1TNSListener and OracleServiceORCL.
You're going to have to do some troubleshooting; Here are a couple of suggestions:
ORA-12514 Tips
ORA-12514 TNS:listener does not currently know of service requested in connect descriptor
How to resolve error: ORA-12514
The service name ,you are using should be in sid column and not in service name entry box.
For example, the service name provided for you is overflow,you have to mention it in
ex:10.171.1.24:1521:overflow
Sid : overflow
Service : leave blank
Test the connection
Go to Services
check whether the "ORACLEServiceORCL" is running or not.
if not running then right-click on it and click start.
Your problem may solve this.
Happy coding :D
This worked for me- Close your SQL Developer and launch again.
I had the same issue after restarting my PC and could not access my local database. This fixed it.
Go to Run > services.msc > Services (Local) > OracleService
Right-click on OracleService > Properties > Log On
Make sure Local System Account is checked then press OK.
Stop the OracleService and restart it. It should be working now.
Mine was working fine till last night but this morning I found the said problem. I checked OracleOraDb11g_home1TNSListener and OracleServiceORCL was running and in "Started" status but still issue was there.
Just restarting both the services solved my issue. I suggest just try this once to save your time before exploring other solutions.
I'm working on SQL 2012 Enterprise and I have a set of SSIS package exports which push data out to text files on a shared network folder. The packages aren't complex and under most circumstances they work perfectly. The problem I'm facing is that they do not work when scheduled - despite reporting that they have succeded.
Let me explain the scenarios;
1) When run manually from within BIDS, they work correctly, txt files are created and populated with data.
2) When deployed to the SSISDB and run from the Agent job they also work as expected - files are created and populate with data.
3) When the Agent job is scheduled to run in the evening, the job runs and reports success. The files are created but the data is not populated.
I've checked the reports on the Integration Services Catalogs and compared the messages line by line from the OnInformation. Both runs reports that the Flat File Destination wrote xxxx rows.
The data is there, the Agent account has the correct access. I cannot fathom why the job works when started manually, but behaves differently when scheduled.
Has anyone seen anything similar? It feels like a very strange bug....
Kind Regards,
James
Make sure that the account you have set up as the proxy for the SSIS task has read/write access to the file.
IMX, when you run an SQL Agent job manually, it appears to use the context of the user who initiates it in some way. I always assumed it was a side effect of impersonation. It's only when it actually runs with the schedule that everything uses the assigned security rights.
Additionally, I think when the user starts the job, the user is impersonating the proxy, but when the job is run via the schedule, the agent's account is impersonating the proxy. Make sure the service account has the right to impersonate the proxy. Take a look at sp_grant_login_to_proxy and sp_enum_login_for_proxy.
Here's a link that roughly goes through the process:
http://www.mssqltips.com/sqlservertip/2163/running-a-ssis-package-from-sql-server-agent-using-a-proxy-account/
I also recall this video being useful:
http://msdn.microsoft.com/en-us/library/dd440761(v=SQL.100).aspx
I had the same problem with Excel files. It was permission rights.
What worked for me was adding the SERVICE account to the folder's security tab. Then the SQL Agent can access the files.
I have got a ssis package which runs when I manually run from Integration Services. But when I try to run it from a job. Then it runs but no data is seen in the data. There seems to be some permission issue. Can somebody tell me what permissions are required for running a package from a SQL Server Job?
State the error message.
If you are using a flat file connection manager, and that's where the error is occurring, click 'start' then 'computer' then check to make sure you are mapped to that drive. If not, click the tab upper right corner to map to the drive then when you access the file through SSIS you shouldn't have an error.
If the package runs successfully as a job using the SQL Server Agent then you have the permissions set right for the database side.
However make sure if you are accessing any external data such as flat files that the agent is able to access these locations. You may have permissions on your Windows account to access the locations when you run the package in Visual Studio but the agent service running the job requires those permissions too.
If this is not the case can you clarify what your package does and any messages you receive from the catalog reports so I can help further.
I have an SSIS package that transfers some tables to CSV files on a network drive; it runs fine from my computer manually. I store it on the server in the MSDB database and execute it from there and it runs fine, but when I create a job that has one step that runs the SSIS package from MSDB it fails saying it can't find the CSV file name.
I spent all day yesterday figuring out this means a permissions issue with whatever logon credentials are being used through the job. The job owner shows to be domain/myuserid and step properties show they are using windows authentication with my username. The problem is, I know I have access to this folder.
The first line of the error log says: "Executed as user: servername\SYSTEM". So I made sure user "SYSTEM" has access to the network folder I want to load the files on, but I still get the same error.
The command line looks like #command=N'/SQL "\SSIS package name" /SERVER servername /CHECKPOINTING OFF /REPORTING E'
edit: I found SQL Server agent job account issue where someone asks who the job is run under and marc_s says "I can't seem to find any definitive answers on that one, really. Since my Jobs typically select and update stuff in the database, I am lead to assume that the "Owner" account will be used by default, unless you specify some other account on a given step"
Which also leads me to believe it is using my logon information that has access
The best practice that we've been able to come up with here is to make a domain account for SSIS and then set up a Proxy in SQL Server that is used to run the SSIS Package in a SQL Job.
I would say that the servername\SYSTEM account is a local account, and therefore won't have access to network folders on other servers.
You probably want to run this as a domain account of some sort, which does have access.
Typically this will be the SQL Server Agent, so check in the Services list, in the control panel, and see what account is running the agent, and if necessary change it to the appropriate account.
This may have knock on consequences though, so be careful what other jobs are running.