Copy .bak file to another Windows server - sql

I have .bak files that created by a job on SQL Server every night. I want the .bak file to be copied to another server. I created a job running the command
EXEC xp_cmdshell 'copy "G:\Source\folder\file.bak" "\\destination\Work\folder\"'
but the job creates the a copy of the .bak file on the source machine but in different directory.
Any idea about what I am doing wrong?
Thanks

Related

Using azcopy to copy .bak files to Azure storage

Im trying to copy the contents of my sql server backup .bak files to azure storage using azcopy.
Im using this command in a batch .bat file -
"C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe" ^
/Source:"I:\" /Dest:https://hmlrservicesstorage.****************************/ ^
/DestKey:********************************************** /Pattern:"*.bak" /s /XO
On the I drive there are two folders, one a normal folder and the other a network share.
When I run the .bat file through a cmd prompt it runs fine copying all .bak files to the storage, yet when I run the same .bat file through a SQL Server Agent job, using exactly the same .bat file it only copies the network share folder across and not the normal folders contents.
I am copying from an Azure VM in the cloud to azure storage account.
SQL Server is installed on the VM in the Azure cloud.
I cant understand why and would greatly appreciate any help on this.

SQL Server agent Jobs and SSIS Project

My SSIS package creates xls files to a folder. When I execute the project in my local machine that works fine. When I execute the job it creates a file that doesn't exist.
What could be the problem?
The jobs says that:
Source: Send Mail Task Send Mail Task Description: Either the
file
"\\bpptvwdw0000001\DTSXs\SubscricaoDirComercial\CustomSubscriptions\dw_PropAprovadasAConcretizar__20200408.xls"
does not exist or you do not have permissions to access the file."
That means the file does not exist or you do not have permissions to access the file, but when I execute it from my machine the project doesn't create that file.
Most likely the account running your job does not have rights to that folder.

AdventureWorks 2014 Download Issue With File Type

I am attempting to download AdventureWorks 2014. I downloaded from codeplex, which now exists in my WinZip as a .bak using Notepad as the default program. I unzipped the file, copied over to Program Files...SQLServer...Data. However, I believe the Notepad program will cause an issue when attempting to import the database. I was expecting to see AdventureWorks files ending in .mdf and .ldf. Would I need to change the default program for .bak or .zip? Also, what would I need to change it to to be able to successfully run the AdventureWorks import script.
Thanks!
The .bak file is a database backup file. Once you restore the database through SQL Server, you'll get your .mdf and .ldf files. Here's how to restore a database backup (your .bak file) for SQL Server 2014.
https://msdn.microsoft.com/en-us/library/ms177429(v=sql.120).aspx

How can I create .MDF file based on existing Database in SQL Server?

I have a database which is sitting on my local PC. I want to create a .mdf file based on this database. I created an empty .mdf file in Visual Studio. How to connect SQL Server Management Studio and run queries against it?
But I want to create .mdf file based on existing database which will save me time on rebuilding whole DB in Visual Studio and also it will have all of the data in tables.
I check this link. it is not helping me solve my question.
Also was trying to copy .mdf file from C:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA into my project in App_Data folder but it trowing error cannot complete because file is used by SQL Server
Thank you.
First close all the programs and then stop the SQL SERVER AGENT. You should now be able to copy the .mdf file. After copying the files, don't forget to start the SQL SERVER AGENT.

FTP Concurrency issues using Ipswitch WS-FTP Pro

I think we have a problem in our FTP scripts that pull files from a remote server to a local machine. I couldn't find an answer in their knowledge base, nor scripting documentation.
We are doing an MGET *.* and then a MDELETE *.* immediately after it. I think what is happening is that, while we are copying files from the server, additional files are copied into the same directory and then the delete command deletes everything from the server. So we end up deleting file we never copied down.
Is there a straight-forward way to delete only the files that were copied, or is it going to be some sort of hack job where we generate a dynamic delete script based on what we actually copied down?
Answers that are product specific would be much appreciated!
Here were the options that I came up with and what I ended up doing.
Rename the extension on the server, copy the renamed files, and then delete the renamed files. This could not work because there is no FTP rename command that works with wildcards (Windows rename command will by the way).
Move the files to a subdirectory on the server, copy the files from that location, and then delete from the remote location. This could not work because there is no FTP command to move the files on the remote server.
Copy the files down in one script and SHELL a batch file on the local side that dynamically builds a script to connect to the server and delete the files that were copied down. This is the solution I ended up using to solve this problem.