Im trying to copy the contents of my sql server backup .bak files to azure storage using azcopy.
Im using this command in a batch .bat file -
"C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe" ^
/Source:"I:\" /Dest:https://hmlrservicesstorage.****************************/ ^
/DestKey:********************************************** /Pattern:"*.bak" /s /XO
On the I drive there are two folders, one a normal folder and the other a network share.
When I run the .bat file through a cmd prompt it runs fine copying all .bak files to the storage, yet when I run the same .bat file through a SQL Server Agent job, using exactly the same .bat file it only copies the network share folder across and not the normal folders contents.
I am copying from an Azure VM in the cloud to azure storage account.
SQL Server is installed on the VM in the Azure cloud.
I cant understand why and would greatly appreciate any help on this.
Related
How to deploy the SQL scripts to the local database (one-click deployment)?
I am trying to get some syntax advice on using osql batch file mode to deploy all .sql files to the mapped database.
I want to move all the files that in a File share on azure. I can do it at the moment by following way using:
Use "net" command to connect to the network drive and assign it a drive letter
Then there's a bat file that uses "move" command like move g:\files\* c:\files to then move the files which runs every hour to check if there are files and move them using windows task scheduler.
But I don't want to use this way because:
The drive will be disconnected if the Machine needs a restart and hence the process doesn't remain automated as someone will have to mount the drive again
The "move" command doesn't moves folders, it moves only files.
Is there a better way of managing this? We don't want to install tools like AzCopy but using Powershell is feasible.
According to your description, I suggest you can do it as the following ways:
1.Call Azcopy in PowerShell.
You could install Azcopy firstly, you could download the latest version from the link. Azcopy supports upload directory to Azure fileshare. You could use the following command.
AzCopy /Source:C:\myfolder /Dest:https://myaccount.file.core.windows.net/myfileshare/ /DestKey:key /S
Then you could write a bat script to call Azcopy, by default Azure install directory is C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy.
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
&.\AzCopy.exe /Source:C:\myfolder /Dest:https://myaccount.file.core.windows.net/myfileshare/ /DestKey:key /S
Use PowerShell Script, you could use to this link. The script use for ASM mode, you should change some command.
Based on my experience, using Azcopy is more easily and simpler.
The issue was resolved using the following approach:
Delete any existing mapped drive to the drive letter we are using with the net use <drive_letter> /delete command. This is done to make sure the drive was detached since the last time script ran
Map the drive again using the net use command
Copy all the files using robocopy
After that delete all the files using del command
Disconnect the drive now using the net use <drive_letter> /delete command
I am attempting to download AdventureWorks 2014. I downloaded from codeplex, which now exists in my WinZip as a .bak using Notepad as the default program. I unzipped the file, copied over to Program Files...SQLServer...Data. However, I believe the Notepad program will cause an issue when attempting to import the database. I was expecting to see AdventureWorks files ending in .mdf and .ldf. Would I need to change the default program for .bak or .zip? Also, what would I need to change it to to be able to successfully run the AdventureWorks import script.
Thanks!
The .bak file is a database backup file. Once you restore the database through SQL Server, you'll get your .mdf and .ldf files. Here's how to restore a database backup (your .bak file) for SQL Server 2014.
https://msdn.microsoft.com/en-us/library/ms177429(v=sql.120).aspx
I have .bak files that created by a job on SQL Server every night. I want the .bak file to be copied to another server. I created a job running the command
EXEC xp_cmdshell 'copy "G:\Source\folder\file.bak" "\\destination\Work\folder\"'
but the job creates the a copy of the .bak file on the source machine but in different directory.
Any idea about what I am doing wrong?
Thanks
I think we have a problem in our FTP scripts that pull files from a remote server to a local machine. I couldn't find an answer in their knowledge base, nor scripting documentation.
We are doing an MGET *.* and then a MDELETE *.* immediately after it. I think what is happening is that, while we are copying files from the server, additional files are copied into the same directory and then the delete command deletes everything from the server. So we end up deleting file we never copied down.
Is there a straight-forward way to delete only the files that were copied, or is it going to be some sort of hack job where we generate a dynamic delete script based on what we actually copied down?
Answers that are product specific would be much appreciated!
Here were the options that I came up with and what I ended up doing.
Rename the extension on the server, copy the renamed files, and then delete the renamed files. This could not work because there is no FTP rename command that works with wildcards (Windows rename command will by the way).
Move the files to a subdirectory on the server, copy the files from that location, and then delete from the remote location. This could not work because there is no FTP command to move the files on the remote server.
Copy the files down in one script and SHELL a batch file on the local side that dynamically builds a script to connect to the server and delete the files that were copied down. This is the solution I ended up using to solve this problem.