Move files from Azure storage to a local directory - azure-storage

I want to move all the files that in a File share on azure. I can do it at the moment by following way using:
Use "net" command to connect to the network drive and assign it a drive letter
Then there's a bat file that uses "move" command like move g:\files\* c:\files to then move the files which runs every hour to check if there are files and move them using windows task scheduler.
But I don't want to use this way because:
The drive will be disconnected if the Machine needs a restart and hence the process doesn't remain automated as someone will have to mount the drive again
The "move" command doesn't moves folders, it moves only files.
Is there a better way of managing this? We don't want to install tools like AzCopy but using Powershell is feasible.

According to your description, I suggest you can do it as the following ways:
1.Call Azcopy in PowerShell.
You could install Azcopy firstly, you could download the latest version from the link. Azcopy supports upload directory to Azure fileshare. You could use the following command.
AzCopy /Source:C:\myfolder /Dest:https://myaccount.file.core.windows.net/myfileshare/ /DestKey:key /S
Then you could write a bat script to call Azcopy, by default Azure install directory is C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy.
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
&.\AzCopy.exe /Source:C:\myfolder /Dest:https://myaccount.file.core.windows.net/myfileshare/ /DestKey:key /S
Use PowerShell Script, you could use to this link. The script use for ASM mode, you should change some command.
Based on my experience, using Azcopy is more easily and simpler.

The issue was resolved using the following approach:
Delete any existing mapped drive to the drive letter we are using with the net use <drive_letter> /delete command. This is done to make sure the drive was detached since the last time script ran
Map the drive again using the net use command
Copy all the files using robocopy
After that delete all the files using del command
Disconnect the drive now using the net use <drive_letter> /delete command

Related

Using azcopy to copy .bak files to Azure storage

Im trying to copy the contents of my sql server backup .bak files to azure storage using azcopy.
Im using this command in a batch .bat file -
"C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe" ^
/Source:"I:\" /Dest:https://hmlrservicesstorage.****************************/ ^
/DestKey:********************************************** /Pattern:"*.bak" /s /XO
On the I drive there are two folders, one a normal folder and the other a network share.
When I run the .bat file through a cmd prompt it runs fine copying all .bak files to the storage, yet when I run the same .bat file through a SQL Server Agent job, using exactly the same .bat file it only copies the network share folder across and not the normal folders contents.
I am copying from an Azure VM in the cloud to azure storage account.
SQL Server is installed on the VM in the Azure cloud.
I cant understand why and would greatly appreciate any help on this.

Unzip multiple files in multiple folders in ssis [duplicate]

I have a .tar.gz file. Now i need to unpack these files with SSIS package. Previously did unzip and delete for .zip files with the help of For each container and script task. Not sure how to do it for .tar.gz files. Any help?
You can use an execute process task to achieve this (or using process from Script task), but you have to install a ZIP application like 7Zip or Winzip or else. And use command line to Zip or Unzip archives.
Follow one of these links for more details:
Zip a folder using SSIS
7Zip command line examples
What command line parameters does WinZip support?

Programmatically access tfs build output

I'm trying to write a powershell script to allow a user to specify a tfs build id (or alternately a changeset id) and download the build output to the current directory. I have the build configured to copy the output to the server, which means only the most recent build output will be accessible in that directory. However from Visual Studio, or from the TFS Web Access, I can download the drop as a .zip file.
How can I access this .zip file programmatically (either in powershell, or even if I could figure out VB code to do this I can convert it to a powershell script)? Am I thinking about build output wrong, and there's a easier, more obvious way to handle this? Is the build output of the older builds being stored somewhere else on the server, or is it store in the database? Should I be configuring the build differently to store each build in a separate folder rather than overwriting each build in a single folder?
You can access the download zip via a properly constructed URL. For example:
https://{AccountName}.visualstudio.com/DefaultCollection/{TeamProject}/_apis/build/builds/{BuildId}/artifacts/drop?%24format=zip

Yii2 archive installation symbolic link issue

I am trying to install Yii framework via archive file in Windows 7...after downloading the file I tried extracting it into a folder in my C:\wamp\www\bid location, however, I encounter an error saying this:
! C:\wamp\www\bid\yii-basic-app-2.0.2.tgz: Cannot create symbolic link C:\wamp\www\bid\basic\vendor\bin\markdown
A required privilege is not held by the client.
! C:\wamp\www\bid\yii-basic-app-2.0.2.tgz: Cannot create symbolic link C:\wamp\www\bid\basic\vendor\bin\yii
A required privilege is not held by the client.
I thought that perhaps Winrar can't extract tgz files so I downloaded 7-zip...Using 7-zip it extracted into a tar file and the tar file was extracted with no errors...For some reason though I do not seem to have the framework folder that seem to be in other people's directory structure...Moreover after trying create my own framework folder and performing this command:
yiic webapp C:\wamp\www\bid
It states that yiic is not recognized as an internal or external command
Can someone tell me what I'm doing wrong...I've tried setting this up numerous times and failed
You need to run 7-Zip File Manager in administrator mode.
Right-click the icon of 7-Zip File Manager, and then click "Run as administrator".
you need to call the command with php since it is actually a external command, i do it like this
php ./yiic webapp NameOfApp
This way your terminal understand that yii is a program that runs with php.
Another solution is to add the yii path to your environmental variable, regards

Backing up source files managed by source control software: TortoiseSVN

I am new to source control and I am confused with something I read on a webpage yesterday (I don't have the link). I have followed these instructions: "create folder structure", then "Start Reprobrowser", then copy source files into trunk folder. Please see the screen shot below:
However, when I navigate to the folder using Windows Explorer I do not see this folder structure. I see this:
Therefore I am wandering: where are the files physically stored? The reason I ask is because I want to ensure that NetBackup (corporate backup tool) backs up the correct directories.
To make sense of the repository structure you need to read all the documentation on SVN, but the preferred way to backup a SVN repository is through the command
svnadmin dump your_svn_repository_path > destination_filename_backup.svn
You could put this command in a scheduled task running sometime before your corporate tool execute the full backup of your data and include the destination_filename_backup.svn in your backup job
If you ever need to restore the backup (after recreating the repository) you could use the command
svnadmin load your_svn_repository_path < destination_filename_backup.svn