We are using azcopy to transfer the contents of a web application to azure file service in a cloud migration project. The issue is it does not copy the empty folders by default. So after content transfer, application throws exception in many cases where it expects the folder structures to be present. My question is is there any command to copy including the emlty folders in az copy??
I checked the documentation. There it is mentioned that upload and download of empty folders is not done in azcopy. For copying, it is not mentioned.... so we were looking for some commands to do the same but unable to find any. Is this even possible with the tool??
Thanks in advance.
AzCopy does not copy empty folders, as you discovered, so what you'd need to do is to recreate the folder structure. Some PowerShell commands to do this are:
(gci C:\Scripts -r | ? {$_.PSIsContainer -eq $True}) | ? {$_.GetFiles().Count -eq 0} | select FullName
(Powershell example from this article)
This PowerShell line gives you a list of all empty folders. You can then pass that list over to your folder creation routine to solve your problem.
Related
I am brand new to Azure.
I have created a data lake gen2 storage account and a container inside it and saved some files and folders in it.I want to list all the files and folders in azure synapse notebook so that i can process a particular file. I am using this command
mssparkutils.fs.ls("abfss://iogen2#demoadlsgen2.dfs.core.windows.net/first/")
but it giving me only one output like:
[FileInfo(path=abfss://iogen2#demoadlsgen2.dfs.core.windows.net/first/stocks, name=stocks, size=0]
I want my answer in a list like:
'abfss://iogen2#demoadlsgen2.dfs.core.windows.net/first/stocks/',
'abfss://iogen2#demoadlsgen2.dfs.core.windows.net/first/stocks/2022-03-06/',
'abfss://iogen2#demoadlsgen2.dfs.core.windows.net/first/stocks/2022-03-06/csv_files/',
'abfss://iogen2#demoadlsgen2.dfs.core.windows.net/first/stocks/2022-03-06/csv_files/demo.csv'
Apparently when i am using os.listdir it is giving an error:
FileNotFoundError: [Errno 2] No such file or directory:
Can anyone please help me in this
As per the repro from my end, it shows all the files in the folder.
Here is files contained in the folder named sample:
I'm able to get the all the files contained in the folder named sample:
If you want to use the os.listdir you need to use file mount/unmount API in Synapse.
I can store files to the specified storage location via the GUI. I can see the files are in the storage location.
When I try to download them using the GUI, I get this every time.
{"error":{"code":3,"message":"Unauthorized request","class":"Directus\\Exception\\UnauthorizedException","file":"\/var\/www\/directus\/src\/helpers\/app.php","line":287}}
When I try the links from the File library, I get the same error.
I found some old topics concerning a "_" project. I do not see any "_" entries in my project.php configuration.
Everyone has read permissions for the storage directory.
The rest of the system appears to run without error.
check the folder in the server and what is been set, the default should be, like
So then if you want to access the 300x300 the URL should be like:
domain.com/public/uploads/Directus/generated/w300,h300,fcrop,q80/file-name.jpg
I want to move all the files that in a File share on azure. I can do it at the moment by following way using:
Use "net" command to connect to the network drive and assign it a drive letter
Then there's a bat file that uses "move" command like move g:\files\* c:\files to then move the files which runs every hour to check if there are files and move them using windows task scheduler.
But I don't want to use this way because:
The drive will be disconnected if the Machine needs a restart and hence the process doesn't remain automated as someone will have to mount the drive again
The "move" command doesn't moves folders, it moves only files.
Is there a better way of managing this? We don't want to install tools like AzCopy but using Powershell is feasible.
According to your description, I suggest you can do it as the following ways:
1.Call Azcopy in PowerShell.
You could install Azcopy firstly, you could download the latest version from the link. Azcopy supports upload directory to Azure fileshare. You could use the following command.
AzCopy /Source:C:\myfolder /Dest:https://myaccount.file.core.windows.net/myfileshare/ /DestKey:key /S
Then you could write a bat script to call Azcopy, by default Azure install directory is C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy.
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
&.\AzCopy.exe /Source:C:\myfolder /Dest:https://myaccount.file.core.windows.net/myfileshare/ /DestKey:key /S
Use PowerShell Script, you could use to this link. The script use for ASM mode, you should change some command.
Based on my experience, using Azcopy is more easily and simpler.
The issue was resolved using the following approach:
Delete any existing mapped drive to the drive letter we are using with the net use <drive_letter> /delete command. This is done to make sure the drive was detached since the last time script ran
Map the drive again using the net use command
Copy all the files using robocopy
After that delete all the files using del command
Disconnect the drive now using the net use <drive_letter> /delete command
I am trying to zip the contents of a Folder in SSIS, there are files and folders in the source folder and I need to zip them all individually. I can get the files to zip fine my problem is the folders.
I have to use 7.zip to create the zipped packages.
Can anyone point me to a good tutorial. I haven't been able to implement any of the samples that I have found.
Thanks
This is how I have configured it.
Its easy to configure but the trick is in constructing the Arguments. Though you see the Arguments as static in the screenshot, its actually coming from a variable and that variable is set in the Arguments expression of Execute Process Task.
I presume you will have this Execute Process task in a For Each File Ennumerator with Traverse SubFolders checked.
Once you have this basic setup in place, all you need to do is work on building the arguments to do the zipping, how you want them. A good place to find all the command line arguments is here.
Finally, the only issue I ran into was not providing a working directory in the command line arguments for 7zip. The package used to run fine on my dev environment but used to fail when running on the server via a SQL job. This was because 7zip didn't have access to the 'Temp' folder on the SQL Server, which it uses by default as the 'working directory'. I got round this problem by specifying the 'working directory as follows at the end of the command line arguments, using the -ws switch:
For e.g:
a -t7z DestinationFile.7z SourceFile -wS:YourTempDirectoryToWhichTheSQLAgentHasRights
I am trying to setup a backup/restore using S3. The upload sync worked well using s3sync. However, next to each folder there is an empty file with matching name. I read somewhere that this is created to define the folder structure but I am not sure about that as it doesn't happen if I create a folder using a different method s3fox etc.
These empty files prevent me from restoring the directories/files. When I do s3cmd sync, I get an error message "can not make directory: File exists" as it first creates that empty file and that fails when trying to create the directory. Any ideas how I can solve this problem?