Configuring remote file storage account (on Akamai) and there is an allow/do not allow az2z zip uploads - searched through all their documentation and checked everywhere else I though that would be sequitur, as well as searched for it and can't find az2z zip defined anywhere. Anybody have an idea what this is?
Just from recall of using Akamai back in 2006/2007.
az2z was a command that would index zip files.
After indexing the cdn would then be able to directly address files within the zip file.
The intention was that the files where not expanded after they were uploaded, but that they could efficiently perform an offset and expand of the file of interest from within the zip file when requested.
Related
I'm trying to upload files (xlsx, docx, txt, pdf, img, etc.) saved in AWS cloud storage to OneDrive using the HTTP module (Get a file) in Integromat.
The contents of the file are uploaded to OneDrive without any corruption, but the file name is always "file.extension".
Any type of file brought the same result.
When I check the Operation of Integromat, I found that the file name is "file.extension" at the time of Output of Get a file.
I'm hoping you can point out what I'm doing wrong.
In addition, my colleague tried uploading a file with the same settings as above, and it was successfully uploaded with the correct file name.
Thank you.
One way would be to grab the file name from the URL. (I don't have enough reputation to embed images as yet, so hopefully the links will work)
Steps involved might look like this:
Step 1
Step 2
Step 3
Step 4
I need to upload files and folders to the server while preserving the hierarchy. At the moment I am using a plugin multiFileUpload that allows you to upload multiple files at the same time, but it ignores the selected folders. I know that neither vaadin nor Html5 has a universal solution that works everywhere for uploading folders.
I'm ready to write my own solution, but climbed the Internet can't find a way to display file selection (perhaps there will a JavaScript call) but the main question - is it possible somehow to POST a request Vaadin's and upload files by way of creating subfolders in which they were?
You can only upload files, not folders. It's simply not doable.
You can upload any number of files, but they won't be structured into folders.
I see two possibilities how you could still achieve what you need if you really wanted to, even if it changes the user experience a bit:
Let the user upload a .zip file of his folder structure. When they upload it, you unzip it on the server side and have now access to all the files in the correct folder structure.
Let the user upload all his files within his folder structure. After all files have been uploaded, You display all the files in a TreeGrid where the user can recreate the original structure using Drag-and-Drop or similar.
I have been given a program that uploades pdf files to an ftp server, which is something I never did. I've been asked what the behavior regarding attempting to upload a duplicate filename is. It apparently doesnt check for duplicate filenames manually, but the comand that uploads the file is My.Computer.Network.UploadFile and I can't find what happens when attempting to upload a duplicate file anywhere, does it throw an exception or overwrites the file?
It looks like My.Computer.Network.UploadFile is a wrapper around WebClient.UploadFile, and the documentation for that states:
This method uses the STOR command to upload an FTP resource.
In the FTP RFC 959 it says (I highlighted the relevant part):
STORE (STOR)
This command causes the server-DTP to accept the data
transferred via the data connection and to store the data as
a file at the server site. If the file specified in the
pathname exists at the server site, then its contents shall
be replaced by the data being transferred. A new file is
created at the server site if the file specified in the
pathname does not already exist.
So, if everything is following standards (and that part of RFC 959 hasn't been replaced, I didn't dig further!), then it should replace the existing file. However, it is possible for the server to deny overwriting of existing files, so the behavior is not guaranteed.
Of course, the best thing to do would be to try it out in your environment and see what it does.
So we have a client that creates "training packages" and then uploads them via ftp to their website. They create the training packages in PowerPoint, and then use some program to convert them into html/swf files and package them within a folder. When they upload, they use Filezilla, and just transfer the entire folder over. The folder is uniquely named, uses no spaces or special characters.
These files have uploaded fine for about a year. Recently, they've run into a problem. Whenever they try to upload training package folder, they are immediately presented with the "This file already exists, do you want to overwrite?" message. Except... the folder they're moving is brand new, and the file it's asking to overwrite DOESN'T EXIST. When they choose "Overwrite" the file looks like it transfers, but the file size is wrong, and the training package doesn't work correctly.
This happens with every training package they try to upload. It's not just a badly outputted package. Also, it's always the same file that has the problem--it's the main "player" for the training package, and though it contains different content for every package, it is the same file name (cplayer.swf) every time.
Things they've tried without success:
-Re-uploading the file again by itself, and overwriting
-Deleting the "bad" file and re-uploading the single file - Get the overwrite message again, even though the file DOES NOT EXIST.
-Renaming the file on the server and re-uploading the single file - Get the overwrite message.
-Renaming the single file locally within the package and uploading/renaming it - Won't let us rename because the file already exists.
-Used another FTP client - Same results as above, so not a client specific problem.
-Used a different FTP login - Same results as above, so not a permissions problem.
Other things of note:
-The file is small--it's not a time out problem. Plus, all other files upload fine, and some are a lot larger.
-They've emailed this file to me, and I've uploaded it successfully.
I am completely at my wits end. Does anyone have any ideas where I can at least troubleshoot a little further?
Thanks for the non-help, the downvote, and the general lack of response on what was a pretty serious issue for me.
In case anyone else has a similar problem, here's what was going on:
Virus software (specifically Malware Bytes) was blocking THIS ONE SINGLE FILE. All I had to do was exclude the folder that contained the file.
We're using Amazon S3 for file storage and recently found out that we need to keep some sort of directory structure. Since S3 doesn't allow that, we know we can name the files according to their structure for storage. For example...
abc/123/draft.doc
What I want to know is if I want to provide a public link to this particular file is there anyway that the file can simply be draft.doc instead of abc/123/draft.doc ?
I feel stupid. After some more investigation I realized that by creating a GET url to the resource, I get exactly what I need.