The file name becomes "file.extension" whenever I upload a file to Onedrive using Integromat, - onedrive

I'm trying to upload files (xlsx, docx, txt, pdf, img, etc.) saved in AWS cloud storage to OneDrive using the HTTP module (Get a file) in Integromat.
The contents of the file are uploaded to OneDrive without any corruption, but the file name is always "file.extension".
Any type of file brought the same result.
When I check the Operation of Integromat, I found that the file name is "file.extension" at the time of Output of Get a file.
I'm hoping you can point out what I'm doing wrong.
In addition, my colleague tried uploading a file with the same settings as above, and it was successfully uploaded with the correct file name.
Thank you.

One way would be to grab the file name from the URL. (I don't have enough reputation to embed images as yet, so hopefully the links will work)
Steps involved might look like this:
Step 1
Step 2
Step 3
Step 4

Related

How to add the date to file name while uploading a file to s3 bucket using alteryx

I have a workflow in alteryx where I am downloading two files from two different urls. After making the required modifications I want to upload them to the s3 bucket as well as save a copy locally. I want to add the current date while saving the file in both cases. I was successful in using a formula tool to rename the file saved locally. But unable to do for the copy being uploaded to s3. Can anyone help me with this? PS: Since it's the company data I cant share the screenshot of the workflow.

Vaadin 8 multiupload with folder selection

I need to upload files and folders to the server while preserving the hierarchy. At the moment I am using a plugin multiFileUpload that allows you to upload multiple files at the same time, but it ignores the selected folders. I know that neither vaadin nor Html5 has a universal solution that works everywhere for uploading folders.
I'm ready to write my own solution, but climbed the Internet can't find a way to display file selection (perhaps there will a JavaScript call) but the main question - is it possible somehow to POST a request Vaadin's and upload files by way of creating subfolders in which they were?
You can only upload files, not folders. It's simply not doable.
You can upload any number of files, but they won't be structured into folders.
I see two possibilities how you could still achieve what you need if you really wanted to, even if it changes the user experience a bit:
Let the user upload a .zip file of his folder structure. When they upload it, you unzip it on the server side and have now access to all the files in the correct folder structure.
Let the user upload all his files within his folder structure. After all files have been uploaded, You display all the files in a TreeGrid where the user can recreate the original structure using Drag-and-Drop or similar.

FTP client sees a file that isn't there... How can I successfully delete/overwrite this "ghost" file?

So we have a client that creates "training packages" and then uploads them via ftp to their website. They create the training packages in PowerPoint, and then use some program to convert them into html/swf files and package them within a folder. When they upload, they use Filezilla, and just transfer the entire folder over. The folder is uniquely named, uses no spaces or special characters.
These files have uploaded fine for about a year. Recently, they've run into a problem. Whenever they try to upload training package folder, they are immediately presented with the "This file already exists, do you want to overwrite?" message. Except... the folder they're moving is brand new, and the file it's asking to overwrite DOESN'T EXIST. When they choose "Overwrite" the file looks like it transfers, but the file size is wrong, and the training package doesn't work correctly.
This happens with every training package they try to upload. It's not just a badly outputted package. Also, it's always the same file that has the problem--it's the main "player" for the training package, and though it contains different content for every package, it is the same file name (cplayer.swf) every time.
Things they've tried without success:
-Re-uploading the file again by itself, and overwriting
-Deleting the "bad" file and re-uploading the single file - Get the overwrite message again, even though the file DOES NOT EXIST.
-Renaming the file on the server and re-uploading the single file - Get the overwrite message.
-Renaming the single file locally within the package and uploading/renaming it - Won't let us rename because the file already exists.
-Used another FTP client - Same results as above, so not a client specific problem.
-Used a different FTP login - Same results as above, so not a permissions problem.
Other things of note:
-The file is small--it's not a time out problem. Plus, all other files upload fine, and some are a lot larger.
-They've emailed this file to me, and I've uploaded it successfully.
I am completely at my wits end. Does anyone have any ideas where I can at least troubleshoot a little further?
Thanks for the non-help, the downvote, and the general lack of response on what was a pretty serious issue for me.
In case anyone else has a similar problem, here's what was going on:
Virus software (specifically Malware Bytes) was blocking THIS ONE SINGLE FILE. All I had to do was exclude the folder that contained the file.

What is an az2z zip?

Configuring remote file storage account (on Akamai) and there is an allow/do not allow az2z zip uploads - searched through all their documentation and checked everywhere else I though that would be sequitur, as well as searched for it and can't find az2z zip defined anywhere. Anybody have an idea what this is?
Just from recall of using Akamai back in 2006/2007.
az2z was a command that would index zip files.
After indexing the cdn would then be able to directly address files within the zip file.
The intention was that the files where not expanded after they were uploaded, but that they could efficiently perform an offset and expand of the file of interest from within the zip file when requested.

Orchard CMS 1.7 Media with S3 Storage

Wanting to use Orchard 1.7 with Media storage on S3 (as I'm deploying to AppHarbor)
So far I'm looking at the S3 Storage provider But its a bit out of date.
Has anyone done this ? is there a better way to use S3 with the new media manager?
I've got images uploading to s3, but they don't display when I click the folder.
here is the Gist of my updated S3Provider
Missing methods for create file, rename folder, get file, and Get storage path. any help on how to complete these would be appreciated.... however stepping through the debugger in VS this doesn't seem to be the root cause of my displaying images issue above.
Edit
Looks like the file is up loading to s3 but not to the database, due to the GetFile method throwing an error...
Edit 2
Added some code to the Get file method. Now that works; (gist updated) Can up load images. However the thumbnails are still not working, they just come back as empty tags ...Think this is because the media manager is using the Open get method - which is supposed to open a file so you can write a stream to it. Don't know how to achieve this with S3... any ideas welcome
As Part of the AWSSKD NuGet package version 1.5.28.3 you can access a S3FileInfo object. I've used this in my S3 Storage File and updated the S3 Storage provider.
This seem to work, need to do a bit more testing on it.
NOTE: I had to add some code on the GetFile Method to ensure the permissions where set correctly otherwise the updating of thumbnails overwrote permissions on the file.... I'm sure there is a better way to do this.