Update a Word file that has been created in ACC - file-upload

I need to update a Word file that has been created in ACC. I can download the file, but when I try to upload it again, I get the error: 'Only the bucket creator is allowed to access this api.'
It seems you can only upload files to buckets the application has created. Is this correct ?
Note that I don't want to create a new version of the file.

Looks like you were uploading the new file via the bucket wip.dm.prod directly, which is owned by Autodesk Cloud products, e.g. BIM360 Docs/ Autodesk Docs(ACC Docs). It's expected that you cannot do that directly, since you're not the bucket owner.
To upload a new file version to Autodesk Cloud products, you will need to do the following.
Request a storage location: https://forge.autodesk.com/en/docs/bim360/v1/tutorials/document-management/upload-document/#step-5-create-a-storage-object
Create Additional Versions of the File for the updated file: https://forge.autodesk.com/en/docs/bim360/v1/tutorials/document-management/upload-document/#step-5-create-a-storage-object
Note. Forge Data Management API is forward compatible with ACC.

Related

How to upload a 9GB file with extension sql.gz to SQL BigQuery Sandbox

I want to upload a 9.6 GB file with the extension .sql.gz to my SQL BigQuery Sandbox (free) account. I received a message that the file is too big and that I need to upload it from the cloud. When trying to upload it from the cloud, I am asked to create a bucket, and if I want to create a bucket, I get the message: "billing must be enabled". Is there any alternative, specifically for an sql.gz file?
As of now, there is no alternative but to upload .gz files files to a bucket in Cloud Storage and use the bq command-line tool to create a new table.
You may enable billing for your existing project to use Cloud Storage.

How can I set a file upload function?

I am creating an SAPUI5 WebApp with an file upload function. I try this example from SAPUi5 Explored: sap.m.sample.UploadCollection
I try with my trial account in SAP WebIDE to set the upload function ( Upload Collection)
The issue is, not allowing to upload a file in the project folder or local desktop folder.
If I upload a file it appears but i can't open it and I get a 405 HTTP
error.
Any Ideas, what the problem is?
like you see already in your posts comments, you need a backend for this task. The UploadCollection control is only usable with a backend in the background which receives the transmitted file from the control.
On the page https://sapui5.hana.ondemand.com/#/api/sap.m.UploadCollection you see
This control allows you to upload single or multiple files from your devices (desktop, tablet or phone) and attach them to the application
while you can replace "application" with "receiving backend"
Indipendent of this may I allowed to ask where do you think the file should be uploaded when not to a backend system? I mean when you choose a file from your local storage it doesn't make any sense to upload it again to your local storage?!

How can I list all uploads for a project?

I would like to access the list of all uploads that have been added to a given project on my company GitLab server.
I don't mean versionned files, I mean attached files: binaries and other types of files that have been attached to issues, merge requests, etc.
It's OK if I have to use the API for that.
What I've tried
My first approach was through GET /projects/:id/repository/files/:file_path, but that's for the versionned files.
Then, I found out about POST /projects/:id/uploads, but that's only for uploading and not for listing already uploaded files.
Is there a way to list all those uploaded files?
I believe this is not possible.
There is an open issue for retrieving specific files which has not received much attention:
https://gitlab.com/gitlab-org/gitlab-ce/issues/55520
Hopefully, in the future, there will eventually be an endpoint
GET /projects/:id/uploads
I had the same question and after getting in touch with gitlab support they confirmed that this is not currently implemented (as of now, November 2021), and forwarded me the 3 following feature requests :
API list all files on a project : https://gitlab.com/gitlab-org/gitlab/-/issues/197361
Attachment Manager : https://gitlab.com/gitlab-org/gitlab/-/issues/16229
Retrieve uploaded files using API : https://gitlab.com/gitlab-org/gitlab/-/issues/25838
A workaround seems to be to export the whole project, and you'll find the uploads in that archive, and you'll be able to list them.

How to create and interaction between google drive and aws s3?

I'm trying to set up a connection a Google Drive folder and S3 bucket, but I'm not sure where to start.
I've already created a sort of "Frankenstein process", but it's easy to use only by me and sharing it to my co-workers it's a pain.
I have a script that generates a plain text file and saves it into a drive folder. And to upload, I've installed Drive file stream to save it in my mac, then all I did was create a script using Python3, with the boto3 library, to upload the text file into different s3 buckets depending on the file name.
I was thinking that I can create a lambda to process the file into the s3 buckets but I cannot resolve how to create the connection between drive and s3. I would appreciate if someone could give me a piece of advise on how to start with this.
Thanks
if you just simply want to connect google drive and aws s3 there is one service name zapier which provide different type of integration without line of code
https://zapier.com/apps/amazon-s3/integrations/google-drive
For more details you can check this link out

How to receive an uploaded file using node.js formidable library and save it to Amazon S3 using knox?

I would like to upload a form from a web page and directly save the file to S3 without first saving it to disk. This node.js app will be deployed to Heroku, where there is no local disk to save the file to.
The node-formidable library provides a great way to upload files and save them to disk. I am not sure how to turn off formidable (or connect-form) from saving file first. The Knox library on the other hand provides a way to read a file from the disk and save it on Amazon S3.
1) Is there a way to hook into formidable's events (on Data) to send the stream to Knox's events, so that I can directly save the uploaded file in my Amazon S3 bucket?
2) Are there any libraries or code snippets that can allow me to directly take the uploaded file and save it Amazon S3 using node.js?
There is a similar question here but the answers there do not address NOT saving the file to disk.
It looks like there is no good way to do it. One reason might be that the node-formidable library saves the uploaded file to disk. I could not find any options to do otherwise. The knox library takes the saved file on the disk and using your Amazon S3 credentials uploads it to Amazon.
Since on Heroku I cannot save files locally, I ended up using transloadit service. Though their authentication docs have some learning curve, I found the service useful.
For those who want to use transloadit using node.js, the following code sample may help (transloadit page had only Ruby and PHP examples)
var crypto, signature;
crypto = require('crypto');
signature = crypto.createHmac("sha1", 'auth secret').
update('some string').
digest("hex")
console.log(signature);
this is Andy, creator of AwsSum:
https://github.com/appsattic/node-awssum/
I just released v0.2.0 of this library. It uploads the files that were created by Express' bodyParser() though as you say, this won't work on Heroku:
https://github.com/appsattic/connect-stream-s3
However, I shall be looking at adding the ability to stream from formidable directly to S3 in the next (v0.3.0) version. For the moment though, take a look and see if it can help. :)