Azure Static Websites feature (preview): uploading files - azure-storage

Testing the new Azure Static Websites feature: I tried it out, uploaded a file via html5 Mobile Device Camera Access and now I can't find it? Where do uploads go? Do I need to set up a public access or similar? Is there a "Kudu" access? Thanks for a tip.

Where do uploads go?
If you upload file upload file to your website, you could access it in the $web container in your storage account. Also, the uploaded file can be viewed in a web browser at the corresponding web endpoint likehttps://yourstorage.zxx.web.core.windows.net/xxx.xxx.
Do I need to set up a public access or similar?
No, you needn't. The web service endpoint always allows anonymous read access, returns formatted HTML pages in response to service errors, and allows only object read operations.
Is there a "Kudu" access?
If you use Azure Static Websites feature, your websites are all based on azure storage, so there is not a Kudu access, I think you could try to use Storage Explorer (preview) in the portal.
For more details about Azure Static Websites feature, refer to this article.

Related

Service Account for Google Data Studio to Access HTML Files on Google Cloud Storage

I have some HTML files uploaded into a Google Cloud Storage bucket that I would like to embed through an iframe on my dashboard in Google Data Studio.
This works just fine when I open access to the world on the bucket (or resource) by setting the AllUsers permission.
However, I would prefer to only allow access through Google Data Studio. How can this be achieved?
I was thinking of adding a permission for the Service Account of Google Data Studio, but don't really know how to configure this correctly.
I don't believe this is possible right now.
However, a complex solution I can think of is to use a combination of Community Connectors and Community Viz:
Build a community connector that uses your own GCP service account to read the HTML files on GCS and send back the raw HTML content as data.
Build a community viz that can take the HTML data from the connector and render the HTML.
If you have multiple HTML files, you can setup filters in Data Studio so that each viz renders only one HTML.
Code samples for Community Connector and viz are available here.

How to access BigCommerce internalapi?

I am trying to download (backup) images that customers upload for products that take custom logos (these are typically JPG, PNG, PDF, etc.) These customer files are downloadable by clicking on a hyperlink in the BigCommerce admin page for the order in question. The link is not a link to the image path but instead, a link to a service that sends the file to the browser. In other words, you have to be authenticated into the admin site to download the file. The URL looks like this:
https://mystore.com/internalapi/v1/orders/383945/products/251438/attributes/561518/download
https://mystore.com/internalapi/v1/orders/{order id}/products/{lineItem id}/attributes/{option id}/download
These are easily constructed in the API itself for a given order. If I use the link in a browser tab while I'm logged into the admin site, the file downloads.
But what I am trying to write an app to automatically download all the files (there are thousands). When I try to use this URL in an app, I get a authentication error. I tried at first using my regular API credentials but then used the credentials to log into the admin site. Both give me an authentication error.
I could not find anything documented on this so-called "internalapi." Anyone ever try to use this "internal" API that is used by the admin site?
I believe authentication is cookie based for that internal API, but there could be problems with using our non-publicly documented internal APIs in production, i.e. we may make future updates that would be breaking changes.
Images attached to orders through a file upload option also get copied to WebDAV, in the dav/product_images/configured_products folder. Another way to do this could be to use a WebDAV client library like easywebdav to connect and download the files.

How to open documents with Office365 in my web application?

We have a cloud based audit application. While performing audit a user typically uploads a lot of documents. Currently in order to view the documents he has to download them. Business requirement is that on clicking the document it should directly open up in another browser tab using office 365 just like dropbox/onedrive. The user should be able to view, edit, save it on server (without downloading) and close it. How to achieve that in our application?
Our webapp is built using ReactJS, NodeJS & MongoDB. Whenever a user uploads a document it gets saved in a AWS S3 bucket.
I went through Microsoft Graph API and OneDrive RestAPI's. Looks like the only solution is to use the OneDrive API's to save files in OneDrive instead of S3. And then it should allow you to use the Office365 apps. Is this the right solution? Am I missing anything?
Is there any other solution?
While the easiest solution is indeed to store the documents in OneDrive, there's also another way. You can enroll in Microsoft's Cloud Storage Partners Program and implement the WOPI protocol on your service. This would allow the Office Online viewers/editors to integrate with your service's data directly.
You need to use both aws and O365 api to reach a working solution. Try the following steps (PS: I have not tried this. But I have saved edited documents from Office 365 to AWS)
Read the uploaded document from AWS using AWS api's and upload it to office doc using office doc api.
Edit the doc using office docs api.
Save the doc back to S3

Uploading static files to Keystone.js

I'm evaluating potential content management systems I want to use for a project. Many of the users will need to upload static files and include links to the in their posts.
In the Admin UI I can only see the ability to upload an image in a post. Does anyone know if it is possible to upload files to Keystone through the Admin UI?
You could use their Amazon S3 storage adapter. Depending on which version of Keystone you're using (3 or 4), you'll have to do some different things. Either way, you need to make some credentials for Amazon S3's service and configure Keystone to work with it. From there, you can use Types.S3File to allow a certain part of your MongoDB model to be a reference to an S3 object. See this page for more info on the S3File type in Keystone.

Allowing read and write access to Google Drive files to unauthenticated clients

We have been working on a web service (http://www.genomespace.org/) that allows computational biology tools that implement our REST API to, among other things, read and write files stored on "The Cloud".
Our default file storage is a common Amazon S3 bucket, but we now allow users to mount their own private S3 bucket as well as files on Dropbox.
We are now trying to enable similar functionality for Google Drive and have run into some problems unique to Google Drive that we have not encountered with S3 or Dropbox.
Only way to allow clients that are not Google-authenticated to read files unobtrusively is to make the files "Public". Our preference would be that once the user has authorized access to our application via OAuth2, the user files could remain "Private" in Google Drive.
However, even though the user has already authorized our web service to offline access to their "Private" files, we have not found a way to generate a URL that a client authorized by our system can use to GET the file directly without the client being logged into Google as well.
The closest we have come to this functionality has been to change the file permissions to "Anyone with Link", except that for files greater than 20MB Google insists on returning an intermediate web page warning that the file has not been scanned for viruses. In addition to having to mess with file permissions, this would break our existing clients. Only when the file is "Public" and we utilize URLs of the form https://googledrive.com/host/PARENT_FOLDER_ID/FILENAME can non-Google clients read the files without interference.
Have not found any way for clients that are not Google-authenticated to upload a file to Google Drive. Our API allows our authorized clients to PUT files directly to the backing file storage using URLs provided by our server. However, even if a folder is marked "Public", the client requires Google authentication credentials to save to Google Drive. We could deal with both of these issues with intermediate hops through our system (e.g., our web server would first download the file from Google Drive and then allow the client to GET it) however this would be woefully inefficient and, hopefully, unnecessary. These problems have been discussed multiple times before on stackoverflow (e.g. here and here and have read the responses very carefully, but have not seen any recent discussion.
The Google folks direct their API users to post on stackoverflow for support, so I am hoping for a fresh look from insiders.
The general answer is: dont make the drive requests through the user's browser. Insead do everything from your servers. You are the one having the (refresh) tokens for users, so you should make all requests like a proxy between the user and Drive. Same for downloading, you download it and return to the user. As long as you use each drive's token there shouldnt be rate limit/quota issues.