Is it possible to use Azure Blob Storage on a website that has no authentication? - file-upload

I need to create a way for anyone who visits my website to upload an image to an Azure Blob Container. The website will have input validations on the file.
I've considered using an Azure Function to write the validated file to the Blob Container, but I can't seem to find a way to do this without exposing the Function URL to the world (similar to this question.
I would use a System-Assigned Managed Identity (SAMI) to authenticate the Function to the Storage account, but because of this, anyone could take the Function URL and bypass the validations and upload.
How is this done in the real world?

If I understand correctly, the user uploads a file via an HTTP POST call to your server, which validates it. You would like to use an Azure Function to then upload the validated file to the Blob Storage.
In this case, you can restrict the access to the Azure Function; so that it can only be called from your server's IP. This way the users cannot reach that Function. This can be done via the networking settings, and is available on all Azure Function plans.
You could also consider implementing the validation logic within the Azure Function.
Finally (perhaps I should have started with this), if you are only considering writing an Azure Function to upload data to a Storage Account, you should perhaps first consider using the Blob Service REST API, specifically the PUT Blob endpoint. There are also official Storage Account SDKs for different languages/ecosystems that you could use to do this.

• Since, you are using an Azure function default generic URL on your website for uploading blobs with no authentication, I would suggest you to please create an ‘A’ host record for your function app. Considering that you have a website, you may be having a custom domain for your website to be unique and as you might be having a custom domain, the custom domain’s DNS records must be hosted on a public DNS server. Thus, similarly, on the same public DNS server, you will have to create an ‘A’ host record for the function app and assign it the same public IP address that is shown and assigned in Azure. This will ensure that your public DNS server has an active DNS resolver for the function app globally and then ensure to create a ‘CNAME’ record for your default generic Azure function app URL with the same URL as the alias in the DNS records and the ‘A’ host record as the assigned value in it.
In this way, whenever, any anonymous person visits your website and tries to upload an image, he will be shown the function app URL as ‘abc.xyz.com’ and not the generic Azure function app URL thus successfully ensuring that your objective is achieved.
• Once the above said has been done, then publish the new ‘CNAME’ record created in the public DNS server as your function app URL. This will not expose the generic Azure function app URL and mask it as well as ensure that it is secured since you will be uploading an SSL/TLS certificate for the website to be HTTPS protected in the function app workspace itself as shown below in the snapshot: -
For more information, kindly refer the below documentation link: -
https://learn.microsoft.com/en-us/azure/dns/dns-custom-domain

Related

is there a full Nextcloud API accessable from outside?

I use Nextcloud as a normal user to store and share files.
I decided to use it as a backend for a web application I am developing so that I can store the files in Nextcloud while the frontend is done by me.
I spent some hours on the API docs
https://docs.nextcloud.com/server/latest/developer_manual/client_apis/WebDAV/index.html
and, with some disappointment, unless I have not made a mistake, I realized that the only API that can be used from outside Nextcloud is the WebDav API.
This is a minimalistic API that allows doing basic things such as uploading a file by passing the full path like with this GET statement (authenticated by basic auth passing username and password in the headers:
GET https://nextcloud.example.com/remote.php/dav/files/username/FolderOne/SubFolderTwo/HelloWorld.txt
This will download the file located in /FolderOne/SubFolderTwo/HelloWorld.txt
with a PUT request, it is possible to overwrite the file by passing the file content in the raw body request
This is very effective but minimalistic.
I was expecting to have a full REST API to access more properties and perform complex operations.
Could you please tell me if I missed some important information?
There is the OCS API but it works only from inside Nextcloud.
Thanks.
A full REST API is avaiable - https://docs.nextcloud.com/server/22/developer_manual/client_apis/OCS/ocs-api-overview.html
Create a Share - https://docs.nextcloud.com/server/latest/developer_manual/client_apis/OCS/ocs-share-api.html
The OwnCloud documentation also offers more examples
https://doc.owncloud.com/server/10.8/developer_manual/core/apis/ocs-share-api.html
You can register an App id and use that to login or passthru a username and password in the authentication header.

logic app, 403 if try to connect to storage behind firewall

i deployed a logic app standard to use vnet integration. In our scenario we want to get attachment from an email and store it to a storage account type datalake. We are using following connectors:
Office 365 and
Azure blob Storage
the problem is that our storage are behind firewall and private endpoint. If storage account are in all network flow work but not work if it is under firewall and we got 403(logic app although is under vnet integration, pass over internet as i can see on log analytics).
i also following microsoft doc and also this link without success:
https://techcommunity.microsoft.com/t5/integrations-on-azure/deploying-standard-logic-app-to-storage-account-behind-firewall/ba-p/2626286
i also tried this and works:
https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-azureblobstorage?WT.mc_id=Portal-Microsoft_Azure_Support&tabs=single-tenant#access-blob-storage-with-managed-identities
but i got file corrupted E.g. body of other connector if is a csv or if attachment is an excel file is corrupted. here the flow via https:
There is a way to use vnet integration and storage private endpoint or there is a way to take the attachment and save it as-is via https connector? (independently by file extension)?
Standard Logic app with private end point, cannot access the storage account with private end point. Storage account can be used as storage for logic app. But accessing the storage account and files from Logic app is not possible, if both resources are in same region.
To achieve this, we need to create Logic app and storage in 2 different region and whitelist the Ip. Refer the below link
https://techcommunity.microsoft.com/t5/integrations-on-azure-blog/access-azure-blob-using-logic-app/ba-p/1451573

Connecting within ArcGIS application with resource

I have following dillema:
Using ArcGIS Enterprise 10.8, I have added a new item – Application – to a users content.
This generates an Application item, with an App ID and APP Secret, along with App type and redirect URIs defined.
These can be used to generate an access token via the OAUTH2 token endpoint:
https:///sharing/rest/oauth2/token
using the parameters :
client_id=APPID&
client_secret=APPSECRET&
grant_type=client_credentials
ESRI States in their documentation:
“Successful authentication directly returns a JSON response containing the access token that allows the application to work with resources that are accessible to the application (that is, have been shared with the application). Use of the client_secret as previously described is mandatory.”
Question is: how do we share resources with the application?
The overall goal is to grant an external application (unknown user) access to portal ressources (ie.a layer item) via OAUTH2 app login.
Do you have any suggestions?
This is certainly confusing documentation, but I have found it useful to review this page: Limitations of App Login.
Specifically:
Applications cannot create, update, share, modify, or delete items
(layers, files, services, maps) in ArcGIS Online or ArcGIS Enterprise.
... If you want to access private content within an organization or
content that has been shared with a user, you must use the named user
login pattern for authentication.
For what you want to do, you'll most likely want to create a non-expiring refresh token based on a specific user, and store that in with your external application.

Allow API users to run AWS Lambda using execution role from Cognito identity pool

I'm using AWS amplify to create an app, where users can upload images using either private or public file access levels, as described in the documentation. Besides this, I've implemented a lambda function which upon request through API gateway modifies an image and returns a link to the modified image.
What I want is that a given user should be able to call the API and modify only his own images, but not that of other users; i.e. allow the AWS lambda function to use the execution role from the cognito user. If I allow the lambda function to access all data in the S3 bucket then it works fine - but I don't want users to be able to access other users images.
I've been at it for a while now, trying different things to no avail.
Now I've integrated the API with the user pool as described here:
https://docs.aws.amazon.com/apigateway/latest/developerguide/apigateway-enable-cognito-user-pool.html
And then I've tried to follow this guide:
https://aws.amazon.com/premiumsupport/knowledge-center/cognito-user-pool-group/
Which does not work since the "cognito:roles" is not present in the event variable of the lambda_handler (presumably because there are not user pool groups?).
What would the right way be to go about this in an AWS Amplify app?
Primarily, I've followed this guide:
https://aws.amazon.com/premiumsupport/knowledge-center/cognito-user-pool-group/
Use API Gateway request mapping and check permissions in Lambda itself:
Use API Gateway request mapping to pass context.identity.cognitoIdentityId to Lambda. Just it should be a Lambda integration with mapping (not a Proxy integration). Another limitation is that API request should be POST, for GET it's also possible if you map cognitoIdentityId to query string.
Lambda has access to all files in S3
Implement access control check in Lambda itself. Lambda can read all permissions of the file in S3. And then see if owner is Cognito user.

How do I access Google Drive Application Data from a remote server?

For my application I want the user to be able to store files on Google Drive and for my service to have access to these same files created with the application.
I created a Client ID for web application and was able to upload/list/download files from JavaScript (client side) with drive.appfolder scope. This is good, this is half of what I want to do.
Now I want to access the same files from Node.js (server side). I am lost as to how to do this. Do I create a new Client ID for the server? (if so, how will the user authenticate?) Do I pass the AuthToken my user got client-side and try to use that on the server? I don't think this will work as the AuthToke is time-sensitive (and probably not intended to be used from multiple IPs).
Any direction or example server-side code will be helpful. Again, all I want is to access these same files the user created with my application, not any other files in the user's Google Drive.
CLARIFICATION: I think my question boils down to: "Is it possible to access the same Application Data on Google Drive both client-side and server-side?"
Do I create a new Client ID for the server?
Up to you. You don't need to, but you can. See below.
if so, how will the user authenticate?
Up to you. OAuth is about authorisation, not authentication.
Just in case you meant authorisation, the user authorises the Project, which may contain multiple client IDs.
Do I pass the AuthToken my user got client-side and try to use that on the server?
You can do, but not a good idea for the reason you state. The preferred approach is to have a separate server Client ID, and use that to request offline access, which returns (eventually) a Refresh Token, which you store in your server. You then use that Refresh Token to request Access Tokens whenever you need them.
AuthToken is ... (and probably not intended to be used from multiple IPs).
It is not bound to a specific IP address
Is it possible to access the same Application Data on Google Drive both client-side and server-side?"
Yes
Most of what you need is at https://developers.google.com/accounts/docs/OAuth2WebServer