How to use Active Directory based authentication in Power BI Service to query data as per user's entitlements from AWS Athena - authentication

I have a requirement that user need to connect Power BI cloud Service to Athena (via on-premises datagateway) and fetch the data according to the user persona setup in AWS.
I understood from MS documentation that I need to have the above setup. On-premises Power BI gateway needs to be installed in Windows EC2 instance to route the calls from PowerBI cloud service to Athena.
My questions are
How can Athena understand the end user who is querying the data from Power BI cloud Service?
On-premises Power BI gateway uses the hard-coded fixed service account to query the data from Athena (using Athena Connector + ODBC) and proxy the user(and caches the data in on-premises gateway), is there any way to pass the user information to Athena, so Athena can authorize and serve the data, according to user's persona (like not showing PII data for the unauthorized users)

Related

Use Google Storage Transfer API to transfer data from external GCS into my GCS

I am working on a web application which comprises of ReactJs frontend and Java SpringBoot backend. This application would require users to upload data from their own Google Cloud storage into my Google Cloud Storage.
The application flow will be as follows -
The frontend requests the user for read access on their storage. For this I have used oauth 2.0 access tokens as described here
The generated Oauth token will be passed to the backend.
The backend will also have credentials for my service account to allow it to access my Google Cloud APIs. I have created the service account with required permissions and generated the key using the instructions from here
The backend will use the generated access token and my service account credentials to transfer the data.
In the final step, I want to create a transfer job using the google Storage-Transfer API. I am using the Java API client provided here for this.
I am having difficulty providing the authentication credentials to the transfer api.
In my understanding, there are two different authentications required - one for reading the user's bucket and another for starting the transfer job and writing the data in my cloud storage. I haven't found any relevant documentation or working examples for my use-case. In all the given samples, it is always assumed that the same service account credentials will have access to both the source and sink buckets.
tl;dr
Does the Google Storage Transfer API allow setting different source and target credentials for GCS to GCS transfers? If yes, how does one provide these credentials to the transfer job specification.
Any help is appreciated. Thanks!
This is not allowed for the the GCS Transfer API unfortunately, for this to work it would be required that the Service Account have access to both the source and the sink buckets, as you mentioned.
You can try opening a feature request in Google's Issue Tracker if you'd like so that Google's Product Team can consider such a functionality for newer versions of the API, also you could mention that this is subject is not touched in the documentation, so it can be improved.

SAPUI5 app with OData authentication + restriction by authorizations

I have some problems to understand the Login with a SAPUI5 App connected with the OData to a SAP-Server.
First of all, the idea is like a time recognition where i login with the user id or username and password. than i check if it is in the SAP backend system and if it is. i want to show only the records for the staff number.
I am doing it with filters? or there are some other ways?
It should be a basic authentication, when I open the app there is a popup where I need to enter my sap-system username and password, but how I can make this as a login page and how can I get the parameters (every username has a staff number)?
first of all I think you need to understand some basic concepts/architecture settings.
There are two possibilities deploying SAPUI5 apps:
hosted on your SAP NW GW onPremise (no matter if central hub or not)
hosted on SAP Cloud Plattform
Case A: deploying on SAP NW GW onPremise
in this case you could override the ICF classes to achieve custom login and logout explained here: https://blogs.sap.com/2016/11/25/sapui5-application-with-custom-login-and-logout-option/ | I will not explain this is further depth
Case B: hosting a SAPUI5 App on SAP Cloud Platform
in this case your need to understand the architecture to determine what you want
If your SAP Cloud Platform hosts the SAPUI5 application your can read data from an SAP Backend using SAP Cloud Platform Destination Services (which means a destination, using the SAP Cloud Connector to connect to your SAP System)
In your case: First of all, the idea is like a time recognition where i login with the user id or username and password.. than i check if it is in the sapbackend system and if it is.. i want to show ONLY the records for the staffnumber..
--> I would recommend the following: If the app is hosted on SAP CP you need an S-User to access it or a respective single sign on mechanism to replace s-user. You will not be able to replace the login from Cloud Platform with anything individual not breaking security terms. Then you need a SAP NW GW OData Service in your SAP Backend. You further need SAP Cloud Connector paired with your SAP System and with your SAP CP instance. Having both things in place requires creating a destination pointing to your OData Service. If you got these things in place you could easily select the destination when creating an SAPUI5 Application via templates.
Help Links:
SSO for SAP CP:
https://blogs.sap.com/2017/04/13/configure-saml-sso-for-sap-cloud-platform-using-an-external-identity-provider/
Destinations in SAP CP:
https://www.sap.com/developer/tutorials/hcp-create-destination.html
https://blogs.sap.com/2018/03/09/understanding-destination-types-available-in-sap-cloud-platform-mobile-service/
https://www.sap.com/developer/tutorials/teched-2016-3.html
Cloud Connector Setup:
https://www.sap.com/developer/tutorials/hcp-cloud-connector-setup.html
SAP NW GW OData Service
https://blogs.sap.com/2016/05/31/odata-service-development-with-sap-gateway-code-based-service-development/
Have fun

How to do user authentication with Amazon athena?

How to access amazon athena?
Is it possible to do user authentication with amazon athena?
How we can restrict the user's to do query over s3 using athena ?
For Authentication and Authorization in Amazon Web Services, you are going to use Identity and Access Management (IAM). Unlike Relational Database Management Systems (RDMSs) including those managed via (RDS), Athena does not come with its own access controls.
To set up permissions for Athena see the following, AWS Documentation » Amazon Athena » User Guide » Security » Access Control Policies.
Kindly go through the aws athena docs https://aws.amazon.com/documentation/athena/, It depends upon your usage. If you are having requirement of daily analysis of data then you can create data pipelines and query the data. Kindly use aws provided JDBC/ODBC driver to link to Athena.

How we can use dynamo db local to save cognito ID of authenticate users

Is there any specific way to create a Cognito Identity Pool in a Amazon DynamoDB local setup? I have already setup the javascript shell and created several tables and have queried it. I need to provide a authenticated mode user login (Facebook, Amazon, Google) for my node application. I found several tutorials about how to set it up using AWS DynamoDB, but I need to know how I can create it using a local DynamoDB without accessing AWS DynamoDB.
Amazon DynamoDB local doesn't validate credentials, so it doesn't matter how you set up the Amazon Cognito identity pool or the roles for the pool. You will be able to interact with the CognitoCredentials object the same way if you are using Amazon DynamoDB or DynamoDB local.
It is important to note that you will not hoever be able to validate fine-grained access control unless you use the full service, again because DynamoDB local doesn't validate credentials.

Utilize Azure Graph API for SharePoint Online User management

Let's say I have SharePoint Online subscription, hence I can manage own *.onmicrosoft.com domain and users/groups connected with it.
As far as I understand, the storage behind SPO where users and groups reside is an Azure AD.
I had a thought that the only way to manage these users/groups remotely is using PowerShell module for Microsoft Online. And now I wounder whether Azure Graph API can be used for purposes of retrieving users and group members from the SPO? I have no Azure subscription, is there a way to utilize Azure Graph API without Azure subscription, having only SPO subscription?
Ok, turns out it is possible. Briefly steps are the following:
Create a service principal that will serve as 'contact point' with your external application (here is a good start point); I've used symmetric key authorization;
Add a newly created service principal to the 'Company Administrator' role;
Look at the azurecoder's article and check out his comprehensive example of using Graph API: https://github.com/azurecoder/azure-activedirectory; this code correctly deals with authentication parameters, like constructing proper service realm.
After that I was able to grab users, groups and users membership information for SPO instance without creating a Azure AD subscription.