User don't have access to the secret even after enabling the access though policy - policy

I have created a policy so that a developer can access the secret but when I try to access the secret as a developer it shows that I don't have permission to access it.
I tried changing the value and adding new secret but nothing helped
path "kv/*" {
capabilities = ["list"]
}
path "kv/dev01"{
capabilities = ["read", "list"]
}
path "kv/dev_01_cred" {
capabilities = ["read", "list"]
}
the developer should be able to read the secret in dev01 and dev_01_cred.
Currently, the developer can only see the list of secrets in kv.

Related

Keycloak Spring UMA denied

I am trying to implement authorization for this use case:
user can access only his own resources
admin can access everything
I am trying out Keycloak and it's resource server. For testing purposes and to understand these scopes and permissions and stuff, I have created test client weather-api and one resource Weather with url /weatherforecast.
Then I have scope weather:read and policy that every user with role weatherer can read that resource.
Now when I try to evaluate on a user with that role, I get PERMIT:
and another user without this role gets DENY.
so I guess my policies and permissions are set correctly.
When I try to use this from my service with user-managed-access disabled, I get permit too.
But when I enable user-managed-access, it fails.
I see in debug log that it gets permissions token:
{
...
"permissions": [
{
"scopes": [
"weather:read"
],
"rsid": "ae5ac493-b7dc-481e-9204-a664d1558a51"
}
],
...
}
but then the next message is
Policy enforcement result for path [http://192.168.0.9:5001/weatherforecast] is : DENIED
I tried to debug Keycloak library and found something I don't really understand.
In KeycloakAdapterPolicyEnforcer this part of code:
#Override
protected boolean isAuthorized(PathConfig pathConfig, PolicyEnforcerConfig.MethodConfig methodConfig, AccessToken accessToken, OIDCHttpFacade httpFacade, Map<String, List<String>> claims) {
AccessToken original = accessToken;
if (super.isAuthorized(pathConfig, methodConfig, accessToken, httpFacade, claims)) {
return true;
}
accessToken = requestAuthorizationToken(pathConfig, methodConfig, httpFacade, claims);
if (accessToken == null) {
return false;
}
...
}
private AccessToken requestAuthorizationToken(PathConfig pathConfig, PolicyEnforcerConfig.MethodConfig methodConfig, OIDCHttpFacade httpFacade, Map<String, List<String>> claims) {
if (getEnforcerConfig().getUserManagedAccess() != null) {
return null;
}
...
}
so when UserManagedAccess is not null, requestAuthorizationToken returns null and then it acts like the user is unauthorized with HTTP 401.
What am I missing here? Why it works only without UMA?
I have looked at these (app-authz-uma-photoz, devconf2019-authz) examples and haven't noticed what I am missing.
Except they are actually creating some resources for users from the Java app, I'm not. But I guess it shouldn't matter if I'm protecting user created resources or single "pre-made" URL, right? It should depend only on correct permissions and since they evaluate to PERMIT so I don't see why this doesn't work.
And one more question. Isn't this UMA thing overkill for just "user can access his own, admin can access everything" case when there will never be any sharing between users? I was thinking about some simpler way that could work without creating user resources in Keycloak but I couldn't think of anything, I believe I still need to have connected user ID with some resource ID to make this working.

Amplify "Unable to verify secret hash for client"

We have been using Amplify and Cognito to register our users for an Angular6 application deployed to Lambda. The client wanted to transition from email to username as primary user identification. So we created a new user pool / client. I don't have visibility into the configuration settings, I was simply given new user pool, identity pool, and client id's. Then I changed the code for application signup to look like this:
return from(Auth.signUp({
'username': username, // was email
'password': password,
attributes: { // added these
'email': email,
'phone_number': phone_number,
'family_name': name,
'birthdate': DOB,
'custom:last_4_ssn': SSN // custom attribute
}}));
The response I'm getting with no other changes made is: Unable to verify secret hash for client. Google claims the problem is that secretAccess is currently an unsupported configuration, but the guy who has access to these services swears to me that nowhere is secretAccess configured in our setup.
I apologize for not having access to the configuration, but is there any other possible reason to receive this error?
That error is probably originating from the fact that the app client you are connected to has an associated secret key. When you create a user pool app client, it generates a secret by default:
Right now, with React-Native Amplify you have to use an app client that does not have a secret key generated. So when you create a new app client with your desired attributes, make sure the "Generate client secret" box is unchecked.
The solution is to pass secret_hash along with the adminAuthInitiate Request. And to calculate the secret hash you can use the following method:
public static String calculateSecretHash(String userPoolClientId, String userPoolClientSecret, String userName) {
final String HMAC_SHA256_ALGORITHM = "HmacSHA256";
SecretKeySpec signingKey = new SecretKeySpec(
userPoolClientSecret.getBytes(StandardCharsets.UTF_8),
HMAC_SHA256_ALGORITHM);
try {
Mac mac = Mac.getInstance(HMAC_SHA256_ALGORITHM);
mac.init(signingKey);
mac.update(userName.getBytes(StandardCharsets.UTF_8));
byte[] rawHmac = mac.doFinal(userPoolClientId.getBytes(StandardCharsets.UTF_8));
return Base64.getEncoder().encodeToString(rawHmac);
} catch (Exception e) {
throw new RuntimeException("Error while calculating ");
}
}
How to Pass Secret_Hash
Map<String, String> authParams = new HashMap<>(2);
authParams.put("USERNAME", <username>);
authParams.put("PASSWORD", <password>);
authParams.put("SECRET_HASH", calculateSecretHash(cognitoClientId, cognitoClientSecret, <username>));
AdminInitiateAuthRequest authRequest = new AdminInitiateAuthRequest()
.withClientId(userPool.getClientId()).withUserPoolId(userPool.getUserPoolId())
.withAuthFlow(AuthFlowType.ADMIN_NO_SRP_AUTH).withAuthParameters(authParams);
AdminInitiateAuthResult result = cognito.adminInitiateAuth(authRequest);
auth = result.getAuthenticationResult();

How to use Firebase's email & password authentication method to connect with AWS to make Fine Uploader S3 work?

I decided to use Fine Uploader for my current AngularJS project (which is connected to hosted on Firebase) because it has many core features that I will need in an uploader already built in but, I am having trouble understanding how to use Firebase's email & password authentication method to communicate with AWS (Amazon Web Services) to allow my users to use Fine Uploader S3 to upload content. Based on Fine Uploader blog post Uploads without any server code, the workflow goes like:
Authenticate your users with the help of an identity provider, such as Google
Use the temporary token from your ID provider to grab temporary access keys from AWS
Pass the keys on to Fine Uploader S3
Your users can now upload to your S3 bucket
The problem is that I won't be using OAuth 2.0 (which is used by Google, Facebook or Amazon to provide user identities) to allow my user's to sign into my app and upload content. Instead I will be using Firebase's email & password authentication.
So how can I make Firebase's email & password authentication method create a temporary token to grab temporary access keys from AWS and pass those keys on to Fine Uploader S3 to allow my users to upload content to S3?
To connect AWS with an outside application, Cognito is going to be a good solution. It will let you generate an OpenID token using the AWS Node SDK and your secret keys in your backend, that you can then use with the AWS JavaScript SDK and WebIdentityCredentials in your client.
Note that I'm unfamiliar with your specific plugin/tool, but this much will at least get you the OpenID and in my work it does let me connect using WebIdentityCredentials, which I imagine is what they are using.
Configure Cognito on AWS
Setup on Cognito is fairly easy - it is more or less a walkthrough. It does involve configuring IAM rules on AWS, though. How to set this up is pretty project specific, so I think I need to point you to the official resources. They recently made some nice updates, but I am admittedly not up to speed on all the changes.
Through the configuration, you will want to setup a 'developer authenticated identity', take note of the 'identity pool id', and the IAM role ARN setup by Cognito.
Setup a Node Server that can handle incoming routes
There are a lot of materials out there on how to accomplish this, but you want to be sure to include and configure the AWS SDK. I also recommend using body-parser as it will make reading in your POST requests easier.
var app = express();
var bodyParser = require('body-parser');
var AWS = require('aws-sdk');
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
Create POST Function to talk with Cognito
Once you have your server setup, you then reach out to Cognito using getOpenIdTokenForDeveloperIdentity. In my setup, I use authenticated users because I expect them to come back and want to be able to continue the associations, so that is why I send in a UserID in req.body.UserIDFromAngularApp.
This is my function using express.router().
.post(function(req, res) {
if(req.body.UserIDFromAngularApp) {
var cognitoidentity = new AWS.CognitoIdentity();
var params = {
IdentityPoolId: 'your_cognito_identity_pool_id',
Logins: {
'your_developer_authenticated_identity_name': req.body.UserIDFromAngularApp
}
};
cognitoidentity.getOpenIdTokenForDeveloperIdentity(params, function(err, data) {
if (err) { console.log(err, err.stack); res.json({failure: 'Connection failure'}); }
else {
console.log(data); // so you can see your result server side
res.json(data); // send it back
}
});
}
else { res.json({failure: 'Connection failure'}); }
});
If all goes well, that will return an OpenID Token back to you. You can then return that back to your Angular application.
POST from Angular, Collect from Promise
At the very least you need to post to your new node server and then collect the OpenID token out of the promise. Using this pattern, that will be found in data.Token.
It sounds like from there you may just need to pass that token on to your plugin/tool.
In case you need to handle authentication further, I have included code to handle the WebIdentityCredentials.
angular.module('yourApp').factory('AWSmaker', ['$http', function($http) {
return {
reachCognito: function(authData) {
$http.post('http://localhost:8888/simpleapi/aws', {
'UserIDFromAngularApp': authData.uid,
})
.success(function(data, status, headers, config) {
if(!data.failure) {
var params = {
RoleArn: your_role_arn_setup_by_cognito,
WebIdentityToken: data.Token
};
AWS.config.credentials = new AWS.WebIdentityCredentials(params, function(err) {
console.log(err, err.stack);
});
}
});
}
}]);
This should get you on your way. Let me know if I can help further.
Each OAuth provider has a slightly unique way of handling things, and so the attributes available in your Firebase authenticated token vary slightly based on provider. For example, when utilizing Facebook, the Facebook auth token is stored at facebook.accessToken in the returned user object:
var ref = new Firebase(URL);
ref.authWithOAuthPopup("facebook", function(error, authData) {
if (authData) {
// the access token for Facebook
console.log(authData.facebook.accessToken);
}
}, {
scope: "email" // the permissions requested
});
All of this is covered in the User Authentication section of the Web Guide.

Google analytics integration in mvc4

try
{
UserCredential credential;
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
new ClientSecrets { ClientId = ClientID, ClientSecret = ClientSecret },
new[] { AnalyticsService.Scope.AnalyticsReadonly, AnalyticsService.Scope.AnalyticsEdit },
"user",
CancellationToken.None,
new FileDataStore("Analytics.Auth.Store")).Result;
return credential;
}
catch { return null; }
I am using above code for google console web application(Google Analytic) but it gives redirect_uri mismatch error. How i can send redirect_uri.
redirect_uri is set up in the Google Developers console -> apis & auth -> credentials
Not sure if Sanaan C ever found an answer ... the reason that your code does not work in a web application is likely because the user that created the Analytics.Auth.Store entry in that user's %APPDATA% folder is NOT the one running your web application.
Does anyone have a solution to this - and please excuse that this question is appended to another ... I actually think this was the intended question in any event ...
===
One simple-minded solution could be to take the credentials created by a user who can respond to the redirect and put it in a folder, with appropriate access permissions, where the user under which the IIS service is being run can find it. Instantiate the FileDataStore with a full path to this folder ...

ArgumentException: Precondition failed.: !string.IsNullOrEmpty(authorization.RefreshToken) with Service Account for Google Admin SDK Directory access

I'm trying to access the Google Directory using a Service Account. I've fiddled with the DriveService example to get this code:
public static void Main(string[] args)
{
var service = BuildDirectoryService();
var results = service.Orgunits.List(customerID).Execute();
Console.WriteLine("OrgUnits");
foreach (var orgUnit in results.OrganizationUnits)
{
Console.WriteLine(orgUnit.Name);
}
Console.ReadKey();
}
static DirectoryService BuildDirectoryService()
{
X509Certificate2 certificate = new X509Certificate2(SERVICE_ACCOUNT_PKCS12_FILE_PATH, "notasecret",
X509KeyStorageFlags.Exportable);
var provider = new AssertionFlowClient(GoogleAuthenticationServer.Description, certificate)
{
ServiceAccountId = SERVICE_ACCOUNT_EMAIL,
Scope = DirectoryService.Scopes.AdminDirectoryOrgunit.GetStringValue()
};
var auth = new OAuth2Authenticator<AssertionFlowClient>(provider, AssertionFlowClient.GetState);
return new DirectoryService(new BaseClientService.Initializer()
{
Authenticator = auth,
ApplicationName = "TestProject1",
});
}
When I run it, I get
ArgumentException: Precondition failed.: !string.IsNullOrEmpty(authorization.RefreshToken)
I'm going round in circles in the Google documentation. The only stuff I can find about RefreshTokens seems to be for when an individual is authorizing the app and the app may need to work offline. Can anyone help out or point me in the direction of the documentation that will, please.
Service Account authorization actually do not return Refresh Token - so this error makes sense. Do you know where this is coming from?
I am not too familiar with the .NET client library but having the full error trace would help.
As a longshot - The error might be a bad error -
Can you confirm that you've enabled the Admin SDK in the APIs console for this project
Can you confirm that you whitelisted that Client ID for the service account in the domain you are testing with (along with the Admin SDK scopes)
The above code will work if you replace the provider block with:
var provider = new AssertionFlowClient(GoogleAuthenticationServer.Description, certificate)
{
ServiceAccountId = SERVICE_ACCOUNT_EMAIL,
Scope = DirectoryService.Scopes.AdminDirectoryOrgunit.GetStringValue(),
ServiceAccountUser = SERVICE_ACCOUNT_USER //"my.admin.account#my.domain.com"
};
I had seen this in another post and tried it with my standard user account and it didn't work. Then I read something that suggested everything had to be done with an admin account. So, I created a whole new project, using my admin account, including creating a new service account, and authorising it. When I tried it, it worked. So, then I put the old service account details back in but left the admin account in. That worked, too.