Unable to create google group with Terraform resource google_cloud_identity_group - permissions

The following resource is used to create a google group using the terraform google-beta and version 3.36:
resource "google_cloud_identity_group" "cloud_identity_group_basic" {
provider = google-beta
display_name = "aaa bbb"
parent = "customers/XXX"
group_key {
id = "aaa_bbb#evilcorp.com"
}
labels = {
"cloudidentity.googleapis.com/groups.discussion_forum" = ""
}
}
terraform plan tells me that it will create the resource but performing apply results in an error (Actor does not have permission to create group). The terraform service-account has already a lot of permissions such as Organization Administrator, Google Cloud Managed Identities Admin, Google Cloud Managed Identities Domain Admin, ...
G Suite Domain-wide Delegation also has been tried, but unsure how this might help.
Terraform will perform the following actions:
# google_cloud_identity_group.cloud_identity_group_basic will be created
+ resource "google_cloud_identity_group" "cloud_identity_group_basic" {
+ create_time = (known after apply)
+ display_name = "aaa bbb"
+ id = (known after apply)
+ labels = {
+ "cloudidentity.googleapis.com/groups.discussion_forum" = ""
}
+ name = (known after apply)
+ parent = "customers/XXX"
+ update_time = (known after apply)
+ group_key {
+ id = "aaa_bbb#evilcorp.com"
}
}
Plan: 1 to add, 0 to change, 0 to destroy.
Do you want to perform these actions?
Terraform will perform the actions described above.
Only 'yes' will be accepted to approve.
Enter a value: yes
google_cloud_identity_group.cloud_identity_group_basic: Creating...
Error: Error creating Group: googleapi: Error 403: Error(2015): Actor does not have permission to create group 'aaa_bbb#evilcorp.com'.
Details:
[
{
"#type": "type.googleapis.com/google.rpc.ResourceInfo",
"description": "Error(2015): Actor does not have permission to create group 'aaa_bbb#evilcorp.com'.",
"owner": "domain:cloudidentity.googleapis.com",
"resourceType": "cloudidentity.googleapis.com/Group"
}
]
on groups.tf line 1, in resource "google_cloud_identity_group" "cloud_identity_group_basic":
1: resource "google_cloud_identity_group" "cloud_identity_group_basic" {

It is possible to use service accounts with Google Groups APIs without domain-wide delegation now.
See: Setting up the Groups API / Assigning an admin role to the service account. This enabled the terraform service-account to create/manage groups.

Building a bit in top of #Dag answer:
It is also possible to do it through the Admin Console.
Actually, I didn't find any other way, as it seems impossible to obtain the uniqueID of the Default Cloud Build Service Account.
Follow the previous link as a Workspace Super User.
Click on the Groups Admin role.
Click in the down arrow in the Admins section
Finally click on Assign service accounts there you can paste
the service account email (<YOUR-PROJECT-ID>#cloudbuild.gserviceaccount.com)
After doing this, it is actually possible to obtain the service accounts uniqueID: Just run the Try this API from the Directory API documentation with the roleId (you can get the roleId from the URL you are after point 2) and the customer id that you can obtain from the Account settings.

Related

How to enable s3 Copy Bucket Permissions in Terraform statement

My goal is to copy the data from a set of s3 buckets into main logging account bucket. Every time I try to perform:
aws s3 cp s3://sub-account-cloudtrail s3://master-acccount-cloudtrail --profile=admin;
I get
(AccessDenied) when calling the CopyObject operation: Access Denied`
I've looked at this post:
How to fix AccessDenied calling CopyObject
I am trying to add the bucket permissions to a Terraform data aws_iam_policy_document. The statement is written like so
data aws_iam_policy_document s3 {
version = "2012-10-17"
statement {
sid = "CopyOobjectPermissions"
effect = "Allow"
principals {
type = "AWS"
identifiers = ["arn:aws:iam::${data.aws_caller_identity.current.account_id}:role/ops-mgmt-admin"]
}
actions = ["s3:GetObject","s3:PutObject","s3:PutObjectAcl"]
resources = ["${aws_s3_bucket.nfcisbenchmark_cloudtrail.arn}/*"]
}
statement {
sid = "CopyBucketPermissions"
actions = ["s3:ListBucket"]
effect = "Allow"
principals {
type = "AWS"
identifiers = ["arn:aws:iam::${data.aws_caller_identity.current.account_id}:role/ops-mgmt-admin"]
}
resources = ["${aws_s3_bucket.nfcisbenchmark_cloudtrail.arn}/*"]
}
}
My goal is to restrict the permissions to the role that is assumed from the sub-account to the master account. My specific question is what permissions need to be added in order to enable copy permissions?
Expected:
Terraform plan runs successfully
Actual:
│ Error: Error putting S3 policy: MalformedPolicy: Action does not apply to any resource(s) in statement
How can I resolve this?
Two things to mention:
In your second statement the resource is wrong, this is why you get the MalformedPolicy error. It should be:
resources = [aws_s3_bucket.nfcisbenchmark_cloudtrail.arn]
Be careful with the identifier. At this point I'm not really sure if your buckets are in different accounts or not. If they are, the account_id in the identifier should reference the source account. data.aws_caller_identity.current.account_id returns the account ID to which Terraform is authenticated, which usually is the account where you are deploying resources (destination account). If your are not doing cross account copying, than it should be fine as it is.
Furthermore, in case of cross account access, ops-mgmt-admin role should have a policy applied to it which gives access to get/list/upload objects to an S3 bucket.

How do I grant access to a user/group to use BigQuery in GCP console?

I have a BigQuery dataset and I wish to grant read access on that dataset to a Google group called somegroup#myorg.com. I have granted that group the READER role on the dataset as proved by this command:
$ bq show myproject98765:mydataset
Dataset myproject98765:mydataset
Last modified ACLs Labels
----------------- ------------------------------------------------- ---------------------------------
28 Nov 12:57:34 Owners:
projectOwners
Readers:
somegroup#myorg.com
However, members of that group are not able to access the BigQuery interface at https://console.cloud.google.com/bigquery. When they visit https://console.cloud.google.com/ the project ("myproject98765") is not available in the project picker.
I assume I have to grant a role to that group that enables its members to access project myproject98765, what is the least permissive role I can grant that will allow those members to access https://console.cloud.google.com/bigquery and nothing else
OK, figured this out. Members of that group do not need to be granted any permissions on myproject98765. If they have access to the dataset (which, as I said above, they do) then its sufficient to login to console.cloud.google.com and query that dataset from any project, they do not have to have explicit access to the project in which the dataset resides.
In case anyone cares, we deploy permissions using terraform. The terraform code in this case is:
resource "google_bigquery_dataset" "ds" {
project . = "myproject98765"
dataset_id = "mydataset"
location = "EU"
access {
role = "READER"
group_by_email = "somegroup#myorg.com"
}
}

Wildcard at end of principal for s3 bucket

I want to allow roles within an account that have a shared prefix to be able to read from an S3 bucket. For example, we have a number of roles named RolePrefix1, RolePrefix2, etc, and may create more of these roles in the future. We want all roles in an account that begin with RolePrefix to be able to access the S3 bucket, without having to change the policy document in the future.
My terraform for bucket policy document is as below:
data "aws_iam_policy_document" "bucket_policy_document" {
statement {
effect = "Allow"
actions = ["s3:GetObject"]
principals = {
type = "AWS"
identifiers = ["arn:aws:iam::111122223333:role/RolePrefix*"]
}
resources = ["${aws_s3_bucket.bucket.arn}/*"]
}
}
This gives me the following error:
Error putting S3 policy: MalformedPolicy: Invalid principal in policy.
Is it possible to achieve this functionality in another way?
You cannot use wildcard along with the ARN in the IAM principal field. You're allowed to use just "*".
https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_elements_principal.html
When you specify users in a Principal element, you cannot use a wildcard (*) to mean "all users". Principals must always name a specific user or users.
Workaround:
Keep "Principal":{"AWS":"*"} and create a condition based on ARNLike etc as they accept user ARN with wildcard in condition.
Example:
https://aws.amazon.com/blogs/security/how-to-restrict-amazon-s3-bucket-access-to-a-specific-iam-role/

Keycloak - how to allow linking accounts without registration

I am managing a Keycloak realm with only a single, fully-trusted external IdP added that is intended to be the default authentication mechanism for users.
I do not want to allow user to register, i.e. I want to manually create a local Keycloak user, and that user should then be allowed to link his external IdP account to the pre-existing Keycloak account, having the email address as common identifier. Users with access to the external IdP but without an existing Keycloak account should not be allowed to connect.
I tried the following First Broker Login settings, but whenever a user tries to login, he gets an error message (code: invalid_user_credentials).
Do you have any idea what my mistake might be?
Looks like they integrated this feature in version 4.5.0.
See automatic account link docs.
Basically you need to create a new flow and add 2 alternative executions:
Create User If Unique
Automatically Link Brokered Account
According to the doc: https://www.keycloak.org/docs/latest/server_admin/index.html#detect-existing-user-first-login-flow, you must create a new flow like this:
et voilà :)
As per this discussion:
https://keycloak.discourse.group/t/link-idp-to-existing-user/1094/5
It’s a bug in keycloak and they seem to be a reluctant to fix it for
whatever reason. I have very few users so I solved it by manually
querying the idp for the information keycloak uses and then copying it
into the relevant fields in the UI. So there is no sign up process for
my users I just make them myself. Obviously that’s a poor solution
though, what we really need is someone to take over that PR and
persuade the maintainers to merge it.
This is the PR: https://github.com/keycloak/keycloak/pull/6282
As it is described in this GitHub issue response the solution is to use a JavaScript authenticator that handles this.
In order to do so, you need to do the folowing:
Enable [custom authenticators using JavaScript in your server[(https://www.keycloak.org/docs/latest/server_installation/#profiles) by https://stackoverflow.com/a/63274532/550222creating a file profile.properties in your configuration directory that contains the following:
feature.scripts=enabled
Create the custom authenticator. You have to create a JAR file (essentially a ZIP file) with the following structure:
META-INF/keycloak-scripts.json
auth-user-must-exist.js
The content of the files are in this Gist, but I am including them here as well:
META-INF/keycloak-scripts.json:
{
"authenticators": [
{
"name": "User must exists",
"fileName": "auth-user-must-exists.js",
"description": "User must exists"
}
]
}
auth-user-must-exist.js:
AuthenticationFlowError = Java.type("org.keycloak.authentication.AuthenticationFlowError")
ServicesLogger = Java.type("org.keycloak.services.ServicesLogger")
AbstractIdpAuthenticator = Java.type("org.keycloak.authentication.authenticators.broker.AbstractIdpAuthenticator")
IdpCreateUserIfUniqueAuthenticator = Java.type("org.keycloak.authentication.authenticators.broker.IdpCreateUserIfUniqueAuthenticator")
var IdpUserMustExists = Java.extend(IdpCreateUserIfUniqueAuthenticator)
function authenticate(context) {
var auth = new IdpUserMustExists() {
authenticateImpl: function(context, serializedCtx, brokerContext) {
var parent = Java.super(auth)
var session = context.getSession()
var realm = context.getRealm()
var authSession = context.getAuthenticationSession()
if (authSession.getAuthNote(AbstractIdpAuthenticator.EXISTING_USER_INFO) != null) {
context.attempted()
return
}
var username = parent.getUsername(context, serializedCtx, brokerContext)
if (username == null) {
ServicesLogger.LOGGER.resetFlow(realm.isRegistrationEmailAsUsername() ? "Email" : "Username")
authSession.setAuthNote(AbstractIdpAuthenticator.ENFORCE_UPDATE_PROFILE, "true")
context.resetFlow()
return
}
var duplication = parent.checkExistingUser(context, username, serializedCtx, brokerContext)
if (duplication == null) {
LOG.info("user not found " + username)
context.failure(AuthenticationFlowError.INVALID_USER)
return
} else {
authSession.setAuthNote(AbstractIdpAuthenticator.EXISTING_USER_INFO, duplication.serialize())
context.attempted()
}
}
}
auth.authenticate(context)
}
Then, you can define as follows:
User Must Exist -> ALTERNATIVE
Automatically Set Existing User -> ALTERNATIVE
Honestly i am surprised by the keycloak auto creating behavior. I tried to add new Authentication flow as descibed here https://www.keycloak.org/docs/latest/server_admin/index.html#automatically-link-existing-first-login-flow
My flow :
1 - Create User If Unique [ALTERNATIVE]
2 - Automatically Link Brokered Account [ALTERNATIVE]
My use case : Authenticating users from Github ( Github as IDP )
Result : when a github user logon with an existing "username" keycloak links the github account to my local user ( based on his username ). I expected using his email instead of username.

Google Analytics Management API - Insert method - Insufficient permissions HTTP 403

I am trying to add users to my Google Analytics account through the API but the code yields this error:
googleapiclient.errors.HttpError: https://www.googleapis.com/analytics/v3/management/accounts/**accountID**/entityUserLinks?alt=json returned "Insufficient Permission">
I have Admin rights to this account - MANAGE USERS. I can add or delete users through the Google Analytics Interface but not through the API. I have also added the service account email to GA as a user. Scope is set to analytics.manage.users
This is the code snippet I am using in my add_user function which has the same code as that provided in the API documentation.
def add_user(service):
try:
service.management().accountUserLinks().insert(
accountId='XXXXX',
body={
'permissions': {
'local': [
'EDIT',
]
},
'userRef': {
'email': 'ABC.DEF#gmail.com'
}
}
).execute()
except TypeError, error:
# Handle errors in constructing a query.
print 'There was an error in constructing your query : %s' % error
return None
Any help will be appreciated. Thank you!!
The problem was I using a service account when I should have been using an installed application. I did not need a service account since I had access using my own credentials.That did the trick for me!
Also remember that you have to specify the scope you would like to use, this example here (using the slightly altered example by Google) defines by default two scopes which would NOT allow to insert users (as they both give read only permissions) and would result in "Error 403 Forbidden" trying so.
The required scope is given in the code below:
from apiclient.discovery import build
from googleapiclient.errors import HttpError
from oauth2client.service_account import ServiceAccountCredentials
def get_service(api_name, api_version, scopes, key_file_location):
"""Get a service that communicates to a Google API.
Args:
api_name: The name of the api to connect to.
api_version: The api version to connect to.
scopes: A list auth scopes to authorize for the application.
key_file_location: The path to a valid service account JSON key file.
Returns:
A service that is connected to the specified API.
"""
credentials = ServiceAccountCredentials.from_json_keyfile_name(
key_file_location, scopes=scopes)
# Build the service object.
service = build(api_name, api_version, credentials=credentials)
return service
def get_first_profile_id(service):
# Use the Analytics service object to get the first profile id.
# Get a list of all Google Analytics accounts for this user
accounts = service.management().accounts().list().execute()
if accounts.get('items'):
# Get the first Google Analytics account.
account = accounts.get('items')[0].get('id')
# Do something, e.g. get account users & insert new ones
# ...
def main():
# Define the auth scopes to request.
# Add here
# https://www.googleapis.com/auth/analytics.manage.users
# to be able to insert users as well:
scopes = [
'https://www.googleapis.com/auth/analytics.readonly',
'https://www.googleapis.com/auth/analytics.manage.users.readonly',
]
key_file_location = 'my_key_file.json'
# Authenticate and construct service.
service = get_service(
api_name='analytics',
api_version='v3',
scopes=scopes,
key_file_location=key_file_location)
profile_id = get_first_profile_id(service)
print_results(get_results(service, profile_id))
if __name__ == '__main__':
main()
Regards,
HerrB92