What is the minimum role for Google Analytics 4 Bigquery link - google-bigquery

I want to set GA4 Bigquery Linking on my client GA4 Property, and Export data to My GCP Project.
As
https://support.google.com/analytics/answer/3416092?hl=en#zippy=%2Cin-this-article
mentioned, I gave Project "Editor" permission to my client account, and my client created GA4 link to My GCP Project, then It works correctly.
But I think "Editor" permission is too much, is there minimum permission for GA4 bigquery link?
Thanks,
Project role for client user is
bigquery.bireservations.get
bigquery.bireservations.update
bigquery.capacityCommitments.create
bigquery.capacityCommitments.delete
bigquery.capacityCommitments.get
bigquery.capacityCommitments.list
bigquery.capacityCommitments.update
bigquery.config.get
bigquery.config.update
bigquery.connections.create
bigquery.connections.delegate
bigquery.connections.delete
bigquery.connections.get
bigquery.connections.getIamPolicy
bigquery.connections.list
bigquery.connections.setIamPolicy
bigquery.connections.update
bigquery.connections.updateTag
bigquery.connections.use
bigquery.dataPolicies.create
bigquery.dataPolicies.delete
bigquery.dataPolicies.get
bigquery.dataPolicies.getIamPolicy
bigquery.dataPolicies.list
bigquery.dataPolicies.setIamPolicy
bigquery.dataPolicies.update
bigquery.datasets.create
bigquery.datasets.createTagBinding
bigquery.datasets.delete
bigquery.datasets.deleteTagBinding
bigquery.datasets.get
bigquery.datasets.getIamPolicy
bigquery.datasets.listTagBindings
bigquery.datasets.setIamPolicy
bigquery.datasets.update
bigquery.datasets.updateTag
bigquery.jobs.create
bigquery.jobs.delete
bigquery.jobs.get
bigquery.jobs.list
bigquery.jobs.listAll
bigquery.jobs.listExecutionMetadata
bigquery.jobs.update
bigquery.models.create
bigquery.models.delete
bigquery.models.export
bigquery.models.getData
bigquery.models.getMetadata
bigquery.models.list
bigquery.models.updateData
bigquery.models.updateMetadata
bigquery.models.updateTag
bigquery.readsessions.create
bigquery.readsessions.getData
bigquery.readsessions.update
bigquery.reservationAssignments.create
bigquery.reservationAssignments.delete
bigquery.reservationAssignments.list
bigquery.reservationAssignments.search
bigquery.reservations.create
bigquery.reservations.delete
bigquery.reservations.get
bigquery.reservations.list
bigquery.reservations.update
bigquery.routines.create
bigquery.routines.delete
bigquery.routines.get
bigquery.routines.list
bigquery.routines.update
bigquery.routines.updateTag
bigquery.rowAccessPolicies.create
bigquery.rowAccessPolicies.delete
bigquery.rowAccessPolicies.getIamPolicy
bigquery.rowAccessPolicies.list
bigquery.rowAccessPolicies.setIamPolicy
bigquery.rowAccessPolicies.update
bigquery.savedqueries.create
bigquery.savedqueries.delete
bigquery.savedqueries.get
bigquery.savedqueries.list
bigquery.savedqueries.update
bigquery.tables.create
bigquery.tables.createIndex
bigquery.tables.createSnapshot
bigquery.tables.delete
bigquery.tables.deleteIndex
bigquery.tables.deleteSnapshot
bigquery.tables.export
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.getIamPolicy
bigquery.tables.list
bigquery.tables.restoreSnapshot
bigquery.tables.setCategory
bigquery.tables.setIamPolicy
bigquery.tables.update
bigquery.tables.updateData
bigquery.tables.updateTag
bigquery.transfers.get
bigquery.transfers.update
bigquerymigration.translation.translate
resourcemanager.projects.get
resourcemanager.projects.setIamPolicy
serviceusage.services.enable

You can provide below roles ( On Project or Dataset)
roles/bigquery.admin
(roles/bigquery.dataEditor)
BigQuery Data Owner ( Only on target Dataset )
(roles/bigquery.dataOwner)

Related

User login history in whole Domain

I'm actually looking for a way to get login history for a specific username.
I have tried these ways but didn't work:
1. eventid 4624
It only shows logins to the DC itself, Not in the entire domain. E.g. a user has logged in to the DC. So Eventviewer logs this as an event with eventid 4624. But if a user logs in to another server (not DC) , nothing will be logged in DC's Eventviewr.
2. eventid 4769
It's about tickets that DC creates and assigns. But is wasn't helpful.
So how can I get the login history of a user in entire domain?
I reproduce your scenario and getting the expected result.
Event ID 4624 - An account was successfully logged on.
This event records every successful attempt to log on to the local computer. It includes critical information about the logon type (e.g. interactive, RemoteInteractive , batch, network, or service), SID, username, network information, and more. Monitoring this particular event is crucial as the information regarding logon type is not found in DCs. you can get a user login history report without having to manually crawl through the event logs.
Open the PowerShell ISE → Run the following script, adjusting the timeframe:
# Find DC list from Active Directory
$DCs = Get-ADDomainController -Filter *
# Define time for report (default is 1 day)
$startDate = (get-date).AddDays(-1)
# Store successful logon events from security logs with the specified dates and workstation/IP in an array
foreach ($DC in $DCs){
$slogonevents = Get-Eventlog -LogName Security -ComputerName $DC.Hostname -after $startDate | where {$_.eventID -eq 4624 }}
# Crawl through events; print all logon history with type, date/time, status, account name, computer and IP address if user logged on remotely
foreach ($e in $slogonevents){
# Logon Successful Events
# Local (Logon Type 2)
if (($e.EventID -eq 4624 ) -and ($e.ReplacementStrings[8] -eq 2)){
write-host "Type: Local Logon`tDate: "$e.TimeGenerated "`tStatus: Success`tUser: "$e.ReplacementStrings[5] "`tWorkstation: "$e.ReplacementStrings[11]
}
# Remote (Logon Type 10)
if (($e.EventID -eq 4624 ) -and ($e.ReplacementStrings[8] -eq 10)){
write-host "Type: Remote Logon`tDate: "$e.TimeGenerated "`tStatus: Success`tUser: "$e.ReplacementStrings[5] "`tWorkstation: "$e.ReplacementStrings[11] "`tIP Address: "$e.ReplacementStrings[18]
Reference : Active Directory: How to Get User Login History using PowerShell - TechNet Articles - United States (English) - TechNet Wiki (microsoft.com)
You can also try with one easiest alternative way using A tool like ADAudit Plus that audits specific logon events as well as current and past logon activity to provide a list of all logon-related changes for particular user.
Step 1 : Download ADAdudit Plus in your VM and install it.
Step 2: Add your Server name ,Username and password.
Step 3 : Follow the below picture to get the logon details of particular user.
Reference : https://www.manageengine.com/products/active-directory-audit/kb/ad-user-login-history-report.html

SAP HANA How to debug / fix "insufficient privilege" Error

In SAP HANA I am trying to call a StoredProcedure with a Table Type as input parameter.
Other Input parameters work just fine. But as soon as I use a Table Type I get the error:
Failed to execute action: InternalError: dberror($.hdb.Connection.executeProcedure): 258 - SQL error, server error code: 258. insufficient privilege: Not authorized at /sapmnt/ld7272/a/HDB/jenkins_prod/workspace/8uyiojyvla/s/ptime/query/checker/query_check.cc:4003
How to fix / debug this?
In the indexserver-trace is:
[19984]{315590}[100/100235487] 2018-08-22 10:07:13.949679 i TraceContext TraceContext.cpp(01028) : UserName=SAPDBCTRL, ApplicationUserName=SM_EFWK, ApplicationName=ABAP:AS2, ApplicationSource=CL_SQL_STATEMENT==============CP:304, Client=010, StatementHash=31c1e1f5ca72868a541d58fc5a77596b, EppRootContextId=0050560204981EE782C14A33A16BC68E, EppTransactionId=47BF1E2CEE9D05A0E005B7CF04FCF981, EppConnectionId=5B7C13CC22061B08E10000000A1807AF, EppConnectionCounter=1, EppComponentName=AS2/sapas2ci_AS2_01, EppAction=EFWK RESOURCE MANAGER
[19984]{315590}[100/100235487] 2018-08-22 10:07:13.949656 w SQLScriptExecuto se_eapi_proxy.cc(00144) : Error <exception 71000258: Not authorized
> in preparation of internal statement: delete from _SYS_STATISTICS.STATISTICS_PROPERTIES where key='internal.check.store_results'
[19984]{315590}[100/100235487] 2018-08-22 10:07:13.949904 e SQLScript trex_llvm.cc(00936) : Llang Runtime Error: Exception::SQLException258: insufficient privilege: Not authorized
at main (line 63) ("_SYS_STATISTICS"."SHARED_STORE_USED_VALUES": line 8 col 5 (at pos 456))
This seems rather straightforward:
The application user (the person using SAP NetWeaver) SM_EFWK logged on in client 010 is trying to delete data from an SAP HANA statistics service table _SYS_STATISTICS.STATISTICS_PROPERTIES.
The NetWeaver/ABAP program uses a secondary database connection with the database user SAPDBCTRL.
The error Exception::SQLException258: insufficient privilege: Not authorized is thrown, because this SAPDBCTRL database user, does not have the privilege to DELETE on this table assigned to it (neither directly, nor via schema or role privilege).
If the SQL command is part of an SAP standard program, then I'd check that the recommended setup has been implemented correctly.
If this command comes from a custom program, you may want to either assign the privilege or use a different technical user as SAPDBCTRL is an SAP standard user that shouldn't be modified.

Schema violation exception when exporting to LDAP is enabled

I have configured Liferay to use LDAP server which works fine as long as Import is enabled.
As soon as I switch on Export enabled option,and user tries to login it throws exception.Strangely the user from Liferay is exported to LDAP server.
Caused by: javax.naming.directory.SchemaViolationException: [LDAP:
error code 67 - NOT_ALLOWED_ON_RDN: failed for MessageType :
MODIFY_REQUEST_Message ID : 6_ Modify Request_ Object :
'cn=johndoe+mail=johndoeldap#liferay.com+sn=doe,dc=example,dc=com'_
Modification[0]_ Operation : replace_
Modification_sn: doe Modification1_
Operation : replace_ Modification_sn: doe
Modification2_ Operation : replace_
Modification_givenName: johndoe Modification3_
Operation : replace_ Modification_mail:
johndoeldap#liferay.com Modification[4]_
Operation : replace_ Modification_cn: doe
doeorg.apache.directory.api.ldap.model.message.ModifyRequestImpl#32d7606a:
ERR_62 Entry
cn=johndoe+mail=johndoeldap#liferay.com+sn=doe,dc=example,dc=com does
not have the cn attributeType, which is part of the RDN";]; remaining
name
'cn=johndoe+mail=johndoeldap#liferay.com+sn=doe,dc=example,dc=com'
[Sanitized]
Post configuring LDAP on liferay,I am able to correctly connect to LDAP and view users too.
Below is the user mapping configuration
Below is export and Group mapping config
LDAP config
I got it sorted up by correcting the User field mapping.
While importing,data was imported from LDAP without any exceptions but on the other hand,while exporting the data to LDAP,there was duplicacy in terms of 'cn' attribute being used multiple times for mapping(both for Screen name and Full name),which must have been used uniquely.So even though the user data is exported from liferay,yet this led to SchmenaViolationException and did not allow user to login in to portal.

google cloud dataflow cross project access for big table

I want to run dataflow job to migrate data from google-project-1-table to google-project-2-table. (Read from one and write to another). I am getting permission issue while doing that. I have set "GOOGLE_APPLICATION_CREDENTIALS" to point to my credential file for project-1. In project-2 below are the permissions/roles for project-1. 1) service-account (role - Editor) 2) -compute#developer.gserviceaccount.com (role - Editor) 3) #cloudservices.gserviceaccount.com(role - Editor).
Is there anything else I need to do to run the job?
Caused by: com.google.bigtable.repackaged.com.google.cloud.grpc.io.IOExceptionWithStatus: Error in response stream
at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.ResultQueueEntry$ExceptionResultQueueEntry.getResponseOrThrow(ResultQueueEntry.java:66)
at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.ResponseQueueReader.getNextMergedRow(ResponseQueueReader.java:55)
at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.StreamingBigtableResultScanner.next(StreamingBigtableResultScanner.java:42)
at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.StreamingBigtableResultScanner.next(StreamingBigtableResultScanner.java:27)
at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.ResumingStreamingResultScanner.next(ResumingStreamingResultScanner.java:89)
at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.ResumingStreamingResultScanner.next(ResumingStreamingResultScanner.java:45)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$1.next(CloudBigtableIO.java:221)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$1.next(CloudBigtableIO.java:216)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$Reader.advance(CloudBigtableIO.java:775)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$Reader.start(CloudBigtableIO.java:799)
at com.google.cloud.dataflow.sdk.io.Read$Bounded$1.evaluateReadHelper(Read.java:178)
... 18 more
Caused by: com.google.bigtable.repackaged.io.grpc.StatusRuntimeException: PERMISSION_DENIED: User can't access project: project-2
at com.google.bigtable.repackaged.io.grpc.Status.asRuntimeException(Status.java:431)
at com.google.bigtable.repackaged.com.google.cloud.grpc.scanner.StreamObserverAdapter.onClose(StreamObserverAdapter.java:48)
at com.google.bigtable.repackaged.io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$3.runInContext(ClientCallImpl.java:462)
at com.google.bigtable.repackaged.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:54)
at com.google.bigtable.repackaged.io.grpc.internal.SerializingExecutor$TaskRunner.run(SerializingExecutor.java:154)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
There are some instructions for this in the section "Accessing Cloud Platform Resources Across Multiple Cloud Platform Projects" of the Dataflow Security and Permissions guide.
Since that guide does not explicitly address Cloud BigTable, I will try to write up the requirements clearly here in terms of your question.
Using fake project id numbers, it seems you have:
A project project-1 with id 12345.
A project project-2 with id 9876
A Bigtable google-project-1-table in project-1
A Bigtable google-project-2-table in project-2
A Dataflow pipeline that will run in project-1, which you want to:
read from google-project-1-table
write to google-project-2-table
Is that accurate?
Your Dataflow workers that write to Bigtable run as the compute engine service account. That is 12345-compute#developer.gserviceaccount.com. This account will need to be able to access project-2 and write to google-project-2-table.
Your error message implies that the permissions failure occurs at the coarsest granularity - the account cannot access project-2 at all.

Problems deploying Grails application on cloudbees

I'm new cloudbees user, I've succeeded in building my war file but I can't deploy it because when a I'm going to use Deploy Now option, putting my application ID claxon/claxon I always get the same error:
**"hudson.util.IOException2: Server.InternalError - Expected Application ID format: account/appname > claxon/"**
this is the log..
hudson.util.IOException2: Server.InternalError - Expected Application ID format: account/appname > claxon/
at com.cloudbees.plugins.deployer.deployables.Deployable.deployFile(Deployable.java:152)
at com.cloudbees.plugins.deployer.DeployNowRunAction$Deployer.perform(DeployNowRunAction.java:639)
at com.cloudbees.plugins.deployer.DeployNowRunAction.run(DeployNowRunAction.java:484)
at com.cloudbees.plugins.deployer.DeployNowTask$ExecutableImpl.run(DeployNowTask.java:157)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:236)
Caused by: com.cloudbees.api.BeesClientException: Server.InternalError - Expected Application ID format: account/appname > claxon/
at com.cloudbees.api.BeesClient.readResponse(BeesClient.java:850)
at com.cloudbees.api.BeesClient.applicationDeployArchive(BeesClient.java:435)
at com.cloudbees.plugins.deployer.deployables.Deployable.deployFile(Deployable.java:124)
... 5 more
Duration: 4.1 sec
Finished: FAILURE
Anybody can help me?.
Don't put the claxon/ the account is sorted for you when you choose the account from the account drop down. just put the id part.
If you expand the help text you will see that it says not to prefix:
Note: I used my superuser rights to capture this screenshot... the only bit that might be confidential is your account name, which you had already exposed via the original question.