How do I get a current session value in Pentaho User Console 5.0 that we use to get using Pentaho Action Sequence (.xaction)?
You can use Dashboards.context and then :
Dashboards.context.user, Dashboards.context.roles, ...
Related
I use Asp.net Core 2.0 Web Application.
I have a simple stimulsoft report file.
my report has a parameter Called #ID.
If I Use the report without parameter, it's working well, But When I send parameter to report, it's not working and report loaded empty.
my Code is:
var _Report = new Stimulsoft.Report.StiReport();
_Report.Load(#"C:\temp\report.mrt"); // load report
((StiSqlDatabase)_Report.Dictionary.Databases["Connection"]).ConnectionString = "new connectionString "; // change connection
_Report.DataSources["DataSource1"].Parameters["#ID"].ParameterValue = 5171; // set parameter value
_Report.Render();
Note: I use Asp.net Core version 2.0.
this code is working in asp.net well.
please help me.
thanks.
Wow!!!
solved.
I should use from Value Instead ParameterValue.
Like below:
_Report.DataSources["DataSource1"].Parameters["#ID"].Value = 5171;
We are using the Beta Scheduled query feature of BigQuery.
Details: https://cloud.google.com/bigquery/docs/scheduling-queries
We have few ETL scheduled queries running overnight to optimize the aggregation and reduce query cost. It works well and there hasn't been much issues.
The problem arises when the person who scheduled the query using their own credentials leaves the organization. I know we can do "update credential" in such cases.
I read through the document and also gave it some try but couldn't really find if we can use a service account instead of individual accounts to schedule queries.
Service accounts are cleaner and ties up to the rest of the IAM framework and is not dependent on a single user.
So if you have any additional information regarding scheduled queries and service account please share.
Thanks for taking time to read the question and respond to it.
Regards
BigQuery Scheduled Query now does support creating a scheduled query with a service account and updating a scheduled query with a service account. Will these work for you?
While it's not supported in BigQuery UI, it's possible to create a transfer (including a scheduled query) using python GCP SDK for DTS, or from BQ CLI.
The following is an example using Python SDK:
r"""Example of creating TransferConfig using service account.
Usage Example:
1. Install GCP BQ python client library.
2. If it has not been done, please grant p4 service account with
iam.serviceAccout.GetAccessTokens permission on your project.
$ gcloud projects add-iam-policy-binding {user_project_id} \
--member='serviceAccount:service-{user_project_number}#'\
'gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com' \
--role='roles/iam.serviceAccountTokenCreator'
where {user_project_id} and {user_project_number} are the user project's
project id and project number, respectively. E.g.,
$ gcloud projects add-iam-policy-binding my-test-proj \
--member='serviceAccount:service-123456789#'\
'gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com'\
--role='roles/iam.serviceAccountTokenCreator'
3. Set environment var PROJECT to your user project, and
GOOGLE_APPLICATION_CREDENTIALS to the service account key path. E.g.,
$ export PROJECT_ID='my_project_id'
$ export GOOGLE_APPLICATION_CREDENTIALS=./serviceacct-creds.json'
4. $ python3 ./create_transfer_config.py
"""
import os
from google.cloud import bigquery_datatransfer
from google.oauth2 import service_account
from google.protobuf.struct_pb2 import Struct
PROJECT = os.environ["PROJECT_ID"]
SA_KEY_PATH = os.environ["GOOGLE_APPLICATION_CREDENTIALS"]
credentials = (
service_account.Credentials.from_service_account_file(SA_KEY_PATH))
client = bigquery_datatransfer.DataTransferServiceClient(
credentials=credentials)
# Get full path to project
parent_base = client.project_path(PROJECT)
params = Struct()
params["query"] = "SELECT CURRENT_DATE() as date, RAND() as val"
transfer_config = {
"destination_dataset_id": "my_data_set",
"display_name": "scheduled_query_test",
"data_source_id": "scheduled_query",
"params": params,
}
parent = parent_base + "/locations/us"
response = client.create_transfer_config(parent, transfer_config)
print response
As far as I know, unfortunately you can't use a service account to directly schedule queries yet. Maybe a Googler will correct me, but the BigQuery docs implicitly state this:
https://cloud.google.com/bigquery/docs/scheduling-queries#quotas
A scheduled query is executed with the creator's credentials and
project, as if you were executing the query yourself
If you need to use a service account (which is great practice BTW), then there are a few workarounds listed here. I've raised a FR here for posterity.
This question is very old and came on this thread while I was searching for same.
Yes, It is possible to use service account to schedule big query jobs.
While creating schedule query job, click on "Advance options", you will get option to select service account.
By default is uses credential of requesting user.
Image from bigquery "create schedule query"1
I have the following statement in my Startup.cs:
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.WriteTo.ColoredConsole()
.WriteTo.MSSqlServer("Server=(localdb)\\MSSQLLocalDB;Database=myDb.Logging;Trusted_Connection=True;", "Logs", autoCreateSqlTable: true)
.WriteTo.RollingFile(pathFormat: Path.Combine(logPath, "Log-{Date}.txt"))
.CreateLogger();
And in my Configure method:
loggerFactory.AddSerilog();
When I start the application, the table is created so I know the connection works. I get logged output to the console and to the file, however, no output to the database table.
What am I failing to do?
Other information: using asp.net core rc2-final, targeting net461, and using Serilog.Sinks.MSSqlServer 4.0.0-beta-100
At first glance it doesn't look like you're missing anything. It's likely that an exception is being thrown by the SQL Server Sink when trying to write to the table.
Have you tried checking the output from Serilog's self log?
Serilog.Debugging.SelfLog.Enable(msg => Console.WriteLine(msg));
Update:
Looks like a permission issue with you SQL Server/Local DB. This error message suggests the sink is trying to run an ALTER TABLE statement and the user running the application doesn't have permission to execute an ALTER TABLE statement.
Update 2: I suggest you write a simple Console App using the full .NET + Serilog v1.5.14 + Serilog.Sinks.MSSqlServer v3.0.98 to see if you get the same behavior... Just to rule out the possibility that there's a problem with the .NET Core implementation or with the beta sink version you're using
I'm trying to determine the best approach for executing business logic in a Push adapter. I've run the example PushAdapter (Module_07_04_nativeAPIForiOSPush) successfully from my local environment, but, adding WL.Server.setActiveUser() throws an error.
I'm running the demo PushAdapter adapter locally in Worklight Studio (6.0.0.201309171829), added as the first line in the adapter:
WL.Server.setActiveUser("PushAppRealm",userId);
...
Deployed the adapter change, run with same params and get this error in the Worklight console:
Can't find method com.worklight.integration.js.JavaScriptIntegrationLibraryImplementation.setUserIdentity(string,string). (/integration.js#36)
FWLSE0101E: Caused by: [project Module_07_04_nativeAPIForiOSPush]null
The adapter runs without any problems without this line. I'm trying to set the active user because I want to get the user's preferences next to determine business logic on whether to create the notification. Is there another approach?
I've also run this in a new workspace (after I applied the Fix Pack 1 to WL Studio 6), but, same result.
Questions are 1) why getting this error?, and 2) is this a valid approach?
Thanks.
var userIdentity = {
userId: "userid",
displayName: "userid",
attributes: {
foo: "bar"
}
};
WL.Server.setActiveUser("PushAppRealm", userIdentity);
This should work. However for this sample you should not be explicitly setting the user identity.The method WL.Server.setActiveUser() is used to set the user identity in case of adapter based authentication.This sample uses form based authentication.
I am working on Enterprise library 6 with windows azure project. Based on the quick start application we can log messages, events to Xml file or sql server. following code doing this in sample application.
this.fileListener = FlatFileLog.CreateListener("aExpense.DataAccess.log", formatter: new XmlEventTextFormatter(EventTextFormatting.Indented), isAsync: true);
fileListener.EnableEvents(AExpenseEvents.Log, EventLevel.LogAlways, AExpenseEvents.Keywords.DataAccess);
//Log to Rolling file informational UI events only
this.rollingfileListener = RollingFlatFileLog.CreateListener("aExpense.UserInterface.log", rollSizeKB: 10, timestampPattern: "yyyy", rollFileExistsBehavior: RollFileExistsBehavior.Increment, rollInterval: RollInterval.Day, formatter: new JsonEventTextFormatter(EventTextFormatting.Indented), isAsync: true);
rollingfileListener.EnableEvents(AExpenseEvents.Log, EventLevel.Informational, AExpenseEvents.Keywords.UserInterface);
// Log all events to DB
this.dbListener = SqlDatabaseLog.CreateListener("aExpense", WebConfigurationManager.ConnectionStrings["Tracing"].ConnectionString, bufferingInterval: TimeSpan.FromSeconds(3), bufferingCount:10);
dbListener.EnableEvents(AExpenseEvents.Log, EventLevel.LogAlways, Keywords.All);
But I need to log all these events and exceptions to Azure Table Storage. Is Enterprise Library 6 can support ? How to do ?
Based on the code sample, it looks like you are using the Semantic Logging Block. There is a
Windows Azure sink for the Semantic Logging Application Block which will log to table storage.
this.azuretableListener = WindowsAzureTableLog.CreateListener(
RoleEnvironment.CurrentRoleInstance.Id,
RoleEnvironment.GetConfigurationSettingValue("ConnectionString"));
azuretableListener.EnableEvents(AExpenseEvents.Log, EventLevel.LogAlways, Keywords.All);
The default table name is "SLABLogsTable" but you can specify another name.