Is it possible to link to a job in the bigquery console? - google-bigquery

If I execute a BigQuery job using the REST API (i.e. bigquery.googleapis.com) in the response I get back a selfLink that looks something like this:
https://bigquery.googleapis.com/bigquery/v2/projects/my-project/jobs/job_0123456789ABCDEF?location=EU
In the UI (ie.. console.cloud.google.com) I can see the very same job in the project's query history:
Is it possible to use the information within that API response and construct a URL that will allow a person to visit that URL in the browser and be taken directly to the information about that query in the UI? This would be really useful because we could log a message containing that URL so that anyone viewing the logs can see a user-friendly UI regarding that job.
I suspect the answer is "no" but just thought I'd ask.

I believe you can share this link:
https://console.cloud.google.com/bigquery?project=<my-project>j=<bq:<location>:<job_id>>&page=queryresults
For example: https://console.cloud.google.com/bigquery?project=my-project&j=bq:US:2846160a-9a13-4192-9bff-e691ff2adab6&page=queryresults
If a user has BQ Job List permission in that project, then when they open up the link they will be be able to see the query that was run in the UI, along with the job information.
But they can't see the query results, which is intended behavior. Instead they will get a warning:
Access Denied: User does not have permission to access results of another user's job.

Related

How to configure PagerDuty alerts in Splunk Cloud?

I've run into a few different issues with the PagerDuty integration in Splunk Cloud.
The documentation on PagerDuty's site is either outdated, not applicable to Splunk Cloud or else there's something wrong with the way my Splunk Cloud account is configured (could be a permissions issue): https://www.pagerduty.com/docs/guides/splunk-integration-guide/. I don't see an Alert Actions page in Splunk Cloud, I have a Searches, Reports and Alerts page though.
I've configured PD alerts in Splunk using the alert_logevent app but it's not clear if I should instead be using some other app. These alerts do fire when there are search hits but I'm seeing another issue (below). The alert_webhook app type seems like it might be appropriate but I was unable to get it to work correctly. I cannot create an alert type using the pagerduty_incident app. . . although I can set it as a Trigger Action (I guess this is how it's supposed to work, I don't find the UI to intuitive here).
When my alerts fire and create incidents in PagerDuty, I do not see a way to set the PagerDuty incident severity.
Also, the PD incidents include a link back to Splunk, which I believe should open the query with the search hits which generated the alert. However, the link brings me to a page with a Page Not Found! error. It contains a link to "more information about my request" which brings up a Splunk query with no hits. This query looks like "index=_internal, host=SOME_HOST_ON_SPLUNK_CLOUD, source=*web_service.log, log_level=ERROR, requestid=A_REQUEST_ID". It it not clear to me if this is a config issue, bug in Splunk Cloud or possibly even a permissions issue for my account.
Any help is appreciated.
I'm also a Splunk Cloud + PagerDuty customer and ran into the same issue. The PagerDuty App for Splunk seems to create all incidents as Critical but you can set different severities with event rules.
One way to do this is dynamically is to rename your Splunk alerts with the desired severity level and then create a PagerDuty event rule for each level that looks for the keyword in the Summary. For example...
If the following condition is met:
Summary contains "TEST"
Then perform the following actions:
Set severity = Info
screenshot of the example in the event rule edit screen
It's a bit of pain to rename your existing alerts in Splunk but it works.
If your severity levels in Splunk are programmatically set like in Enterprise Security, then another method would be to modify the PagerDuty App for Splunk to send the $alert.severity$ custom alert action token as a custom detail in the webhook payload and use that as event rule condition instead of Summary... but that seems harder.

How to log requests and user info like username automatically into log file to track user activity in liferay?

In Liferay 7 Enterprise Edition,
I want to log user info like user_name in external log files automatically in each request to track user activities, how to do that?
without using auditing plugin
when I tried to log post request for example (login), it doesn't contain any info about user ?!
This kind of thing is much harder than you might think...
Getting access to the current user is really easy. As Victor pointed out, you can use the ThemeDisplay object to get current user. If you don't have the request around, you can use the PrincipalThreadLocal to find the current user id.
That gives you the who, but certainly not the "what is user doing" aspects. Since the portal aggregates the HTML fragments of many portlets, from a servlet filter perspective it would be hard to gleam which one of the available portlets on an incoming URL is actually being interacted with. You could try a portlet filter to narrow the field, but this will just tell you what portlet is being accessed but not what they are doing with it.
Although you rejected the built in audit functionality available in DXP, it really is the answer for tracking who did what in the portal because it has the necessary touch points to get those two pieces and put them together.
Now if you rejected the built in audit functionality because you want a file and not a database entry, that is easy to solve. Go to the System Settings control panel and find the Logging Audit Message Processor and enable it. It will write the audit events out to a file in CSV format, but you should have the source for modules/apps/foundation/portal-security-audit/portal-security-audit-router/src/main/java/com/liferay/portal/security/audit/router/internal/LoggingAuditMessageProcessor.java so you can use this as a basis to write your own format.
Look at this code:
https://github.com/amusarra/liferay-portal-security-audit
in particular the portal-security-audit-capture-events module that catch the login events.
This seems a job for a filter, the user information is normally extracted from the themeDysplay, like in:
ThemeDisplay themeDisplay = ( ThemeDisplay ) request.getAttribute( THEME_DISPLAY );
long userId = themeDisplay.getRealUserId();
If you want to track specific portlets, an OSGi portlet filter would do the job.

Sitecore: Statistics on sitecore domain users

I'm working with Sitecore 8 Update 2.
I'm looking for a way to get some statistics on how many sitecore users are logged in ( over time, not just right now).
Mainly i would like to see if the actual amount of people working on the system is close to the maximum amount of users allowed by the license. Otherwise a company might be seriously overpaying.
Is there already an out of the box solution or a module available for this ?
If you visit this page:
<your-domain>/sitecore/client/Applications/LicenseOptions/KickUser
It will give you a list of all the currently logged in CMS users. That page is a SPEAK application and has a datasource called ActiveUsersDataSource. In the code, this datasource uses the DomainAccessGuard.Sessions property to get a list of all the sessions.
So if you just want a list, the KickUser page should be enough. If you want to run a report you can use the DomainAccessGuard.Sessions property as a start. You could use that to write a report with Sitecore Powershell Extensions pretty simply.
Log files usually contain audit information about login / logout. E.g.:
6140 13:57:33 INFO AUDIT (sitecore\djanjicek): Login
...
7512 14:02:57 INFO AUDIT (sitecore\djanjicek): Logout
With Sitecore Log Analyzer you should be able to filter your log files on the audit trails.
https://marketplace.sitecore.net/Modules/S/Sitecore_Log_Analyzer.aspx
If you need a web based solution then you could write an admin page that reads all log files and outputs the required lines in a timely fashion.
Also, you can try this:
var x = Membership.Providers["sql"].GetNumberOfUsersOnline();
where "sql" is you membership provider name defined in configuration.

Accessing Metacritic API and/or Scraping

Does anybody know where documentation for the Metacritic api is/if it still works. There used to be a Metacritic API at https://market.mashape.com/byroredux/metacritic-v2#get-user-details which disappeared today.
Otherwise I'm trying to scrape the site myself but keeping getting a blocked by a 429 Slow down. I got data like 3 times this hour and haven't been able to get anymore in the last 20 minutes which is making testing difficult and application possibly useless. Please let me know if there's anything else I can be doing to scape I don't know about.
I was using that API as well for an app I wrote a while ago. Looks like the creator removed it from Mashape. I just sent him an email to ask whether it'll be back up. I did find this scraper online. It only has a few endpoints but following the examples given you could easily add more. Let me know if you make any progress!
Edit: Looks like CBS requested it to be taken down. The ToS prohibits scraping:
[…] you agree not to do the following, or assist others to do the following:
Engage in unauthorized spidering, “scraping,” data mining or harvesting of Content, or use any other unauthorized automated means to gather data from or about the Services;
Though I was hoping for a Javascript way of doing this, the creator of the API also told me some info.
He says I was getting blocked for not having a User agent in the header and should use a 429 handling procedure i.e. re-request with longer pauses in between.
A PHP plugin available as well: http://datalinx.io/shop/metacritic-api/
I had to add a user agent like JCDJulian said and now it allows me to scrape. So for Ruby:
agent = Mechanize.new
agent.user_agent_alias = "Mac Firefox"
Then it stopped giving me the 403 Forbidden error.

Worklight api console request for Push

I need to get a list of users for a specific Push adapter/event source, and, I'm trying to use the API console requests, which says the format is:
http://{hostname}:{port}/{context-root}/console/api/{api-context}/{action}/{parameters}
and I'm using:
http://192.168.1.106:10080/Module_07_04_nativeAPIForiOSPush/console/api/Push/get/PushAdapter/PushEventSource
to search the demo project, which has one subscriber. However, I get 404 return from a browser request.
The first column of the docs is the "api-context", but, it lists "Push" and "Event Sources", which, obviously seems invalid.
What is the correct format to find users subscribed to a push for a specific adapter/event source?
WL server does not provide API for listing subscribed users. By design you should maintain your own DB of subscribed users, this is why you have onSubscribe onUnsubscribe callbacks in event source. As an alternative - you can look into WL's DB tables to find this info.