Tracking piwik installation - tracking

I am trying to track piwik installation itself not the piwik campaigns but so far I haven't had any luck.
Here are list of few things I did:
In the config.ini.php, I have added following lines: [Debug]
track_visits_inside_piwik_ui = 999
where 999 is the site id of the site I want to track.
It does not generate any error and does not track anything.
I modified piwik/plugins/CoreHome/templates/piwik_tag.tpl:
{if $piwikUrl == 'http://demo.piwik.org/' || $debugTrackVisitsInsidePiwikUI || 'http://yourDomain/piwik/'}
and changed var piwikTracker = Piwik.getTracker("piwik.php", 2);
This also does not throw an error but does not track as well.
I have tried to put the piwik tracking code in the individual pages but there are no html codes where I can add them.
Any help will be very much Appreciated.
Anjali.

Using internal tracking you can only track visits to idSite = 1. The setting you are using is invalid since track_visits_inside_piwik_ui is a boolean flag and can only accept values 0 and 1.
[Debug]
track_visits_inside_piwik_ui = 1
This is the only valid setting. Note that this is a debugging/development feature and should not be used in production.

Related

Issue with Universal Forwarder forwarding logs to index

I have installed splunk UF on windows . I have one static log file in system (json) and that need to be monitored. I have configure this in inputs.conf file.
I see only system/application and security logs being sent to indexer whereas the static log file is not seen.
I ran "splunk list inputstatus" and checked,
C:\Users\Administrator\Downloads\test\test.json
file position = 75256
file size = 75256
percent = 100.00
type = finished reading
So, this means the file is being read properly.
What can be the issue that I dont see test.json logs at splunk side ? I tried checking index=_internal at indexer but not able to figure out what is causing issue, I checked few blogs on Internet as well. Can anyone please help on this.
inputs.conf stanza:
[monitor://C:\Users\Administrator\Downloads\data test\test.json]
disabled = 0
index = test_index
sourcetype = test_data

Scrapy spidermon exceptions

I'm trying to setup the basic suite of spidermon monitors as described here I did a quick Google search and also found this. So I made a quick monitors.py, then copy and pasted the code in there.
I then proceeded to do this:
SPIDERMON_ENABLED = True
SPIDERMON_SPIDER_CLOSE_MONITORS = (
'spidermon.contrib.scrapy.monitors.SpiderCloseMonitorSuite',
)
in my settings.py in the scrapy project.
It keeps raising this error:
spidermon.exceptions.NotConfigured: You should specify a minimum number of items to check against
Which I believe I've done (SPIDERMON_MIN_ITEMS = 10 # "SPIDERMON_MIN_ITEMS" - at the top of the file).
What am I doing wrong? I just want to setup the pre-defined monitors and then optimize them later.
Spidermon couldn't find a valid value for SPIDERMON_MIN_ITEMS in the settings. This must be an integer value bigger than zero otherwise it'll throw the error described. SPIDERMON_ADD_FIELD_COVERAGE set is also mandatory in order to use all the monitors available in this MonitorSuite.
In order to run the built-in close MonitorSuite SpiderCloseMonitorSuite from Spidermon project, please confirm if the settings.py file - located in the root directory of your scrapy project - have the variables below:
EXTENSIONS = {
'spidermon.contrib.scrapy.extensions.Spidermon': 500,
}
SPIDERMON_ENABLED = True
SPIDERMON_MIN_ITEMS = 10
SPIDERMON_ADD_FIELD_COVERAGE = True
SPIDERMON_SPIDER_CLOSE_MONITORS = (
'spidermon.contrib.scrapy.monitors.SpiderCloseMonitorSuite',
)

Issues pulling change log using python

I am trying to query and pull changelog details using python.
The below code returns the list of issues in the project.
issued = jira.search_issues('project= proj_a', maxResults=5)
for issue in issued:
print(issue)
I am trying to pass values obtained in the issue above
issues = jira.issue(issue,expand='changelog')
changelog = issues.changelog
projects = jira.project(project)
I get the below error on trying the above:
JIRAError: JiraError HTTP 404 url: https://abc.atlassian.net/rest/api/2/issue/issue?expand=changelog
text: Issue does not exist or you do not have permission to see it.
Could anyone advise as to where am I going wrong or what permissions do I need.
Please note, if I pass a specific issue_id in the above code it works just fine but I am trying to pass a list of issue_id
You can already receive all the changelog data in the search_issues() method so you don't have to get the changelog by iterating over each issue and making another API call for each issue. Check out the code below for examples on how to work with the changelog.
issues = jira.search_issues('project= proj_a', maxResults=5, expand='changelog')
for issue in issues:
print(f"Changes from issue: {issue.key} {issue.fields.summary}")
print(f"Number of Changelog entries found: {issue.changelog.total}") # number of changelog entries (careful, each entry can have multiple field changes)
for history in issue.changelog.histories:
print(f"Author: {history.author}") # person who did the change
print(f"Timestamp: {history.created}") # when did the change happen?
print("\nListing all items that changed:")
for item in history.items:
print(f"Field name: {item.field}") # field to which the change happened
print(f"Changed to: {item.toString}") # new value, item.to might be better in some cases depending on your needs.
print(f"Changed from: {item.fromString}") # old value, item.from might be better in some cases depending on your needs.
print()
print()
Just to explain what you did wrong before when iterating over each issue: you have to use the issue.key, not the issue-resource itself. When you simply pass the issue, it won't be handled correctly as a parameter in jira.issue(). Instead, pass issue.key:
for issue in issues:
print(issue.key)
myIssue = jira.issue(issue.key, expand='changelog')

DokuWiki LDAP can't see any groups

We have just changed our domain after protracted name change (the name actually happened two years ago!) and our DokuWiki installation has stopped being able to see any groups and memberships.
The config has been updated to reflect the new server and DCs and login is working correctly, it is only the groups that aren't working.
$conf['auth']['ldap']['server'] = 'ldap://MYDC.mydomain.co.uk:389';
$conf['auth']['ldap']['binddn'] = '%{user}#mydomain.co.uk';
$conf['auth']['ldap']['usertree'] = 'dc=mydomain,dc=co,dc=uk';
$conf['auth']['ldap']['userfilter'] = '(userPrincipalName=%{user}#mydomain.co.uk)';
$conf['auth']['ldap']['mapping']['name'] = 'displayname';
$conf['auth']['ldap']['mapping']['grps'] = 'array(\'memberof\' => \'/CN=(.+?),/i\')';
$conf['auth']['ldap']['grouptree'] = 'dc=mydomain,dc=co,dc=uk';
$conf['auth']['ldap']['groupfilter'] = '(&(cn=*)(Member=%{dn})(objectClass=group))';
$conf['auth']['ldap']['referrals'] = '0';
$conf['auth']['ldap']['version'] = '3';
$conf['auth']['ldap']['debug'] = 1;
Obviously I have edited the doain name there, but for the life of me I can't see what's wrong here, It all worked fine yesterday on the old domain.
I should also state that this is an old version of DokuWiki that for various reasons I can't actually update.
The debug line gives me a "ldap search: success" line, but if I add "?do=check" onto any url within the system I get "You are part of the groups"...... and nothing, it can't see any groups.
It's a massive pain as we have a pretty intricate ACL setup for the site, so it's not like I can just throw it open to all.
If anyone has any suggestions, no matter how obvious, please pass them on.
Solved it by changing the dokuwiki authentication plugin that was used, the 'authad' is more simple to use and just works with what I'm doing.
As a side bonus it also means that I have finally been able to get the install upgraded to the current version.

API Client 1.3 (rev89) - Error 500 "No individual errors" when using Fields Filter

Today (10.00 AM GMT+2) the code deployed in a production environment, started throwing an increasing number of errors while requesting file lists from a Google Drive folder, the error was always 500 "No Individual Errors".
After 2 hours, all the request failed.
The code regarding the file list request is the following:
'Search for a specific file name
oListReq.Q = "mimeType = 'application/vnd.google-apps.folder' and title = '" + ParentFolder + "' and trashed=false"
oListReq.Fields = "items/id" 'MSO - 20130621 - only ID is needed
oListReq.MaxResults = 10 'Max 10 files (too many I Expect only 1)
'Get the results
oFileList = oListReq.Fetch()
Testing the same requests with the API Explorer there is no problem and only the ID is returned.
Going step by step trying to identify the problem, turns out that all the requests with the Fields field specified generated a 500 error (other requests in the code have "items(id,alternateLink)" but the result is the same as the code above).
Temporary fixed the code commenting those lines.
Could you please investigate why this filters are not working with the .Net Client Library anymore?
Sorry for that. This error has been reproduced and Google is investigating on this. For now, please turn off fields filter.
It seems the issue is now fixed. We had the same issue with one of our production application, we had to produce a hot fix, but I performed a test a few minutes ago and it looks like it works again.