where does Apache Flume logs its error messages? - apache

I'm new to Apache flume, Just I want to know, where does Apacheflume logs its error messages and metadata information.
I searched apche flume directory for captured error logs, but I did'nt see any floder with the name log.
Could anyone help me on this, how to configure logs in apache flume.

Flume logs are in /var/log/flume-ng. This location is specified in logging configuration file /etc/flume-ng/conf/log4j.properties.

Dmitry is right, the log file location is specified in FLUME_HOME/conf/log4j.properties.
I just wanted to add that, in Apache Flume 1.5, the default log location is:
FLUME_HOME/logs/flume.log
The log file may not be generated in case Flume initialization failed - this usually means that Flume couldn't find Java, configuration files, etc.

Related

Flume sink to Splunk?

Has anyone had success sinking data from flume to splunk?
I've tried the Thrift and Avro flume sinks, but they have issues. Not really great formats for splunk, and flume keeps trying events over and over again after they've been sunk.
I'm looking into the flume HTTP sink to splunk's HEC, but I can't see how to set the HEC token in the header. Has anyone configured the HEC token in header for flume http sink?
Considering just doing a file sink that is forwarded to Splunk, but would like to avoid this temporary file if possible.
Advice?
Ended up just making a rolling file sink, and copying those files over to a directory monitored by Splunk forwarder. Not ideal or performant but good enough for our use case.

AWS ebextensions creating httpd.conf with logfomat change as described in AWS document. But apache logs are printing twice

I am using AWS ElasticBeanstalk for prod deployment.
I want to add response time for each request in apache access log.
As per AWS documentation I created a configuration file in source bundle .ebextensions/httpd/conf/httpd.conf (took the default file and modified logformat)
The response time gets generated, but the log statements are printing twice.
Thanks in advance.

Logs not Updating in kibana automatically using filebeat

I have installed ELK and Filebeat in two different machines(Cent OS).
My log file was there in filebeat machine.
Every time if i want to update the logs i need to run the following command "systemctl restart filebeat"
How to avoid this step? so that filebeat should able to read logfile synchronously and show it in kibana?
You're probably missing the scan_frequency option on filebeat config file.
https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html

Script for Monitoring Apache log file behaviour

I m looking for a script which monitors apache log file and alerts the admin if there is a "GET 500" string comes or the apache log file itself stops getting new entries from a particular range of IP address.
you can use M/Monit to monitor Apache logs (and any other logs if you want). You can install and configure it on Ubuntu following the tutorial from here How to monitor Symfony and Apache logs with M/Monit, just ignore the symfony2 part and replace the string from if match "error" then alert with your GET 500 one.

WAMP Server 3.0.0 cannot open apache conf file

I've re-installed WAMP in order to put it into a different directory. This was mainly so everything is automatically backed up.
Mysql starts ok, but apache fails with the following message from the event viewer:
The Apache service named reported the following error:
httpd.exe: Could not open configuration file bin/conf/httpd.conf: The system cannot find the path specified. .
This file is in the correct place (C:\googledrive\wamp64\bin\apache\apache2.4.17\conf) and there isn't any other stray httpd.conf files that it might be picking up on the path or anywhere.
Any ideas?
This turned out to be a permissions problem. It seems that the system account didn't have access to the directory containing the config file, but I guess must have had enough access to start up Apache and give that helpful (not) message!