i want to take various server logs and send it to the splunk server. is that possible?
i assume it has to do something with the rysyslog.conf ... but I have no idea of how to do it?
Of course you can do that, this is what splunk is all about.
Install the universal forwarder and configure the inputs.conf file to gather the data from your files.
http://docs.splunk.com/Documentation/Splunk/6.2.0/Forwarding/Deploymentoverview
Related
I'm currently working on a bot for some anarchy servers, and it's a lot more reliable for it to read the log than just plain chat. I need a way to access that, but I don't know how. Is there even a way to do this without admin access?
No there is actually no way to view this. Because it's a file located on the server you can only view it if you have (direct) file access to the server. The only way to get the latest.log file is to contact the server owner, but I think that no server owner of e.g Hypixel will give you this log file.
You can make a Minecraft plugin in Java that acts as an API server, you can then make it read the file and return it. Of course, you would want to protect it with some type of authorization. You can use an HTTP server, an example would be this, it allows commands to be executed but you could easily work off that.
As MCTzOCK mentioned you can't view it without asking for permission from the server owner.
I put password_persist.txt file in client site, because esb works as windows server. but we are concern about security of this password. because everyone can have access to that.
They don't want to use password_tmp file every time.
Is there any solution for hiding it or hash it?
Thanks in advance
We run our ESB as a windows service with the Log On As set to a user we setup specifically to run our ESB and other wso2 products (ex. domain\wso2user). We then use a data security product, in our case Vormetric, to lock that file down so that only that user we have setup to run our windows service has access to that file. All other users are denied access to viewing that file.
We worked with our security team to arrive at this solution so I would recommend speaking to yours and seeing if they have a solution in place for this type of scenario. My only experience with this is using Vormetric but I am sure there are countless other products out there that will provide similar functionality.
Joe
Another solution would be to take the base64 encoded key-store password, decode it and generate the password-temp file at server startup using the server startup script. In Linux you could include
echo d3NvMmNhcmJvbgo= | base64 -d | tee $CARBON_HOME/password-tmp
at the start of the
elif [ "$CMD" = "start" ]; then block. You should be able to do something similar for windows as well. It would be better to go with encryption though.
I am using IBM HTTP Server 6.1 / Apache 2.0.47. I would like to pull a specific piece of data out of all requests coming through the HTTP server and if it exists log the found data along with the target URL. It needs to be as efficient as possible.
Is a filter appropriate or a handler?
Does a filter/handler exist that I can configure and use as is or do I need to write something? How do I configure, or write this?
Thanks.
You could use mod_security apache module , which have a good audit log tool SecAuditLog (log all headers), that you can declench by http status. You'll find as well fine filters, that will maybe fits your needs.
And do not hesitate to ask servfault gurus on that.
Does anyone know of a script or program that can be used for backing up multiple websites?
Ideally, I would like the have it setup on a server where the backups will be stored.
I would like to be able to add the website login info, and it connects and creates a zip file or similar that it would then be sent back to the remote server to be saved as a backup etc...
But it would also need to be able to be set up as a cron so it backed up everyday at least?
I can find PC to Server backups that a similar, but no server to server remote backup scripts etc...
It would be heavily used, and needs to be a gui so the less techy can use it too?
Does anyone know of anything similar to what we need?
HTTP-Track website mirroring utility.
Wget and scripts
RSync and FTP login (or SFTP for security)
Git can be used for backup and has security features and networking ability.
7Zip can be called from the command line to create a zip file.
In any case you will need to implement either secure FTP (SSH secured) OR a password-secured upload form. If you feel clever you might use WebDAV.
Here's what I would do:
Put a backup generator script on each website (outputting a ZIP)
Protect its access with a .htpasswd file
On the backupserver, make a cron script download all the backups and store them
How do I get a status report of all files currently being uploaded via HTTP form based file upload on an Apache Server?
I don't believe you can do this with Apache itself. The upload looks like nothing more than a POST as far as Apache cares. There are modules and other servers that do special processing to uploads so you may have some luck there. It would probably be easier to keep track of it in your application.
Check out SWFUpload, its uses Flash (in a nice way) to assist with managing multiple uploads.
There are events you can monitor for how many files of a set have been uploaded.