AWStats SQL Tool? - sql

does anyone ever heard of a tool/script/etc. which allows to import an AWStats created logfile ( e.g. as text or xml ) into a SQL-DB?
I just want to figure out if i really have to write a parser-script myself...
thanks.

I guess you need the compiled information created by AWStats rather than the raw data from the initial server (e.g Apache). The Drupal.org folk seem to have done a little work on parsing the AWStats output, is that any help?
If the raw data from e.g. Apache is enough, you can import that into the database, see for example:
Writing Apache's Logs to MySQL

Related

Parse Server: logging specific functions

Is there a way to turn off logging on a specific cloud function but not for all of them? My issue is that the result of my cloud function contains a base64 image, and Parse Server by default output the base64 string to the log, which makes my daily log files extremely large and hard to maintain.
setting logLevel = 'warn' instead of 'info' solves this problem but then I lost the I/O information of other cloud functions which I also need..
thanks.
Can you open an issue in the repo with the specifics about your use case? We'll be probably able to slim down the logging for long base64 strings

What is the significance of data-config.xml file in Solr?

and when shall I use it? How is it configured can anyone please tell me in detail?
The data-config.xml file is an example configuration file for how to use the DataImportHandler in Solr. It's one way of getting data into Solr, allowing one of the servers to connect through JDBC (or through a few other plugins) to a database server or a set of files and import them into Solr.
DIH has a few issues (for example the non-distributed way it works), so it's usually suggested to write the indexing code yourself (and POST it to Solr from a suitable client, such as SolrJ, Solarium, SolrClient, MySolr, etc.)
It has been mentioned that the DIH functionality really should be moved into a separate application, but that hasn't happened yet as far as I know.

Small SQL Database for logs?

Im thinking about to use a DB for my logs instead of a normal txt file. Why? In a DB I could handle them much more easier than with a txt file. Actually I dont have a big log txt, there are some exceptions, and for every single day: userlogins and what client uploaded what file where - but even here, a DB would make sense or? What free (for noncommercial and for commercial) small DBs should I try? I could use a "real" DB like PostgreSQL or nosql with a simple XML DB with BaseX, so that's what I thought. Any suggestions? Thank you.
Edit: Oh sry forgot - Im using .NET, but maybe that's not so importan.
What will you do with your logging information? If you are going to do regular complex analysis work on it (performance, trending, etc.) then a database would be very useful. If you just need a place to dump "this happened" type messages that will be used infrequently at best (post-crash analysis and the like), a simple text or XML file should be more than sufficient. (If you do that, cycle the files ever day or week -- rename the current file, say with the date/time, and start a new "current" log file.)
Use SQLite. Really small footprint, cross-platform, single file for the whole db and serverless (http://www.sqlite.org) Give it a try.
Using the Package Manager you can install SqlServerCompact which works within your solution.
Use the Package Manager Console and type the following command:
Install-Package SqlServerCompact

Dijit.tree - Listing a directory on webserver

I would like to know if there is an easy way to list all the files in a directory on a webserver using Dojo's dijit.tree.
I suppose I could populate a datastore of the files using PHP, but that seems like a major pain and something that could be done much easier, I just can't think of anything else.
Any ideas?
Youll need to find a way to get the data from the server. Dojo can only handle things client-side
Check out "dojox/data/demos/demo_FileStore_dojotree.html", that's one specific way requiring php support on your server

Is there a web log analyser that can understand processing time and parse querystring?

Does anyone know of an web log analyser that can both report on the 'processing time' field that Apache can store (%D), and also parse querystrings intelligently?
I've looked into some of the usual suspects eg AWStats and Webalyser but none I've looked at seem to offer either of these features.
Ideally, you'd be able to report on specific querystring parameters rather than simple 'page' requests, eg if my server showed hits to:
/someurl?blah=X&whatever=Y
/someurl?whatever=Y&blah=Z
I'd like it to be able to parse that intelligently, so if I ask for a report where 'whatever=Y' both URLs would be grouped together, whereas if I report on 'blah=X', they would be counted separately?
Any suggestions of off-the-shelf tools that can do this? FOSS or otherwise.
Yes I realise that I can write some awk or sed scripts to parse this sort of thing myself but I'm looking for someone to have done that hard work for me and present it in a nice chart or what have you.
logparser from microsoft is a very versatile tool that can be used to parse server logs. It supports a SQL syntax so you can give powerful queries. THe only downside is that it is Windows only at this point.