What log analysis and alerting systems are you using with Cloudflare, preferably PRO, not Enterprise, since Clouflare only retains 4 hours worth of Access log data?
I am thinking along the lines of Loggly, Splunk, Sumologic etc. I cannot seem to find anything that would work more or less out of the box ie some form of integration.
I am aware of Cloudflare's work with Google on their Enterprise plan.
Thanks.
Related
I am building a UWP app that targets both x86, x64 and ARM platforms. I want to replace the current implementation that uses Azure for the backed (an App Service and an SQL Server) because of the high price and because my Pay-As-You-Go subscription does not allow me to set a spending limit.
I thought about using a local database but I don't know if that could be a solution since I want the user to be able to have his data synced on both PC and phone for example. I am also ok with renouncing the idea of a structured database in favor of structured files (like xml) if I can find a way to keep them somewhere in the cloud (and then I can read/write them from the client app - no need for App Service).
Are there any free, non-trial alternatives to Azure? Or should I look more into the file storage implementation? Thanks in advance.
Instead of Azure you could use another web hosting solution to publish you API. Azure also offers small free plans that might be sufficient.
An alternative would be to request access and store/sync data to user's OneDrive. Each logged in user with Microsoft Account should have OneDrive storage available so this is a good middle-ground, which is still free for you. A nice introduction to this can be found in this article.
UWP also offers RoamingFolder where you can store small files that are synced across the devices that you use. Unfortunately this is less reliable because you are not able to control when the sync happens and cannot resolve conflicts.
I have successfully migrated to another cloud platform: Heroku. In my opinion, at least for small apps, Heroku offers the best solution both technology-wise and price-wise.
I am now able to have a webservice hosted for free in the cloud, without worring about traffic and number of requests. Of course you can scale up if you want better performance, but you can start with a free plan. Also, I have a postgressql db hosted also in the cloud, also for free (up until 10 000 records, and it will be just 9$/month if I want to upgrade to 10 milion). One can never found an offer like this free on Azure.
I had to learn a bit of Node.js (there are a lot of languages Heroku supports for backend services, but .Net is not one of them) but it was totally worth it!
Another option that is now starting to gain more and more popularity is FireBase. I will certantly also check that out for my future apps.
My team and I have build a site on Joomla (php,apache,mysql) , basically a publishing article site with no user interaction, and some JS modules.
The site is famous and at some peak times there are 2500-3500 requests accessing it, the site get very slow and look very bad on user experience. The upload target is not the issue.
I need some suggestions on HW requirements and technologies I can use like apache enhance, php modules or any proxy infrastructure.
Does anyone done a study ore an analyse on this topics, I can't find anything util searching on web and I'm all ears to hear any suggestion.
Thank you
We currently manage a few sites that are more or less similar. Here's the hardware from one of them:
16 GB RAM
500 GB Hard Disk (not SSD)
8 core processor
Note, however, that for high traffic websites we do modify the Joomla core, and we also switch the tables from InnoDB to MyISAM (regardless what others might think here, MyISAM is much faster than InnoDB ). We only use first level caching (we rarely use the "System - Cache" plugin).
I am new to web developing, so my question might be silly.
I want to set up an Apache cluster. I have four hardware machines and I want to distribute http requests load between them. Now on every machine I have installed Fedora.
For now it can be simple load balance cluster without any recovery techniques (in case of some hardware error on some servers). And of course I need open source (free to commercial use) software.
Any suggestions on what soft/tutorials/books I should look to learn how to set up environment like this?
The O'Reilly book "Server Load Balancing" by Tony Bourke (now runs lbdigest.com) is the classic. Unfortunately it's a little dated. Maybe Tony will consider an update.
If you really want "no recovery techniques", basic DNS Round-robin might work for you but it's pretty crude. There's an open source project called Ultramonkey, but I haven't had a chance to mess with it. A lot of development has gone into commercial solutions which offer load balancing, high-availability, etc. Including the Coyote Point product (which I helped write). The appliance based products are actually quite affordable today.
I've got a Windows Server box running AD, and a CentOS box running OpenLDAP in a mixed windows Linux network and I want to keep the two in sync. Preferably using free software/just some configuration changes. anyone know how to make these 2 authentication systems play nice? any syncing would have to be done over SSL for security reasons.
I use a home-grown perl script, which sync one-way from AD to LDAP via SSL. It is very custom and very rigid. I walked the same path 6 months back looking for tools to sync but none fits our needs. Well actually there isn't any that does sync without breaking
So my answer is get a scripting guy and give him the requirements and a months paycheck. Seriously, it is best done in-house than spend time looking for one and molding to your needs.
Perl has good libraries and has worked very well for us. We migrated from OpenLDAP to 389-DS which already has windowsSync plugin.(Hope that tempts you to switchover). :)
I was wondering if anyone had some good general information on windows terminal services and how it works.
I'm wondering:
If a DLL is loaded into memory is it available for all users or reloaded for each user. (or does it depend on something else)
A specification for an example server and how many concurrent users it supports with general use....
Any issues with them, I used one a while ago and we had sporadic freezes for a few seconds every so often. This could not be tracked down to a network issue someone suggested something to do with remote printers being attached to the system. It really annoyed users seemed to happen often when going to the start menu.
Is 2008 a big upgrade in terms of performance to 2003
Terminal Services is simply like a server with multiple "remote desktop".
DLL is not shared between sessions, just like ordinal process.
You need a special license if you want to use standard Windows server
I suggest removing all the printers when you want to use it (you can also disable them in client side), but that's not a big issue
2008 is far better for performance and security, but you'd also need more recent RDP clients.