apache restricting bandwidth - apache

Ive been looking around the web without much success.
I am running a local xampp (1.7.0) installation and a web app that i have developed that backups up my file and send them to an FTP server. The problem is that apache seem to be using only a limited amount of my bandwidth and i am unsure why this is happening.
It usually doesn't get above 64KB/s but i know that my current broadband will allow over 1MB/s which is a massive difference. Also if i use my FTP program to login to the server it will let me download in excess of 500KB/s. Does anyone know how i can get around this cos my backups are very big files and take hours to copy at 64KB/s?
Thanks Mic

You seem to be confusing download and upload speeds: "if i use my FTP program to login to the server it will let me download in excess of 500KB/s."
Are you perhaps on a 1Mb/64Kb ADSL or cable connection?

Related

Disable the execute permission over SMB protocol on Synology NAS

I use to run a little NAS with access for a lot of people that are not quite good with the basics.
To avoid saturation of the NAS, i need to make a rule that doesn't allow users that are not administrators to execute file on the shared folder over SMB protocol. It may seem a little restrictive but with 10 people opening heavy file and work on them right on the NAS make it really slow and often it shutdown.
So i run a synology ds214se with the 7.1.1 version, and if anybody can help i'll be glad.

Continuously/automatically saving access logs for Apache Web Server on a Raspberry Pi

Does anybody know how to have the access logs for an Apache server on a Raspberry Pi saved before it is overwritten(or if it is even possible)?
I am using it to see who has accessed a website and need a way to store it regularly without losing any of the data.
I am new both to Raspberry Pi and Apache, so any help is much appreciated.
I have tried googling it to find out if it is even possible, but I haven't had any luck so far. I found out how to configure the log files and all that, but just not how to save the content of the file.

Profile a web request end to end

I just recently moved an app from a single stack Linode configuration, to a full on Amazon configuration as such: Load balancer, multiple app server, RDS database instance.
My latency in the process went up by around 200-300ms. I understand that having the app server and database server not on the same stack, will increase latency some.
How do I go about profiling a typical request to see where all the latency comes from, preferably with a nice break down. This will allow me to optimize our weaknesses. At the end of the day I want to be back around 100-150 ms on a request.
This particular project is a Codeigniter project running on top of Apache & phpfpm.
I've had very good success in using this application https://blackfire.io/
Blackfire Profiler Fire up your PHP Apps Performance
Easy to setup and currently requires the Chrome Browser but it will give you a break down of your entire application. It also supports SAPI & CLI which is quite nice too.
It's at least a tool to help you identify where you might have some performance issues.

Apache hangs/times out when backing up website with gzip or zip?

I'm running some websites on a dedicated Ubuntu web server. If I'm remembering correctly, it has 8 cores, 16GB memory, and running as a 64 bit Ubuntu. Content and files are delivered quickly to web browsers. Everything seems like a dream... until I run gzip or zip to backup an 8.6GB sized website.
When running gzip or zip, Apache stops delivering content. Internal server error messages are delivered until the compression process is complete. During the process, I can login via ssh without delays and run the top command. I can see that the zip process is taking about 50% CPU (I'm guessing that's 50% of a single CPU, not all 8?).
At first I thought this could be a log issue, with Apache logs growing out of control and not wanting to be messed with. Log files are under 5MB though and being rotated when they hit 5MB. Another current thought is that Apache only wants to run on one CPU and lets any other process take the lead. Not sure where to look to address that yet.
Any thoughts on how to troubleshoot this issue? Taking out all my sites while backups occur is not an option, and I can't seem to reproduce this issue on my local machines (granted, it's different hardware and configuration). My hopes are that this question is not to vague. I'm happy to provide additional details as needed.
Thanks for your brains in advance!
I'd suggest running your backup script under the "ionice" command. It will help prevent starving httpd from I/O.

Open Source Web Service/WCF media streamer

Does anyone know of an open source web service/wcf service that can stream media content to clients? In particular I am looking for something that could access my music collection and stream it to a client (could be a client browser, win mobile app or even iphone application).
I guess it would have to be WCF based as I'm not sure that webservices do streaming really well. Also Windows Media Streaming Services is not the best way to go as the service should operate from a vista/xp machine (preferably).
If not, does anyone know the best way to start going about creating something like this - I'm not sure I know where to start with this one, although I can see many many uses for such a service!
Even though it's not open source, Windows Server 2008 has a Streaming Media role that will do what you ask. Of course, you'll need to have a server to put it on.
I tried Orb and it is quite good, apart from the fact that it hijacked my tuner card so media center would no longer work. However I'm going to try and create a home grown version.
Orb (www.orb.com) will stream your media to just about anything with a web browser. I've been running it on an XP virtual machine for about a year. I love being able to stream my entire media collection to my phone while I'm working at a client's site.
While it isn't open source, it is free and relatively well supported. One of the best features is that the architecture is set up so that there are no special requirements for your firewall -- it just works.