How do I get apache to access another drive in Ubuntu 15.04? - apache

Basically I've nearly run out of space on the drive that Apache is installed on (my server is running Ubuntu desktop 15.04, 32-bit). I've tried to keep everything in the /www/html/ folder but there's no more room. However, I have a spare drive that I can store stuff on. I've searched for ages and everyone's saying "you can't access other drives like that, be thankful it can't because -insert security issue-", which is absolutely stupid because I doubt every server in the world has only one drive per server. Naturally you need to distribute the files on many disks. The main question is how do I get my server to use my other drive as well? Do I make a shortcut to the drive and put it in /www/html/ and then specify the directory in my html files as "/www/html/drive2/..."?
Sorry if I sound stupid, I'm still new to running my own server (which is basically just a spare computer I had in my closet).

Related

oVirt VM running normaly but the vdisk does not exist

We have a oVirt 4.3 running in my company. We receaved a error mensagem, from a backup took, informed that snapshot could not be generated. When we saw on the host where this VM is running we can see the information about vdisk but not see it. If we looking for it not exist where should be. When we looking for the vdisk informatio on oVirt Engine the UUID is different from where it is running.
It is possible to force this "gost" to write on the storage?
I don´t know if any one has this problem before.
Best regards

xampp apache server error 403 access forbidden on windows10

I've installed xampp in two different pc and in both of them it gives me the same error running windows10 (both with it). Apache server runs correctly, the ports are dedicated to httpd as it's supposed but when I try to access the folder of my website project to test the html files, google chrome, firefox and explorer show me a message saying: "Error 403 - access forbidden because you don't have the right permissions". I've tried everything that is in the other similar questions here and already gave permisions to all users in my folders, even to the whole drive c:. I also modified the txt files inside the folder /apache/conf/ and didn't work. I've seen that in some questions regarding the same matter, the file httpd.conf is a little bit different with parts that I can't see in mine. Thank you so much.
Finally, I solved the problem learning some more. I post it here in case someone is starting with it like me. The point is that when you install xampp, the program has a default folder where it let's you put your projects, it's called "htdocs". So that you can work only with project folders that you create inside this "htdocs" folder. If you try to create a new folder on c:/ for example, the program will not let you access beacuase of security (everybody could access all the folders on your computer, entering your server). I hope this to be of some help for someone else. Thank you, and sorry for the newbie question.

Smart local copy of a remote directory

Currently I have a bunch of local copies of dev/production websites. Each copy contains the "files" directory, which contains files uploaded by site users. Currently I use rsync to synchronize the directories contents from remote servers (via ssh).
There are some annoyances:
I have to run rsync manually each time when I want fresh files (this could be automated of course, but as I have a lot of website copies, it's not a good idea).
The rsync execution takes some time.
Disc space on my laptop is running out.
I think all of this could be solved if there is some kind of a software that can work like a proxy:
When I list files, it requests the file list from the remote server and caches the results for some (configurable) time.
When I first time request file contents, it retrieves the remote file and saves it locally.
When I update a file, it only gets updated locally.
When I save a new file in the "files" directory, it not goes to the remote server.
Of course, the logic of such software should be much more complex, but I hope, my idea is clear: don't waste disk space, download files on demand, no remote changes.
Is there any software that works like that?
Map a network drive with NFS or sshfs. Make local copies if you really need a file.
I did not mention it in the question, but I needed this for work with Drupal. And now I have found a Drupal-only solution, the Stage File Proxy module.
It does exactly what I need: downloads files from a remote server only when they are requested.

Storing and retrieving files stored separately from codebase coldfusion

We currently have a site running cold fusion 11. In an effort to improve some aspects of security we would like to store all files uploaded by our users on a server separate from our codebase and DB servers.
I'm pretty much starting from scratch here as I wasn't able to find much in my searches so far. What's the best practice for doing this and what cold fusion functions would work for storing and retrieving files from an external source?
I could use some more information to be more helpful. But let's say you have a separate server that stores all your user files on a Windows network. I would use CFContent to serve those files with the file being retrieved over a UNC path.
I'd recommend reading this blog entry of mine on Securely Serving Files via CFContent. Wil, also from CF Webtools, posts one here: Serving File Downloads with ColdFusion
We had a similar issue when we migrated to a Unix platform. Our solution was to mount a file server to the webserver. It's accessed programmatically by ColdFusion as if it's on the same server, but it's inaccessible from the web root (browser). It's worked very smoothly for us.

Cannot connect to Compute Engine CentOS Virtual Machine

I am new to Virtual Machines and CLI so please bear with me.
I have a CentOS 6.5 running on Compute Engine.
I ran yum update (without creating a snapshot of the previous disk - Yes I am an idiot) and not I cannot connect to the machine using the ip address.
I tried the following steps.
Tried to connect through Filezilla - didn't work.
Tried through Putty - didn't work
Tried through the browser option given by the CE console - didn't work.
I even tried creating a snapshot and starting up another VM with the snapshot - didn't work.
If anyone knows how I can get the files and folders out from the previous disk, I can start up a new VM and transfer everything again.
I do not have the latest database and this is important.
Please help!
Thanks
Warren
The way to recover is to delete your VM without deleting the disk, then create another VM with its own boot disk, attach and mount the original disk, and recover any data that you need from it.
First things first: on the VM instances page, click on the instance name that is currently running with that disk, and uncheck the box "Delete boot disk when instance is deleted". Then delete the instance.
Now, create a new instance with its own boot disk. To differentiate this new disk from the original boot disk:
using a different OS (or version of the OS) for the new disk, e.g., if using Ubuntu, try a different version or use Debian; if using RHEL, try CentOS, or vice versa
see which one is mounted at / — this should be the new disk
Mount the original disk as read-only and recover any information you need. Once you have a backup of your data, you can remount it with read-write access and try to fix it (but back up the data first!).
I finally solved this problem thanks to Misha for sending me in the right direction.
The steps are below for anyone who has the same issue.
Problem:
While updating the Centos server using yum update, I was unable to connect back to the server.
I tried all possible combinations but no luck. This seems to be a known issue as there was some material on the Compute Engine site regarding this.
Solution:
I followed the steps as Misha suggested. I started up another VM with its own boot disk and then attached the original disk with read write access.
Note: I was unable to mount the disk as just read only.
The commands were
mkdir /mnt/sdb1
mount /dev/sdb1 /mnt/sdb1
Once I mounted the VM, I copied the files from the html folder in the sdb1 disk to the html folder in the sda1(the new boot disk).
The database was a bit more challenging.
I tried quite a few times but copying the files from /dev/sdb1/var/lib/mysql into the new disk mysql folder was not working.
I found some tutorials but nothing helped.
Finally I downloaded the files from within the /dev/sdb1/var/lib/mysql and put them in my local windows mysql installation within the data folder.
Remember you have to download everything which includes the ib_logfile0 , ib_logfile1 and ibdata1 including the folder which has the *.frm files.
Then I opened localhost/phpmyadmin and voila... the files were there.
The rest was pretty simple... Exporting and uploading the SQL scripts back to the server.
This took me about 12 hours to figure out.
Thanks again Misha.