Why is localhost (DocumentRoot) blocked from running on GoogleDrive, Dropbox or Tresorit? - apache

I am attempting to relocate my DocumentRoot (i.e. localhost) to a synchronised folder (such as Google Drive, Dropbox or Tresorit), but the attempt fails with a 403 error.
On Windows machines I can configure localhost to run from D:/GoogleDrive/SitesG folder; the local site runs perfectly.
On a Mac, however, localhost won't work when running out of a cloud-based storage folder such as Google Drive, Dropbox, Tresorit, etc.
Everything is fine when localhost is at Users/myname/Sites.
However, when I reconfigure the Mac to run from Users/myname/GoogleDrive/SitesG - e.g. by editing the httpd.conf, etc, files - localhost is blocked.
Clearly the problem is to do with permissions on the parent folder (e.g. the Google Drive or Dropbox or Tresorit folder). I can see that the permissions on the various folders are as follows.
drwxr-xr-x 32 myname staff 1024 30 Apr 02:23 Sites
drwxr-xr-x 22 myname staff 704 30 May 21:01 SitesG
drwx------# 61 myname staff 1952 30 May 17:47 GoogleDrive
So my question is: On a Mac (running HighSierra), is it possible to relocate the DocumentRoot to GoogleDrive? Or is there something instrinsic to GoogleDrive that prohibits localhost from being run a Google Drive folder?

Locating an Apache virtual host to a cloud-based storage folder will create many files/folders permissions problems.
Instead of relocating your documentRoot and changing a lot of settings and permissions, you should more easily, for each cloud stored project, create a symlink in your Users/myname/Sites folder, pointing to your GoogleDrive/Dropbox website folder.
Imagine you have a "websiteA" folder inside your Dropbox folder :
1) Go to your "Users/myname/Sites folder" and create such a symlink
cd ~/Sites
ln -s ~/Dropbox/websiteA websiteA
As you can check opening your ~/Sites folder in the Finder, you have created a folder with an arrow on it, pointing to the "websiteA" cloud-based folder.
2) Now, you just have to create a virtual host pointing to ~/Sites/websiteA.
You could, instead, globally change your ~/Sites folder to a symlink pointing to your cloud-based folder, but the project-by-project approch is more flexible as it will allow you to manage both local and cloud-based projects.

Many thanks to #DrFred for the solution above, which I'm confident would work though I have not had the chance to test it.
Here's the solution I devised before receiving any answers. It's very similar to Dr Fred's above, in that both solve the problem with symlinks. I add mine for completeness and extra detail.
As above, I develop on multiple devices (several Macs and Windows PCs, side by side), so my aim was to have a single localhost development folder that would synch almost instantly between different devices without the need to check files into/out of git and without running into the file permissions problems created when using Google Drive to synch code files.
The steps I used to achieve this aim were as follows.
Create a folder called ~/Users/myname/SitesNew on a Mac.
Create a symlink from that folder to an identically named folder in Dropbox on the same Mac. You will then have two identical folders on the Mac:
~/Users/myname/SitesNew <-- Real folder on Mac
~/Users/myname/Dropbox/SitesNew <-- Symbolic folder on Mac
Synchronise Dropbox on all devices (making sure to add the SitesNew folder if you are using selective synch on any device). The symlink folder will now appear as a real folder on Dropbox in the cloud and on the Windows PCs. In my case the new Windows PC folder was at:
D:/Dropbox/SitesNew <-- Real folder on Windows
Update the Apache httpd.conf files on the Mac to recognise localhost at ~/Users/myname/SitesNew.
Update the Apache httpd.conf on the Windows PC to recognise localhost at D:/Dropbox/SitesNew.
From now on, any localhost development work (edit, add, delete) on one device will synch with the localhost on the other, even across different operating systems.
Note 1: This solution works only with Dropbox but not with Google Drive, as Google Drive has problems with symlinks and also messes with permissions in a different way, especially on a Mac.
Note 2: If any files have previously been saved on Google Drive (e.g. originally my Windows sites folder was at D:\GoogleDrive\SitesOld), use chmod both (a) to determine the right values for the permissions (e.g. see https://chmod-calculator.com), and (b) to convert folders and files to the right values.

Related

SSH and FTP showing different files

I am using a host to try and deploy my Django site but I am confused by the SSH vs. FTP.
Background info:
I got the IP address, name and password from my host for the VPS.
I logged in using the same information via Putty and via WinSCP.
Both show me as having accessed root#[VPS IP Address].
Running ls on Putty shows nothing (no files or folders). So I created a file hello.txt.
WinSCP shows a lot of folders at the root, unlike Putty. I then searched all the folders for the hello.txt that I created and it's nowhere to be found.
Why would accessing the same VPS via two different methods show completely different things?
If you are indeed sure that you are logged into the same host, with the same user account you should check that you are in the same folder.
Using ssh you can issue the command pwd (print working directory) to view the current the directory you are in.
To change to another directory using the shell, use the cd command, for example:
cd .. # This moves up to the parent directory
cd /var/www/html
The Winscp user interface should also show you in what directory you are currently in.
Navigation to another directory using Winscp should be fairly straightforward.
There's no reason to think these methods will put you in the same directory location at all.
When you SSH in using Putty, you will almost certainly be put in your home directory, and that will be where your hello.txt was created.
But the FTP service has presumably been configured to put you in the common area where your service's files are located, which is not under your home directory. Where it is will be specific to the configuration of that machine.
Using SSH you will probably be able to use cd to change directory to the FTP location, if you can find out what it is; however, the reverse is not true and you almost certainly won't be able to navigate to the home directory via FTP.
(Note, this is not a question about Django, and should probably have been asked on ServerFault.)

Plex and Owncloud shared folders permission issues

I am setting up a multimedia server on Debian 8.
I installed both Plex and Owncloud. I have set up /var/media as my Owncloud default folder. I decided to create a folder Library at the root of Owncloud. So the folder path is:
/var/media/admin/files/Library
I changed the permissions of media with:
chmod 770 -R /var/media
On top of that, all the files in /var/media are owned by www-data:www-data.
In order to make Plex see my medias, I have added the user plex to the group www-data. I would like to create a library watching my /var/media/admin/files/Library folder but I have a problem, Plex doesn't see neither the files or folders in /var/media. Here is a screenshot:
To finish, I have tried to connect on my server via ssh with the plex user, and it sees files and folders inside /var/media.
What am I doing wrong? Maybe it is not a permission issue?
Thanks
Update
If I change the ownership of /var/media to plex:www-data, it works. But I can't understand why it doesn't work for www-data:www-data. So it is well a permissions issue.
If I launch id plex, I have:
uid=107(plex) gid=33(www-data) groups=33(www-data)
Just to remind, here are the permissions of /var/media folder (full permissions for group...):
drwxrwx--- 4 www-data www-data 4096 Oct 30 09:01 media
I assume from your post that Plex, Linux OS, and your media are all contained on the same machine and that there are no separate computing devices being used here as that would mean additional steps are required.
In all likelyhood, plex won't be able to list your files because the mode 777 is required to list files in a directory even if the files themselves are set more restrictively than 777 e.g. 750. From what I can tell, your chmod command has set all the directory permissions to 770 which would break the listing capability. As it happens I've just yesterday written a guide over on Tech-KnowHow that covers this, and within that I have described how to set all your folders to 777 and your files to something else. That way it works with plex (and other systems for that matter). I've essentially chosen the same solutions as you in that I use the group to assign the permissions and make sure the everyone / other mode is set to apply no permissions.
There's a direct link to the article below, you'll need to click on the implementation page and look for the find command under the 'Apply correct modes' heading. I've also included how to keep your ownership consistent through samba which is useful when copying new files across. Let me know how that goes in the comments and I'll help you out where I can while it's still fresh in my mind. Good luck!
https://www.tech-knowhow.com/2016/03/how-to-plex-permissions-linux/
I know it is an old post, but I had the same issue and this was my solution :
After a
sudo service plexmediaserver status
I found the file used to launch the plex service /lib/systemd/system/plexmediaserver.service. This file contains the user and group which are used by plex.
So we can change the line Group=plex by your group.
PS: do not forget to restart the plex service with
sudo service plexmediaserver restart

Using HGFS within VMware to run web server

I have VMware running Ubuntu 14.02 and A Windows 8 Host. I've enabled shared folders and installed VMware tools. Now what I want is to run the web server through /mnt/hgfs/ProjectName
At this point I can access the shared folder from within Ubuntu. I do not have to run sudo to create new folders or files or edit existing. The folder is not mounted as read only and not treated as read only; however, when I try to change the read only attribute within Windows it reverts back afterwards. Is there any clue as to why the web server cannot read the folder as a web server? Even being mounted as read only the web server should be able to read the files.
Turns out the best way to run this is to name the project folder html within vmware and then mount it to the /var/www folder. Now edits have no problem being made and the server runs great for access to both the host and the guest OS.

Finding Dropbox directory ubuntu

I have a dropbox account which sync all my website folders. and it works well on windows using my apache to test, because apache can find the directory. I have another development computer using Ubuntu 13, and i changed the document root in apache to /home/jacques/dropbox but it cant find the directory , so i opened my home folder. i saw the directory there, so i tried to access it using the terminal, it said that the directory doesnt exist.
I did right click dropbox and that said that the directory is in /home/dropbox and /home/jacques/dropbox
am i missing something important here ?
There are a few things to check here -
First is that on Ubuntu the default Dropbox directory is
/home/username/Dropbox not /home/username/dropbox. Note the capital
'D', linux file systems are case-sensitive. Make sure that you specify it with the capital D in the DocumentRoot declaration.
The second is to check what user Apache is running as and making
sure that it has permissions to view your Dropbox directory. On
Ubuntu, the default is www-data, so you might want to add yourself
to the www-data group and change the group on the Dropbox folder to
be www-data.
Alternatively, you can change the user and group that Apache runs as by editing the /etc/apache2/envvars file and by making
these edits:
export APACHE_RUN_USER=jacques
export APACHE_RUN_GROUP=jacques
You will need to restart Apache after this, and you may need to update the owner of the /var/log/apache2 directory to be you also.

Parallels plesk permissions accessing through FTP

Our server is running under CentOS 6 and handled over Panel Plesk 10.4.4. Structure of folders and files is created using php script. Then, when accessing through FTP we are unable to modify these folder contents previously created. When accessing it over Apache web user works without exception but not over ftp. Folders and files have 755 and 644 rights respectively. How to enable ftp acces? Thank you
EDIT: problem is that file owner and ftp are not the same but I do not know exactly how and where to attach it.
File and folders owner is psacln (gid 502) and group is apache (gid 503). Ftp users are not the same.
We add a login ftp user (also system one) to the group owner of files and folders "psacln" using usermod -a -G psacln ftpusername. Same procedure with apache group but problem persists.
The problem here would be that you probably run your site in mod_php mode. In this mode scripts are operated under Apache privileges, so all files and directories created are owned by Apache. This way the files cannot be accessed by your FTP user unless you set up 777 or 666 permissions.
I think your options could be
switch to FastCGI mode of PHP. Depending on your Plesk account privileges, you can either do it yourself in Plesk UI or will have to ask hosting provider for that.
This way your script will be operated under user privileges (same as FTP user) and there will be no problems with accessing these files through FTP. Also this option is often considered more secure.
make PHP script setting 777 permissions on your folders and 666 permissions on your files. It means you allow to modify them by everyone (so called "others"). So FTP user can modify these files as well. While this may sound insecure, but practically these files are already can be accessed from any other site on that system (if it is shared hosting server). So I don't think it will be any more insecure than the current status.
Regards