Using HGFS within VMware to run web server - apache

I have VMware running Ubuntu 14.02 and A Windows 8 Host. I've enabled shared folders and installed VMware tools. Now what I want is to run the web server through /mnt/hgfs/ProjectName
At this point I can access the shared folder from within Ubuntu. I do not have to run sudo to create new folders or files or edit existing. The folder is not mounted as read only and not treated as read only; however, when I try to change the read only attribute within Windows it reverts back afterwards. Is there any clue as to why the web server cannot read the folder as a web server? Even being mounted as read only the web server should be able to read the files.

Turns out the best way to run this is to name the project folder html within vmware and then mount it to the /var/www folder. Now edits have no problem being made and the server runs great for access to both the host and the guest OS.

Related

Why is localhost (DocumentRoot) blocked from running on GoogleDrive, Dropbox or Tresorit?

I am attempting to relocate my DocumentRoot (i.e. localhost) to a synchronised folder (such as Google Drive, Dropbox or Tresorit), but the attempt fails with a 403 error.
On Windows machines I can configure localhost to run from D:/GoogleDrive/SitesG folder; the local site runs perfectly.
On a Mac, however, localhost won't work when running out of a cloud-based storage folder such as Google Drive, Dropbox, Tresorit, etc.
Everything is fine when localhost is at Users/myname/Sites.
However, when I reconfigure the Mac to run from Users/myname/GoogleDrive/SitesG - e.g. by editing the httpd.conf, etc, files - localhost is blocked.
Clearly the problem is to do with permissions on the parent folder (e.g. the Google Drive or Dropbox or Tresorit folder). I can see that the permissions on the various folders are as follows.
drwxr-xr-x 32 myname staff 1024 30 Apr 02:23 Sites
drwxr-xr-x 22 myname staff 704 30 May 21:01 SitesG
drwx------# 61 myname staff 1952 30 May 17:47 GoogleDrive
So my question is: On a Mac (running HighSierra), is it possible to relocate the DocumentRoot to GoogleDrive? Or is there something instrinsic to GoogleDrive that prohibits localhost from being run a Google Drive folder?
Locating an Apache virtual host to a cloud-based storage folder will create many files/folders permissions problems.
Instead of relocating your documentRoot and changing a lot of settings and permissions, you should more easily, for each cloud stored project, create a symlink in your Users/myname/Sites folder, pointing to your GoogleDrive/Dropbox website folder.
Imagine you have a "websiteA" folder inside your Dropbox folder :
1) Go to your "Users/myname/Sites folder" and create such a symlink
cd ~/Sites
ln -s ~/Dropbox/websiteA websiteA
As you can check opening your ~/Sites folder in the Finder, you have created a folder with an arrow on it, pointing to the "websiteA" cloud-based folder.
2) Now, you just have to create a virtual host pointing to ~/Sites/websiteA.
You could, instead, globally change your ~/Sites folder to a symlink pointing to your cloud-based folder, but the project-by-project approch is more flexible as it will allow you to manage both local and cloud-based projects.
Many thanks to #DrFred for the solution above, which I'm confident would work though I have not had the chance to test it.
Here's the solution I devised before receiving any answers. It's very similar to Dr Fred's above, in that both solve the problem with symlinks. I add mine for completeness and extra detail.
As above, I develop on multiple devices (several Macs and Windows PCs, side by side), so my aim was to have a single localhost development folder that would synch almost instantly between different devices without the need to check files into/out of git and without running into the file permissions problems created when using Google Drive to synch code files.
The steps I used to achieve this aim were as follows.
Create a folder called ~/Users/myname/SitesNew on a Mac.
Create a symlink from that folder to an identically named folder in Dropbox on the same Mac. You will then have two identical folders on the Mac:
~/Users/myname/SitesNew <-- Real folder on Mac
~/Users/myname/Dropbox/SitesNew <-- Symbolic folder on Mac
Synchronise Dropbox on all devices (making sure to add the SitesNew folder if you are using selective synch on any device). The symlink folder will now appear as a real folder on Dropbox in the cloud and on the Windows PCs. In my case the new Windows PC folder was at:
D:/Dropbox/SitesNew <-- Real folder on Windows
Update the Apache httpd.conf files on the Mac to recognise localhost at ~/Users/myname/SitesNew.
Update the Apache httpd.conf on the Windows PC to recognise localhost at D:/Dropbox/SitesNew.
From now on, any localhost development work (edit, add, delete) on one device will synch with the localhost on the other, even across different operating systems.
Note 1: This solution works only with Dropbox but not with Google Drive, as Google Drive has problems with symlinks and also messes with permissions in a different way, especially on a Mac.
Note 2: If any files have previously been saved on Google Drive (e.g. originally my Windows sites folder was at D:\GoogleDrive\SitesOld), use chmod both (a) to determine the right values for the permissions (e.g. see https://chmod-calculator.com), and (b) to convert folders and files to the right values.

Serverpilot Ubuntu 14.04 accessing CGI-scripts

Got an odd Serverpilot query not specifically Craft related
My client wants FTP access to a subfolder on the site - loathe to let them have full access so created a user with access to /home/FTPUSER and symlink to this from my /srv/users/serverpilot/apps/APPNAME/public/ folder so that they cannot access the site's core system files, etc from a script.
PHP files from this folder are working fine but the client now wants to be able to run CGI/Perl scripts from this folder. I have tried following instructions at https://serverpilot.io/community/articles/how-to-create-a-cgi-bin-directory.html (updating the document root) but can't get CGI or Perl scripts to run, instead being returned as plain text.
Any thoughts?

How to move server project to my localhost?

What I'm asking is not how to move localhost to remote server.
I'm new to server so I don't know how to move project I have to open? launch? at my localhost.
I try to develop Magento at localhost and develop and test Magento service at my own laptop localhost.
But I'm in troubling moving my project cloned using GitHub to my localhost.
Install wamp, start it's services.
clicking the wamp icon in your system tray will allow you to access 'phpmyadmin'- use the databases tab to create a new db (for development at home I typically leave the db user and password as 'root').
Inside wamp's install directory is the 'www' folder. create a subdirectory within it, download magento and extract it into your newly created subdirectory.
If you unzipped your magento files to c:/wamp/www/myproject then point your browser at 127.0.0.1/myproject and you should see the magento installation page. Follow it through until you finish installing.
Now when you go to 127.0.0.1 you should see the front page of a fresh magento install. appending the url with /admin will let you log into the back end. Do this, and then navigate to system -> cache management and disable all the caches.
You can now copy across any work you've already done on the project into your relevant magento folders (presumably /app and /skin)

What is the fastest way to upload the big files to the server

I have got dedicated server and file about 4 GB to upload on the server. What is the fastest and most save way to upload that file to the server?
FTP may create issues if the connection will be broken.
SFTP will have the same issue as well.
Do you have your own computer available through internet public IP as well?
In that case you may try to set up a simple HTTP server (if you have Windows - just set up the IIS) and then use some download manager on dedicated server (depends from OS) to download the file through HTTP (it can use multiple streams for that) or do this through torrent.
There're trackers, like http://openbittorrent.com/, which will allow you to keep the file on your computer and then use some torrent client to upload the file to the dedicated server.
I'm not sure what OS your remote server is running but I would use wget it has a --continue from the man page:
--continue
Continue getting a partially-downloaded file. This is useful when
you want to finish up a download started by a previous instance of
Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget
will assume that it is the first portion of the remote file, and
will ask the server to continue the retrieval from an offset equal
to the length of the local file.
wget binaries are available for GNU/Linux / Windows / MacOSX / dos:
http://wget.addictivecode.org/FrequentlyAskedQuestions?action=show&redirect=Faq#download

Is it possible to have WAMP run httpd.exe as user [myself] instead of local SYSTEM?

I run a django application over apache with mod_wsgi, using WAMP.
A certain URL allows me to stream the content of image files, the paths of which are stored in database.
The files can be located whether on local machine or under network drive (\\my\network\folder).
With the development server (manage.py runserver), I have no trouble at all reading and streaming the files.
With WAMP, and with network drive files, I get a IOError : obviously because the httpd instance does not have read permission on said drive.
In the task manager, I see that httpd.exe is run by SYSTEM. I would like to tell WAMP to run the server as [myself] as I have read and write permissions on the shared folder. (eventually, the production server should be run by a 'www-admin' user having the permissions)
Mapping the network shared folder on a drive letter (Z: for instance) does not solve this at all.
The User/Group directives in httpd.conf do not seem to have any kind of influence on Apache's behaviour.
I've also regedited : I tried to duplicate the HKLM\[...]\wampapache registry key under HK_CURRENT_USER\ and rename the original key, but then the new key does not seem to be found when I cmd this
> httpd.exe -n wampapache -k start
or when I run WAMP.
I've run out of ideas :)
Has anybody ever had the same issue?
Win+R, services.msc
edit wampapache and wampmysqld to log on as some user.
the tray icon is a convenient front end to "net start wampapache" and "net start wampmysqld"
The User/Group directives in httpd.conf do not seem to have any kind of influence on Apache's behaviour.
httpd.exe is started by the root user (this is probably why you see it running under SYSTEM). The user and group lines in httpd.conf determine what user the child processes (that httpd spawns) will run under. These forks are what actually handle page requests, etc. so it is possible that your configuration is already doing what you want it to, it is just unclear from looking at task manager.
You could also try using runas to start WAMP/Apache, though your mileage may vary.
I've just found that executing httpd.exe myself works for me... I just loose all the funky WAMP tray icon, and the "restart apache" menu item, really handy whenever I update my application code...
I'll have to make do with this for the moment...