Prohibit the acces to index.php on wamp - apache

Wamp is installed on a local server.
How can i prohibit the users from opening
http://<server-ip>:<port-no>/index.php
but i should be able to log into the server and open the index.php
basically, i am trying to restrict the users to see the list of deployed apps on the wamp.

Use either a .htaccess file or modify your apache configuration to deny users access, only allowing your own IP address (probably 127.0.0.1).
See this page for a good starting guide: http://www.htaccess-guide.com/deny-visitors-by-ip-address/

Related

Redirect Symfony URL

I have a Symfony3 installation on a server and I need to access the web files through the IP/game address.
So I created my "game" symfony folder with everything in it, and for now if I want to see my web content I need to add /web/app.php (or app_dev.php, nvm), which is logical.
So here is my question : How can I access my app.php (or app_dev.php) just by IP/game and not IP/game/web/app.php (or app_dev.php).
I can't configure any vhost to define a DocumentRoot so I think I have to use an htaccess but all my tests actually have failed :(
Thank you for your help and have a nice day !
The only way for this to work is always by accessing ( IP or hostname )/( app_dev.php for dev, nothing for prod )/routes
If you want to access dev environment you will need to use app_dev.php after hostname / IP. Everything after that is not related to file structure but with routing specified in configuration of Symfony application ( at least by default web server configuration which is provided on Symfony website ).

htaccess - simulating local page as web page

Is there a way to rewrite a URL in a local project to look like web page?
For example I have project with url
localhost/site
I'm tryin' to rewirite this to:
www.site.com
or
site.com
That project has subpages, and it would be good if it worked like
site.com/subpage.php
I'm trying for an hour but I'm really htaccess noob.
I work with VertrigoServ and mod_rewrite works fine with some examples which I tried in other projects.
The issue is not with rewrite. To make a local page appear to have a more conventional domain you need to tell your browser that the conventional domain is found on your local webserver.
The simplest way to do this is by editing the hosts file on your OS. On *nix based devices it's usually /etc/hosts, On windows it's usually C:\Windows\System32\drivers\etc\hosts. You will need elevated privileges to edit the file.
Add a line to the end of the file that maps the domain name to your local ip address like this:
127.0.0.1 www.site.com
Close an reopen your browser and visit www.site.com, you should see it is loading the page from your local webserver.
Chances are, you are still seeing the same page as if you view localhost site. To make it load your code you need to change the DocumentRoot in httpd.conf for your apache install to match the directory where your code is.
A preferable solution may be to use Name Based Virtual Hosting, this allows multiple web sites to share the same IP address. Searching 'apache VirtualHost examples' should give you plenty of resources for this. Make sure that NameVirtualHost *:80 is also enabled in order for this to work.

Disallow direct access to subfolder from external IP Addres

In the process of moving an application from ColdFusion to PHP, I have a ColdFusion server running on CentoOS using apache. Despite a correct robots.txt disallowing the indexing of my application it has come attention that some files from the clients were indexed.
I need to know how to set up apache to only allow access to file from the server itself and NOT allow anyone to access the files from the inter-google. SO if you were to click the link it would deny access, but if you were to attempt to download it from the application itself (using a download script) it would allow it to download. Is it possible and how?
LOVE that the search engine ignored my robots.txt. Thanks!

How do I secure my WAMP so only localhost can access root directory?

I've managed to setup my WAMP configuration so I can show my clients their websites while they're in development, but I want to secure the root directory so only I can access it.
As it stands now, anyone can simply go to the domain name and see all the other projects I'm working on.
For example, I want to be able to give my clients access to: http://example.com/customer1 but I don't want them to see http://example.com.
I know I have to configure something in my httpd.conf file but not really sure what to do.
Hope I explained this properly.
Deny From All
Allow From localhost 127.0.0.1
It's some time since I used apache, but this should do it.

ProFtpd Homedirectory per user

I So new to linux Ubuntu and moving from windows.
The problem I have is related to ftp. This is what I am trying to do.
My websites are located in /var/www. Each website has its own root for examle
/var/www/site1
/var/www/site2
This is what I am trying to do. I would like to have a user for each site (directory) so they can access their home directory through ftp.
Looks so easy, but can't make it work. Any help or direction is appreciated.
If you want a user per site you might need to create virtual hosts in apache so that instead of the sites being located in /var/www they will be located in the user's home directory.
You can then allow users to access their home folders via ftp. These are instructions that are available on the Ubuntu site. Access them here
alternatively you can add the users as FTP users with the right permissions to the folders located under /var/www/ . Documented well in the ProFTPd website.
There's an old post here that you might still find helpful