Is there anyway to have a bunch of iframes on one html page that link to several different text files located elsewhere on the server?
For example, if my apache server hosts its HTML pages in:
C:\Apache\WebDocs,
Is there anyway to link several different log files from different locations like" C:\game1\logs\log.txt and C:\GameServer\logs\console.txt all into one webpage using iframes?
I suggest you create shortcuts to your current apache document root and give apache user read permission to the original file. Personally, I haven't tested this on a windows machine. Otherwise, you can use Alias directive.
Alias /log1 C:\game1\logs\
and you can call the url as http://localhost/log1/
That should help as well.
Related
I took over a website which I'm supposed to admin and somebody brought to my attention that certain Indexes and Files are available, which shouldn't be. I will be using dummy names.
You were able to access example.com/intern before, but I changed a line in /etc/apache2/apache2.conf according to this https://stackoverflow.com/a/31445273 . This worked partly, as I get a 403-Forbidden when I now navigate to example.com/intern and that's basically what I want.
However the directory intern governs a file called file.php.bak aswell as file.php. When I navigate to example.com/intern/file.php I get a white website. I am however not sure, if you are able to access file.php in another way, because the site does load and I don't get a 403 like before. What is way worse and the reason I am struggling with this is: If I go to example.com/intern/file.php.bak then my Browser (Firefox) offers me to download file.php.bak, which I can read in plaintext. I want all files in intern to not be accessible via the website, but I have no idea how to do this. Can anybody help?
Things I've tried:
Removing the Indexes from the apache2.conf file like mentioned above. It only puts the 403 on the directory itself and not recursively for all the files in it.
Writing a .htaccess file as described here: https://fedingo.com/how-to-prevent-direct-file-download-in-apache-server/ and putting it in intern with the same result as in 1)
Putting an empty index.html file in the intern directory. This leads to no more 403 in example.com/intern, but the download on example.com/intern/file.php.bak is still possible. I've also tried index.php with the same result.
File System:
The application runs from /var/www/application which is also the folder for the /var/www/application/index.php I want to use. The /var/www/application/intern directory is also there. While it isn't browsable anymore, the files in it still are accessible. /var/www/application/intern/file.php can be navigated to via example.com/intern/file.php, but it seems like it can't be downloaded or read as it results in a white page. /var/www/application/intern/file.php.bak can however be downloaded via example.com/intern/file.php.bak.
Let's say Apache document root is set to DocumentRoot "/folder_one/folder_two"
Placing files in a folder_one will prevent people browsing your apache server and requesting the files directly.
Place index file in folder_two and include some code such as PHP to tell apache to include whatever files you want from folder_one.
In this manor Apache will still be able to serve whatever files you want from folder_one and people will not be able to request the files directly as the are located in a directory above the Apache document root.
How do I customize the list that web server does in absence of index.* file in the web root or its child directory, if we do not put any index files in the web root directory and the directory has the read permission?
you can set the page to show for a directory url with the index directive, it doesn't need to point to something called index.*, might just as well be whatever.html. See http://nginx.org/en/docs/http/ngx_http_index_module.html#index for details
or you can set autoindex on to give a generated file/directory listing, you can use the autoindex_exact_size and autoindex_localtime to further customize that listing. See http://nginx.org/en/docs/http/ngx_http_autoindex_module.html for details
3th option, if your nginx is compiled with it, is the random_index, see http://nginx.org/en/docs/http/ngx_http_random_index_module.html for details.
NOTE: to find out if your nginx is compiled with the needed --with-http_random_index_module option use the command nginx -V
Well, it depends on what webserver you are using.
In case of Apache, direcory indexes handled by a module called mod_autoindex.
When you want to customize the directory listing, then you have to know that Apache need three 'view' files:
The Header — by default automatically generated by Apache The
Directory Listing — necessarily generated by Apache
The Footer — referred to as the “Readme” file
The Header and Footer parts are basically written in plain HTML. The directory listing is generated by Apache but you can apply CSS on it..
The whole thing is a rather long story, so what I can suggest is a well written article with the details about this 'directory listing customisation':
Better Default Directory Views with .htaccess
I have a number of subdomains, which are using crossdomain.xml file and I'm looking to a simple way of managing them all - which get semi-regularly updated. One way I've thought is a PHP script, which pushes and overwrites the xml file. The other, which I much prefer is a an apache redirect on a single file.
So, question is how would I, across multiple domains, redirect an xml on dom1.domain.com and dom2.wirewax.com to the same crossdomain.xml file without Flash getting upset about. i.e. not a 302 HTTP redirect, but internal file fetching.
You can write a PHP script that fetches the content from a single location (database or text file) and sends it as-is to Flash. Yes, the script itself needs to be copied on all hosts.
If you have all websites hosted on same webserver, perhaps mod_alias could help:
Alias /crossdomain.xml /path/to/shared/crossdomain.xml
I have not personally tested this. The reference page includes instructions to setup the shared directory so that it can be read by multiple hosts.
I have a directory on my website specifically for javascript files, I want these javascript files to be hidden, so if I type the url to it it says Forbidden or disallows access, but my front-end website files can still access them to execute them when needed. Is there a way to do this through a FTP client?
Cheers,
Dan
You can't do this trough a ftp client. It is the task of your webserver to forbid access to certain files.
If you change permission, the webserver won't have access to them anymore, so this is not the way to go.
You must configure your webserver to restrict the access. If you're using Apache, you can use an .htaccess file. There's different ways of doing this, many depends on the way the webserver is configured.
The easiest is to put an .htaccess file in your Scripts folder which contain only this none line :
deny from all
However, like peeter said, there's a good chance this will break your site, since the browser must access theses files, so you can't restrict access.
Put a htaccess file in your scripts folder containing deny from all, but this will stop your pages from accessing the scripts also (though not if you pass them through the PHP engine first)
You're trying to hide JavaScript files that are executed on the clients side. If a client(browser) cannot access the files means non of your javascript code is executed.
If I understood your question correctly then you cannot achieve what you're trying to achieve.
I have our basic corporate static html website installed in our web root directory and our billing software installed in /portal. I have integrated the websites to look like a single site by including the /menu.tpl smarty template file in the /portal/header.tpl file. However, if I use relative URL's, the menu sysem doesnt work as the base url for the billing script is /portal. i.e. if I create a link to faq.php in the menu.tpl and I load a page on the portal site, the link in the menu back to the faq page is now /portal/faq.php whereby if I load a page off the root site the link is just /faq.php as it should be.
The obvious answer is to just use absolute URL's, but I need the site to be portable as I have many developers who need to install and test it.
I cant find anyway to resolve this. Any ideas?
I ran into the same problem as you a while ago, and after trying a lot of dead ends, I finally ended up with the following solution:
For any URL you need to be a chamelion, i.e. change its path depending on the environment, insert a PHP function that writes out the correct URL.
If you include the PHP function from a single central file, then you can change all of the URL's in the entire site automatically, based on a setting, or some pre-detected switch such as the current domain name, etc.
Example:
<?php print_base_url_plus("/menu.php"); ?>
... where print_base_url_plus() is a function which appends the base URL onto the output.
You may find that you have to change some of the URL's to be php, so they are preprocessed by the PHP engine, or, you can alter the web settings so that standard .htm files are piped through the PHP engine, just like .php files.