Enable the use of SSI - apache

Using HostGator, I can't seem to get SSI to work on my server. I'm using Dreamweaver to build the site and the everything works just fine in the preview. But when I actually upload the pages to my server, any elements that are includes files don't appear. Does anyone know how I can enable SSI on my web server?

Your last comment gave me the information I need. The issue is that the file is not in the same directory as the file you're trying to add the footer.inc file to. Try this code:
<!--#include virtual= "includes/footer.inc" -->
when using the file= parameter, the file you're including must be in the same directory. If the file you're including is not in the same directory, then you will have to use virtual. See this page for more information: SSI: The Include Command.
And here, from the source, is pretty much the rule of thumb: Use file= when the included file is within the same directory as the page that wants it. Use virtual= when it isn't.
EDIT: I think I got it now. Copy and paste the above code and it should work for you. Make sure you follow these guideline: after <!--, there is no space between the last - and #. Additionally, there is a space between the closing " and the first -. These rules must be adhered to. You can view more information here: Server Side Includes Not Working

Related

Archiving an old PHP website: will any webhost let me totally disable query string support?

I want to archive an old website which was built with PHP. Its URLs are full of .phps and query strings.
I don't want anything to actually change from the perspective of the visitor -- the URLs should remain the same. The only actual difference is that it will no longer be interactive or dynamic.
I ran wget --recursive to spider the site and grab all the static content. So now I have thousands of files such as page.php?param1=a&param2=b. I want to serve them up as they were before, so that means they'll mostly have Content-Type: text/html, and the webserver needs to treat ? and & in the URL as literal ? and & in the files it looks up on disk -- in other words it needs to not support query strings.
And ideally I'd like to host it for free.
My first thought was Netlify, but deployment on Netlify fails if any files have ? in their filename. I'm also concerned that I may not be able to tell it that most of these files are to be served as text/html (and one as application/rss+xml) even though there's no clue about that in their filenames.
I then considered https://surge.sh/, but hit exactly the same problems.
I then tried AWS S3. It's not free but it's pretty close. I got further here: I was able to attach metadata to the files I was uploading so each would have the correct content type, and it doesn't mind the files having ? and & in their filenames. However, its webserver interprets ?... as a query string, and it looks up and serves the file without that suffix. I can't find any way to disable query strings.
Did I miss anything -- is there a way to make any of the above hosts act the way I want them to?
Is there another host which will fit the bill?
If all else fails, I'll find a way to transform all the filenames and all the links between the files. I found how to get wget to transform ? to #, which may be good enough. It would be a shame to go this route, however, since then the URLs are all changing.
I found a solution with Netlify.
I added the wget options --adjust-extension and --restrict-file-names=windows.
The --adjust-extension part adds .html at the end of filenames which were served as HTML but didn't already have that extension, so now we have for example index.php.html. This was the simplest way to get Netlify to serve these files as HTML. It may be possible to skip this and manually specify the content types of these files.
The --restrict-file-names=windows alters filenames in a few ways, the most important of which is that it replaces ? with #. This is needed since Netlify doesn't let us deploy files with ? in the name. It's a bit of a hack; this is not really what this option is meant for.
This gives static files with names like myfile.php#param1=value1&param2=value2.html and myfile.php.html.
I did some cleanup. For example, I needed to adjust a few link and resource paths to be absolute rather than relative due to how Netlify manages presence or lack of trailing slashes.
I wrote a _redirects file to define URL rewriting rules. As the Netlify redirect options documentation shows, we can test for specific query parameters and capture their values. We can use those values in the destinations, and we can specify a 200 code, which makes Netlify handle it as a rewrite rather than a redirection (i.e. the visitor still sees the original URL). An exclamation mark is needed after the 200 code if a "query-string-less" version (such as mypage.php.html) exists, to tell Netlify we are intentionally shadowing.
/mypage.php param1=:param1 param2=:param2 /mypage.php#param1=:param1&param2=:param2.html 200!
/mypage.php param1=:param1 /mypage.php#param1=:param1.html 200!
/mypage.php param2=:param2 /mypage.php#param2=:param2.html 200!
If not all query parameter combinations are actually used in the dumped files, not all of the redirect lines need to be included of course.
There's no need for a final /mypage.php /mypage.php.html 200 line, since Netlify automatically looks for a file with a .html extension added to the requested URL and serves it if found.
I wrote a _headers file to set the content type of my RSS file:
/rss.php
Content-Type: application/rss+xml
I hope this helps somebody.

Securing GitLab Pages with Let's Encrypt gets 404

I am following this tutorial https://about.gitlab.com/2016/04/11/tutorial-securing-your-gitlab-pages-with-tls-and-letsencrypt/
Next step instructions are:
Make sure your web server displays the following content at
http://YOURDOMAIN.org/.well-known/acme-challenge/5TBu788fW0tQ5EOwZMdu1Gv3e9C33gxjV58hVtWTbDM
before continuing:
5TBu788fW0tQ5EOwZMdu1Gv3e9C33gxjV58hVtWTbDM.ewlbSYgvIxVOqiP1lD2zeDKWBGEZMRfO_4kJyLRP_4U
#
# output omitted
#
Press ENTER to continue
According to the tutorial, it's using Jekyll, but I don't use a static html generator like jekyll. The files are all static html. I created the exact path under root folder: /.well-known/acme-challenge/PukY0bbiH3nRfciQ4IzwTDIXFn4G5sZ5I-LkMz3-KHE.html
But after the piplines jobs are done, I am still getting 404. What's the problem here?
I had problem same yesterday and I found the solution, I hope it is not too late to share with you. According to this tutorial here, the "well-known" folder should be under the "public" folder.
And the letsencrypt need to access a .html file in the following path using the browser.
http://YOURDOMAIN.org/.well-known/acme-challenge/5TBu788fW0tQ5EOwZMdu1Gv3e9C33gxjV58hVtWTbDM
To do this, you must create the "index.html" file in the path below inside your gitlab repository.
public/.well-known/acme-challenge/5TBu788fW0tQ5EOwZMdu1Gv3e9C33gxjV58hVtWTbDM/index.html
In the "index.html" file you should put only the following sentence:
5TBu788fW0tQ5EOwZMdu1Gv3e9C33gxjV58hVtWTbDM.ewlbSYgvIxVOqiP1lD2zeDKWBGEZMRfO_4kJyLRP_4U
important: do not put any html tag, just the plain text above.
Then just continue following the tutorial. Good luck.

Server side: detecting if a user is downloading (save as...) or visualizing a file in the browser

I'm writing an apache2 module
by default and when viewed in a web browser, the module would only print the first lines of a large file and convert them to HTML.
if the user choose to 'download as...', the whole raw file would be downloaded.
Is it possible to detect this choice on the server side ? (for example is there a specific http header set ?).
note: I would like to avoid any parameter in the GET url (e.g: "http://example.org/file?mode=raw" )
Pierre
added my own answer to close the question: as said #alexeyten there is no difference. I ended by a javascript code the alter the index.html file generated by apache.

Sitefinity: Need to make a PDF available on a very specific URL but can't do it

I have a website on SiteFinity 4.4. I need to make a document available on a very specific URL, i.e.
http:www.example.com/reports/the-report.pdf
If I just create a directory in the root of the site it does not work (503 error). Also when I try to use the 302Redirect.xml file to redirect the URL to the PDF it does not work either (same error). The link has already been published and has to be exactly as specified. How do I solve this?
Any help would be greatly appreciated.
Sitefinity wouldn't block a folder. Adding a physical folder and dropping that report on the proper place should function, so it probably means you'll have to check your server configuration.
Anyway, the fastest way outside Sitefinity, would be to just create a IIS rewrite rule. Make the http:/www.example.com/reports/the-report.pdf the pattern and redirect them to the url of the document from the sitefinity library.
When you upload a document to the library in sitefinity it gives you an direct url, something like /docs/defaultlibrary/document. You can verify the url by going to content >> documents and files and chose Embed link to this file. That gives you a pop-up with the url.

How to use relative URL's in website with two base URL's

I have our basic corporate static html website installed in our web root directory and our billing software installed in /portal. I have integrated the websites to look like a single site by including the /menu.tpl smarty template file in the /portal/header.tpl file. However, if I use relative URL's, the menu sysem doesnt work as the base url for the billing script is /portal. i.e. if I create a link to faq.php in the menu.tpl and I load a page on the portal site, the link in the menu back to the faq page is now /portal/faq.php whereby if I load a page off the root site the link is just /faq.php as it should be.
The obvious answer is to just use absolute URL's, but I need the site to be portable as I have many developers who need to install and test it.
I cant find anyway to resolve this. Any ideas?
I ran into the same problem as you a while ago, and after trying a lot of dead ends, I finally ended up with the following solution:
For any URL you need to be a chamelion, i.e. change its path depending on the environment, insert a PHP function that writes out the correct URL.
If you include the PHP function from a single central file, then you can change all of the URL's in the entire site automatically, based on a setting, or some pre-detected switch such as the current domain name, etc.
Example:
<?php print_base_url_plus("/menu.php"); ?>
... where print_base_url_plus() is a function which appends the base URL onto the output.
You may find that you have to change some of the URL's to be php, so they are preprocessed by the PHP engine, or, you can alter the web settings so that standard .htm files are piped through the PHP engine, just like .php files.