Openshift sets several environment variables that can be used.
See here (variables: OPENSHIFT_*) https://developers.openshift.com/en/managing-environment-variables.html
Has anyone used these variables in their .htaccess file or can anyone assist in doing so?
My use specifically applies to a Mojolicious app as getting it to work on Openshift requires the following line in the .htaccess file. [ID HERE] refers to the OPENSHIFT_APP_UUID value. I manually type the value there now, and in theory, it should not change but I'd like to allow the app to be deployed by different users without needing them to look for their ID and edit the .htaccess file.
PerlSetVar psgi_app /var/lib/openshift/[ID HERE]/app-root/runtime/repo/perl/index.pl
You might be best off using an action_hook to write that .htaccess file post deploy and have it dynamically include the values of the environment variables that you want, something like this in your .openshift/action_hooks/post_deploy
echo "PassEnv ${OPENSHIFT_REPO_DIR}" >> ${OPENSHIFT_REPO_DIR}.htaccess
echo "PerlSetVar psgi_app ${OPENSHIFT_REPO_DIR}perl/index.pl" >> ${OPENSHIFT_REPO_DIR}.htaccess
Related
I want to archive an old website which was built with PHP. Its URLs are full of .phps and query strings.
I don't want anything to actually change from the perspective of the visitor -- the URLs should remain the same. The only actual difference is that it will no longer be interactive or dynamic.
I ran wget --recursive to spider the site and grab all the static content. So now I have thousands of files such as page.php?param1=a¶m2=b. I want to serve them up as they were before, so that means they'll mostly have Content-Type: text/html, and the webserver needs to treat ? and & in the URL as literal ? and & in the files it looks up on disk -- in other words it needs to not support query strings.
And ideally I'd like to host it for free.
My first thought was Netlify, but deployment on Netlify fails if any files have ? in their filename. I'm also concerned that I may not be able to tell it that most of these files are to be served as text/html (and one as application/rss+xml) even though there's no clue about that in their filenames.
I then considered https://surge.sh/, but hit exactly the same problems.
I then tried AWS S3. It's not free but it's pretty close. I got further here: I was able to attach metadata to the files I was uploading so each would have the correct content type, and it doesn't mind the files having ? and & in their filenames. However, its webserver interprets ?... as a query string, and it looks up and serves the file without that suffix. I can't find any way to disable query strings.
Did I miss anything -- is there a way to make any of the above hosts act the way I want them to?
Is there another host which will fit the bill?
If all else fails, I'll find a way to transform all the filenames and all the links between the files. I found how to get wget to transform ? to #, which may be good enough. It would be a shame to go this route, however, since then the URLs are all changing.
I found a solution with Netlify.
I added the wget options --adjust-extension and --restrict-file-names=windows.
The --adjust-extension part adds .html at the end of filenames which were served as HTML but didn't already have that extension, so now we have for example index.php.html. This was the simplest way to get Netlify to serve these files as HTML. It may be possible to skip this and manually specify the content types of these files.
The --restrict-file-names=windows alters filenames in a few ways, the most important of which is that it replaces ? with #. This is needed since Netlify doesn't let us deploy files with ? in the name. It's a bit of a hack; this is not really what this option is meant for.
This gives static files with names like myfile.php#param1=value1¶m2=value2.html and myfile.php.html.
I did some cleanup. For example, I needed to adjust a few link and resource paths to be absolute rather than relative due to how Netlify manages presence or lack of trailing slashes.
I wrote a _redirects file to define URL rewriting rules. As the Netlify redirect options documentation shows, we can test for specific query parameters and capture their values. We can use those values in the destinations, and we can specify a 200 code, which makes Netlify handle it as a rewrite rather than a redirection (i.e. the visitor still sees the original URL). An exclamation mark is needed after the 200 code if a "query-string-less" version (such as mypage.php.html) exists, to tell Netlify we are intentionally shadowing.
/mypage.php param1=:param1 param2=:param2 /mypage.php#param1=:param1¶m2=:param2.html 200!
/mypage.php param1=:param1 /mypage.php#param1=:param1.html 200!
/mypage.php param2=:param2 /mypage.php#param2=:param2.html 200!
If not all query parameter combinations are actually used in the dumped files, not all of the redirect lines need to be included of course.
There's no need for a final /mypage.php /mypage.php.html 200 line, since Netlify automatically looks for a file with a .html extension added to the requested URL and serves it if found.
I wrote a _headers file to set the content type of my RSS file:
/rss.php
Content-Type: application/rss+xml
I hope this helps somebody.
How to change Apache's hard coded Error pages, instead of using ErrorDocument directive. Because, I do not want to place the ErrorDocument file inside the htdocs folder, as it creates some issues when the user visits the error page itself. Such as the environment variable REDIRECT_URL not working, and stuff like that. I have tried to find stuff in the apache directory, but no luck, i cannot find anything that can be modified to change the hardcoded error pages itself. Is there a way at all to do that?
You can edit them if you install apache from source, download the source from httpd.apache.org/download.cgi, and you can find the contents of the files using grep -rnw '/path/to/somewhere/' -e 'texttofind' if using linux. And modify them to your needs (carefully of course), and you can compile it and use it.
But, I would suggest you to just stick with the ErrorDocument directive, which is much simpler.
Happy Cod3ing.
Using HostGator, I can't seem to get SSI to work on my server. I'm using Dreamweaver to build the site and the everything works just fine in the preview. But when I actually upload the pages to my server, any elements that are includes files don't appear. Does anyone know how I can enable SSI on my web server?
Your last comment gave me the information I need. The issue is that the file is not in the same directory as the file you're trying to add the footer.inc file to. Try this code:
<!--#include virtual= "includes/footer.inc" -->
when using the file= parameter, the file you're including must be in the same directory. If the file you're including is not in the same directory, then you will have to use virtual. See this page for more information: SSI: The Include Command.
And here, from the source, is pretty much the rule of thumb: Use file= when the included file is within the same directory as the page that wants it. Use virtual= when it isn't.
EDIT: I think I got it now. Copy and paste the above code and it should work for you. Make sure you follow these guideline: after <!--, there is no space between the last - and #. Additionally, there is a space between the closing " and the first -. These rules must be adhered to. You can view more information here: Server Side Includes Not Working
Sorry about my english level.
I researched so much, and i found that can i use ".htaccess" to get redirection to subdomain folder and this is OK.
In Drupal i need to create a folder for each subdomain in "/sites/sub.example.com/" and copy "default.settings.php" from default folder "/sites/default/default.setting.php" and rename it to "settings.php", after that, enable "$databases" variable in the same file, when it's done, i need to add a wildcard and modify "hosts" file.
Well, i should "automate" all this, but i don't know if it's is more hard because it's important hold the server safety with writing permissions or try another way, someone could advise me.
Im working on OSX and Drupal 7.x (recent release)
Thank you very much.
For each site that you want to use separate database, create own sites/ directory with settings.php. For example, if you want to have one database for example.com, another one for sub1.example.com and third one for sub2.example.com, all using same code base, setup your files like this:
sites/example.com/settings.php
sites/sub1.example.com/settings.php
sites/sub2.example.com/settings.php
each settings.php using different database credentials.
Read more here - https://drupal.org/documentation/install/multi-site
Also, if you want to automate this and if there is supposed to be bigger number of sites to be managed, consider deploying aegir - http://www.aegirproject.org.
I hope I understood your question correctly.
I'm using a script on several webhosting providers and want just to transfer the whole script to the server if a new version is released.
But every server has its own absolute path to the AuthFile (for directory protection). .htaccess needs an absolute path to the AuthFile and this absolute path is different on every server.
My first approach was to use one .htaccess-file with several <Directory>-directives. Each with an absolute path which sets the AuthFile for the specific server.
But I got a 500 internal server error: .htaccess: <Directory not allowed here
The second idea was to use SetEnv and <IfDefine>. But IfDefine is not able to read environment-variables as shown in this blog entry.
The specific paths and servers are known.
Is there a way to find out on which server the .htaccess is called and to set the specific path for the AuthFile?
Now I solved this problem to create a script which parses the .htaccess files and checks whether the paths are set correctly for the specific server.
Not the solution I hoped for, but better than no solution :-/