How to force SSL for a checkout link for a webstore - apache

I've browsed through several questions on the site and nothing quite matched what I want to do.
I found 1 question that could possible work the best as long as it does neglect SSL on any of the links within the specific folder I want to avoid having SSL on. (Force redirect to SSL for all pages apart from one)
Basically what I need is this-
I need the link on my site for '.../store' to remain with SSL off, but I want to force SSL for everything else in the store, most specifically '.../store/index.php?xlspg=checkout'.
The reason I need the /store link to remain with SSL off is because it conflicts with the admin panel login. That is the only direct link that cannot have SSL, so I'm also not sure which would be the best way to handle this.
The link to another question on this site that I posted above seems like it would work, like I said, as long as it doesnt affect anything deeper into the store aside from that sole page itself.
Any help would be greatly appreciated!!!

Can you instead construct your link to the admin panel to avoid using SSL? For example, if your admin panel link is something like:
Admin panel
change it instead to:
Admin panel
Then you can use SSL by default, except for the admin panel.

Try this rule:
RewriteCond %{HTTPS} !=on
RewriteRule ^store/index\.php$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

Related

How to move opencart site http to https

I have already installed ssl certificate on my opencart site but some pages are working fine with https but category pages not working with https. Do I need to change all url in database also? In the config file, I already set https.
Some of these may not apply to your particular installation but in the interest of creating a comprehensive answer, I've tried to cover all the bases here:
Note: you might need to adjust the table names depending on your store's table prefix if they don't begin with oc_
Open config.php and admin/config.php and change all those constant url declarations to https - make sure to include HTTP_SERVER and HTTP_CATALOG
In your admin panel go to system > settings, click edit and in the server table set Use SSL: to Yes.
In your database update the store_url column in the oc_order table so that all links are https. This is important because updating orders can fail if the api attempts to access http version of your site. you can use this query: UPDATE oc_order SET store_url = REPLACE(store_url, 'http:', 'https:')
If you have any hard coded images and links in your description tables you should replace those as well. SSL will still work but will show the warning flag in the browser bar. This includes oc_product_description, oc_category_description, and any other tables where you might have created html content.
Same as above for your theme files. It's fairly common to find hard coded http:// links and images in footer.tpl and header.tpl for starters. You can simply browser your site to see if any of the pages are not showing the green lock icon in the browser and take it from there.
Another culprit breaking https can be third party extensions which can exist both as files and in OC2 as ocmods in the oc_modification table.
Finally, create a redirect in .httaccess to gracefully let traffic know that your pages can now be found on https. I've excluded robots.txt and any connections for the openbay routes because, based on experience, when I tried to redirect ebay webhooks it broke things and they seem to be http only by default. I suspect this may be a shortcoming in how openbay handles those requests, or possibly a configuration issue but I was unable to find a workaround that didn't break openbay so for now I'd recommend leaving those requests untouched. I am using this in .htaccess:
RewriteCond %{HTTPS} off
RewriteCond %{REQUEST_URI} !/robots\.txt$
RewriteCond %{QUERY_STRING} !^route=ebay/openbay/*
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
That should do it!

A .htaccess trick to allow only one page from sub-domain to be opened

here is the situation and I need a little help with it:
I have a domain xxxxx.com and a sub-domain upload.xxxxx.com both lead to directory /www/xxxxx.com/, but I am using the 2nd domain for file uploading since I am using Cloudflare's services and with the 2nd domain there are no performance optimization and troubles with the upload (I've created cookies that are valid for both domains).
But my point is because I am using only 1 file for that 2nd domain and it upload.xxxxx.com/upload.php the same file exist under xxxxx.com/upload.php - I am not really good in .htaccess, so how can I make the only page that could be opened from this subdomain to be upload.php and all other to redirect to the main domain ?
You can use this rule as first rule in your DOCUMENT_ROOT/.htaccess file:
RewriteEngine On
RewriteCond %{HTTP_HOST} !^upload\. [NC]
RewriteRule ^upload\.php$ /? [NC,L,R]
What I did in the end I think it's good for the users who are interested to know:
I added a cross domain support for the cookies and added additional domain, upload.*
I left the upload.* domain only on redirect rules via Cloudflare (no optimization)
Redirected the uploading form to the upload.xxxxx.com/upload.php
Made the cookie cross domain based on login so when you login on the main site you're logged in on the upload.* one too.
When you submit the form it uploads really fast via upload.* and redirects back to the preview page on the original domain.
It handles the errors via cookies from the upload.* to the normal domain
Hope this helps :)

Prevent users from accessing files using non apache-rewritten urls

May be a noob question but I'm just starting playing around with apache and have not found a precise answer yet.
I am setting up a web app using url-rewriting massively, to show nice urls like [mywebsite.com/product/x] instead of [mywebsite.com/app/controllers/product.php?id=x].
However, I can still access the required page by typing the url [mywebsite.com/app/controllers/product.php?id=x]. I'd like to make it not possible, ie. redirect people to an error page if they do so, and allow them to access this page with the "rewritten" syntax only.
What would be the easiest way to do that? And do you think it is a necessary measure to secure an app?
In your PHP file, examine the $_SERVER['REQUEST_URI'] and ensure it is being accessed the way you want it to be.
There is no reason why this should be a security issue.
RewriteCond %{REDIRECT_URL} ! ^/app/controllers/product.php$
RewriteRule ^app/controllers/product.php$ /product/x [R,L]
RewriteRule ^product/(.*)$ /app/controllers/product.php?id=$1 [L]
The first rule will redirect any request to /app/controllers/product.php with no REDIRECT_URL variable set to the clean url. The Rewrite (last rule) will set this variable when calling the real page and won't be redirected.

How to prevent a search engine from indexing a directory for a particular domain?

I have a web hosting package with 2 domains pointing to it. I've noticed on Google that it has indexed the directory of one of the domains for the other domain. Is there a way of preventing this from happening.
You could try with the Robots exclusion standard but is no guarantee.
Redirect all pages of one of your domains to the other one. You can do that with .htaccess and modRewrite similar to this:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^example\.com$ [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [R=301,L]
This would perform a 301 redirect (Permanently moved) from example.com to www.example.com.
For SEO purposes you never want to have duplicate content (identical pages on different URLs), there should always be exactly one URL for your content, all other possible URLs should redirect to that one.
Updating your robots.txt will definitely solve the problem in the future, but I think the question you should be asking is, How did Google know those pages were there?
First, you should ensure that a user can't traverse your site's filesystem (if your server is *nix, .htaccess should have something like Options -Indexes). And if you had a public link anywhere that joined the two sites on a single domain, that could be how Google found it. If you are careful to keep your site clean and never point to the files in the other docroot, there should be no problem hosting one domain off the subdirectory of another domain.
You can clear Google's index of those pages by using their Webmaster Tools. In order to identify yourself as the site's owner, you'll need to install a unique file (they create it for you) in the root directory of your various document roots, then you can manually update the parts of your site that they've indexed. This applies only to Google.
If you've been indexed by other search engines (and you probably have been if Google indexed you), you should try to figure out how they got there, fix the problem, move the second site to another folder (causing the pages to report 404 Page Not Found on your main domain) and then get the the search engines to reindex.
If you are using Linux, then some additions to your .htaccess file would probably work, but the specifics would depend on your site setup.

Multiple domains for one site: alias or redirect?

I'm setting up a number sites right now and many of them have multiple domains. The question is: do I alias the domain (with ServerAlias) or do I Redirect the request?
Obviously ServerAlias is better/easier from a readability or scripting perspective. I have heard however that Google likes it better if everything redirects to one domain. Is this true? If so, what redirect code should be used?
Common vhost examples will have:
ServerName example.net
ServerAlias www.example.net
Is this wrong and should the www also be a redirect in addition to example2.net and www.example2.net? Or is Google smart enough to that all these sites (or at least the www) are the same site?
UPDATE: Part of the reasoning for wanting aliases is that they are much faster. A redirect for a dialup user just because they did (or didn't) use the www adds significantly to initial page load.
UPDATE and ANSWER: Thanks Paul for finding the Google link which instructs us to "help your fellow webmasters by not perpetuating the myth of duplicate content penalties". Note, however, this only applies to content ON THE SAME SITE, exemplified in the article with "www.example.com/skates.asp?color=black&brand=riedell or www.example.com/skates.asp?brand=riedell&color=black". In fact, the article explicitly says "Don't create multiple pages, subdomains, or domains with substantially duplicate content."
Redirecting is better, then there is always one, canonical domain for your content. I hear Google penalises multiple domains hosting the same content, but I can't find a source for that at the moment (edit, here's one article, but from 2005, which is ancient history in Internet years!) (not correct, see edit below)
Here's some mod-rewrite rules to redirect to a canonical domain:
RewriteCond %{HTTP_HOST} !^www\.foobar\.com [NC]
RewriteCond %{HTTP_HOST} !^$
RewriteRule ^/(.*) http://www.foobar.com/$1 [L,R=permanent]
That checks that the host isn't the canonical domain (www.foobar.com) and checks that a domain has actually been specified, before deciding to redirect the request to the canonical domain.
Further Edit: Here's an article straight from the horses mouth - seems it's not as big an issue as you might think. Please read this article CAREFULLY as it distinguishes between duplicate content on the same site (as in "www.example.com/skates.asp?color=black&brand=riedell and www.example.com/skates.asp?brand=riedell&color=black") and specifically says "Don't create multiple pages, subdomains, or domains with substantially duplicate content."
SSL certificates can also be an issue (wild card certs mitigate this but are more expensive).
So if the cert is only bound to www.example.com, it won't validate for example.com. If this circumstance applies to your case, then carefully handling, redirects and hyperlink references in your html and javascript is very important.
If they are entirely different domain names, you will want to redirect because otherwise cookies can not be shared between the two. If a user logs into your website at example1.com, they will need to log in again if they visit example2.com.
If they are just different subdomains (example.com vs www.example.com) this won't matter.
Server aliasing can cause problems with CGI session continuity: since cookies are attached to the domain they were served from, CGI scripts have to be carefully written so that they are aware of the aliasing, or all links within and into the site have to be relative, or both - it is much harder to avoid niggly little hard-to-debug problems due to the browser serving you different cookies based on whether the user last entered your site through name.tld or www.name.tld.
Nowadays I doubt it matters. If you see both entries in google, then you know you're doing it wrong.
If half the links to your site refer to one URL and half refer to another, each URL is only going to get half the pagerank. Even if Google doesn't penalize your rank for having duplicate content, you're going to suffer.