I have already installed ssl certificate on my opencart site but some pages are working fine with https but category pages not working with https. Do I need to change all url in database also? In the config file, I already set https.
Some of these may not apply to your particular installation but in the interest of creating a comprehensive answer, I've tried to cover all the bases here:
Note: you might need to adjust the table names depending on your store's table prefix if they don't begin with oc_
Open config.php and admin/config.php and change all those constant url declarations to https - make sure to include HTTP_SERVER and HTTP_CATALOG
In your admin panel go to system > settings, click edit and in the server table set Use SSL: to Yes.
In your database update the store_url column in the oc_order table so that all links are https. This is important because updating orders can fail if the api attempts to access http version of your site. you can use this query: UPDATE oc_order SET store_url = REPLACE(store_url, 'http:', 'https:')
If you have any hard coded images and links in your description tables you should replace those as well. SSL will still work but will show the warning flag in the browser bar. This includes oc_product_description, oc_category_description, and any other tables where you might have created html content.
Same as above for your theme files. It's fairly common to find hard coded http:// links and images in footer.tpl and header.tpl for starters. You can simply browser your site to see if any of the pages are not showing the green lock icon in the browser and take it from there.
Another culprit breaking https can be third party extensions which can exist both as files and in OC2 as ocmods in the oc_modification table.
Finally, create a redirect in .httaccess to gracefully let traffic know that your pages can now be found on https. I've excluded robots.txt and any connections for the openbay routes because, based on experience, when I tried to redirect ebay webhooks it broke things and they seem to be http only by default. I suspect this may be a shortcoming in how openbay handles those requests, or possibly a configuration issue but I was unable to find a workaround that didn't break openbay so for now I'd recommend leaving those requests untouched. I am using this in .htaccess:
RewriteCond %{HTTPS} off
RewriteCond %{REQUEST_URI} !/robots\.txt$
RewriteCond %{QUERY_STRING} !^route=ebay/openbay/*
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
That should do it!
Related
here is the situation and I need a little help with it:
I have a domain xxxxx.com and a sub-domain upload.xxxxx.com both lead to directory /www/xxxxx.com/, but I am using the 2nd domain for file uploading since I am using Cloudflare's services and with the 2nd domain there are no performance optimization and troubles with the upload (I've created cookies that are valid for both domains).
But my point is because I am using only 1 file for that 2nd domain and it upload.xxxxx.com/upload.php the same file exist under xxxxx.com/upload.php - I am not really good in .htaccess, so how can I make the only page that could be opened from this subdomain to be upload.php and all other to redirect to the main domain ?
You can use this rule as first rule in your DOCUMENT_ROOT/.htaccess file:
RewriteEngine On
RewriteCond %{HTTP_HOST} !^upload\. [NC]
RewriteRule ^upload\.php$ /? [NC,L,R]
What I did in the end I think it's good for the users who are interested to know:
I added a cross domain support for the cookies and added additional domain, upload.*
I left the upload.* domain only on redirect rules via Cloudflare (no optimization)
Redirected the uploading form to the upload.xxxxx.com/upload.php
Made the cookie cross domain based on login so when you login on the main site you're logged in on the upload.* one too.
When you submit the form it uploads really fast via upload.* and redirects back to the preview page on the original domain.
It handles the errors via cookies from the upload.* to the normal domain
Hope this helps :)
I am using Magento 1.6.2.0 on a shared host running Litespeed web server, and I have begun investigating ways to speed up page loads. Currently I am using Pingdom to look at requests and it appears that I am losing an entire second from the get-go when I type my URL without the www. The browser redirects to the www page, it's just that it takes so long. Is this something I can fix? I presume that I can change Magento's base-url to not include the www, but then I presume I'll have the same delay when going to the www url instead.
I took a look at the link you gave, and I indeed see an about 1 second delay before I receive a 302 redirect to the URL with www. prepended. Not entirely coincidentally, the actual page HTML also takes quite long (about 1.7 seconds) to load.
This is a fairly common issue with large web applications: to return even a simple response like a redirect, the entire application must load and run its startup code. Couple this with a not so fast shared webserver that isn't optimized for that one application, and you can get quite slow page load times. It's nothing unique to Magento; I've seen the same effect with MediaWiki myself, and I expect that it happens with other applications too.
The obvious solution is just to avoid redirects: as long as you make sure all your URLs have the right hostname, the extra delay due to wrong hostnames will not appear. Magento itself will presumably take care of this for its own URLs, but if you have any other code (or static pages) that link to your Magento URLs, make sure they use the right hostname.
You can also sign up for Google Webmaster Tools (and similar tools for other search engines) and configure your preferred domain there (it's under Site configuration → Settings) so that Google will automatically prepend www. to any links to your site it indexes.
You can (and should) also try to reduce Magento's startup time in general. This will speed up not only redirects, but all other page loads as well. I'm not familiar enough with Magento to be able to give much detailed advice on this, but the obvious first step for any massive PHP application is to make sure you're using a PHP accelerator such as APC.
Finally, the fastest way to redirect visitors to the correct hostname is to make your webserver send the redirect directly without ever invoking Magento at all. The details on how to do this depend on the server software you're using, but apparently LiteSpeed supports the same RewriteRule syntax as Apache's mod_rewrite, so you should be able to do this just by adding the following lines to your main .htaccess file:
Options +FollowSymLinks
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} !^www\.mmmspeciosa\.com$ [NC]
RewriteRule ^(.*)$ http://www.mmmspeciosa.com/$1 [R=301,L]
(By the way, I'm using HTTP 301 permanent redirects here instead of the HTTP 302 temporary redirects Magento seems to be using. This is not only more appropriate according to the HTTP standard, but also works better with search engines, which treat a 301 redirect as an indication to index the target URL instead of the source of the redirect. If this redirect type is not configurable in Magento, I would consider it a bug. If it is configurable, you should set it to 301.)
I have a web hosting package with 2 domains pointing to it. I've noticed on Google that it has indexed the directory of one of the domains for the other domain. Is there a way of preventing this from happening.
You could try with the Robots exclusion standard but is no guarantee.
Redirect all pages of one of your domains to the other one. You can do that with .htaccess and modRewrite similar to this:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^example\.com$ [NC]
RewriteRule ^(.*)$ http://www.example.com/$1 [R=301,L]
This would perform a 301 redirect (Permanently moved) from example.com to www.example.com.
For SEO purposes you never want to have duplicate content (identical pages on different URLs), there should always be exactly one URL for your content, all other possible URLs should redirect to that one.
Updating your robots.txt will definitely solve the problem in the future, but I think the question you should be asking is, How did Google know those pages were there?
First, you should ensure that a user can't traverse your site's filesystem (if your server is *nix, .htaccess should have something like Options -Indexes). And if you had a public link anywhere that joined the two sites on a single domain, that could be how Google found it. If you are careful to keep your site clean and never point to the files in the other docroot, there should be no problem hosting one domain off the subdirectory of another domain.
You can clear Google's index of those pages by using their Webmaster Tools. In order to identify yourself as the site's owner, you'll need to install a unique file (they create it for you) in the root directory of your various document roots, then you can manually update the parts of your site that they've indexed. This applies only to Google.
If you've been indexed by other search engines (and you probably have been if Google indexed you), you should try to figure out how they got there, fix the problem, move the second site to another folder (causing the pages to report 404 Page Not Found on your main domain) and then get the the search engines to reindex.
If you are using Linux, then some additions to your .htaccess file would probably work, but the specifics would depend on your site setup.
My situation:
We have a mobile version of our website, and want to start redirecting mobile users to it. The plan is to do this in Apache httpd.conf or .htaccess, using something like this:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (iPhone|Blackberry|...)
RewriteRule (.*) mobile/$1
However we want there to be a way for users to override our default action of redirecting them. One way we thought to do it was to show a link on the mobile site directing back to the regular site, and store a cookie when they use that link.
Could the Apache configuration file check a cookie before redirecting?
Is there a better way?
The HTTP_COOKIE server variable contains the cookies passed from the client to the server. You can look in there to find out what cookies have been generated by a script or module.
I've browsed through several questions on the site and nothing quite matched what I want to do.
I found 1 question that could possible work the best as long as it does neglect SSL on any of the links within the specific folder I want to avoid having SSL on. (Force redirect to SSL for all pages apart from one)
Basically what I need is this-
I need the link on my site for '.../store' to remain with SSL off, but I want to force SSL for everything else in the store, most specifically '.../store/index.php?xlspg=checkout'.
The reason I need the /store link to remain with SSL off is because it conflicts with the admin panel login. That is the only direct link that cannot have SSL, so I'm also not sure which would be the best way to handle this.
The link to another question on this site that I posted above seems like it would work, like I said, as long as it doesnt affect anything deeper into the store aside from that sole page itself.
Any help would be greatly appreciated!!!
Can you instead construct your link to the admin panel to avoid using SSL? For example, if your admin panel link is something like:
Admin panel
change it instead to:
Admin panel
Then you can use SSL by default, except for the admin panel.
Try this rule:
RewriteCond %{HTTPS} !=on
RewriteRule ^store/index\.php$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]