Will Google index a site which forces HTTPS or SSL? - seo

We are having problems getting Google to index our site. We decided it was easiest to just use https for the entire site.
Do we need to change it so that the anonymous, "public", areas of the site are not encrypted for them to be indexed?

Yes, they'll index https just fine. Remember that https is a transport protocol to ensure that folks in the middle of the wire can't eavesdrop (. . . as easily . . . ) or muck w /the traffic.
They will not index password protected portions, of course.

They do. Here's another references: http://www.google.com/search?q=mdc+javascript First result is an https address

Yes, they will.
Example search:
http://www.google.com/search?q=%22SSL+Certificates+-+Secure+Your+Data+%26+Transactions%22+site%3Agodaddy.com
You should see a result indicating the URL is "https://www.godaddy.com/gdshop/ssl/ssl.asp".

Related

Typolink across domain using SSL behind proxy

There are two page trees in my TYPO3 and I link between them. Both websites use an SSL/TLS encryption for the frontend delivery and thus should only generate relative links (if on the same domain) or link to my second domain (which it does, but only using http and not https).
Now the reason for this seems clear: I've never told TYPO3 to only generate https links. The question is: how would I do that in the first place?
I've come accross the possibility to work with config.absRelPrefix but this doesn't work when linking across domains.
use
config.typolinkCheckRootline = 1
this way typolinks check if the target page belongs to the current domain.
https://docs.typo3.org/typo3cms/TyposcriptReference/Setup/Config/Index.html#typolinkcheckrootline
Apparently, config.typolinkCheckRootline = 1 as well as any combination of config.absRelPrefix and config.baseUrl won't help it, if TYPO3 get's the wrong HTTP host and only get's the host via HTTP_X_FORWARDED_HOST env var.

Google OAuth 2 works only from localhost [duplicate]

I'm trying to set a web service that needs the user's Google Latitude info, so I'm using Google OAuth to get the user authorization stuff.
However, when trying to set the redirection URI in the Google APIs Console for a web application client ID I get a message error if I try to set it to 'http://PUBLIC_IP/'.
I need to test it with non local users (thus localhost can't be used), so I would like to know if having a web domain is mandatory in order to use Google's OAuth. If not, how can I solve this issue?
This is not currently supported. I filed a feature request and will update on progress.
Update: Essential app verification activities have continued to make support of IP address-based apps unlikely. These verification activities are necessary to provide protections against abuse of user accounts. In addition, the cost of setting up dedicated domains has been reduced significantly since this feature was requested. Please read other responses here about possible options.
You can use xip.io to work around it.
For example: '192.168.0.50.xip.io:3000' will resolve to '192.168.0.50:3000'
I ran into this issue too and so I entered a URL with a .com extension and also entered it into my /etc/hosts file. Works like a charm.
It totally sucks that my entire app now has to be developed on an apparently 'live' domain though.
I used my public hostname. It helps if you have a static IP address. I used http://www.displaymyhostname.com/ to get my hostname. I plugged it straight into the Authorized JavaScript origins field when I created a new Web Application Client ID.
P.S. My hostname looked something like this: 111.111.111.111.static.exetel.com.au
You can use a dynamic DNS. I used ddns.net which offers a free solution. Basically, you enter your FQDN as this: yourcompany.ddns.net as your domain. When looked up for an IP address, the .net domain points to ddns; when ddns.net is looked up, it looks up in its database for your company, returns the IP. So mine looks like this: https://wigwam.ddns.net and everything works fine. You don't need to buy a domain, you can substitute your known IP, and Google is happy with that.
Your IP must be static, of course.
Yes, as of now you still need to have a domain name to use Google OAuth in your application. If you have a static public IP and don't want to buy a domain name, you could use a free subdomain from FreeDNS to link to your public IP. Seemed to work well enough for me with a Django app.
Echoing what Breno said in response to his earlier comment:
Apologies for the lack of updates here. Essential app verification activities have continued to make support of IP address-based apps unlikely. These verification activities are necessary to provide protections against abuse of user accounts. In addition, the cost of setting up dedicated domains has been reduced significantly since this feature was requested. Please read other responses here about possible options.
You can read more about Google's app verification requirements [1] and Google's policies requiring secure handling of data [2].
[1] https://support.google.com/cloud/answer/9110914?hl=en
[2] https://developers.google.com/identity/protocols/oauth2/policies#secure-response-handling.
xip.io is not working anymore as an alternative you can use nip.io the same way for example:
10.0.0.1.nip.io:8000 will resolve to 10.0.0.1:8000
It seems like xip.io is down, but there are alternatives such as sslip.io and nip.io. However, I couldn't get either of these to work.
I ended up hosting the main file server on the main machine, and ran said server on a 192.168.1.xx IP address. I then ran servers on each of the test machines (including a second server on the main machine), all of which were on the localhost address. Any requests that the localhost servers received were then passed off to the 192.168.1.xx server, which allowed testing on all of the devices.
This should also work with public facing IP addresses.

Http to Https redirection not happening automatically

I am trying to access a secure website using this kind of url: https://securenet.someBank.com. Everything is good and I am shown the login page. Now when I just type:
http://securenet.someBank.com (i.e http instead of https) I expect to get back a page with https in the browser. (e.g when you say:http://mail.yahoo.com, you get back https://mail.yahoo.com).
But in this case https:://securenet.someBank.com just says :Page cannot be displayed.
So what did the website developer do wrong in implementing security? I am just curious. I thought this kind of thing (http --> https redirection) was handled automatically by the web server and the website developer does not even need to do anything. But apparently it is not so.
The redirections from HTTP to HTTPS are merely a convenience for the user.
As I was saying in this answer on Webmasters.SE, only the end user can check whether HTTPS is used at all, and whether it's used correctly. A MITM attacker could otherwise prevent that initial redirection from happening at all.
These automatic redirections are only useful based on the assumption that there's no MITM performing such an attack. They're useful to get the user used to seeing HTTPS on pages that should be secure, but whatever happens, it will always be up to the user to check what they're connecting to. Therefore, I wouldn't necessarily call the absence of such a redirection a developer or sysadmin mistake.
As a user, you should always bookmark and use the https:// address for sites where you expect it should be used.
[...] https://securenet.someBank.com. Everything is good and I am shown the login page.
[...]
But in this case https:://securenet.someBank.com just says :Page cannot be displayed.
Here, assuming the double :: is a typo in your question, you seem to contradict yourself. If https://securenet.someBank.com just says "Page cannot be displayed", this would be a mistake indeed.
besides the recommendation by Bruno above I would recommend you to read the following:
https://www.owasp.org/index.php/HTTP_Strict_Transport_Security
There are two things you could do:
1) Force HTTP Strict Transport Security
2) Do a permanent redirect as described in the example on that page.
Any questions, just let me know.
Fabio
#fcerullo
Probably wrong server configuration. For example in apache one must define a redirect option in httpd.conf file in order to automaticaly redirect to the https URL of the page.

Moving website from HTTP to fully HTTPS and SEO implications

Alright, you think that this might be one of the most asked question on the internet, and you're tired reading the exact same answers. So let's focus on one of the most common answer, and forget about the others.
One of the common answer is:
"The https-site and the http-site are two completely different sites;
it’s a little bit like having a www version of the site and a non-www
version. Make sure you have 301 redirects from the http URLs to the
https ones." (source:
http://www.seomoz.org/ugc/seo-for-https-with-s-like-secure)
So here's my question:
Why are people saying that https and http are two different websites? How different is https://www.mydomain.com from http://www.mydomain.com?
The URI is the same and the content is the same. Only the protocol changes.
Why would the protocol have any impact on SEO? Whether or not the content is encrypted from point A to point B, why would that matter SEO wise?
Thanks for your help!
-H
Http and https could technically be two different sites. You could configure your server to server completely different content. They have two different urls (the difference being that s).
That being said, almost all webmasters with both http and https serve nearly identical content whether the site is secure or not. Google recognizes this and allows you to run both at the same time without having to fear duplicate content penalties.
If you are moving from one one to another, you should treat it similarly to other url changes.
Put 301 redirects in place so that each page gets properly redirected to the same content at its new url
Register both versions in Google Webmaster Tools
I have not personally done this switch, but it should be doable without problems. I have made other types of sitewide url changes without problems in the last couple years.
The other alternative would be to run both http and https at the same time and switch users over more gradually. As they log in, for example.
Update to above answer as on August 2014, Google has just confirmed that sites secured by SSL will start getting a ranking boost. Check official statement here: http://googlewebmastercentral.blogspot.in/2014/08/https-as-ranking-signal.html
Don't think about it in terms of protocol. Think about it in terms of potentiality from a search engines point of view.
http://example.com and http://www.example.com can be completely different sites.
http://example.com/ and http://www.example.com/home can be completely different pages.
https://www.example.com and http://www.example.com can, again, be completely different sites.
In addition to this, https pages have a very hard time ranking. google etc.
If your entire site is https and pops an SSL certificate to an HTTP request, G views them as secure and that they're https for a reason. It's sometimes not very clever in this regard. If you have secure product or category pages, for instance, they simply will not rank compared to competitors. I have seen this time and again.
In recent months, it is becoming very clear Google will gently force webmasters to move to HTTPS.
Why are people saying that https and http are two different websites?
How different is www.mydomain.com from
www.mydomain.com?
Answer: Use the site: operator to find duplicate content. Go to a browser and type:
site:http://example-domain.com
and
site:https://example-domain.com
If you see both versions indexed in Google or other search engines they are duplicates. You must redirect the HTTP version to the HTTPS version to avoid diluting your websites authority and a possible penalty from Google's Panda algorithm.
Why would the protocol have any impact on SEO?
Answer:
For ecommerce websites, Google will not rank them well without being
secure. They do not want users to get their bank info etc stolen.
Google will be giving ranking boosts to sites that move to HTTPS in
the future. Although it is not a large ranking signal now, it could
become larger.
The guys at Google Chrome have submitted a proposal to dish out
warnings to users for ALL websites not using HTTPS. Yes, I know it
sounds crazy, but check
this out.
Info taken from this guide on how to move to HTTPS without killing your rank.
Recently, if SSL is inactive in Firefox browser, it shows an error. You must enable SSL and redirect the URL to HTTPS 301

Can a public IP address be used as Google OAuth redirect URI?

I'm trying to set a web service that needs the user's Google Latitude info, so I'm using Google OAuth to get the user authorization stuff.
However, when trying to set the redirection URI in the Google APIs Console for a web application client ID I get a message error if I try to set it to 'http://PUBLIC_IP/'.
I need to test it with non local users (thus localhost can't be used), so I would like to know if having a web domain is mandatory in order to use Google's OAuth. If not, how can I solve this issue?
This is not currently supported. I filed a feature request and will update on progress.
Update: Essential app verification activities have continued to make support of IP address-based apps unlikely. These verification activities are necessary to provide protections against abuse of user accounts. In addition, the cost of setting up dedicated domains has been reduced significantly since this feature was requested. Please read other responses here about possible options.
You can use xip.io to work around it.
For example: '192.168.0.50.xip.io:3000' will resolve to '192.168.0.50:3000'
I ran into this issue too and so I entered a URL with a .com extension and also entered it into my /etc/hosts file. Works like a charm.
It totally sucks that my entire app now has to be developed on an apparently 'live' domain though.
I used my public hostname. It helps if you have a static IP address. I used http://www.displaymyhostname.com/ to get my hostname. I plugged it straight into the Authorized JavaScript origins field when I created a new Web Application Client ID.
P.S. My hostname looked something like this: 111.111.111.111.static.exetel.com.au
You can use a dynamic DNS. I used ddns.net which offers a free solution. Basically, you enter your FQDN as this: yourcompany.ddns.net as your domain. When looked up for an IP address, the .net domain points to ddns; when ddns.net is looked up, it looks up in its database for your company, returns the IP. So mine looks like this: https://wigwam.ddns.net and everything works fine. You don't need to buy a domain, you can substitute your known IP, and Google is happy with that.
Your IP must be static, of course.
Yes, as of now you still need to have a domain name to use Google OAuth in your application. If you have a static public IP and don't want to buy a domain name, you could use a free subdomain from FreeDNS to link to your public IP. Seemed to work well enough for me with a Django app.
Echoing what Breno said in response to his earlier comment:
Apologies for the lack of updates here. Essential app verification activities have continued to make support of IP address-based apps unlikely. These verification activities are necessary to provide protections against abuse of user accounts. In addition, the cost of setting up dedicated domains has been reduced significantly since this feature was requested. Please read other responses here about possible options.
You can read more about Google's app verification requirements [1] and Google's policies requiring secure handling of data [2].
[1] https://support.google.com/cloud/answer/9110914?hl=en
[2] https://developers.google.com/identity/protocols/oauth2/policies#secure-response-handling.
xip.io is not working anymore as an alternative you can use nip.io the same way for example:
10.0.0.1.nip.io:8000 will resolve to 10.0.0.1:8000
It seems like xip.io is down, but there are alternatives such as sslip.io and nip.io. However, I couldn't get either of these to work.
I ended up hosting the main file server on the main machine, and ran said server on a 192.168.1.xx IP address. I then ran servers on each of the test machines (including a second server on the main machine), all of which were on the localhost address. Any requests that the localhost servers received were then passed off to the 192.168.1.xx server, which allowed testing on all of the devices.
This should also work with public facing IP addresses.