I would like to know how many different SubjectAlternativeNames a certificate may have and where this specification is published.
Why? Because FireFox only recognises the first five entries in the SubjectAlternativeName and we have a certificate that picks up a number of common misspellings of one of our websites, all of which have DNS entries pointing to the correct domain.
We do not wish to wildcard this certificate.
Are you sure that
FireFox only recognises the first five entries in the SubjectAlternativeName
because if you go to this page, it has many entries in SubjectAlternativeName and it works fine. Firefox does not complain even if the URL is not in the first five entries and when you click certificate details, it displays all of the entries correctly. I should mention that I'm using version 50.1.0 (but I have tested in version 49.0.2).
Another example of a certificate with six entries can be found here (because previous one has wildcard entries first).
You are correct. Apparently I had previously accepted a permanent exception to the site I was visiting and that certificate, used for testing, only had five SubAtlNames. I cleared the exception and the proper certificate now lists all of the expected alternative names.
Related
When I search something on google.com, I see interaction with the following IP address: 172.217.7.132
But when I attempt to reverse lookup the ip address, I get iad30s08-in-f132.1e100.net. and iad30s08-in-f4.1e100.net., not google.com.
What do I need to do in order to correctly identify that this IP address is resolved by google.com.
EDIT
Clarifying the question: My problem is not specific to google.com. I want to programmatically/logically arrive at google.com because that's what my browser requested for.
Same problem exists in the case of amazon: The IP address it resolves to, on reverseDNS gives me: server-13-32-167-140.sea19.r.cloudfront.net. instead of amazon.com
Code for performing reverse lookup:
In [1]: def reverse_lookup(ip_address):
...: from dns import reversename, resolver
...: domain_address = reversename.from_address(ip_address)
...: return [answer.to_text() for answer in resolver.query(domain_address, "PTR")]
As others have mentioned, 1e100.net does, in fact, belong to google. Their reverse DNS is going to resolve to whatever they want it to resolve to, and there's not much you can do about that.
Depending on your requirements, another alternative may be using a geolocation database to gather more information about an IP. You can find a demo of this here:
https://www.maxmind.com/en/geoip-demo
(enter your example address 172.217.7.132 in the form)
MaxMind has various products (some free, some commercial), so one of them may fit your needs of being able to look up this info programatically.
A different possible solution would be to get access to a WHOIS API, such as:
https://hexillion.com/whois
Example results:
https://hexillion.com/samples/WhoisXML/?query=172.217.7.132&_accept=application%2Fvnd.hexillion.whois-v2%2Bjson
https://support.google.com/faqs/answer/174717
1e100.net is a Google-owned domain name used to identify the servers in our network.
Following standard industry practice, we make sure each IP address has a corresponding hostname. In October 2009, we started using a single domain name to identify our servers across all Google products, rather than use different product domains such as youtube.com, blogger.com, and google.com.
Typically, you will get a 1e100.net result when you do a reverse lookup on one of their IPs. Consider it as good as a google.com result would be - you've verified that the IP is controlled by Google if you see it.
One exception to this is the Googlebot crawler, which may return google.com or googlebot.com results. (I would expect this to eventually get moved over to 1e100.net in the future.)
Assume I own example.com and I purchasd a wildcard SSL cert that works for *.example.com. Management now wants to create "companion" sites to alreasy existing sites. So if I have foo.example.com and bar.example.com they might also want me to create meow.foo.example.com and woof.bar.example.com.
The existing wildcard cert works for the sub-domain sites, as we've been doing all along. I just learned it does NOT work for sub-sub-domain sites. Is it possible to create a wildcard cert for *.*.example.com?
Would sub-sub-sub-domain's require an additional wildcard cert? So if you want to protect X levels, do you need X certs?
Disclaimer: I spent a while researching this, but are a lot of seemingly conflicting questiona/answers on SO, so I feel bad about asking this again, but I don't want to go through the hassle of purchasing to find out later that I made a mistake.
While there may be some implementations which allow multiple wildcards, effectively the answer is no, multiple wildcards are not allowed.
RFC 6125 (section 6.4.3) tries clarifying the common de facto rules, and distilled down to the core MUSTs the rules are effectively:
If the first character is not * perform a literal match. (non-wildcard).
If the second character is not . then match nothing (invalid wildcard)
Find the first . in the match candidate, and literal-match starting with the second character in the dNSName value.
So *.*.example.com would not match a.b.example.com because .*.example.com != .b.example.com.
Of course, some clients may have implemented their matching logic differently. But counting on anything more lax than this interpretation will result in some clients saying it isn't a match when you were hoping it would.
(Okay RFC 6125 section 6.4.3 doesn't have any actual MUSTs; but if you respect the SHOULD NOTs and don't follow the MAY, but do support wildcard matching, you end up with the above.)
See https://security.stackexchange.com/a/10540/68042
you just need separate certificate for each level of subdomain, with names:
example.com
*.example.com
..example.com
..*.example.com
In theory, single certificate with all those entries in Subject Alt Name extension would do, but it may not work for some cases. Separate certificates are safer.
To use single certificate (for *.example.com) you may use names meow-foo.example.com instead of meow.foo.example.com
You can go for Multi-domain wildcard SSL Certificate it will secure your sub-sub- domain.It can secure below wildcards
- *.mydomain.tld
- *.sub1.mydomain.tld
- *.sub2.mydomain.tld
- *.anydomain.com
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
We have a wildcard SSL certificate for *.domain.example, and have a website with sub1.sub2.domain.example.
Safari 4.0.4 on macOS pops up a certificate error(presumably because of wildcard interpretation), while Safari 4 on Windows does not.
Also IE8 behavior is mixed at best, some IE8 display the certificate error and some do not.
What causes this strange behavior on Safari and IE?
A wildcard SSL certificate for *.example.net will match sub.example.net but not sub.sub.example.net.
From RFC 2818:
Matching is performed using the matching rules specified by
RFC2459. If more than one identity of a given type is present in
the certificate (e.g., more than one dNSName name, a match in any one
of the set is considered acceptable.) Names may contain the wildcard
character * which is considered to match any single domain name
component or component fragment. E.g., *.a.example matches foo.a.example but
not bar.foo.a.example. f*.example matches foo.example but not bar.example.
If you need a wildcard certificate that contains *.domain.example sites and also work with sub1.sub2.domain.example or another domain like *.domain2.example, you can solve that with a single wildcard certificate with what is called a subject alternative name (SAN) extension for each of the other sub sub domains. A SAN cert is not just for multiple specific host names, it can be created for wildcards entries as well.
For example, *.domain.example, sub1.sub2.domain.example, and *.domain2.example would have a Common Name of *.domain.example then you would attach a subject alternative name of both *.domain2.example and *.sub2.domain.example. It might depend on the Certificate Authority as to how they would charge you (or not) for the certificate, but there are some out there where this offering is available. Also, SAN is support is pretty widespread in the web browser space. The best real world example of this use, it Google's SSL cert. Go open Google and view its SSL certificate, you will see it works for *.google.com, *.youtube.com, *.gmail.com, and a bunch more where they are listed as subject alternative names.
The wildcard is only applied to the first part (from the left) of you domain. So you'll need a certificate for *.sub2.domain.example
If you meant that you have sub1.domain.example and sub2.domain.example, then it should work.
I'm getting slightly different display of a website depending on which URL I use to access it (two different servers, both serving the same files). One looks "thinner" than the other in Firefox 3.0 (no discernible difference in IE)
For example, compare:
http://www.ece.ualberta.ca/support/
and
http://www1.ece.ualberta.ca/support/
This is not a major issue, but I just noticed this and am extremely curious as to what could cause it. Is it some kind of Firefox bug? I haven't yet tried the newest version.
EDIT: My bad for assuming those URL's were actually serving the same content (it's not my server, but I do attend that school). Comparing:
http://www.ece.ualberta.ca/~ecegsa/links.html (it seems this server is down atm) and http://www1.ece.ualberta.ca/~ecegsa/links.html
shows the same issue, but the HTML is identical according to diff run on saved files. I don't see the problem on anything other than FF 3.0 at my work, so I'm guessing it's some idiosyncrasy with that browser. Still curious though.
Looking briefly at those two URLs, they're running different HTML!
For example, http://www.ece.ualberta.ca/support/ has this text:
Windows Vista/7 (volume license)
Activation
While http://www1.ece.ualberta.ca/support/ has this text:
Windows Vista (volume license)
Activation
I suspect that different HTML accounts for the difference you're seeing.
If these are actually the same servers hosting the same content, this kind of disparity could be caused by intermediate caches (e.g. proxies, CDN's, etc.) refreshing at different rates. For example, if www points to a load-balancing, caching proxy and www1 points directly to the host, this may cause the difference. You might also be seeing a bug or lag in how content is updated to different servers in a load-balanced cluster.
This will require a little setup. Trust me that this is for a good cause.
The Background
A friend of mine has run a non-profit public interest website for two years. The site is designed to counteract misinformation about a certain public person. Of course, over the last two years those of us who support what he is doing have relentlessly linked to the site in order to boost it in Google so that it appears very highly when you search for this public person's name. (In fact it is the #2 result, right below the public person's own site). He does not have the support of this public person, but what he is doing is in the public interest and good.
The friend had a stroke recently. Coincidentally, the domain name came up for renewal right when he was in the hospital and his wife missed the email about it. A domain squatter snapped up the domain, and put up content diametrically opposed to his intent. This squatter is now benefitting from his Google placement and page rank.
Fortunately there were other domains he owned which were aliased to point to this domain, i.e. they used a DNS mapping or HTTP 301 redirect (I'm not sure which) to send people to the right site. We reconfigured one of the alias domains to point directly to the original content.
We have publicized this new name for the site and the community has now created thousands of links to the new domain, and is fixing all the old links. We can see from the cache that Google has in fact crawled the original site at the new address, and has re-crawled the imposter site.
The Problem
Even though Google has crawled both sites, you can't get the site to appear in relevant searches under the new URL!
It appears to me that Google remembers the old redirect between the two names (probably because someone linked to the new domain back when it was an alias). It is treating the two sites as if they are the same site in all results. The results for the site name, and using the "link:" operator to find sites that link to this site, are entirely consistent with Google being convinced they are the same site.
Keep in mind that we do not have control of the content of the old domain, and we do not have the cooperation of the person that these sites relate to.
How can we convince the Googlebot that domain "a" and domain "b" are now two different sites and should be treated as such in results?
EDIT: Forward was probably DNS, not HTTP based.
Google will detect the decrease in links to the old domain and that will hurt it.
Include some new interesting content on the new domain. This will encourage Google to crawl this domain.
The 301 redirects will be forgotten, in time. Perhaps several months. Note that they redirected one set of URLs to another set, not from one domain to another. Get some links to some new pages within the site, not just the homepage, as these URLs will not be in the old redirected set.
Set up Google Webmaster Tools and submit an XML sitemap. Thoroughly check everything in Webmaster Tools about once per week.
Good luck.
Time heals all wounds...
Losing control of the domain is a big blow, and it will take time to recover. It sounds like you're following all the correct procedures (getting people to change links, using 301s, etc.)
Has the content of the original site changed since being put up again? If not, you should probably make some changes. If Google re-crawls the page and finds it substantially identical to the one previously indexed, it might consider it a copy and that's why it's using the original URL.
Also, I believe that Google has a resolution process for just such situations. I'm not sure what the form to fill out is or who to contact, but surely some other SO citizens could help.
Good luck!