How do I get placemark icons to load over ssl? - ssl

I'm working on a web application that uses the google earth plugin. Recently, a new requirement to have non-public users logon was added, which meant that some users were now using the site over https. Among the things that broke in testing were the custom placemark icons (They were working using http).
The icons are hosted on the same server which servers the page.
Here are the urls for each of the protocols.
http - http://localhost/Images/yellow.png
https - https://localhost/Images/yellow.png
I can follow that link and the image will appear as you would expect.
The images hrefs are declared as icon styles in dynamically generated kml.
I want to avoid loading the images over http because I think that will cause internet explorer to present the user with a mixed content warning.
How do I get the images to load properly while using https?

I have been wrestling with this myself -- the short answer is that this won't work. If the content is served off of an HTTPS site that generates any kind of error/prompt (authentication, invalid certificate, etc.) the plugin will simply not load the content.
Interestingly, the desktop client works fine and prompts the user for credentials if necessary. However, neither client will allow content to be served off of site with an untrusted certificate.
The only workaround that I have found is:
Use a trusted HTTPS certificate on the server hosting the content (either trust the certificate on the client systems or just use a real certificate.)
Do not use HTTPS basic auth as that will always generate 401/Challenge responses which the web browser client will simply ignore
If authentication is a requirement, use NTLM authentication and common (e.g., domain) logins. If you load the plugin in Interent Explorer (or in a .NET WebBrowserControl) the authentication will be handled properly and the images will show up.
I was at a Google Earth administrator's training last week and the trainer confirmed this "bug". It is supposed to be fixed in the next version of the plugin (it may actually be fixed already -- what version of the plugin are you using?)

Related

How can I whitelist onedrive using the "quintolabs qlproxy" for web filtering

I am useing quintolabs qlproxy for web filtering. How can I whitelist onedrive so it stays syncronized? What are the URLs and IPs to Whitelist?
Seems the issue is that OneDrive application uses SSL Pinning and thus does not accept mimicked SSL certificate from your Squid proxy. A similar issue for Dropbox is explained at http://docs.diladele.com/faq/squid/dropbox.html.
This same error will be present in all SSL inspecting web filters. For example from the message at Sophos (astaro) UTM support forum it seems the list of domain names to exclude is quite large (see https://www.astaro.org/gateway-products/network-protection-firewall-nat-qos-ips/56579-microsoft-onedrive.html):
skyapi.live.net
storage.live.com
skydrive.live.com
shared.live.com
onedrive.live.com
Please note the list may not be complete. The best is to fire up the WireShark or (better) Microsoft Message Analyzer on the machine where OneDrive is installed and try to see what domain names are sent to the proxy upon start of OneDrive application. Then exclude these from ssl bump.

Safari 9 disallowed running of insecure content?

after upgrading to Safari 9 I'm getting this error in the browser:
[Warning] [blocked] The page at https://localhost:8443/login was not allowed to run insecure content from http://localhost:8080/assets/static/script.js.
Anyone knows how to enable the running of insecure content on the new Safari?
According to the Apple support forums Safari does not allow you to disable the block on mixed content.
Though this is frustrating for usability in legitimate cases like yours, it seems to be part of their effort to force secure content serving / content serving best practices.
As a solution for you you can either upgrade the HTTP connection to HTTPS (which it seems you have done) or proxy your content through an HTTPS connection with an HTTPS-enabled service (or, in your case, port).
You can fix the HTTPS problem by using HTTPS locally with a self signed SSL certificate. Heroku has a great how-to article about generating one.
After setting up SSL on all of your development servers, you will still get an error loading the resource in Safari since an untrusted certificate is being used(self signed SSL certificates are not trusted by browsers by default because they cannot be verified with a trusted authority). To fix this, you can load the problematic URL in a new tab in Safari and the browser will prompt you to allow access. If you click "Show Certificate" in the prompt, there will be a checkbox in the certificate details view to "Always allow content from localhost". Checking this before allowing access will store the setting in Safari for the future. After allowing access just reload the page originally exhibiting a problem and you should be good to go.
This is a valid use case as a developer but please make sure you fully understand the security implications and risks you are adding to your system by making this change!
If like me you have
frontend on port1
backend on port2b
want to load script http://localhost:port1/app.js from http://localhost:port2/backendPage
I have found an easy workaround: simply redirect with http response all http://localhost:port2/localFrontend/*path to http://localhost:port1/*path from your backend server configuration.
Then you could load your script directly from http://localhost:port2/localFrontend/app.js instead of direct frontend url. (or you could configure a base url for all your resources)
This way, Safari will be able to load content from another domain/port without needing any https setup.
For me disabling the Website tracking i.e. uncheck the Prevent cross-site tracking worked.

Warning when using https on my website

I'm trying to secure my website with https. I managed to add the certificates and all that stuff but in some parts of the website i get this message "this website contains interactive content that isn't encrypted (such as scripts)". Any ideas on how to fixed this?
The website runs on localhost.
I am using Apache on OS X Mavericks
You have resources on your website (javascript for example) that isn't sent through a HTTPS request, but rather a regular HTTP request.
Try storing the resources on your own website instead of requesting them from a different one.
yes. When you use https ALL pictures, SSC, and JS files should be loaded from your machine

How to fix Firefox defaulting to https for rails app on custom domain hosted on heroku

I have a ruby on rails 3 app hosted on heroku with a custom domain. It uses oauth to allow the user to log in through Facebook. After a user logs in through Facebook, the next time they type in our domain in Firefox (tested on FF 15.0.1 on Mac) it automatically fills in https before the address (So the user is used to typing "example.com" into the address bar and pressing ENTER, but Firefox changes that to https://www.example.com). This of course shows the "This Connection is Untrusted" warning page (http://support.mozilla.org/en-US/kb/connection-untrusted-error-message) since we do not have an SSL certificate instead of loading our page.
This only seems to happen with Firefox (tested on Chrome and Safari as well).
I've tried redirecting the rails action that we point to for root to the http protocol version using this example (http://captico.com/securing-specific-routes-in-rails-3/2011/02), but that didn't work. I've also tried adding the ssl_requirement gem (https://github.com/bartt/ssl_requirement) and excluding the action that we point to for the root domain, but then I just got a bad URI error.
We're in money saving mode right now as we test out the site and slowly grow in users. I believe the best thing to do is to pay the money for our own SSL cert, as well as the $20/month to heroku to get SSL for our custom domain. But for now, we'd like to avoid having these extra costs.
Is there a way to fix this for free?
To fix it for free, use the *.herokuapp.com domain instead of a custom domain.

Images on SSL enabled site with Internet explorer

I have a problem with my site after implementation of SSL that images do not appear. The scenario is that images come from images.domain.com (hosted on Amazon S3) and my certificate is for www.domain.com.
This problem only seems to happen in IE and not in any other browsers.
The issue is related to "mixed content" - HTTPS pages which have HTTP resources (images, scripts, etc) embedded.
The point of using HTTPS is to ensure that only the originating server and the client have access to the secured page. However, in theory it might be possible for this security to be compromised if HTTP resources are embedded - a server might intercept an unsecured javascript file and inject some code to alter the secured page onload.
Most browsers will indicate that a secure page has mixed content by altering the "secure lock" icon, either by showing the lock as open or broken, or by making the icon red (Chrome displayed a skull and crossbones for a short time, but they realised that this was a bit serious for the potential threat level).
Internet Explorer (depending on the version) will display a message either asking whether the insecure content should be shown (IE<=7), or whether only the secure content should be shown (IE>=8). It sounds like you have somehow disabled this message to always hide the insecure content, however that's not the default behaviour.
I think the best solution for you is to replace your S3 links with HTTPS versions.
I am not a web developer, but someone who often deals with the crap experience that is IE. I am not sure what version you are using, but you do not have a wildcard SSL cert (i.e. *.domain.com), so does it have something to do with an old-school limitation in 3rd party images?
See here for what I allude to above and a very good explanation of how IE caches cross-domain HTTPS content, specifically images. I am not sure what the solution is, but I was curious so I researched a little myself and this might help.