SSL and W3 XHTML Validator - ssl

This may be a dumb newbie question, so appologies for that.
My website is using a SSL certificate. I also include the W3 validator link in each of my webpages as follows:
<img src="valid-xhtml1.png" alt="Valid XHTML 1.0 Strict" height="31" width="88" />
(Note: copied over the w3 validator image so SSL wouldn't complain about unsecure resources).
When I do this, and click on the image to validate the page, I get this message from the validator:
The error mentions requesting the validator unsecurely. So I tried changing the href of the <a> tag to use https for the validator, but then the page simply doesn't load (I guess because the validator doesn't use SSL).
Does anyone know a way around this? I am guessing there is not a way to use the code as is, but maybe there is a way to update uri=referer to be uri=https://mysite.com/...? Is there a way to dynamically grab the URL of the current page?
Also, just for further reference, does SSL simply prevent the referer request header from being accessed?
Oh, and I know I can just go to my website using http instead of https, and the validator works. But I'd rather get it configured to work with https too.

As for the "validate icon" question:
This would usually lead to displaying a messages about "unsecure items" (=mixed http+https content)... the validate icon is not officially supported in such constellation... a partial workaround is described here.
IF you want to grab the uri dynamically I suspect you will have to use JavaScript for that and then create/add the <a> in the DOM...
As for the SSL/Referer question:
The standard says that a client (=browser) should send referer only if the destination is secure - so yes, in mixed cases the referer won't get sent to the non-secure URL.

Ok, so it's not looking like there is a way to do this with just HTML. So instead, I decided to use JavaScript to handle the issue.
I removed the <a> tag from around the W3 logo and added an onclick JavaScript function validatePage(). So here is basically a template for an XHTML Strict page that still allows you to include the validation icon.
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8" />
<title>Title of document</title>
<script type="text/javascript">
function validatePage() {
var validatorUrl = "http://validator.w3.org/check?uri=http" + (document.URL).substring(5);
window.open(validatorUrl);
}
</script>
</head>
<body>
<h1>Test Template Page</h1>
<p><img src="valid-xhtml1.png" alt="Valid XHTML 1.0 Strict" height="31" width="88" onclick="validatePage()" /></p>
</body>
</html>
Notice how the validatorUrl variable trims off the "https" from the URL and instead uses "http". So I just circumvented using the HTTP referer header.
Hope this helps someone else.

Related

Why is the referer from my server alway null?

I am trying to work out why my referrer from my server always seems to be blank. I have knocked together the following to test it:
<html>
<head>
<meta http-equiv="Refresh" content="0; url='https://www.whatismyreferer.com/'" />
<meta name="referrer" content="origin" />
</head>
<body>
</body>
</html>
When I go to this page I get this:
Is this something that is being set at a server level in Apache? I have a case where I need to pass the referrer so finding out what is controlling this would be good.
The referrer header (with the famous referer spelling) is sent by the browser. If the browser decides not to send it (e.g. for privacy reasons) it just won't do. You should never rely on the header to be there. Even if you find configurations that currently work: The request is valid with or without this header. And browsers might change their opinion any time (they did: The header used to be omnipresent, not it's less present)

How to capture JS redirects in Selenium?

Is there any way to capture all the redirects on the page performed in JS? For instance, let's take a look at this web page making redirect using window.location
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Redirect JS</title>
</head>
<body>
<script>
window.location = "http://www.example.com";
</script>
</body>
</html>
or meta tag
<meta http-equiv="refresh" content="0; url=http://example.com/">
I would like to render web page and get all urls where user has been redirected. Is it possible? How to do that in selenium?
In Python: http://selenium-python.readthedocs.org/en/latest/api.html : webdriver has property current_url. After you driver.get() the page, I would assume current_url is the redirected URL. Is it not?
Your requirement "in Selenium" will make this impossible. Selenium interacts with a browser as a human would - a human should generally not know or care about all the redirects. If you are willing to abandon Selenium for this purpose, then there are libraries such as HttpBuilder (in the Java world) and many others (for other languages) that allow you to manipulate and watch HTTP traffic, which is what you are after here.

neo4j REST API getting HTML response instead of JSON

I'm trying to use neo4j's REST API from an Apache Flex front-end. When my Flex app connects to the base URL (http://localhost:7474/db/data/) to discover other service URLs, it gets replies back in HTML rather than JSON format (just like if I enter the base URL into my browser).
In the Flex HTTP request, I've set the Content-Type and Accept headers both to "application/json" but it hasn't made a difference. I've also tried both GET and POST request methods.
I've verified neo4j is capable of sending JSON responses through a simple telnet window, so it must be "intelligently" formatting the reply based on something in the HTTP request. I'd thought the Content-Type and Accept headers would take care of it, though.
I realize the problem isn't technically in neo4j, but rather somewhere inside Flex's HTTPService (and supporting) classes, but I've been unsuccessful in working around the apparent bug/limitation.
Is there a way to simply force all such responses from neo4j to just be in JSON format?
Thanks,
Chris
* EDIT *
As requested below, here is the exact reply I'm getting in my Flex app:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html><head><title>Root</title><meta content="text/html; charset=utf-8" http-equiv="Content-Type">
<link href='http://resthtml.neo4j.org/style/rest.css' rel='stylesheet' type='text/css'>
<script type='text/javascript' src='/webadmin/htmlbrowse.js'></script>
</head>
<body onload='javascript:neo4jHtmlBrowse.start();' id='root'>
<div id='content'><div id='header'><h1><a title='Neo4j REST interface' href='/'><span>Neo4j REST interface</span></a></h1></div>
<div id='page-body'>
<table class="root"><caption>Root</caption>
<tr class='odd'><th>relationship_index</th><td>http://localhost:7474/db/data/index/relationship</td></tr>
<tr><th>node_index</th><td>http://localhost:7474/db/data/index/node</td></tr>
</table>
<div class='break'> </div></div></div></body></html>
This is the same result I get if I just put the base URL in my web browser manually and retrieve it that way.
I figured it out. When I compiled and ran my Flex app as a browser-based app, it used the browser's native capability to request the URL, blowing away my customized Content-Type and Accept headers.
When I compiled and ran as an Adobe Air desktop app, it worked fine and I received the proper JSON response.
Likely this is a bug in Flash Player, as the documentation for the Flex HTTPService class doesn't give any limitation on changing Content-Type or other headers when running in a browser vs. Air.
-Chris

SharePoint 2010 - Page Viewer

I would like to use the Page Viewer web part to display an html page with some java script. That page is to be hosted as a stand-alone page within SharePoint (perhaps under Shared Documents folder).
The problem is this: when I point the Web Part to use the page, it prompts me to save the html file rather than displaying its content inside the web part.
I am following general rules to create the html file:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>My Little SharePopint Page</title>
</head>
<body>
<div id="PlayerName">
</div>
<div id="display">
</div>
</body>
</html>
So I just need the page to be displayed inside the Page Viewer web part - and not to be prompted to save it as a file.
You web application configuration is forcing your users to download the file, this is by design and by default, you can change it in "Web Application General Settings" in central administration. look for "Browser File Handling" make sure its not strict. Check technet
page about this setting.
This can have security implications (Like loading PDF files inside the browser windows). There are other ways to accomplish this using js.
You need to remove everything north and south of the body tags first, they are already present through SharePoint. Then load the file (with just the div content) to your doc library and use the page viewer to point to that file.
I note that you are using PlayerName, impling that there is some type of flash/video player content to come later, depending on the object settings used that will also cause a "download" file display too.

Setting X-UA-Compatible meta tag in ASP.NET 4.0 site doesn't work

As I understand it you can tell the IE8 (and I assume later versions) how to best render your page.
This is useful because the page may have been designed for IE7, quirks mode or to target IE8 standards mode. As I have it, the default behaviour for IE8 when it encounters a page is to render in IE8 standards mode (not sure how it interprets the DOCTYPE though). With this default the user could change the rendering mode by clicking on the "Compatibility View" button next to the refresh button.
This is nice to give the user some control, but bad when you know your site only renders well with IE7 or whatever. In that case you don't want to enable the user to make the wrong choice and that's where the ability for a website to tell the >= IE8 browser how to render the page is very useful.
You simply have to provide the X-UA-Compatible meta tag the within the head tag. There are loads of references on the web how to do this and what values can be used. Remember to make it the first one.
<html xmlns="http://www.w3.org/1999/xhtml">
<head id="Head1" runat="server">
<meta http-equiv="X-UA-Compatible" content="IE=7" />
OK, so it's nothing new so far - however it just doesn't work for my ASP.NET project? I've tried it on a couple of other projects I have and the same problem.
Is there perhaps a scenario where because I'm using developer tools like Visual Studio, etc. that IE has been configured to always show the "Compatibility View" button for debugging purposes? Grasping at straws here I know.
I found out why this is happening.
It seems that ASP.NET's theming is interfering. When looking at the rendred output there is a dynamically inserted tag for the stylesheet (one for each) from the theme.
The ASP.NET theming engine inserts these items above the X-UA-Compatible meta tag, thus breaking IE's expectation of having it as the first tag in the head element.
So an ASP.NET site that has theming and the following in the source:
<html xmlns="http://www.w3.org/1999/xhtml">
<head id="Head1" runat="server">
<meta http-equiv="X-UA-Compatible" content="IE=7" />
will get rendered out as follows:
<html xmlns="http://www.w3.org/1999/xhtml">
<head id="ctl00_Head1">
<link href="App_Themes/White/Default.css" type="text/css" rel="stylesheet" />
<meta http-equiv="X-UA-Compatible" content="IE=7" />
This seems to be a bit of a bug. I'll create a MS Connect issue for it.
There's an interesting workaround for this here . I'll include the gist to make it easier:
The "styleSheetTheme" setting always places its CSS file in the header
at the top before anything else. To move the "X-UA-Compatible" before
it, you would have to do the following:
Make the meta tag accessible from the server code by giving it an ID and add the "runat" attribute:
...
Add the following pre-render event handler to your page (or master page):
protected void Page_PreRender(object sender, EventArgs e)
{
Control MyFirstCtrl = Page.Header.FindControl("FirstCtrlID");
Page.Header.Controls.Remove(MyFirstCtrl);
Page.Header.Controls.AddAt(0, MyFirstCtrl);
}
You can move things around in the header this way for anything that
you explicitly define in there.