Chrome developer tools header formats and view source - http-headers

Below are two screenshots from the same version of Chrome. I would like to know when and why header names sometimes are displayed with different word capitalization and also when is the view source/view parsed toggle available? I've read the developer tools documentation which says nothing about it and tried to load pages in different ways. The only pattern i suspect is content compression, could that be it?
Update: nope seen both versions on sites using gzip

It seems that it happens only for resources served over HTTP2/SPDY (compare this image served over HTTP2 with the same image served over HTTP). There is an old Chrome bug that proves that HTTP2/SPDY headers are being handled differently. I reported this as a bug here.

Related

Web fonts not loading on IE11 on Windows 10 Pro

I have a site that uses web fonts.The site used to function properly until the client's IT team rolled out Windows 10 Pro across the organization. After the Windows upgrade, the web fonts stopped working with error "CSS3111: #font-face encountered unknown error" on IE 11 browser. However on all other major browsers including new Edge the site works without any issues.
I did some search to and came to know it is because of a recommended features called Untrusted Font Blocking and to disable it I need to modify certain registry keys. However in my case that is not an option as the this feature is recommended by Microsoft for security. Also it needs the change to be done on local machines.
While googling, I noticed some people suggesting encoding and embeding of font file as Base64 encoded string. In fact I could see that as a workaround in many Q&A sites and forums (eg here, here). But I failed when I attempted to do that. On my Win10+IE11 I still get CSS3111: #font-face encountered unknown error (screenshot)
Further googling led me to this SO question that actually answers to my problem. It says even if I convert the font file to Base64 the Win10+IE11 will still prevent it while converting and loading into the memory for execution.
Interestingly I found major Font and icon vendors are not attempting to figure out a workaround for this as I can see none of their website shows up properly on Win10+IE11. Even Microsoft's own site (outlook.office.com) has this issue.
Now my questions are:
Is there any workaround that can help me to fix the issue
If no workaround then is it a good idea to have a end user warning popup to switch to more supported browser with user agent detection?
Thanks in advance.

How can I use google web fonts with phantomjs

I'm using phantomjs version 1.9.7 which I believe is supposed to support web fonts. I have inserted the font with google web fonts however it is just displaying my fallback font when I automate a screengrab. The webfont is displaying fine in all my browsers. Are there any workarounds for this?
I have struggled with this issue for several hours. Well, there is a simple reason for this controversy: user-agent!
Some services such as Google Fonts are returning different CSS content based on user-agent. When you call a webpage which includes Google Fonts with default PhantomJs user-agent, Google would return TTF version which is supported in PhantomJs.
However, if you set a custom user agent (Chrome, FF, etc.), Google Fonts would return .woff2 version. .woff2 is not supported in PhantomJs 2.x. Obviously, fonts are not going to be loaded.
So, for users who are testing PhantomJs without setting a generic user-agent, Google Fonts is working. If they set for example Google Chrome user-agent, it does not work.
So, you have two options:
Avoid setting a custom user-agent if it is possible.
Avoid using dumb-smart font providers like Google Fonts which does not outputting all font versions in CSS and let the browser decide what it needs.
I believe your belief is wrong :-)
The 1.9.x series of Phantom is still based on the old WebKit code, which people have been reporting web font problems with all along.
The good news is that a technical preview of Phantom 2.0 is apparently about to be released; https://github.com/ariya/phantomjs/wiki/PhantomJS-2 is intended to be the best place to follow status. And, from memory, someone reported success with web fonts using it (but I may be wrong on that as a quick search of the mailing list archives didn't turn up a definitive message saying web fonts work... but they definitely should).
BTW, SlimerJS is an almost drop-in replacement, based on Firefox's Gecko engine, and does support web fonts (though some problems in corner cases, IIRC).
I just had the same issue with PhantomJS 2.1.1.
In my case I was working under a proxy that was blocking PhantomJS from loading the font from Google. After connecting to an open network it rendered correctly.
I tried all fixes listed here to no avail, but here is a work around. Either option resolved this for me.
First Option:
Install the missing web fonts on the local computer that is running PhantomJS. Most web font providers including Google allow you to easily download the fonts for local install. No change needed at the target URL.
Second:
I sites I have control over, I was able to resolve by splitting up the link tags.
Change This from:
<link href='//fonts.googleapis.com/css?family=Roboto:400,500,700|Open+Sans:400italic,700italic,400,700' rel='stylesheet' type='text/css'>
To:
<link href='//fonts.googleapis.com/css?family=Roboto:400,500,700' rel='stylesheet' type='text/css'>
<link href='//fonts.googleapis.com/css?family=Open+Sans:400italic,700italic,400,700' rel='stylesheet' type='text/css'>
I know this is not an ideal solution but either one works, It depends on if you can modify the target URL.

ASP.net Ajax Partial Rendering using UpdatePanel not working in WebKit browsers

I am part of the developer team for a quite a large online system using ASP.NET(4).
Asp.net Ajax completely breaks down for Webkit browsers and we are getting full page postbacks when we should be getting partial only for the UpdatePanels.
I am starting to believe it has something to do with my Application Configuration, mainly for the following reasons.
If I move the ajax enabled controls to a new project they will work as expected for all browsers, including Webkit.
I created a static .aspx file with nothing but an UpdatePanel,ScriptManager and a button making a literal visible on click.
I get no Javascript errors from any browser, and i see an http request for the asp.net-ajax (ScriptResource.axd) in both Firebug and Chrome Developer tools
I tried ye'old safari fix from this highly referenced thread
Edit: After a bit more testing and http sniffing i noticed a major difference between the test application and the actual application. The test application generates 2 additional .axd files which are not generated from the actual application. These WebResource.axd, seem to contain data related to the async postback. However this is only the case for Webkit browsers. The WebResource.axd files are generated for Firefox as i can see them in firebug
What i am asking from the community, is any ideas or suggestions as to what could be the cause of this problem and if i am correct to assume that the problem is probably on the server side
Thanks for any help
The problem was due to a deprecated config file that's used to limit the content that bots/spiders/crawlers receive, which was loading by mistake thanks to our lovely inhouse CMS
In short if u get behavior similar to my case, check your or configs
I was having a similar issue however my problem was with all browsers and not just webkit. I ended up going through and tearing up the web.config file and found out that a line: <xhtmlConformance mode="Legacy"> was preventing webresource.axd from working properly. The fix was to simply remove that line from my web.config file.
For a little more information on xhtmlConformance, visit http://technet.microsoft.com/en-us/librarY/ms228268(v=vs.85).aspx.
If you scroll all the way to the bottom you'll notice it explicitly states that it causes issues with webresource.axd and scriptresource.axd.

External PDF highlighting with Safari browser not working!

We have been facing a weird problem with PDF documents displayed in Safari. This problem is reproducible in many of our machines. The problem is like this..
Adobe Reader has support for hit highlighting in PDF documents when it is being viewed in any browser. For example,
http://www.mysite.com/myfile.pdf#xml=http://www.somesite.com/words.txt
This URL should highlight the words specified in words.txt file. But, unfortunately many of our Safari browsers (on Windows machines) don't highlight any text in opened pdf file. The same URL works fine in rest of the browsers (IE, Firefox and Chrome). I could not figure out where the problem is!
Can anybody please help me on this?
Thanks in advance,
Safari uses a built-in PDF plugin exclusive to safari (even on iPhone)
This is different from the Adobe's plugin and the API is different too.
Most of the things that work with the Adobe plugin won't work with the one inside Safari.
you may be able to find information about forcing Safari to use the Adobe Reader plugin (Google is your friend) but that would be a per user setting, something you cannot control on everybody's machine.
I'm also looking for help on this to control pdf files inside safari using javascript.

Is there an HTTP proxy tool that can substitute browsed content?

What I'm looking for is some sort of a proxy tool that will allow me to specify a local file to load instead of one specified in the web page that is being browsed. I have tried Burp Suite which is almost working - it allows us to intercept a file and replace it by pasting the contents of the file we are swapping in into an input field. The file content is compiled code (Flash content) so we are pasting in bytecode, but something isn't working.
The reason is we are a 3rd party software developer without access to our client's development or testing environments. Our content must interact correctly with the rest of the content on their webpage (there are elements on their page that communicate with our content) and to test any changes we make takes several hours turnaround to get our files uploaded to their servers. So what we need is some sort of hacking tool to let us test our work with their web pages, hence the requirement to specify a file in a webpage to swap with a local version.
The autoresponder feature in Fiddler Web Debugging Proxy might do what you need, if it's only static content.
I've been using HTTP::Proxy for a long time, and it has always helped me fiddle with things on the fly.
You might be able to do this with Greasemonkey but I'm not sure if the tests will be totally reliable.
http://diveintogreasemonkey.org/patterns/replace-element.html
And if Greasemonkey seems plain wrong for you I would take it as the perfect excuse to try out mouseHole. Now I have to admit that I've never tried it but since _why also made Hpricot I expect it to be fun, productive, and different.