I am currently trying to develop an Apple Watch optimized email. But none of the existing approaches like using headers:
Content-Type: text/watch-html; charset="utf-8"
Content-Type: text/X-watch-html; charset="utf-8"
doesn't work.
I also used the tricks described in the article on Litmus https://www.litmus.com/blog/how-to-send-hidden-version-email-apple-watch/
but it also had no effect. Apple Watch displays the standard text/html version of the email. Perhaps someone has a working solution or an explanation for the reason for this behavior.
Related
I’m trying to get and print the HTTP protocol version of the target in requests made by vue-resource in a Vue.js component. I don’t have any problem in getting headers like Data or Server: being on localhost I use a proxy to bypass the CORS limitations as well, but I can’t figure out how to do the same with that. As far as I know, both Chrome and Firefox developer tools refer to the HTTP protocol version in the request (not in the response), so using response.headers.get("foo"); as explained in the official documentation doesn’t work, and I don’t have a request variable set. I just need to show if the target uses HTTP 1.x or 2.x in a string like HTTP/1.1 200 OK — I can’t understand if Vue.http.interceptors could help and how. I guess it shouldn’t be that hard… thanks in advance!
EDIT: Unrelated to Vue.js itself, but resource.nextHopProtocol did the trick — on Firefox, at least. Being a Candidate Recommendation, it doesn’t work with all the browsers.
I'm trying to use neo4j's REST API from an Apache Flex front-end. When my Flex app connects to the base URL (http://localhost:7474/db/data/) to discover other service URLs, it gets replies back in HTML rather than JSON format (just like if I enter the base URL into my browser).
In the Flex HTTP request, I've set the Content-Type and Accept headers both to "application/json" but it hasn't made a difference. I've also tried both GET and POST request methods.
I've verified neo4j is capable of sending JSON responses through a simple telnet window, so it must be "intelligently" formatting the reply based on something in the HTTP request. I'd thought the Content-Type and Accept headers would take care of it, though.
I realize the problem isn't technically in neo4j, but rather somewhere inside Flex's HTTPService (and supporting) classes, but I've been unsuccessful in working around the apparent bug/limitation.
Is there a way to simply force all such responses from neo4j to just be in JSON format?
Thanks,
Chris
* EDIT *
As requested below, here is the exact reply I'm getting in my Flex app:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<html><head><title>Root</title><meta content="text/html; charset=utf-8" http-equiv="Content-Type">
<link href='http://resthtml.neo4j.org/style/rest.css' rel='stylesheet' type='text/css'>
<script type='text/javascript' src='/webadmin/htmlbrowse.js'></script>
</head>
<body onload='javascript:neo4jHtmlBrowse.start();' id='root'>
<div id='content'><div id='header'><h1><a title='Neo4j REST interface' href='/'><span>Neo4j REST interface</span></a></h1></div>
<div id='page-body'>
<table class="root"><caption>Root</caption>
<tr class='odd'><th>relationship_index</th><td>http://localhost:7474/db/data/index/relationship</td></tr>
<tr><th>node_index</th><td>http://localhost:7474/db/data/index/node</td></tr>
</table>
<div class='break'> </div></div></div></body></html>
This is the same result I get if I just put the base URL in my web browser manually and retrieve it that way.
I figured it out. When I compiled and ran my Flex app as a browser-based app, it used the browser's native capability to request the URL, blowing away my customized Content-Type and Accept headers.
When I compiled and ran as an Adobe Air desktop app, it worked fine and I received the proper JSON response.
Likely this is a bug in Flash Player, as the documentation for the Flex HTTPService class doesn't give any limitation on changing Content-Type or other headers when running in a browser vs. Air.
-Chris
I have my website configured to serve static content using gzip compression, like so:
<link rel='stylesheet' href='http://cdn-domain.com/css/style.css.gzip?ver=0.9' type='text/css' media='all' />
I don't see any website doing anything similar. So, the question is, what's wrong with this? Am I to expect shortcomings?
Precisely, as I understand it, most websites are configured to serve normal static files (.css, .js, etc) and gzipped content (.css.gz, .js.gz, etc) only if the request comes with a Accept-Encoding: gzip header. Why should they be doing this when all browsers support gzip just the same?
PS: I am not seeing any performance issues at all because all the static content is gzipped prior to uploading it to the CDN which then simply serves the gzipped files. Therefore, there's no stress/strain on my server.
Just in case it's helpful, here's the HTTP Response Header information for the gzipped CSS file:
And this for gzipped favicon.ico file:
Supporting Content-Encoding: gzip isn't a requirement of any current HTTP specification, that's why there is a trigger in the form of the request header.
In practice? If your audience is using a web browser and you are only worried about legitimate users then there is very, very slim to no chance that anyone will actually be affected by only having preprocessed gzipped versions available. It's a remnant of a bygone age. Browsers these days should handle being force-fed gzipped content even if they don't request it as long as you also provide them correct headers for the content being given to them. It's important to realise that HTTP request/response is a conversation and that most of the headers in a request are just that; a request. For the most part, the server on the other end is under no obligation to honor any particular headers, and as long as they return a valid response that makes sense the client on the other end should do their best to make sense of what was returned. This includes enabling gzip if the server responds that it has used it.
If your target is machine consumption however, then be a little wary. People still think that it's a smart idea to write their own HTTP/SMTP/etc parsers sometimes even though the topic has been done to death in multiple libraries for pretty much every language out there. All the libraries should support gzip just fine, but hand-rolled parsers usually won't.
I'm tearing my hair out over Internet Explorer 9's caching.
I set a series of cookies from a perl script depending on a query string value. These cookies hold information about various things on the page like banners and colours.
The problem I'm having is that in IE9 it will always, ALWAYS, use the cache instead of using the new values. The sequence of events runs like this:
Visit www.example.com/?color=blue
Perl script sets cookies, I am redirected back to www.example.com
Colours are blue, everything is as expected.
Visit www.example.com/?color=red
Cookies set, redirected, colours set to red, all is normal
Re-visit www.example.com/?color=blue
Perl Script runs, cookies are re-set (I have confirmed this) but! IE9 retreives all resources from the cache, so on redirect all my colours stay red.
So, every time I visit a new URL it gets the resources fresh, but each time I visit a previously visited URL it retrieves them from the cache.
The following meta tags are in the <head> of example.com, which I thought would prevent the cache from being used:
<META HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE">
<META HTTP-EQUIV="PRAGMA" CONTENT="NO-CACHE">
<META HTTP-EQUIV="EXPIRES" CONTENT="0">
For what it's worth - I've also tried <META HTTP-EQUIV="EXPIRES"
CONTENT="-1">
IE9 seems to ignore ALL these directives. The only time I've had success so far in that browser is by using developer tools and ensuring that it is manually set to "Always refresh from server"
Why is IE ignoring my headers, and how can I force it to check the server each time?
Those are not headers. They are <meta> elements, which are an extremely poor substitute for HTTP headers. I suggest you read Mark Nottingham's caching tutorial, it goes into detail about this and about what caching directives are appropriate to use.
Also, ignore anybody telling you to set the caching to private. That enables caching in the browser - it says "this is okay to cache as long as you don't forward it on to another client".
Try sending the following as HTTP Headers (not meta tags):
Cache-Control: private, must-revalidate, max-age=0
Expires: Thu, 01 Jan 1970 00:00:00
I don't know if this will be useful to anybody, but I had a similar problem on my movies website (crosstastemovies.com). Whenever I clicked on the button "get more movies" (which retrieves a new random batch of movies to rate) IE9 would return the exact same page and ignore the server's response... :P
I had to call a random variable in order to keep IE9 from doing this. So instead of calling "index.php?location=rate_movies" I changed it to "index.php?location=rate_movies&rand=RANDOMSTRING".
Everything is ok now.
Cheers
Will just mention that I had a problem looking very like this. But I tried IE9 on a different computer and there was no issue. Then going to Internet Options -> General -> Delete and deleting everything restored correct behaviour. Deleting the cache was not sufficient.
The only items that HTML5 specifies are content-type, default-style and refresh. See the spec.
Anything else that seems to work is only by the grace of the browser and you can't depend on it.
johnstok is correct. Typing in that code will allow content to update from the server and not just refresh the page.
<meta http-equiv="Content-Type" content="text/html; charset=utf-8; Cache-Control: no-cache" />
put this line of code into your section if you need to have it in you asp code and it should work.
Does there exist a website service or set of scripts that will tell you whether your web page badly configured if your goal is to be internationally friendly?
To be more precise, I'm wondering if something like this exists:
Checking URL: http://www.example.com
GET / HTTP/1.0
Accept-Charset: utf8
...
HTTP/1.0 200 OK
Charset: iso-8859-1
..<?xml version="1.0" charset="utf8" ?>
WARNING: Header document conflict, your server claims to return iso-8859-1, but
includes octet values outside the legal range. This can happen when your documents
are saved with a different character set than your web server is configured to serve.
From my understanding its unlikely that this will help me make a website that will allow people to post in Japanese or Hebrew, but it might be able to help my English websites reach a larger international audience.
I believe the W3C validator does it, but maybe not to the extent you are looking for...