OneDrive REST API download returns 200 instead of 302 - onedrive

I am accessing OneDrive from a C++ program using WinHttp* functions. What absolutely baffles me is REST API download command. I am downloading a small test picture with
GET /v1.0/drive/items/FF306293D40F9529!158/content
When I do this in the console, I get a response exactly as described in the docs, that is, a 302 redirect to the actual content. However when I send the same request with WinHttpSendRequest, I receive the actual file contents instead of a redirection:
HTTP/1.1 200 OK
Cache-Control: public
Date: Wed, 06 Jan 2016 10:22:36 GMT
Content-Length: 161796
Content-Type: image/jpeg
Content-Location: https://public-bn1306.files.1drv.com/blahblahbla
Expires: Tue, 05 Apr 2016 10:22:36 GMT
Last-Modified: Wed, 06 Jan 2016 10:22:35 GMT
Accept-Ranges: bytes
ETag: aRkYzMDYyOTNENDBGOTUyOSExNTguMw
P3P: CP="BUS CUR CONo FIN IVDo ONL OUR PHY SAMo TELo"
Server: Microsoft-IIS/8.5
X-MSNSERVER: BN1306____PAP099
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-SqlDataOrigin: S
CTag: aYzpGRjMwNjI5M0Q0MEY5NTI5ITE1OC4yNTc
X-PreAuthInfo: rv;poba;
Content-Disposition: attachment; filename="Test.jpg"
X-Content-Type-Options: nosniff
X-StreamOrigin: X
X-AsmVersion: UNKNOWN; 19.33.0.0
X-MSEdge-Ref: Ref A: 44F40B6FB83547EFBC895911207BDB42 Ref B: 96336B0FC1B608598B549A1CB70C8C59 Ref C: Wed Jan 06 02:22:36 2016 PST
First I thought that maybe it was because the file was small, but trying to download a 250Mb large binary file changed nothing.
I don't complain actually, this would make the program a bit simpler, but I certainly would like to know why the same request works differently with WinHttp*. I suppose I do something wrong, but what??? (Banging my head against my desk.)

There are http libraries that will follow redirects automatically. You send a request, the library receives 302, and decides on its own to go to the redirect location, which then returns a 200.

Related

how to debug browser caching issue?

I have configured apache http with some cache settings. However, it does not seem to work properly and the browser seems to keep fetching the file as if it's not cached.
I tried to debug it by fetching the header.
$ curl -skI https://whatever-url-it-is/some-script.js
HTTP/1.1 200 OK
Date: Sun, 15 Sep 2019 05:35:08 GMT
Server: Apache/2.4.39 (Amazon) OpenSSL/1.0.2k-fips PHP/5.6.40
Last-Modified: Sun, 02 Jul 2017 10:52:25 GMT
ETag: "152ba-5535372162c40"
Accept-Ranges: bytes
Content-Length: 86714
Cache-Control: max-age=2592000, public
Expires: Tue, 17 Sep 2019 05:35:08 GMT
Vary: Accept-Encoding,User-Agent
Content-Type: text/javascript
The Cache-Control header seems to be correct. The Expires header looks a little odd but according to this page (https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Expires) the Expire will be ignored given that Cache-Control is there.
What else can I do to debug this issue? Any idea is very well welcome. Thanks
Never mind. I didn't realized that "Disable Cache" was turned on in Chrome. User error :-)

Forcing Google Cloud storage to respond with Content-Encoding

I would like to serve some Unreal-generated HTML5 files using Google Cloud storage. Some of the files are gzip-encoded, and one of the javascript files checks that every file which ends with gz is returned with Content-Encoding: gzip. I don't want to change these files.
In GC UI, I set the Content-Encoding (and Content-type) fields in file metadata. To prevent decompressive transcoding I also set Cache-Control: no-transform. Despite that, the Content-Encoding header is still missing. (full header below). Is there anything else I can do to make Google Cloud respond with Content-Encoding?
HTTP/1.0 200 OK
X-GUploader-UploadID: some hash
Date: Sun, 24 Feb 2019 01:06:05 GMT
Cache-Control: no-transform
Expires: Mon, 24 Feb 2020 01:06:05 GMT
Last-Modified: Sun, 24 Feb 2019 00:52:29 GMT
x-goog-generation: 1550969549021835
x-goog-metageneration: 3
x-goog-stored-content-encoding: gzip
x-goog-stored-content-length: 99750206
Content-Type: application/octet-stream
x-goog-hash: crc32c=oobjmg==
x-goog-hash: md5=fN6srk43sVaYDVNgpXtdLQ==
x-goog-storage-class: REGIONAL
Accept-Ranges: none
Server: UploadServer
Vary: Accept-Encoding

PDF not showing on visitors browser

I have a webpage (http://optiswissopen2015.ch/page/noticeboard) with some PDFs on it. All of them are linked the same way. But on some browsers (IE8 for sure) they are shown as text instead of open a pdf viewer.
<a href=" /files/Noticeboard/1436883318_sism2015.pdf"
download runat="server" class="button color3">
My first thought was, that they may have a problem in the header. But converting them to .ps and back doesn't help.
What can I do, that they open right with all browsers?
As a last option, I could ZIP them :-(
Yes, the the issue is in the file content type returned for some PDF files on your server. To verify it use curl -i [url] or wget -S [url] or this online tool.
For example, 1436883318_sism2015.pdf returns (incorrect):
HTTP/1.1 200 OK
ETag: "[omitted]"
Last-Modified: Tue, 14 Jul 2015 14:15:18 GMT
Content-Type: text/plain
Content-Length: 1188394
Date: Mon, 27 Jul 2015 17:19:21 GMT
Accept-Ranges: bytes
Server: LiteSpeed
Connection: close
And anmedleguide.pdf returns the correct header:
HTTP/1.1 200 OK
Date: Mon, 27 Jul 2015 17:21:25 GMT
Server: Apache/2.4.10 (Unix)
Last-Modified: Fri, 17 Jul 2015 13:07:01 GMT
ETag: "[omitted]"
Accept-Ranges: bytes
Content-Length: 932698
Content-Type: application/pdf
Why don't you try and integrate pdf.js.
I opened up your page in mozilla and the all the pdf's are being displayed nicely(Mozilla Firefox by default opens up a pdf in pdf.js). So you can integrate it in your website and get to display the pdf's in the same manner irrespective of the browser.
On the link below I have explained how to set-up pdf.js easily. (Just use your actual server instead of localhost mentioned there.)
Look on this link.
It will take only a few minutes to set-it up and the pdf issue will vanish for sure.

G-WAN 3.12.26 64-bit add duplicate http header

I use gwan for image generation, so I need to set correct content type, but G-WAN 3.12.26 after some load adds its own header with content type text/html and returns page with 2 http headers.
How to reproduce this:
use setheaders.c servlet from gwan package, start gwan and open this page, lets say http://localhost/?setheaders.c and you will get this (correct response):
HTTP/1.1 200 OK
Date: Sat, 29 Dec 2012 20:37:52 GMT
Last-Modified: Sat, 29 Dec 2012 20:37:52 GMT
Content-type: text/html
Content-Length: 371
Connection: close
<!DOCTYPE HTML><html lang="en"><head><title>Setting response headers</title><meta http-equiv="Content-Type" content="text/html; charset=utf-8"><link href="imgs/style.css" rel="stylesheet" type="text/css"></head><body style="margin:16px;"><h1>Setting response headers</h1><br>This reply was made with custom HTTP headers, look at the servlet source code.<br></body></html>`
now run apache bench: ab -n 1000 'http://localhost/?setheaders.c' (1000 requests were enough for my system).
DO NOT RESTART GWAN, open http://localhost/?setheaders.c again and this is what you should get (incorrect response, 2 http headers):
HTTP/1.1 200 OK
Server: G-WAN
Date: Sat, 29 Dec 2012 20:43:34 GMT
Last-Modified: Fri, 16 Jan 1970 16:53:33 GMT
ETag: "be86ada7-14b40d-16f"
Vary: Accept-Encoding
Accept-Ranges: bytes
Content-Type: text/html; charset=UTF-8
Content-Length: 367
Content-Encoding: gzip
Connection: close
HTTP/1.1 200 OK
Date: Sat, 29 Dec 2012 20:43:34 GMT
Last-Modified: Sat, 29 Dec 2012 20:43:34 GMT
Content-type: text/html
Content-Length: 371
Connection: close
<!DOCTYPE HTML><html lang="en"><head><title>Setting response headers</title><meta http-equiv="Content-Type" content="text/html; charset=utf-8"><link href="imgs/style.css" rel="stylesheet" type="text/css"></head><body style="margin:16px;"><h1>Setting response headers</h1><br>This reply was made with custom HTTP headers, look at the servlet source code.<br></body></html>
GWAN returns correct response if gzip and x-gzip are not set as acceptable encoding in request header (Accept-Encoding: gzip, x-gzip).
Is it possible to solve this modifying just servlet? If yes, then how?
Are you setting the MIME type like shown in fractal.c:
// -------------------------------------------------------------------------
// specify a MIME type so we don't have to build custom HTTP headers
// -------------------------------------------------------------------------
char *mime = (char*)get_env(argv, REPLY_MIME_TYPE);
// note that we setup the FILE EXTENTION, not the MIME type:
mime[0] = '.'; mime[1] = 'g'; mime[2] = 'i'; mime[3] = 'f'; mime[4] = 0;
If you do so then there's no way to confuse the automatic headers feature.
Other than that, v3.12 has had many instability problems (file time failures, pthread failures, signals failures, etc.) due to our direct syscalls and GLIBC wrappers - an effort initially intended to make the program run on all versions of Linux.
We have found (thanks to the many reports like yours) that rather than trying to fix those issues one by one (pointlessly fighting GLIBC, a moving target with many different releases each having its own bugs and specificities) a much safer path is to ditch GLIBC.
That's what G-WAN v4 will do, just a few days from now.

Image Not Caching?

I'm convinced that certain images on my site are not caching properly. I have set the headers as best I can, but it still seems like they download again every time I hit the refresh button.
For example, a particular image always takes a bit over 1 second to download. This is even after it should be cached. Here are the response headers:
HTTP/1.1 200 OK
Date: Sun, 06 Mar 2011 12:51:52 GMT
Server: Apache/2.2.17 (Unix) mod_ssl/2.2.17 OpenSSL/0.9.7a mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 PHP/5.2.16
Last-Modified: Thu, 01 Jan 1970 00:00:00 GMT
Accept-Ranges: bytes
Content-Length: 19211
Cache-Control: max-age=630323456, public
Expires: Wed, 03 Mar 2021 12:51:52 GMT
Keep-Alive: timeout=5, max=98
Connection: Keep-Alive
Content-Type: image/png
Is there anything wrong with this? Thanks.
UPDATE
<FilesMatch "\.(htm|html|php)$">
Header set Expires "Thu, 01 Jan 1970 00:00:00 GMT"
</FilesMatch>
Your Last-Modified says 1970, and your max-age is 630323456 seconds (19 years). So the file has been 'expired' since 1989, and must be re-downloaded. The browser is doing what it should be doing.
Solution:
Change the Last-Modified to the real Last-Modified (probably some time in the past few years)
Change the max-age to
Remove the Expires header; it is overridden when you also have max-age. See RFC2616 section 14.9.3. Alternatively, delete the Cache-Control header and keep only the Expires header. Either one is fine, but only use one, not both of them.