Rails send_file not sending data when called through a web service - ruby-on-rails-3

I am in a strange situation where send_file is unable to send file correctly. Here is situation:
Rail version: 3.0.10 and 3.1.0 [two different branches for testing]
Ruby: 1.9.2 on RVM
Webserver: Apache with Passenger
My client has a document management system; I worked on upgrading it to Rails 3 (and now rails 3.1) from Rails 2. We mostly redeveloped the system as the earlier one was quite old. All the features are working except one. The application allows users to download documents assigned to them. When users login they can see which documents are assigned to them and they can download. It works perfectly fine. Here is the code which works:
send_file(document.file[:path],
:type => document.file[:content_type],:x_sendfile=>true, :filename=>document.name)
There is one client for which they (my client) have made their earlier solution to send the document when requested through a .NET based web service (or whatever it may be called). The web service authenticates as a user and then is forwarded to document download path. I am able to make the web service authenticate and then redirect to the controller action which downloads the files but it does not work. The Server log says everything is ok:
Started GET "/download/12234" for 12.123.12.123 at 2011-09-20 23:21:24 -0400
Processing by DocumentController#download as HTML
Parameters: {"id"=>"12234"}
Sent file /yyy/zzz/abc/12234 (0.1ms)
Completed 200 OK in 138ms
I have changed the specific names and IPs. Note that the IP (12.123.12.123) is for the server which hosts web service.
I was on call with the developer who developed the .NET web service and he says he getting all the headers correct except that the content length is -1 and he is receiving no content. He said all other headers are correct.
To solve the problem; I tried multiple variations of send_file by trying to set all possible options (x_sendfile, stream, disposition etc).. I also tried setting the header:
response.headers["Cache-Control"] = "no-cache, no-store, max-age=0, must-revalidate"
response.headers["Pragma"] = "no-cache"
response.headers["Expires"] = "Fri, 01 Jan 1990 00:00:00 GMT"
But nothing works when I use the web service to download the file. However, same method works directly in the browser [I tested by bypassing authentication in the code].
I tried using send_date, but it does not work:
File.open(document.file[:path], 'r') do |f|
send_data f.read, :type => document.file[:content_type], :filename => document.name, :disposition => 'inline'
end
As a workaround I tried redirect_to instead of send file and used a test file in the public folder and it works. Although insecure but this seems to work fine. The only problem is that the browser is now opening the document instead of downloading it.
Please help me.
Update: The problem was related to the fact that Rails now sends chunked content and the web service was expecting content length.

It was not a problem with send_file. It was how the .NET web service was programmed. It was expecting content length.
There is a change (between Rails 2 & Rails3) in the default behaviour when sending content. Now it is chunked content - so there can not be a content length.
The .NET guy changed the code and everything is now working fine! Hope this would help somebody.

Related

IIS return 404 error but the files are exist

I have a website that implemented with .net core 2.1 and angular js.
I published this website on windows server that worked with IIS.
The problem is that, sometimes response of http requests for some files or request (the ajax request that called) is 404 error and sometimes it works correctly and I am sure that file is exists because if the user that faced with error, refresh the page, it will load correctly !
I attached a photos that compare same requests.
After many days I have this problem yet. I enabled IIS logs and attach one of xml log file here.

PWA Caching Issue

I have a PWA which has been developed in ASP.net Core and hosted on an Azure App Service (Linux).
When a new version of the PWA was released, I found that devices failed to update without clearing the browser cache.
To resolve this, I discovered a tag helper called asp-append-version that will clear cache for a specific file. I also discovered that I can append the version of the src attribute that specifies the URL of a file to trigger the browser to retrieve the latest file. For example, src="/scripts/pwa.js?v=1". Each time I update the pwa.js file I would also change the version i.e. v=2.
I’ve now discovered that my PWA is caching other JavaScript files in my application which results in the app not working on devices that have been updated to the new version however failed to clear the cache on specific files.
I believed that if I didn’t specify any cache control headers such as Cache-Control that the browser would not cache any files however this appears not to be the case.
To resolve this issue, is the recommended approach to add the appropriate Cache-Control headers (Cache-Control, Pragma, and Expires) to prevent browser caching or should I only add the tag helper asp-append-version to for example scripts tags to auto clear cache for that specific file?
I would preferably like the browser to store for example images rather than going to the server each time to retrieve these. I believe setting the header Cache-Control: no-cache would work as this would check if the file has changed before retrieving the updated version?
Thanks.
Thanks # SteveSandersonMS for your insights, In your web server returns correct HTTP cache control headers, browsers will know not to re-use cached resources.
Refer here link 1 & link 2 for Cache control headers on Linux app service
For example, if you use the "ASP.NET Core hosted" version of the Blazor WebAssembly template, the server will return Cache-Control: no-cache headers which means the browser will always check with the server whether updated content is present (and this uses etags, so the server will return 304 meaning "keep using your cached content" if nothing has changed since the browser last updated its content).
If you use a different web server or service, you need to configure the web server to return correct caching headers. Blazor WebAssembly can't control or even influence that.
Refer here

Can't render iframe on another domain using rack-cors

I'm trying to render an iframe of App A within App B.
App A is a local Rails 5.0 app and is using https.
App B is hosted on Heroku and is using https.
I've tried implementing the rack-cors gem but with no success, and I've tried all the suggestions I can find on StackOverflow.
My cors.rb file, within App A:
Rails.application.config.middleware.insert_before 0, Rack::Cors do
allow do
origins 'https://app-b.herokuapp.com'
resource '/url/on/app_a/*',
headers: :any,
methods: :any
end
end
My config.ru file (I've tried with and without this):
# This file is used by Rack-based servers to start the application.
require ::File.expand_path('../config/environment', __FILE__)
run Rails.application
require 'rack/cors'
use Rack::Cors do
allow do
origins 'https://app-b.herokuapp.com'
resource '/url/on/app_a/*',
headers: :any,
methods: :any
end
end
The error I get is: Refused to display 'https://app-a.com/' in a frame because it set 'X-Frame-Options' to 'sameorigin'.
I am not sure if this is specifically to rack-cors, but I do know that the header 'X-Frame-Options' is intentionally set to 'sameorigin' for at least Rails 5. Most likely to prevent developers from unintentionally allowing someone to wrap their server in an iframe.
According to the docs, we can see that if the server sets this response as not 'sameorigin', then the browser will allow the HTML code to run. So what we need is to remove that header away. Chris Peters does a great job at this post. To save a click
class SomeController < ApplicationController
after_action :allow_iframe
private
def allow_iframe
response.headers.except! 'X-Frame-Options'
end
end
To apply this to all endpoints simply place the after_action line and the function code in the application controller, but I would suggest limiting this to only specific pages/controllers.

Celerity cannot follow a Devise redirect, because Celerity doesn't send an accept header and Devise responds with plain text

We've got a Rails application just upgraded to Rails3 using Devise's Rails3 gem for authentication. We've been using Capybara with Celerity backend to test some of the pages.
When accessing the application in a browser, Devise responds with a redirect to the login page when a user is trying to access a protected page/controller.
In the Rails 2.x version of Devise this used to work even if the incoming request had a blank accept header */*.
In the Rails 3 version, Devise responds with a plain text string when the accept header is blank.
The reason the blank accept header thing matters is because we're using the Celerity backend of Capybara to test some of the pages, and apparently Celerity sends a blank accept header, and thus doesn't get redirected by Devise. This behavior has changed from Devise for Rails 2.x to Devise for Rails 3.
Celerity fails with an UnexpectedPageException and the server log reports that the request was made with */* as accept header.
When using the Selenium/Webdriver backend on the exact same test suite, the problem goes away.
There are two ways to tackle this:
Tell Devise to somehow always assume text/html as accept header and respond accordingly. How could that be done? Do we have to override the controllers?
"Fix" Celerity to sent text/html as accept header. How can this be done?
Is this an HTMLUnit problem/bug?
To me #2 looks like the "right" way to fix this, but I'm not sure if Celerity/HTMLUnit's lack of accept header is a bug or a feature. Thoughts?
It turns out to be an issue with HTMLUnit. I've made a patch to the Celerity gem which you can find here: https://github.com/jarib/celerity/pull/49
It will set the default accept header to "text/html" but also adds an optional parameter to override it.

FLV's not being served by web server (302 response)

I have a web app. IIS 6. .NET 3.5.
I have 2 websites on the web server. One of which is already correctly serving FLVs. The newer one is not.
I have added the MIME type information to the HTTP Headers in the website properties ['.flv', 'video/x-flv'] (as FLV is not an extension IIS recognises by default).
When I goto the URL, Firefox goes black and displays "Waiting for video". It stays like this. I have checked the logs that IIS writes to, and I have found the GET request, and the HTTP status associated with it, which is 302. This is a "Moved Temporarily" status code. I don't understand why it would be throwing this. All other content on this site (currently consisting of webpages and images) is returned fine.
I have tried the same video in the older site, and just directing FF to the URL, it runs correctly.
Any help as to why I can't do this would be much appreciated, thank you.