"config.force_ssl = true" not forcing HTTPS - ssl

I am running Rails 3.1 and have tried putting the above line in development.rb and application .rb (not both at the same time) but it doesn't seem to do anything. My request are still working on HTTP. Isn't this meant to force all requests to use HTTPS? I'm sure I've missed something very obvious here but can't for the life of me think of what - being a newbie doesn't help either.
Any help would be greatly appreciated.
Cheers,
Dany.

It wont work locally, have you deployed it?

It will work provided that your local server (webrick?) is configured to use SSL, below is one way to do that via script/rails:
#!/usr/bin/env ruby.exe
# This command will automatically be run when you run "rails" with Rails 3 gems installed from the root of your application.
# require 'rubygems'
require 'rails/commands/server'
require 'rack'
require 'webrick'
require 'webrick/https'
module Rails
class Server < ::Rack::Server
def default_options
super.merge({
:Port => 3000,
:SSLEnable => false, # set to true to automatically generate SSL cert
:SSLVerifyClient => OpenSSL::SSL::VERIFY_NONE,
#:SSLCertificate => OpenSSL::X509::Certificate.new(File.open("ssl.crt").read),
:SSLCertName => [["CN", WEBrick::Utils::getservername]]
})
end
end
end
APP_PATH = File.expand_path('../../config/application', __FILE__)
require File.expand_path('../../config/boot', __FILE__)
require 'rails/commands'

Related

Rails - Method not allowed when executing POST methods (error 405)

I come from this post:
Rails using puma, change localhost:3000 to localhost:3000/example
I have fixed this issue, but now I receive "Method not allowed" when I do a post request. I have been reading and tried this solution:
Post returns 405 method not allowed
I know where is the problem: If I put in application.rb lines 1- and 2-, all assets are shown correctly and post methods aren't doing it. If I comment these lines, methods works but assets don't.
Application.rb:
class Application < Rails::Application
# Initialize configuration defaults for originally generated Rails version.
config.load_defaults 5.1
config.exceptions_app = ->(env) { ExceptionController.action(:show).call(env) }
config.action_dispatch.rescue_responses["BadTaste"] = :bad_request
1- config.action_controller.asset_host = "https://www.sevilla.org"
2- config.assets.prefix = '/autorizaciones-movilidad'
# Settings in config/environments/* take precedence over those specified here.
# Application configuration should go into files in config/initializers
# -- all .rb files in that directory are automatically loaded.
end
Routes:
Rails.application.routes.draw do
#resources :assets, path: '/autorizaciones-movilidad'
scope "/autorizaciones-movilidad" do
get 'vehicles/new'
get 'vehicles/create'
...
get 'vehicles/update'
end
end
Controller structure:
Don't know how to solve it.
It is deployed with a proxy server (in localhost it was working ok)

Select SSL Routes serving up rails 4 static pages via highvoltage gem

I have several static erb pages being served up in a ruby rails 4 site via the high voltage gem:
get '/about' => 'high_voltage/pages#show', id: 'about'
get '/contact' => 'high_voltage/pages#show', id: 'contact', :protocol => "https"
get '/privacy' => 'high_voltage/pages#show', id: 'privacy'
This all works well and good, except that the /contact route doesn't redirect or force SSL on, it is happy with whatever protocol is used.
I host the site on engine yard, attempting to put :force_ssl only or variants in the route line resulted in failed deployments - high voltage uses a slightly different set of arguments than normal routes so I suspect there is a conflict somewhere.
Anyone use highvoltage and SSL with rails 4 for select static pages (not the whole site)? Example routes line please.
You can achieve this by overriding the HighVoltage#PagesController see the override section of the documentation.
It might look something like this:
class PagesController < ApplicationController
include HighVoltage::StaticPage
before_filter :ensure_secure_page
private
def ensure_secure_page
if params[:id] == 'contact'
# check to make sure SSL is being use. Redirect to secure page if not.
end
end
end
Next disable the routes that HighVoltage provides:
# config/initializers/high_voltage.rb
HighVoltage.routes = false
Then in your application's routes file you'll need to set up a new route:
# config/routes.rb
get "/pages/*id" => 'pages#show', as: :page, format: false

Secure session cookies for rails application

I have the following configuration in my session_store.rb
Fuel::Application.config.session_store :cookie_store,
:key => "_secure_session",
:secure => !(Rails.env.development? || Rails.env.test?),
:domain => :all
In application_controller.rb
def default_url_options
return { :only_path => false, :port => 443, :protocol => 'https' }
end
I am using devise and my rails3 server is running behind HAProxy. HAProxy terminates the HTTPS traffic and passes HTTP requests to Rails. My problem is when i turn on :secure => true in session_store.rb, the user is redirected back to the sign in page with the message "Unauthorized". I have tried debugging it a lot, not sure how to get it working.
Its a situation where HAProxy is the reverse proxy terminating all the secure traffic and passing non-secure traffic to rails. When rails sets the cookie to secure, somehow it itself is not able to access it.
For your normal session cookie, your doing this correctly. You should see the '_secure_session' cookie properly set as secure in your browser. For the Devise "remember me" cookie you'll need to set that in the devise config. In config/initializers/devise.rb you'll find a line somewhere around line 133 that looks like
# Options to be passed to the created cookie. For instance, you can set
# :secure => true in order to force SSL only cookies.
# config.cookie_options = {}
I changed that to:
config.rememberable_options = {:secure => Rails.env.production?}
If Set-cookie is not being sent to the browser on initial authentication, then it sounds like a devise problem.
If Set-cookie is going to the browser, but not being sent back on the next https:// request, then it's probably a mismatch on :secure => setting.
If the cookie is sent by the browser, but not passed along by HAProxy, then it's a HAProxy configuration problem.
If the cookie is in the ruby environment, and being ignored due to policy, then it's a problem somewhere in Ruby code - at a guess, around secure/not-secure cookie-matching.

Subdomain constraint (Rails 3) makes local server (thin) SO SLOW

I recently added a subdomain constraint to my Rails routes file
constraints(:subdomain => 'new') do
devise_for :customers do
get "/customers/sign_up" => "registrations#new"
post "/customers" => "registrations#create"
put "/customers/:id" => "registrations#update"
end
match '/' => 'roxy#index'
namespace :roxy, :path => '/' do
resources :customers
resources :surveys
end
end
In order to test the subdomain routing constraint locally, I added this line to my hosts file.
127.0.0.1 new.localhost.local
Now, I test my app in my browser at the URL new.localhost.local:3000. It takes about 10 - 15 seconds to load every page, which is unreasonably slow. If I remove the subdomain constraint and just go to 127.0.0.1:3000, everything is zippy and fast again.
What am I doing wrong? I'm new to Rails, so please tell me if there is a better way to do subdomain routing in rails, or if there is a setting I need to configure.
Figured it out. It's nothing to do with Rails or subdomains or thin. Turns out, unlike other unixy-things, OS X reserves the .local TLD for mDNS functionality. For every page, the DNS resolution was timing out before loading my app. So I just changed my /etc/hosts file to
127.0.0.1 new.localhost.dev
and everything's working great now.
Read more: http://www.justincarmony.com/blog/2011/07/27/mac-os-x-lion-etc-hosts-bugs-and-dns-resolution/

Phusion Passenger - route exists but is not matched (gives 404 instead)

I'm running a rails 3 app at the root level in a phusion passenger environment (CentOS, apache) and having difficulty getting passenger to find some routes, although rake routes shows the routes correctly. Everything works fine in development (i.e. using rails server instead of phusion passenger in apache).
I have an admin section to my app with a login page. The main part of the app works, but everything under the admin section is inaccessible because I get a 404 instead of the login page (when I disable login I can access the admin pages). My apache config is
<VirtualHost *:80>
ServerName foo.bar.com
DocumentRoot /var/www/apps/myapp/current/public
<Directory /var/www/apps/myapp/current/public>
Allow from all
Options -MultiViews
</Directory>
</VirtualHost>
My login process is implemented as a before_filter in an admin controller:
class Admin::AdminController < ApplicationController
# login disabled for testing
before_filter :require_login
def require_login
#current_user ||= User.find_by_id(session[:user_id])
redirect_to admin_login_path unless #current_user
end
end
My routes file has
Mpf::Application.routes.draw do
secure_protocol = "https://"
...
namespace "admin" do
...
match "login" => "user_sessions#new", :as => :login, :constraints => { :protocol => secure_protocol }
...
end
...
end
and when I run rake routes I get
admin_login /admin/login(.:format) {:protocol=>"http://", :action=>"new", :controller=>"admin/user_sessions"}
BUT when I try to access http://foo.bar.com/admin I get a 404 and the log shows
Started GET "/admin/login" for iii.iii.iii.iii at 2011-07-13 07:20:41 -0400
ActionController::RoutingError (No route matches "/admin/login"):
As far as I can tell it should be working... except for the fact that it's not. Any help would be greatly appreciated!
Have you tried accessing with https://? It looks like you provided constraints to prevent access from http:// and the link you posted references http://.