Carrierwave and amazon s3 - ruby-on-rails-3

I am using the carrierwave gem to manage file uploads in my rails 3 app, however, I am not able to connect to my amazon s3 bucket.
I have followed the instructions on the wiki yet they are not quite detailed enough, for example where do I store my s3 credentials?

Put something like this in an initializer.
CarrierWave.configure do |config|
config.storage = :fog
config.fog_directory = 'your_bucket'
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => 'your_access_key'
:aws_secret_access_key => 'your_secret_key',
:region => 'your_region'
}
end
You can store your credentials right in the file, if you want (and the code is private). Or from a separate file, or the database, up to you. The following would load a config file and allow different configurations based on the env.
# some module in your app
module YourApp::AWS
CONFIG_PATH = File.join(Rails.root, 'config/aws.yml')
def self.config
#_config ||= YAML.load_file(CONFIG_PATH)[Rails.env]
end
end
# config/aws.yml
base: &base
secret_access_key: "your_secret_access_key"
access_key_id: "your_access_key_id"
region: your_region
development:
<<: *base
bucket_name: your_dev_bucket
production:
<<: *base
bucket_name: your_production_bucket
# back in the initializer
config.fog_directory = YourApp::AWS.config['bucket_name']
# ...
config.fog_credentials = {
:provider => 'AWS',
:aws_access_key_id => YourApp::AWS.config['access_key_id'],
:aws_secret_access_key => YourApp::AWS.config['secret_access_key'],
:region => YourApp::AWS.config['region']
}

Check out this quick blog post I wrote about how to do it. Basically there are a few steps, each of which is pretty complicated:
Configuring API keys (allowing you to connect to the Amazon S3 API)
Connecting the API keys to your account (make sure to keep the credentials not checked into GitHub if you're using a public repo though)
Deploying the changes out.
Hope this helps!

Related

Carrierwave Fog (S3) Heroku

I am struggling with this setup and have read through Carrierwave docs and still and pulling hair.
I'm getting an error when I try to start the server even in dev mode.
Exiting
/Users/my-machine/.rvm/gems/ruby-1.9.3-p327/gems/carrierwave-0.7.1/lib/carrierwave/uploader/configuration.rb:73:in
`eval': can't convert nil into String (TypeError)
Here is my setup.
config/initializers/carrierwave.rb
S3_CONFIG = YAML.load_file(Rails.root.join('config', 'amazon_s3.yml'))[Rails.env]
CarrierWave.configure do |config|
config.storage = :s3
config.s3_access_policy = :public_read
config.s3_access_key_id = S3_CONFIG['access_key_id']
config.s3_secret_access_key = S3_CONFIG['secret_access_key']
config.s3_bucket = S3_CONFIG['bucket']
config.s3_region = 'us-east-1'
end
config/amazon_s3.yml
development:
access_key_id: xxxxxxxxxxxxxxxxxxx
secret_access_key: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
bucket: dev-bucket
test:
access_key_id: xxxxxxxxxxxxxxxxxxx
secret_access_key: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
bucket: test-bucket
production:
access_key_id: xxxxxxxxxxxxxxxxxxx
secret_access_key: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
bucket: prod-bucket
app/uploaders/image_uploader.rb
class ImageUploader < CarrierWave::Uploader::Base
# Choose what kind of storage to use for this uploader:
#storage :file
storage :fog
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
# fix for Heroku, unfortunately, it disables caching,
# see: https://github.com/jnicklas/carrierwave/wiki/How-to%3A-Make-Carrierwave-work-on-Heroku
def cache_dir
"#{Rails.root}/tmp/uploads"
end
end
How are you establishing your S3_CONFIG environment variables in your development environment? Seems as thou they're missing.

issues setting up S3 with carrierwave and fog in Heroku

I've been trying to upload files to Amazon S3 through a form, but I'm getting some errors I do not know how to fix.
Does somebody have experience this type of errors?
Heroku log errors:
response => #<Excon::Response:0x00000007294a30 #body="<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message>
Excon::Errors::Forbidden (Expected(200) <=> Actual(403 Forbidden)
config/initializers/carrierwage.rb
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => "AWS",
:aws_access_key_id => ENV['AWS_ACCESS_KEY_ID'],
:aws_secret_access_key => ENV['AWS_SECRET_ACCESS_KEY']
}
config.fog_directory = ENV['AWS_S3_BUCKET']
end
app/uploaders/file_uploader.rb
storage :fog
include CarrierWave::MimeTypes
process :set_content_type
def extension_white_list
%w(jpg jpeg gif png pdf)
end
# Override the directory where uploaded files will be stored.
# This is a sensible default for uploaders that are meant to be mounted:
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
I have set
the env. variables on Heroku.
I'm using the US standard for the bucket.

Rails + paperclip + S3 + OSX = OpenSSL error

I'm trying to post to S3 using AWS in development, but it can't find my ssl bundle. I have it installed for Oauth, and once I tell it where it is, it works fine. I can't seem to configure AWS to see it properly though.
OpenSSL::SSL::SSLError:
SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed
Here's my config from my model:
has_attached_file :image,
:styles => { ... },
:storage => :s3,
:s3_credentials => {
:access_key_id => ACCESS_KEY,
:secret_access_key => SECRET_KEY,
:bucket => BUCKET,
:ssl_ca_file => '/opt/local/share/curl/curl-ca-bundle.crt'
}
I have attempted to add, :ssl_verify_peer => false, and :use_ssl => false. Neither of which work, which makes me think that I'm configuring the AWS gem in the wrong place. Any suggestions where/how I should be doing this?
I'm using paperclip 2.4.0, and aws-sdk 1.3.8
I should also mention that the error occurs in testing with rspec.
Figured it out with help from the github aws-sdk page: https://github.com/amazonwebservices/aws-sdk-for-ruby
In short, I had to create a specific config/initializers/aws.rb that looks like...
# load the libraries
require 'aws'
# log requests using the default rails logger
AWS.config(:logger => Rails.logger)
# load credentials from a file
config_path = File.expand_path(File.dirname(__FILE__)+"/../aws.yml")
AWS.config(YAML.load(File.read(config_path)))
All i had to do then was move my config/s3.yml file to config/aws.yml. And then change my model to use that yml file...
has_attached_file :image,
:styles => { ... },
:storage => :s3,
:s3_credentials => "#{Rails.root.to_s}/config/aws.yml"
And that took care of it. As I suspected, setting the ssl properties via paperclip using the s3_credentials didn't work because the aws object had already been loaded.
Just for completeness, here's the yml file...
development:
access_key_id: ...
secret_access_key: ...
bucket: bucket_name
ssl_ca_file: /opt/local/share/curl/curl-ca-bundle.crt
test:
access_key_id: ...
secret_access_key: ...
bucket: bucket_name
ssl_ca_file: /opt/local/share/curl/curl-ca-bundle.crt
production:
access_key_id: ...
secret_access_key: ...
bucket: bucket_name
What's your bucket name?
If you use something like foo.domain.com as the bucket, paperclip will use that as a prefix for the host name (foo.domain.com.aws.amazon.com), which will cause problems with SSL verification.
Try using a bucket name that doesn't resemble a host name, like mydomain-photos
The code for determining the url is in fog.rb:
if fog_credentials[:provider] == 'AWS'
if #options[:fog_directory].to_s =~ Fog::AWS_BUCKET_SUBDOMAIN_RESTRICTON_REGEX
"https://#{#options[:fog_directory]}.s3.amazonaws.com/#{path(style)}"
else
# directory is not a valid subdomain, so use path style for access
"https://s3.amazonaws.com/#{#options[:fog_directory]}/#{path(style)}"
end
else
directory.files.new(:key => path(style)).public_url
end
and that regex is:
AWS_BUCKET_SUBDOMAIN_RESTRICTON_REGEX = /^(?:[a-z]|\d(?!\d{0,2}(?:\.\d{1,3}){3}$))(?:[a-z0-9]|\.(?![\.\-])|\-(?![\.])){1,61}[a-z0-9]$/

rails aws-s3 delete file throws AWS::S3::PermanentRedirect error - EU bucket problem?

I'm building a rails3 app on heroku, and I'm using aws-s3 gem to manipulate files stored in an Amazon S3 eu bucket.
When I try to perform a AWS::S3::S3Object.delete filename, 'mybucketname' command, I get the following error:
AWS::S3::PermanentRedirect (The bucket you are attempting to access
must be addressed using the specified endpoint. Please send all future
requests to this endpoint.):
I have added the following to my application.rb file:
AWS::S3::Base.establish_connection!(
:access_key_id => "myAccessKey",
:secret_access_key => "mySecretAccessKey"
)
and the following code to my controller:
def destroy
song = tape.songs.find(params[:id])
AWS::S3::S3Object.delete song.filename, 'mybucket'
song.destroy
respond_to do |format|
format.js { render :nothing => true }
end end
I found a proposed solution somewhere to add AWS_CALLING_FORMAT: SUBDOMAIN to my amazon_s3.yml file, as supposedly, aws-s3 should handle differently eu buckets than us.
However, this did not work, same error is received.
Could you please provide any assistance?
Thank you very much for your help.
the problem is you need to type SUBDOMAIN as uppercase string in config, try this out
You can specify custom endpoint at connection initialization point:
AWS::S3::Base.establish_connection!(
:access_key_id => 'myAccessKey',
:secret_access_key => 'mySecretAccessKey',
:server => 's3-website-us-west-1.amazonaws.com'
)
you can find actual endpoint through the AWS console:
full list of valid options - here https://github.com/marcel/aws-s3/blob/master/lib/aws/s3/connection.rb#L252
VALID_OPTIONS = [:access_key_id, :secret_access_key, :server, :port, :use_ssl, :persistent, :proxy].freeze
My solution is to set the constant to the actual service link at initialization time.
in config/initializers/aws_s3.rb
AWS::S3::DEFAULT_HOST = "s3-ap-northeast-1.amazonaws.com"
AWS::S3::Base.establish_connection!(
:access_key_id => 'access_key_id',
:secret_access_key => 'secret_access_key'
)

Carrierwave and s3 with heroku error undefined method `fog_credentials='

I'm trying to setup carrierwave and s3 with heroku. I'm following the carrierwave docs exactly: https://github.com/jnicklas/carrierwave
I've setup a bucket named testbucket in AWS, then I installed fog and created a new initializer with this inside :
CarrierWave.configure do |config|
config.fog_credentials = {
:provider => 'AWS', # required
:aws_access_key_id => 'my_key_inside_here', # required
:aws_secret_access_key => 'my_secret_access_key_here', # required
:region => 'eu-west-1' # optional, defaults to 'us-east-1'
}
config.fog_directory = 'testbucket' # required
end
Then inside my image_uploader.rb I set
storage :fog
Is there something else I am missing??? Thanks for any help.
If you're using carrier-wave 0.5.2, you have to look in the docs within the gem. They are different than what you see on github. Specifically, check out this file in the gem: lib/carrierwave/storage/s3.rb
Also set store to :s3...not :fog.
You'll see this section:
# CarrierWave.configure do |config|
# config.s3_access_key_id = "xxxxxx"
# config.s3_secret_access_key = "xxxxxx"
# config.s3_bucket = "my_bucket_name"
# end
#