GETTING THIS ERROR WHEN UPLOADING A FILE :
LoadError (no such file to load -- aws-sdk (You may need to install the aws-sdk gem)):
app/controllers/uploaded_files_controller.rb:19:in `create'
I am using Mongo and Paperclip. I can upload files fine without using s3. However, our production server is on Heroku and so I have to use Amazon to store the files.
I've read other Stack Overflow posts about this but none address my specific issue.
I have restarted my server several times. that's not it.
I am indeed requiring the Amazon gem in my Gemfile
I have done a bundle install after putting the amazon gem in ( I know its obvious, but still I had to state this )
I am NOT using ImageMagick. These uploads are simple text file uploads.
I know that my Amazon bucket name and auth stuff is correct because I use this app to connect to other Amazon resources in a different capacity.
Can anyone help with this ? Here is my code:
class UploadedFile
include Mongoid::Document
include Mongoid::Paperclip
require "aws/s3"
has_mongoid_attached_file :file,
:storage => :s3,
:bucket_name => 'my-uploads',
:path => ':attachment/:id/:style.:extension',
:s3_credentials => File.join(Rails.root, 'config', 'amazon_s3.yml')
end
OK, I've found the answer: The gem needs to be updated.
Paperclip now requires Amazon SDK gem instead of the s3 gem.
gem 'aws-s3', :require => "aws/s3"
should be instead
gem 'aws-sdk', :require => "aws-sdk"
Related
Both Rails gems DragonFly and Paperclip use public/system folder to store uploaded files.
As far as I know this folder is accessible by everybody - at least the root files, 404.html or others.
How can I protect these uploaded files? Is there any configuration options available?
I need to process transcoded video files which are generated from user uploads and make them available with some permission checks.
Are there any recommendations?
Not sure about dragonfile but you can change the paperclip file upload options.
Could you use Amazon S3 instead?
Add the following to your envrionment:
config.paperclip_defaults = {
:storage => :s3,
:s3_credentials => {
:bucket => ENV['AWS_BUCKET'],
:access_key_id => ENV['AWS_ACCESS_KEY_ID'],
:secret_access_key => ENV['AWS_SECRET_ACCESS_KEY']
}
}
Make sure you've got the s3 gem installed by adding this to your gemfile:
gem 'aws-sdk'
Does that help or do you need to keep them on your server to process??
-- edit --
Apparently heroku recommend using a temp file if you can't use s3. Have a read of this post:
How can I change the upload directory for paperclip on heroku to /tmp?
I have a pretty straightforward model and attachment
has_attached_file :upload,
:storage => :s3,
:bucket => 'bestofbauer',
:s3_credentials => {
:access_key_id => ENV['MyAccessKEY'],
:secret_access_key => ENV['MySecretKey']
}
I have a bucket setup with s3 called bestofbauer.
I know I could refactor the credentials into an initializer but I haven't gotten this to save an attachment yet so I haven't worried about it.
When I run the save for the object and its attachement I get:
RuntimeError in RecommendationsController#create
Missing credentials
I have poured over: Credentials missing when uploading photos with Paperclip and Amazon s3 but that didn't resolve my issue.
I am using the following gems:
gem "paperclip"
gem "sws-sdk"
gem 'aws-s3'
Any other ideas?
You need to set your environment variables. Here's two different ways to do it:
Every time you run rails server or any other command that accesses your S3 account you need to include your keys:
$ MyAccessKEY=ACCESS_KEY MySecretKEY=SECRET_KEY rails server
I'm assuming you're using bash so edit your ~/.bash_rc or ~/.bash_profile to set your environment variables
export MyAccessKEY=ACCESS_KEY
export MySecretKEY=SECRET_KEY
Then open a new terminal window and double-check that they're set
$ echo $MyAccessKey
> ACCESS KEY PRINTS OUT HERE
If you're deploying to Heroku then you'll want to provide your environment variables there as well:
$ heroku config:add MyAccessKEY=ACCESS_KEY MySecretKEY=SECRET_KEY
You can review your Heroku config:
$ heroku config
It will list out all of the config variables you have for that app.
You'll probably want to put your S3 bucket name in an ENV setting as well so you don't mess up your bucket when testing locally.
I have a resque worker which works great but is just too slow. The main reason for this is I'm using activerecord and having to load the entire environment which takes at least 10-20 seconds just to load up (I don't keep a running worker at all times as I'm using Heroku and pay for the time the worker runs). I'm using a resque worker to grab & parse data from an external website and then dumping the data into my database.
My question is whether I should rewrite the method to not use Rails and instead use DataMapper? Or something else which would load faster than activerecord.
Or If I should extract the code (using ActiveRecord) which figures out what to do with the external data and move it out of the worker and back into the app?
Hope that makes sense.
I have the same problem.
you could setup your environment on the rake resque:setup rake task
I tried this. assuming my rake resque task is on lib/tasks/resque.rake
require "resque/tasks"
task "resque:setup" do
root_path = "#{File.dirname(__FILE__)}/../.."
db_config = YAML::load(File.open(File.join(root_path,'config','database.yml')))["development"]
ActiveRecord::Base.establish_connection(db_config)
require "#{root_path}/app/workers/photo_downloader.rb" #workers
#Dir.glob("#{root_path}/app/models/*").each { |r| puts r; require r } #require all model
require "#{root_path}/app/models/photo.rb" # require model individually
end
I haven't completely success beacuse I use the Paperclip gem which require rails environment
Rails’ bootstrap is really slow; it is intended to be kept running, until certain time for restart (to eliminate some memory leaks there most likely is, any software is not bug-free), and is not intended to be used as a site that is launched for one request and then shut down.
That kind of usage more resembles a script. If you need to launch it with browser, you can easily use something like Erubis to write the page and use ActiveRecord in the script (I think it was useable outside of rails) or similar abstraction layer. Myself, for small tasks, I just use Mysql2.
Use bundler to get active_record and other gem to you without rails application .
require 'rubygems'
require 'logger'
require 'active_record'
require 'bundler'
require "active_support"
require "spreadsheet"
require 'net/ping'
require 'net/http'
Bundler.setup
Bundler.require(:default) if defined?(Bundler)
$config_logger = Logger.new("./log/dev.log")
class Dbconnect
def initialize
#settings = YAML.load_file('./config/database.yml')["development"]
#adapter = #settings["adapter"] if #settings["adapter"]
#database = #settings["database"] if #settings["database"]
#pool = #settings["pool"] if #settings["pool"]
#timeout = #settings["timeout"] if #settings["timeout"]
end
def connect_to_db
ActiveRecord::Base.establish_connection(
:adapter => #adapter,
:database => #database,
:reconnect => #reconnect,
:pool => #pool,
:timeout => #timeout)
$config_logger.info "\n db Connected: to => #{#database} "
end
end
end
}
Example Gemfile :
source "http://rubygems.org"
gem 'mail'
gem "escape_utils"
gem 'json',:require => "json"
gem 'json_pure'
gem 'resque'
gem 'resque-scheduler'
gem 'redis-namespace'
gem 'resque-status'
gem 'rake'
gem 'em-udns'
gem 'sqlite3'
gem 'spreadsheet'
gem 'activerecord', '3.2.1', :require => "active_record"
gem 'net-scp', :require => 'net/scp'
gem 'net-sftp', :require => 'net/sftp'
gem 'net-ssh', :require => 'net/ssh'
gem 'dir'
gem 'amatch'
gem 'haml'
gem 'net-ping'
gem install bundler
rest of the thing : bundle install .
I must use RightAWS for certain things. However, I can only get Paperclip uploads to S3 working when RightAWS is nowhere in my Gemfile. Since v2.3.11, Paperclip has used AWS-S3, switching from RightAWS.
RightAWS allows me to check the existence of an object without downloading the entire object via the head? method. It also allows me to stream massive amounts of files from a bucket in 1,000 unit chunks with its incrementally_list_bucket method. I haven't found a way to duplicate this functionality in AWS-S3. I do not have the time currently to implement this and contribute it either.
Can anyone tell me if there is a way to load both of these AWS gems in a Rails 3 project without causing Paperclip to cause the "wrong number of arguments (4 for 5) error?
Ta dahhh. Changing my Gemfile from:
gem 'aws-s3'
to:
gem 'aws-s3', :require => 'aws/s3'
fixed the problem!
We have already built a rails app that has several users and an image for each of them. Doing all of the dev work on our localhost, we have working seeds for users & photos...but now that we are trying to use S3 for the image storage, we are running into errors during...always during the "seed" step of the migrations, when doing this:
rake db:migrate:reset
Apologies for the question, but we have have been banging our heads on this for 11 hours, having gone through every related Stack question on the subject. A lot of similar posts had a NoSuchBucket error and other types of issues, but we none of the suggested changes have fixed our issue...maybe it's related to the newest versions of the gems we are using?
We are using Rails 3.0.4, Ruby 1.8.7, Paperclip 2.3.8, aws-s3 0.6.2
We are adding seeds for initial users and a photo for each user using our seeds.rb file in the /migrate/ folder. This always worked fine when storing files and images on local machine (using paperclip, but not S3). We also have tested removing the seeds file and simply creating a new user with the working app and got the same error:
Credentials are not a path, file, or
hash
For the user module, we have tested both the option where we set the following S3 keys through both the (a) yml file and (b) directly in the user model.
access_key_id: 'secret'
secret_access_key: 'secret'
We have tried doing this from our localhost (not yet live on heroku), and we have also tried running this through Heroku.
We have tried seemingly every permutation of the layout of those keys, but the error we most frequently get is this:
can't convert Module into Hash
Googling this error message returns zero results, so we don't know what's happening there. This was the most frustrating part...seemingly every attempt got us back to this error.
We also tried both:
(1) hardcoding the access keys in the user model, both like this:
:access_key_id => ENV['accesskeyid'],
:secret_access_key => ENV['secretaccesskey'],
In this case, we often got this error:
You did not provide both required access keys. Please provide the
access_key_id and the secret_access_key.
Frustrating, because we always had both items listed, tested with and without quotes, changing up the order, etc.
We tried it both (a) with the ENV['accesskeyid'] and (b) without those...with simply
blahblah => 'accesskeyid'.
and (2) putting the keys into the yml file, like this:
has_attached_file :photo,
:storage => :s3,
:s3_credentials => "#{Rails.root}/config/s3.yml",
:path => "/:photo/:filename"
with this in the yml file:
development:
access_key_id: accesskeyid
secret_access_key: secretaccesskey
bucket: ourbucketname
production:
access_key_id: accesskeyid
secret_access_key: secretaccesskey
bucket: ourbucketname
We tried this with single quotes around the keys, and without.
We also tried defining the bucket in the model, rather than in the yml file, and got the same error.
and (3), setting it up this way:
if Rails.env == "production"
S3_CREDENTIALS = { :access_key_id => ENV['S3_KEY'], :secret_access_key => ENV['S3_SECRET'], :bucket => "ourbucket"} else
S3_CREDENTIALS = Rails.root.join("config/s3.yml")
end
has_attached_file :photo,
:storage => :s3,
:styles => { :small => "50x50>", :thumb => "75x75>", :medium =>
"400x400>"},
:path => "/:photo/:filename"
With the same contents in our yml file.
This gave us this error:
credentials are not a file, path, or hash
Naturally, we quadruple-checked that we had the correct access keys (from our AWS account) and tested several different ways of setting up the hash, but never got what we wanted.
Here is the relevant portion of Gemfile:
gem 'aws-s3', :require => 'aws/s3' #For Storing Images on Amazon
gem 'paperclip'
As another attempt, we tried to use the gem right_aws, instead, in the Gemfile, but this resulted in this error:
no such file to load -- aws/s3 (You
may need to install the aws-s3 gem)
Note, we have been doing all of this and hitting all of these errors doing migrations from from localhost, not from the live Heroku app, but couldn't even get past this simple 'seed users' step.
Currently, our bucket is titled media.oururl.com. Is there some issue with having periods in the bucket name?
Going to ask the heroku guys about this, as well, but considering how amazing this community is, I am hoping one of you knows what we're doing wrong her.
MUCH appreciated - and hopefully this helps others who follow behind us.
excellent question. I spent quite some time with a similar issue a while ago
The primary issue is that you need to move the following code into it's own initializer file:
if Rails.env == "production"
S3_CREDENTIALS = { :access_key_id => ENV['S3_KEY'], :secret_access_key => ENV['S3_SECRET'], :bucket => "ourbucket"}
else
S3_CREDENTIALS = Rails.root.join("config/s3.yml")
end
Then, you should add the following line to your model where you have *has_attached_file :photo* The line to add is.
:s3_credentials => S3_CREDENTIALS,
This is what you were missing before.
Also, for when you declare your bucket name, make sure that is for standard us. If you use one of the other locations, you'll have to update the path appropriately.
Hope this helps!