Heroku and carrierwave don't load my s3 configuration - ruby-on-rails-3

I have an application on Heroku that uses the Carrierwave gem to upload images to S3.
I have set the s3 configuration in an initializer called carrierwave.rb
CarrierWave.configure do |config|
config.s3_access_key_id = 'XXXXXXXXXXXXXXXXXXXX'
config.s3_secret_access_key = 'XXXXXXXXXXXXXXXXX'
config.s3_bucket = 'XXXXX'
config.storage = :s3
end
This works fine in development on my local machine, however once I deploy to Heroku I get the following error
A Errno::EACCES occurred in events#update:
Permission denied - /app/public/uploads
/usr/ruby1.8.7/lib/ruby/1.8/fileutils.rb:243:in `mkdir'
Obviously it's trying to write to the heroku server which is read only and not picking up my s3 settings.
Does anyone know how I can get heroku to send my files to s3?

From CarrierWave wikki:
Heroku has a read-only filesystem, so uploads must be stored on S3 and cannot be cached in the public directory.
You can work around this by setting the cache_dir in your Uploader classes to the tmp directory:
Check out https://github.com/jnicklas/carrierwave/wiki and scroll to the bottom section labeled "CarrierWave on Heroku" to see how they set this up. Hope this helps someone.

Have you looked at this demo app.
In particular the uploaded class here

Related

Images uploaded through ActiveStorage disappear after Dokku deployment

I successfully deployed my Rails application to my DigitalOcean droplet through Dokku. After deploying it, I started uploading images to my site. After pushing a new version and redeploying the app, the uploaded images disappeared.
Now, I've already read that Dokku uses ephemeral storage. I've tried following a guide to make it persistent storage, but with no success.
This is the command that I tried:
dokku storage:mount underlords /var/lib/dokku/data/storage:/storage
After redeployment, it still didn't work.
If you are using persistent storage, note that the second path is an absolute path within your app container. It is not relative to the /app directory, but relative to the root path. This means that you should be saving your files to /storage and not /app/storage.

Rails 4 images in public folder are not loading on Apache development

I am new to rails. I am working on a sample application for social networking. I have managed to upload the profile picture of users manually (By copying the image uploaded to /tmp/image to the public folder- public/images/tmp/image) and saved the path to db as avatar_url.
In the profile view I used
<%= image_tag(#userinfo.avatar_url, :alt=>"Avatar image")%>
and getting the picture when running on the rails server.
But after that I have deployed the app in apache with passenger in the development environment by setting RailsEnv development. After that the images are not loading. I tried to go to myip:80/public/images/tmp/image, and it gives Routing Error.
After searching on the web, I found that adding config.serve_static_assets = true in production.rb will solve the problem in production. But no use for me because it also stated that the static files will serve in development by default. For confirming the problem again, I started the rails server and opened localhost:3000/profile, image is there and not getting the image in myip:80/profile.
So do I need to add any other config. Or am I not supposed to do that in this way.
Finally, I got the solution for my problem. Just sharing here.
The problem was actually because of permission issues. The picture will be created in a root temp directory on the form submission. Then I copied the image form the temp folder to the public folder. Hence it has only read permissions. After I deployed it, the image gets returns 403 forbidden error.
I used,
FileUtils.chmod 775, target
to set the permission. After that it worked well.
The option config.serve_static_assets = true tells rails to serve the static assets for your application, but that job should really be left to Apache.
Your issue sounds more related to your Apache configuration than rails.
I would take a look at a tutorial on how to configure Apache and Passenger to make sure your environment is setup correctly.
Anything in the public folder should be served by the web server. However myip:80/public/images/tmp/image is not a valid path. You would need to also have a filename at the end with an extension.

Rails Upload with Carrierwave, Fog to S3 - HTTP vs HTTPS

I've been following the excellent Rails Cast by Ryan Bates on uploading files to S3 (Episode 383). Things work fine - but...
I'd like to use the images' HTTP URL instead of HTTPS.
Tried looking in the Carrierwave documentation, but could not find if this was an option.
Tried to see if this was an S3 setting, but by default it seems to support HTTP and HTTPS.
Any help would be appreciated.
Thank you.
You can do this by setting the asset_host config parameter:
CarrierWave.configure do |config|
...
config.fog_directory = 'yourbucket'
# Forcing use of HTTP
config.asset_host = "http://#{config.fog_directory}.s3.amazonaws.com"
...
end
If your bucket is in a region other than US Standard you might need to add that part to the host as well.
CarrierWave 0.9.0 added a configuration param fog_use_ssl_for_aws to disable SSL for public_url.
CarrierWave.configure do |config|
...
config.fog_use_ssl_for_aws = false
...
end
Not sure if this is what you are looking for, but if you want to allow users to download files from your S3 bucket, you will need to create permissions for everyone to list and download files.
That can be done in your S3 bucket configuration panel, under the "Permissions" tab. By default, S3 file will be private so you would need an authenticated url to access them.

Zip::ZipFile File not found error with aws s3 in rails

Iam working on a rails application hosted on heroku and trying to unzip files. Iam storing zipfile using paperclip on amazon s3.
zip_file.rb
class ZipFile < ActiveRecord::Base
has_attached_file :attachment, {}.merge(PAPERCLIP_STORAGE_OPTIONS)
end
My files are successfully getting stored on amazon. when I open the attachment url in browser it downloads the zip file. But in my console when I try to unzipfile it gives me the error:
u = ZipFile.last.attachment.url
Zip::ZipFile.open(u)
I get the error:
Zip::ZipError: File #{file_url} not found
I also used the zipfile.attachment.path to access the file but it returns the same error.
What is the issue? Please help.
Many thanks.

broken pipe error with rails 3 while trying to upload data to AWS-S3

I am trying to upload some static data to my aws s3 account.
I am using aws/s3 gem for this purpose.
I have a simple upload button on my webpage which hits the controller where it create the AWS connection and tries uploading data to AWS S3.
The connection to the AWS is successful, how-ever while trying to store data in S3, i get following error : Errno::EPIPE:Broken pipe" ...always.
I tried running the same piece of code from s3sh (S3 Shell) and i am able to execute all calls properly.
Am i missing something here?? its been quite some time now since i am facing this issue.
My config are : ruby 1.8, rails 3, mongrel, s3 bucket region us.
any help will be great.
I think the broken pipe error could mean a lot of things. I was experiencing it just now and it was because the bucket name in my s3.yml configuration file didn't match the name of the bucket I created on Amazon (typo).
So for people running into this answer in the future, it could be something as silly and simple as that.
In my case the problem was with the file size. S3 puts a limit of 5GB on single file uploads. Chopping up the file into several 500MB files worked for me.
I also had this issue uploading my application.css which had compiled file size > 1.1MB. I set the fog region with:
config.fog_region = 'us-west-2'
and that seems to have fixed the issue for me...