Iam working on a rails application hosted on heroku and trying to unzip files. Iam storing zipfile using paperclip on amazon s3.
zip_file.rb
class ZipFile < ActiveRecord::Base
has_attached_file :attachment, {}.merge(PAPERCLIP_STORAGE_OPTIONS)
end
My files are successfully getting stored on amazon. when I open the attachment url in browser it downloads the zip file. But in my console when I try to unzipfile it gives me the error:
u = ZipFile.last.attachment.url
Zip::ZipFile.open(u)
I get the error:
Zip::ZipError: File #{file_url} not found
I also used the zipfile.attachment.path to access the file but it returns the same error.
What is the issue? Please help.
Many thanks.
Related
I am trying to deploy static website using #nuxt/content module.
After I uploaded these files to S3 bucket, and enabled static hosting feature, I get the error message says
Document not found,overwrite this content with #not-found slot in
Anyone familiar with AWS, please save my day!
This is my procedure to get an error.
npx nuxi init content-app -t content
npm run generate and .output/public/** directory is created
Upload all files under the public directory to S3 bucket
access AWS S3 console, open bucket access permission, enable static website hosting feature
access S3 URL, I get an error.
versions are
#nuxt/content:^2.0.0
nuxt:3.0.0-rc.3
Thank you for reading !
When I upload files to s3 from my hosted environment, everything works. I can see the file in the bucket, and when I look at the Server-side Encryption settings on that file it looks like this:
I can also successfully visit the file via its Object URL.
When I upload files from my local environment, I can still see the files in the bucket, but they have this error in Server-side encryption settings:
When I visit the Object URL for these files it comes back with XML saying 'Access Denied'
Anyone encountered this error before?
I'm trying to upload documents to my cloudsearch domain through AWS CLI using the following command:
aws cloudsearchdomain upload-documents --endpoint-url
http://doc-domainname-id.region.cloudsearch.amazonaws.com/2013-01-01/documents/batch
--content-type application/json --documents documents-batch.json
My access policies are open to everyone search and update but i'm still getting an exception when every time i try to upload a batch of documents:
An error occurred (CloudSearchException) when calling the
UploadDocuments operation: Request forbidden by administrative rules.
I've already uploaded files before using the same commands and everything was fine. Now i'm getting this issue.
Any help would be welcome. Thank you.
I am trying to upload some static data to my aws s3 account.
I am using aws/s3 gem for this purpose.
I have a simple upload button on my webpage which hits the controller where it create the AWS connection and tries uploading data to AWS S3.
The connection to the AWS is successful, how-ever while trying to store data in S3, i get following error : Errno::EPIPE:Broken pipe" ...always.
I tried running the same piece of code from s3sh (S3 Shell) and i am able to execute all calls properly.
Am i missing something here?? its been quite some time now since i am facing this issue.
My config are : ruby 1.8, rails 3, mongrel, s3 bucket region us.
any help will be great.
I think the broken pipe error could mean a lot of things. I was experiencing it just now and it was because the bucket name in my s3.yml configuration file didn't match the name of the bucket I created on Amazon (typo).
So for people running into this answer in the future, it could be something as silly and simple as that.
In my case the problem was with the file size. S3 puts a limit of 5GB on single file uploads. Chopping up the file into several 500MB files worked for me.
I also had this issue uploading my application.css which had compiled file size > 1.1MB. I set the fog region with:
config.fog_region = 'us-west-2'
and that seems to have fixed the issue for me...
I have an application on Heroku that uses the Carrierwave gem to upload images to S3.
I have set the s3 configuration in an initializer called carrierwave.rb
CarrierWave.configure do |config|
config.s3_access_key_id = 'XXXXXXXXXXXXXXXXXXXX'
config.s3_secret_access_key = 'XXXXXXXXXXXXXXXXX'
config.s3_bucket = 'XXXXX'
config.storage = :s3
end
This works fine in development on my local machine, however once I deploy to Heroku I get the following error
A Errno::EACCES occurred in events#update:
Permission denied - /app/public/uploads
/usr/ruby1.8.7/lib/ruby/1.8/fileutils.rb:243:in `mkdir'
Obviously it's trying to write to the heroku server which is read only and not picking up my s3 settings.
Does anyone know how I can get heroku to send my files to s3?
From CarrierWave wikki:
Heroku has a read-only filesystem, so uploads must be stored on S3 and cannot be cached in the public directory.
You can work around this by setting the cache_dir in your Uploader classes to the tmp directory:
Check out https://github.com/jnicklas/carrierwave/wiki and scroll to the bottom section labeled "CarrierWave on Heroku" to see how they set this up. Hope this helps someone.
Have you looked at this demo app.
In particular the uploaded class here