How to deploy Nuxt static web project using content module to Amazon S3 - amazon-s3

I am trying to deploy static website using #nuxt/content module.
After I uploaded these files to S3 bucket, and enabled static hosting feature, I get the error message says
Document not found,overwrite this content with #not-found slot in
Anyone familiar with AWS, please save my day!
This is my procedure to get an error.
npx nuxi init content-app -t content
npm run generate and .output/public/** directory is created
Upload all files under the public directory to S3 bucket
access AWS S3 console, open bucket access permission, enable static website hosting feature
access S3 URL, I get an error.
versions are
#nuxt/content:^2.0.0
nuxt:3.0.0-rc.3
Thank you for reading !

Related

Configure CORS in IBM Cloud Object Storage Bucket using CLI

I am trying to configure CORS in my IBM Cloud Object Storage bucket. I dont see any option to do that from bucket configuration in UI and I can see it can only be done through CLI. The command looks similar to how its done in AWS CLI as well. This is the command to configure CORS,
ibmcloud cos bucket-cors-put --bucket BUCKET_NAME [--cors-configuration STRUCTURE] [--region REGION] [--output FORMAT]
It is expecting cors configuration STRUCTURE in JSON format from a file and add it as --cors-configuration file://<filename.json>. I have created a configuration file as cors.json and saved it on my Desktop. But when I am providing path for that file and running the command, I am getting this error,
The value in flag '--cors-configuration' is invalid
I am providing file path like this - --cors-configuration file:///C:/Users/KirtiJha/Desktop/cors.json
I am new with Cloud CLI. Am I doing wrong here? Any help is much appreciated
You can configure CORS in the CLI or via API and SDKs. On the CLI, you can use the IBM Cloud COS plugin in the bucket-cors-put command as you mentioned.
The file URI seems valid to me. You could try to set it in quotes ("file:///..."). Also, try to copy the file into your current directory and then test with --cors-configuration file://cors.json.

Access issue accessing s3 selectively for application running under ecs. But works from CLI

When i access s3 through my code which is running under ecs. I can access only few files but for few files i get access denied exception. I have access to the bucket and the object are not restricted based on object leve;. i am using DefaultAWSCredentialsProviderChain.
But the same files can be downloaded using AWS cli under the same assumed role.
Can help me understand what can be the issue?

GoReplay - Upload to S3 does not work

I am trying to capture all incoming traffic on a specific port using GoReplay and to upload it directly to S3 servers.
I am running a simple file server on port 8000 and a gor instance using the (simple) command
gor --input-raw :8000 --output-file s3://<MyBucket>/%Y_%m_%d_%H_%M_%S.log
I does create a temporal file at /tmp/ but other than that, id does not upload any thing to S3.
Additional information :
The OS is Ubuntu 14.04
AWS cli is installed.
The AWS credentials are deffined within the environent
It seems the information you are providing or scenario you explained is not complete however to upload a file from your EC2 machine to S3 is simple as written command below.
aws s3 cp yourSourceFile s3://your bucket
To see your file you can see your file by using below command
aws s3 ls s3://your bucket
However, s3 is object storage and you can't use it to upload those files which are continually editing or adding or updating.

Asp.net core 2.0 site running on LightSail Ubuntu can't access AWS S3

I have a website that I've build with Asp.net core 2.0. The website gets a list of files sitting in my ASW S3 bucket and displays them as links for authenticated users. When I run the website locally I have no issues and am able to access S3 to generate pre-signed urls. When I deploy the web app to LightSail ubuntu I get this incredibly useful error message : AmazonS3Exception: Access Denied.
At first I thought is was a region issues. I changed my S3 buckets to use the same region as my Lightsail ubuntu instance (East #2). Then I thought it my be a CORS issues and made sure that my buckets allowed CORS.
I'm kinda stuck at the moment.
I had exactly the same issue, then i solved it by creating an environment variable. To add environment variable permanently in Ubuntu open the environment file by the following command
sudo vi /etc/environment
then add your credential like this
AWS_ACCESS_KEY_ID=YOUR_KEY_ID
AWS_SECRET_ACCESS_KEY=YOUR_SECRECT_ACCESS_KEY
Then save the environment file, and Restart your asp core app.

Unable to open file from winscp (ec2) which was mounted to AWS S3

I have mounted s3 bucket on my EC2 instance using the details mentioned here at https://winscp.net/eng/docs/guide_amazon_s3_sftp. Also, I updated the bucket policy.
From winscp, I can see the list of files, add new files, delete files uploaded via s3 web portal, and open files uploaded by me via winscp.
However, I cannot open the file from winscp which I uploaded from s3 web portal. I get the following error.
Permission denied to read s3 object