I have the following Amazon S3 object:
I cannot delete it. I have tried aws cli, 3Hub, and Amazon's Management Console. When I try using aws cli or 3Hub, I get a key does not exist error. When I try Amazon's Management Console, the object always reappears with the same last modified date.
I have noticed that the object has that %0A (linefeed?) on the end of the link and suspect that this is part of the problem.
How can I delete this object?
I have also opened a thread in the AWS forums here: https://forums.aws.amazon.com/thread.jspa?threadID=142946&tstart=0. I have also created a private support ticket -- which is getting good Amazon attention.
Update
Other things I am trying:
Using the s3curl tool (didn't work)
Using the AWS S3 CLI rm tool (didn't work)
Using the fixbucket command from s3cmd (didn't work)
Using a lifecycle rule (this worked after about 24 hours):
S3s lifecycle rules unfortunately do not accept wildcards. You will have to specify the fill in the ** in 'media/**/'. You do not need the * after 'Icon' however since lifecycle rules accept a prefix, which means that all keys prefixed with what you supply will be deleted.
Related
In terraform to read an object from s3 bucket at the time of deployment I can use data source
data aws_s3_bucket_object { }
Is there a similar concept in CDK? I've seen various methods of uploading assets to s3, as well as importing an existing bucket, but not getting an object from the bucket. I need to read a configuration file from the bucket that will affect further deployment.
Its important to remember that CDK itself is not a deployment option. it can deploy, but the code you are writing in a cdk stack is the definition of your resources - not a method for deployment.
So, you can do one of a few things.
Use your SDK for your language to make a call to the s3 bucket and load the data directly. This is perfectly acceptable and an understood way to gather information you need before deployment - each time the stack Synths (which it does before every cdk deploy that code will run and will pull your data.
Use a CodePipeline to set up a proper pipeline, and give it two sources - one your version control repo and the second your s3 bucket:
https://docs.aws.amazon.com/codebuild/latest/userguide/sample-multi-in-out.html
The preferred way - drop the json file, and use Parameter Store. CDK contains modules that will create a token version of this parameter on synth, and when it deploys it will reference that properly back to the Systems Manager Parameter store
https://docs.aws.amazon.com/cdk/v2/guide/get_ssm_value.html
If your parameters change after deployment, you can have that as part of your cdk stack pretty easily (using cfn outputs). If they change in the middle/during deployment, you really need to be using a CodePipeline to manage these steps instead of just CDK.
Because remember: The cdk deploy option is just a convenience. It will execute everything and has no way to pause in the middle and execute specific steps. (other than a very basic, this depends on this resources)
I want to auto sync my local folder with S3 bucket. I mean, when i change some file in S3, automatically this file would update in the local folder.
I tried using scheduler task and AWS cli but i think there is a better way to do that.
Do you know some app or better solution?
Hope you can help me.
#mgg, You can mount the s3 bucket to the local server using s3fs, this way you can sync your local changes to s3 bucket.
You could execute code (Lambda Functions) that responds to some events in a given bucket (such file change, deleted or created), so, you could have a simple http service that receives a post or a get request from that lambda and update your local data accordingly.
Read more:
Tutorial, Using AWS Lambda with Amazon S3
Working with Lambda Functions
The other approach (I don't recommend this) is to have some code "pulling" for changes in some bucket and then reflecting those changes locally. At first glance it looks easier to implement, but ... it get complicated when you try to handle not just creation events.
And of course for each cycle of your "pulling" component you have to check all elements in your local directory against all elements in the bucket, it is a performance killing approach!
Is there a command in AWS CLI to restore Versioning files?
i've been developing a web server using Django
someday i found there was deleted image files randomly in S3
i think Django sorl-thumbnail will delete it
and tried to fix it but it failed
So I thought of temporary solution.
AWS S3 is delivering versioning. i use it to recover manually every day.
This is very Annoying to do, so I am writing a script.
But I could not find a way to restore a file with a delete marker.
Does anyone know the situation above?
thanks you!
Recovering of objects is a bit tricky in s3. As per AWS documentation http://docs.aws.amazon.com/AmazonS3/latest/dev/DeletingObjects.html
When you delete an object from a versioned bucket, S3 creates a new object called a delete marker, which has its own, new version ID.
If you delete that "version" of the object, it will restore your object's visibility.
You can use this command
aws s3api delete-object --bucket <bucket> --key <key> --version-id <version_id_of_delete_marker>
I have enabled publishing of logs from AWS elasticbeanstalk to AWS S3 by following these instructions: http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.loggingS3.title.html
This is working fine. My question is how do I automate the deletion of old logs from S3, say over one week old? Ideally I'd like a way to configure this within AWS but I can't find this option. I have considered using logrotate but was wondering if there is a better way. Any help is much appreciated.
I eventually discovered how to do this. You can create an S3 Lifecycle rule to delete particular files or all files in a folder more than N days old. Note: you can also archive instead of delete or archive for a while before deleting, among other things- it's a great feature.
Reference: http://docs.aws.amazon.com/AmazonS3/latest/dev/ObjectExpiration.html
and http://docs.aws.amazon.com/AmazonS3/latest/dev/manage-lifecycle-using-console.html
On this page:
https://developers.google.com/storage/docs/gsutil_install?hl=ja#install
The gsutil install recommends, right after install, running
gsutil update
which returns
CommandException: Invalid command "update".
Am I just seeing incorrect documentation? Is there some other way to update?
Checking on 'usage' doesn't mention there being any update command:
Usage: gsutil [-d][-D] [-h header]... [-m] [command [opts...] args...] [-q]
Available commands:
acl Get, set, or change bucket and/or object ACLs
cat Concatenate object content to stdout
compose Concatenate a sequence of objects into a new composite object.
config Obtain credentials and create configuration file
cors Set a CORS XML document for one or more buckets
cp Copy files and objects
defacl Get, set, or change default ACL on buckets
du Display object size usage
help Get help about commands and topics
lifecycle Get or set lifecycle configuration for a bucket
logging Configure or retrieve logging on buckets
ls List providers, buckets, or objects
mb Make buckets
mv Move/rename objects and/or subdirectories
notification Configure object change notification
perfdiag Run performance diagnostic
rb Remove buckets
rm Remove objects
setmeta Set metadata on already uploaded objects
stat Display object status
test Run gsutil tests
version Print version info about gsutil
versioning Enable or suspend versioning for one or more buckets
web Set a main page and/or error page for one or more buckets
Additional help topics:
acls Working With Access Control Lists
anon Accessing Public Data Without Credentials
crc32c CRC32C and Installing crcmod
creds Credential Types Supporting Various Use Cases
dev Contributing Code to gsutil
metadata Working With Object Metadata
naming Object and Bucket Naming
options Top-Level Command-Line Options
prod Scripting Production Transfers
projects Working With Projects
subdirs How Subdirectories Work
support Google Cloud Storage Support
versions Object Versioning and Concurrency Control
wildcards Wildcard Names
Use gsutil help for detailed help.
EDIT:
It is gsutil version 3.42
Maybe you have a very old version?
Try gsutil version to see yours.
you can check release notes here:
https://github.com/GoogleCloudPlatform/gsutil/blob/master/CHANGES.md