How to upload/update a leased blob with azcopy? - azure-storage

Is there a way using azcopy v10 to upload or update a blob that has a lease on it? I checked the docs, and also ran azcopy copy -h | grep lease but nothing showed up.

No, azcopy does not support upload/update a leased blob. All the supported parameters are listed here. You can raise a feature request in github like this one: custom headers.
By the way, you can try azure cli commands which supports the --lease-id parameter.

Related

Configure CORS in IBM Cloud Object Storage Bucket using CLI

I am trying to configure CORS in my IBM Cloud Object Storage bucket. I dont see any option to do that from bucket configuration in UI and I can see it can only be done through CLI. The command looks similar to how its done in AWS CLI as well. This is the command to configure CORS,
ibmcloud cos bucket-cors-put --bucket BUCKET_NAME [--cors-configuration STRUCTURE] [--region REGION] [--output FORMAT]
It is expecting cors configuration STRUCTURE in JSON format from a file and add it as --cors-configuration file://<filename.json>. I have created a configuration file as cors.json and saved it on my Desktop. But when I am providing path for that file and running the command, I am getting this error,
The value in flag '--cors-configuration' is invalid
I am providing file path like this - --cors-configuration file:///C:/Users/KirtiJha/Desktop/cors.json
I am new with Cloud CLI. Am I doing wrong here? Any help is much appreciated
You can configure CORS in the CLI or via API and SDKs. On the CLI, you can use the IBM Cloud COS plugin in the bucket-cors-put command as you mentioned.
The file URI seems valid to me. You could try to set it in quotes ("file:///..."). Also, try to copy the file into your current directory and then test with --cors-configuration file://cors.json.

s3cmd to set correct content type for image/jpeg

I have uploaded a number of directories that contain images, what is the correct way to modify the content-type to "image/jpeg" without having to re-upload the images?
I tried:
s3cmd modify s3://ccc-public/catalog/cropped/EP01L.jpg --add-header="Cache-Control:max-age=1296000" --mime-type="image/jpeg" --recursive -vvv
but the Content-Type always comes as binary/octet-stream
any advise much appreciated
These days, it is recommended to use the official Amazon Command Line Interface (CLI).
The aws s3 cp command includes options for specifying mime-type, but by default the mime type of a file is guessed when it is uploaded.
s3cmd has recently (as in, this past weekend) fixed this, but the code is not yet upstream. Please try with this branch:
https://github.com/mdomsch/s3cmd/tree/bug/content-type
With a little more testing, this will get merged into upstream.
Then, your command should work exactly as you expect.
-mdomsch, s3cmd maintainer
Ran into this recently as well and used some of the info here How can i change AWS S3 content type only for audio files
For us, it came down to ensuring that you change the content-type with the following
--add-header='content-type':'image/png'
I want to emphasize using the lower case content-type vs Content-Type here. That's what ended up making a difference.
--add-header did not help for me. But the below worked.
s3cmd put style.css -m 'text/css' s3://mybucket

Google Cloud gsutil instructions say update but there is no update command

On this page:
https://developers.google.com/storage/docs/gsutil_install?hl=ja#install
The gsutil install recommends, right after install, running
gsutil update
which returns
CommandException: Invalid command "update".
Am I just seeing incorrect documentation? Is there some other way to update?
Checking on 'usage' doesn't mention there being any update command:
Usage: gsutil [-d][-D] [-h header]... [-m] [command [opts...] args...] [-q]
Available commands:
acl Get, set, or change bucket and/or object ACLs
cat Concatenate object content to stdout
compose Concatenate a sequence of objects into a new composite object.
config Obtain credentials and create configuration file
cors Set a CORS XML document for one or more buckets
cp Copy files and objects
defacl Get, set, or change default ACL on buckets
du Display object size usage
help Get help about commands and topics
lifecycle Get or set lifecycle configuration for a bucket
logging Configure or retrieve logging on buckets
ls List providers, buckets, or objects
mb Make buckets
mv Move/rename objects and/or subdirectories
notification Configure object change notification
perfdiag Run performance diagnostic
rb Remove buckets
rm Remove objects
setmeta Set metadata on already uploaded objects
stat Display object status
test Run gsutil tests
version Print version info about gsutil
versioning Enable or suspend versioning for one or more buckets
web Set a main page and/or error page for one or more buckets
Additional help topics:
acls Working With Access Control Lists
anon Accessing Public Data Without Credentials
crc32c CRC32C and Installing crcmod
creds Credential Types Supporting Various Use Cases
dev Contributing Code to gsutil
metadata Working With Object Metadata
naming Object and Bucket Naming
options Top-Level Command-Line Options
prod Scripting Production Transfers
projects Working With Projects
subdirs How Subdirectories Work
support Google Cloud Storage Support
versions Object Versioning and Concurrency Control
wildcards Wildcard Names
Use gsutil help for detailed help.
EDIT:
It is gsutil version 3.42
Maybe you have a very old version?
Try gsutil version to see yours.
you can check release notes here:
https://github.com/GoogleCloudPlatform/gsutil/blob/master/CHANGES.md

Cannot delete Amazon S3 object

I have the following Amazon S3 object:
I cannot delete it. I have tried aws cli, 3Hub, and Amazon's Management Console. When I try using aws cli or 3Hub, I get a key does not exist error. When I try Amazon's Management Console, the object always reappears with the same last modified date.
I have noticed that the object has that %0A (linefeed?) on the end of the link and suspect that this is part of the problem.
How can I delete this object?
I have also opened a thread in the AWS forums here: https://forums.aws.amazon.com/thread.jspa?threadID=142946&tstart=0. I have also created a private support ticket -- which is getting good Amazon attention.
Update
Other things I am trying:
Using the s3curl tool (didn't work)
Using the AWS S3 CLI rm tool (didn't work)
Using the fixbucket command from s3cmd (didn't work)
Using a lifecycle rule (this worked after about 24 hours):
S3s lifecycle rules unfortunately do not accept wildcards. You will have to specify the fill in the ** in 'media/**/'. You do not need the * after 'Icon' however since lifecycle rules accept a prefix, which means that all keys prefixed with what you supply will be deleted.

Unable to make list blob request (windows azure)?

Unable to make list blob request using following URI with replacing myaccount with my storage account
http://myaccount.blob.core.windows.net/mycontainer?restype=container&comp=list
Are you sure that your blob is public?
You can check is using CloudBerry Explorer a great free tool to manage Blobs. You can download it here: http://www.cloudberrylab.com/free-microsoft-azure-explorer.aspx
Once the application is install go on the container and right-click. Go check in Properties is the security is public.