How to rename a file in blob storage by using Azure Datalake Gen2 Rest API - azure-storage

I've try to do the following instruction of this document : LINK
I used SAS authentication and added this to request header "x-ms-rename-source" but i kept getting this error "403-AuthorizationPermissionMismatch". Doing fine with all others api method but this one seem really tricky. Does anyone have success rename a file or directory with this one ?

Instead of using SAS authentication, i used authorization headers. You can check it here.
My request headers :
DateTime now = DateTime.UtcNow;
requestMessage.Headers.Add("x-ms-date", now.ToString("R", CultureInfo.InvariantCulture));
requestMessage.Headers.Add("x-ms-version", "2018-11-09");
//your source path you want to rename
requestMessage.Headers.Add("x-ms-rename-source", renameSourcePath);
//rename operation only accept authorize by shared key via header
requestMessage.Headers.Authorization = AzureStorageAuthenticationHelper.GetAuthorizationHeader(
StorageGen2AccountName, StorageGen2AccountKey, now, requestMessage);

You can try to rename the file in Blob Storage by using Storage Explorer tool
Kindly let us know if the above helps or you need further assistance on this issue.

Related

Using a service account and JSON key which is sent to you to upload data into google cloud storage

I wrote a python script that uploads files from a local folder into Google cloud storage.
I also created a service account with sufficient permission and tested it on my computer using that service account JSON key and it worked.
Now I send the code and JSON key to someone else to run but the authentication fails on her side.
Are we missing any authentication through GCP UI?
def config_gcloud():
subprocess.run(
[
shutil.which("gcloud"),
"auth",
"activate-service-account",
"--key-file",
CREDENTIALS_LOCATION,
]
)
storage_client = storage.Client.from_service_account_json(CREDENTIALS_LOCATION)
return storage_client
def file_upload(bucket, source, destination):
storage_client = config_gcloud()
...
The error happens in the config_cloud and it says it is expecting str, path, ... but gets NonType.
As I said, the code is fine and works on my computer. How anotehr person can use it using JSON key which I sent her?She stored Json locally and path to Json is in the code.
CREDENTIALS_LOCATION is None instead of the correct path, hence it complaining about it being NoneType instead of str|Path.
Also you don't need that gcloud call, that would only matter for gcloud/gsutil commands, not python client stuff.
And please post the actual stacktrace of the error next time, not just a misspelled interpretation of it.

Github Enterprise Raw URL Gist Unable to Download

I'm able to get a list of gists and their files https://api.git.mygithub.net/users/myuser/gists?per_page=100&page=1 which I found using the docs here: https://docs.github.com/en/free-pro-team#latest/rest/reference/gists#get-a-gist
The files on the gist object have a raw_url. If I fetch the raw_url with the same token, it fails wanting me to authenticate. If I add the header: Accept: application/vnd.github.v3.raw it returns a 406 Not Acceptable. I've references to that header around.
I'm not sure what the scope should be on the token. It seems like it would be the same one I accessed the API. In the UI if you click the raw file it gets a token appended to the url. That token doesn't look like one of the Private tokens mentioned here: https://docs.github.com/en/free-pro-team#latest/github/authenticating-to-github/creating-a-personal-access-token
So what is the format of the HTTP request to download the raw gist?
The raw url needs to have the hostname of gist. changed to raw. and the url path needs to start with /gist/.
Example code in Go fixing it:
url := gistFile.RawUrl
url = strings.Replace(url, "gist.", "raw.", 1)
url = strings.Replace(url, ".net/", ".net/gist/", 1)

Can't connect Azure Table Storage to PowerBI (415) Unsupported Media Type)

I'm getting the error below while connecting to Azure Table Storage,
Details:
Blockquote "AzureTables: Request failed: The remote server returned an error:
(415) Unsupported Media Type. (None of the provided media types are
supported)
The one thing I noticed is that if I fill up only the account name it will automatically add the rest of the url which is ".table.core.windows.net" where in the portal its table.cosmosdb.azure.com.
With core.windows.net Im getting err "AzureTables: Request failed: The remote name could not be resolved". But it might messing up some headers while using table.cosmosdb.azure.com
Please advise.
Thank you.
m
You should be able to connect to your azure table storage/CosmosDB account using powerBi using the following link structure: https://STORAGEACCOUNTNAME.table.core.windows.net/ , or https://yourcosmosdbname.documents.azure.com:443/ for cosmosdb
You can get the correct link by going to Portal > go to Storage accounts > Click on Tables/CosmosDB > You'll find the table link you would like to link to powerbi > remove the last table name after "/", then use it to connect in powerbi, it will later allow you to select the specific table in powerBI:
These are screenshots from testing for CosmosDB:
Errors 415:
When it comes to these errors, they can be caused by cache, which can be flushed by going to:
In Power BI Desktop: Go to "File" and select "Options". Under "Data Load" you have the option to clear the cache. After doing this you can use "Get Data" and "OData-feed" as normal and the URL won't return the 415 error
Check the following link for additional suggestions:
Not clear how you consume the table service API, but here is the solution that worked for me for React SPA and fetch api.
Request header must contain:
"Content-Type":"application/json"
It was failing for me with single quotes, and worked with double.

Setting metadata on S3 multipart upload

I'd like to upload a file to S3 in parts, and set some metadata on the file. I'm using boto to interact with S3. I'm able to set metadata with single-operation uploads like so:
Is there a way to set metadata with a multipart upload? I've tried this method of copying the key to change the metadata, but it fails with the error: InvalidRequest: The specified copy source is larger than the maximum allowable size for a copy source: <size>
I've also tried doing the following:
key = bucket.create_key(key_name)
key.set_metadata('some-key', 'value')
<multipart upload>
...but the multipart upload overwrites the metadata.
I'm using code similar to this to do the multipart upload.
Sorry, I just found the answer:
Per the docs:
If you want to provide any metadata describing the object being uploaded, you must provide it in the request to initiate multipart upload.
So in boto, the metadata can be set in the initiate_multipart_upload call. Docs here.
Faced such issue earlier today and discovered that there is no information on how to do that right.
The code example on how we solved that issue provided below.
$uploader = new MultipartUploader($client, $source, [
'bucket' => $bucketName,
'key' => $filename,
'before_initiate' => function (\Aws\Command $command) {
$command['ContentType'] = 'application/octet-stream';
$command['ContentDisposition'] = 'attachment';
},
]);
Unfortunately, documentation https://docs.aws.amazon.com/aws-sdk-php/v3/guide/service/s3-multipart-upload.html#customizing-a-multipart-upload doesn't make it clear and easy to understand that if you'd like to provide alternative meta data with multipart upload you have to go this way.
I hope that will help.

azure blob uploadfile bad request

Hi I am new to azure I am trying to upload a file to azure container using
static void UploadBlobFromFile(Uri blobEndpoint, string accountName, string accountKey)
{
// Create service client for credentialed access to the Blob service.
CloudBlobClient blobClient =
new CloudBlobClient(blobEndpoint,
new StorageCredentials(accountName, accountKey));
// Get a reference to a container, which may or may not exist.
CloudBlobContainer container = blobClient.GetContainerReference("StackOverflowAnalysis");
// Create a new container, if it does not exist
//container.CreateIfNotExist();
// Get a reference to a blob, which may or may not exist.
CloudBlockBlob blob = container.GetBlockBlobReference("QueryResults.csv");
// Upload content to the blob, which will create the blob if it does not already exist.
using (var filst = System.IO.File.OpenRead(#"c:\users\hmohamed\Downloads\QueryResults.csv"))
{ blob.UploadFromStream(filst); }
}'
I am getting error Bad request 400; I am trying this in mvc app I have also tried it with console application where i got error the process cannot access file because it is being used by another process. Responses to similar posts advice to run netstat command to fix the problem but I do not know how to use it and what parameters to supply; can some one please help
All letters in a container name must be lowercase. So, please use "stackoverflowanalysis" as your container name.
For more information on naming, please refer to Naming and Referencing Containers, Blobs, and Metadata.