Azure IoT Python SDK how to set content type on uploaded images - azure-iot-hub

Uploading an image via the Python SDK, the "Content Type" in Azure Blob storage is always "application/x-www-form-urlencoded".
Certain applications, like Twilio, require a correct content type when accessing blob files, to render an MMS message.
I would like to request the ability to set the content Type at upload, vs. having to do it in code.
(For others) As a workaround, I am using the code below (include WindowsAzure.Storage from NuGet):
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
static void ChangeContentType(string URI)
{
//Parse the connection string for the storage account.
const string ConnectionString = "DefaultEndpointsProtocol=https;AccountName=accountName;AccountKey=AccountKey";
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConnectionString);
//Create the service client object for credentialed access to the Blob service.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
ICloudBlob imageFile = blobClient.GetBlobReferenceFromServer(new Uri(URI));
imageFile.Properties.ContentType = "image/jpeg";
imageFile.SetProperties();
}

Related

Google Cloud Storage report download no access

I am running node to download the sales report from google cloud storage.
I got the credentials.json file. Now the problem is every time I run my application I get "xxxxxxx#gmail.com" does not have storage.objects.get access to the Google Cloud Storage object".
Yes, this email is nowhere registered on the google cloud storage or given rights to, but it should work with the credentials alone, no?
The credentials are directly from the google cloud storage and have this information :
client_secret,project_id,redirect_uri,client_id...
My sample Code:
// Imports the Google Cloud client library.
const {Storage} = require('#google-cloud/storage');
const projectId = 'xxxxxxxxxxxxxxxxxx'
const key = 'credentials.json'
const bucketName = 'pubsite.......'
const destFileName = './test'
const fileName = 'salesreport_2020.zip'
// Creates a client
const storage = new Storage({projectId, key});
async function downloadFile() {
const options = {
destination: destFileName,
};
// Downloads the file
await storage.bucket(bucketName).file(fileName).download(options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName}.`
);
}
downloadFile().catch(console.error);
You are using the wrong type of credentials file.
Your code is written to use a service account JSON key file. You mention that the credentials file contains client_secret. That means you are trying to use OAuth 2.0 Client IDs.
Look in the file credentials.json. It should contain "type": "service_account". If you see {"installed": or {"web": at the start of the file, then you have the wrong credentials.
Creating and managing service account keys
Also, you are specifying the parameters wrong in the line:
const storage = new Storage({projectId, key});
Replace with:
const storage = new Storage({projectId: projectId, keyFilename: key});
Because you are seeing the random gmail address, that likely means the storage client is using Application default credentials instead of the ones you intend. There are two paths forward:
Embrace application default credentials. Remove the options you are passing in to the Storage constructor, and instead set the GOOGLE_APPLICATION_CREDENTIALS environmental variable to you json service account file.
Fix the Storage constructor to pass in credentials properly. The issue may be something as simple as you needing to pass the full path to the credentials file (ie /a/b/c/credentials.json). Possibly the storage options are not being processed right, try being explicit like
const storage = new Storage({projectId: 'your-project-id', keyFilename: '/path/to/keyfile.json'});

how to read html file from azure storage explorer

how to read .html file from Azure storage explorer.
through connectionstring can able to access blob.
string Template = bloburl + "file.html";
Yes you could create the html and save it to the container created.
You could make the URI for this blob public and then access it via your custom URL -- you'd have to create a CNAME for the storage container first.
Here is a good resource on how to use Blob Storage from .NET:
http://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-blobs/
var url = TemplateMain;
var httpClient = new HttpClient();
var html = await httpClient.GetStringAsync(url);
i have tried using GetStringAsync and its working fine.

Uploading file from Blazor WebAssembly App directly to Blob storage

I've been trying to develop a Blazor WebAssembly app (I'm trying with both .NET Standard 2.1 and .NET 5.0) and my goal is to allow the user to select a file using InputFileand for that file to be uploaded to an Azure Blob Storage Container. I've been looking around a lot and following different guides here and there but with no success. The issues I was obtaining were usually related to security for example CORS (Although it had been fully set up), Authorization fails and System.PlatformNotSupportedException: System.Security.Cryptography.Algorithms is not supported on this platform.
Regardless of whether it is good practice or not; is it possible to directly upload to a blob storage from a blazor app? One method i tried was via a SAS token. It works via a CONSOLE APP but not a BLAZOR APP.
<label for="browseData"><b>Browse File</b></label>
<p><InputFile id="browseData" OnChange="#OnInputFileChange" /></p>
private async Task OnInputFileChange(InputFileChangeEventArgs e)
{
var maxAllowedFiles = 1;
var inputFile = e.GetMultipleFiles(maxAllowedFiles).First();
var stream = inputFile.OpenReadStream();
await StorageService.UploadFileToStorage(stream, "sftp-server", inputFile.Name);
}
Storage Service
public class AzureStorageService
{
private readonly IAzureStorageKeyService _azureStorageKeyService;
public AzureStorageService(IAzureStorageKeyService azureStorageKeyService)
{
_azureStorageKeyService = azureStorageKeyService;
}
public async Task<Uri> UploadFileToStorage(Stream stream, string container, string fileName)
{
try
{
const string REPLACE_THIS_ACCOUNT = "test";
var blobUri = new Uri("https://"
+ REPLACE_THIS_ACCOUNT +
".blob.core.windows.net/" +
container + "/" + fileName);
// Create the blob client.
AzureSasCredential azureSasCredential = new AzureSasCredential(
"?sv=2019-12-12&ss=bfqt&srt=sco&sp=rwdlacupx&se=2021-01-20T04:21:45Z&st=2021-01-19T20:21:45Z&spr=https&sig=OIkLePYDcF2AChtYUKs0VxUajs4KmwSyOXpQkFLvN2M%3D");
var blobClient = new BlobClient(blobUri, azureSasCredential);
// Upload the file
var response = await blobClient.UploadAsync(stream, true);
return blobUri;
}
catch (Exception ex)
{
Console.WriteLine(ex);
return null;
}
}
}
Like I was mentioning this will work via a console app but not a blazor app due to CORS..is this a security feature that just cannot be bypassed and just has to be done via the server side through a function -> blob?
If you want to directly upload file to Azure blob in Blazor WebAssembly application, we need to configure CORS fro Azure storage account. Regarding how to configure it, please refer to here.
For example
Configure CORS
Allowed origins: *
Allowed verbs: DELETE,GET,HEAD,MERGE,POST,OPTIONS,PUT
Allowed headers: *
Exposed headers: *
Maximum age (seconds): 3600
Create Account SAS.
My upload file compoment
#page "/fileupload"
#using System.ComponentModel.DataAnnotations
#using System.IO
#using System.Linq
#using System.Threading
#using Azure.Storage.Blobs
#using Azure.Storage.Blobs.Models
#using Azure.Storage
#using Azure
<h3>File Upload Component</h3>
<label for="browseData"><b>Browse File</b></label>
<p><InputFile id="browseData" OnChange="#OnInputFileChange" /></p>
#{
var progressCss = "progress " + (displayProgress ? "" : "d-none");
}
<div class="#progressCss">
<div class="progress-bar" role="progressbar" style="#($"width: { progressBar }%")" aria-valuenow="#progressBar" aria-valuemin="0" aria-valuemax="100"></div>
</div>
#code {
private bool displayProgress = false;
private string result = string.Empty;
private string progressBar;
private int maxAllowedSize = 10 * 1024 * 1024;
private async Task OnInputFileChange(InputFileChangeEventArgs e)
{
var maxAllowedFiles = 1;
var inputFile = e.GetMultipleFiles(maxAllowedFiles).First();
var blobUri = new Uri("https://"
+ "23storage23" +
".blob.core.windows.net/" +
"upload" + "/" + inputFile.Name);
AzureSasCredential credential = new AzureSasCredential("");
BlobClient blobClient = new BlobClient(blobUri, credential, new BlobClientOptions());
displayProgress = true;
var res = await blobClient.UploadAsync(inputFile.OpenReadStream(maxAllowedSize), new BlobUploadOptions
{
HttpHeaders = new BlobHttpHeaders { ContentType = inputFile.ContentType },
TransferOptions = new StorageTransferOptions
{
InitialTransferSize = 1024 * 1024,
MaximumConcurrency = 10
},
ProgressHandler = new Progress<long>((progress) =>
{
progressBar = (100.0 * progress / inputFile.Size).ToString("0");
})
});
}
}
Besides, you also can download my sample to do a test.
I was able to upload a file with your code. It was working like a charm. So, there is no error in your source code. I used the Microsoft.NET.Sdk.BlazorWebAssembly SDK with net5.0 as the target framework. Probably, it's a configuration issue with the storage account.
The CORS policy allows the host of the application (https://localhost:5001) to accesses the resources with any verb and any header (*) values. The response can include the content-length header. Is your WASM self hosted or hosted within an ASP core application? You can easily spot what origin is sent to azure by inspected the request.
Based on your context, only a few verbs like PUT might be enough.
The configuration of the SAS key can also be more specific.
Using SAS from Blazor WASM
Even not entirely part of your questions, but worth mentioning. If you want to allow a direct upload into your blob storage - and there are good reasons for it - before uploading a file, your WASM app should send a request to your API, the response contains the SAS key that can only be used for this one operation. The SAS key itself should be a user delegation SAS key

How to download a file using from s3 private bucket without AWS cli

Is it possible to download a file from AWS s3 without AWS cli? In my production server I would need to download a config file which is in S3 bucket.
I was thinking of having Amazon Systems Manger run a script that would download the config (YAML files) from the S3. But we do not want to install AWS cli on the production machines. How can I go about this?
You would need some sort of program to call the Amazon S3 API to retrieve the object. For example, a PowerShell script (using AWS Tools for Windows PowerShell) or a Python script that uses the AWS SDK.
You could alternatively generate an Amazon S3 pre-signed URL, which would allow a private object to be downloaded from Amazon S3 via a normal HTTPS call (eg curl). This can be done easily using the AWS SDK for Python, or you could code it yourself without using libraries (it's a bit more complex).
In all examples above, you would need to provide the script/program with a set of IAM Credentials for authenticating with AWS.
Just adding notes for any C# code lovers to solve problem with .Net
Firstly write(C#) code to download private file as string
public string DownloadPrivateFileS3(string fileKey)
{
string accessKey = "YOURVALUE";
string accessSecret = "YOURVALUE";;
string bucket = "YOURVALUE";;
using (s3Client = new AmazonS3Client(accessKey, accessSecret, "YOURVALUE"))
{
var folderPath = "AppData/Websites/Cases";
var fileTransferUtility = new TransferUtility(s3Client);
Stream stream = fileTransferUtility.OpenStream(bucket, folderPath + "/" + fileKey);
using (var memoryStream = new MemoryStream())
{
stream.CopyTo(memoryStream);
var response = memoryStream.ToArray();
return Convert.ToBase64String(response);
}
return "";
}
}
Second Write JQuery Code to download string as Base64
function downloadPrivateFile() {
$.ajax({url: 'DownloadPrivateFileS3?fileName=' + fileName, success: function(result){
var link = this.document.createElement('a');
link.download = fileName;
link.href = "data:application/octet-stream;base64," + result;
this.document.body.appendChild(link);
link.click();
this.document.body.removeChild(link);
}});
}
Call downloadPrivateFile method from anywhere of HTML/C#/JQuery -
Enjoy Happy Coding and Solutions of Complex Problems

'Error 403: The account for the specified project has been disabled' while accessing google cloud storage API from java

created multiple accounts, every time 1$ charged from cc.then I am able to create bucket in https://console.cloud.google.com/, after that I start accessing the bucket from my java code as bellow, then account getting blocked, I tried multiple times.
java code :
creating credentials
HttpTransport httpTransport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
List<String> scopes = new ArrayList<String>();
scopes.add(StorageScopes.DEVSTORAGE_FULL_CONTROL);
Credential credential = new GoogleCredential.Builder()
.setTransport(httpTransport)
.setJsonFactory(jsonFactory)
.setServiceAccountId(
propsReaderUtil.getValue(ACCOUNT_ID_PROPERTY))
.setServiceAccountPrivateKeyFromP12File(
new File(getClass().getClassLoader().getResource(propsReaderUtil.getValue(
PRIVATE_KEY_PATH_PROPERTY)).getFile()))
.setServiceAccountScopes(scopes).build();
storage = new Storage.Builder(httpTransport, jsonFactory,
credential).setApplicationName(
propsReaderUtil.getValue(APPLICATION_NAME_PROPERTY))
.build();
uploading stream
Storage storage = getStorage();
StorageObject object = new StorageObject();
object.setBucket(bucketName);
InputStream stream = file.getInputStream();
try {
String contentType = URLConnection
.guessContentTypeFromStream(stream);
InputStreamContent content = new InputStreamContent(contentType,
stream);
Storage.Objects.Insert insert = storage.objects().insert(
bucketName, null, content);
insert.setName(file.getName());
insert.execute();
} finally {
stream.close();
}
Please let me know if I am doing something wrong, or suggest me best way to do this.
Any suggestions appreciated...
Thanks in advance...
Error 403 is an example of an error response you receive if you try to list the buckets of a non-existent project or one in which you don't have permission to list buckets.
The account associated with the project that owns the bucket or object has been disabled. Check the Google Cloud Platform Console to see if there is a problem with billing, and if not, contact account support.
More information can be found in HTTP Status and Error Codes.