I'm looking for a way to find any expired or expiring SAS signatures on an Azure storage account.
Using C# I have examined all the public properties and methods of the CloudStorageAccount class, I have also looked at this class in ILSpy and Azure Resource explorer - just can't see a way to retrieve the SAS expiry date/time.
void Main()
{
CloudStorageAccount account = new CloudStorageAccount(new
StorageCredentials(GetName(), GetKey()), true);
account.Dump();
CloudBlobClient client = account.CreateCloudBlobClient();
foreach (CloudBlobContainer container in client.ListContainers())
{
var sabp = new SharedAccessBlobPolicy();
var sas = container.GetSharedAccessSignature(sabp);
Console.WriteLine(container.Name);
Console.WriteLine(sas);
Console.WriteLine();
}
}
internal string GetName() {return #"myaccountname";}
internal string GetKey() {return #"myaccountkey";}
The is no error but also no way (I can see) to get the account-level SAS.
Note I do not want any blob SAS but the SAS set against the container.
Thanks
If you want to know if your account-level SAS is expired or expiring of , based on this doc , you can just check the SignedExpiry param. in SAS , its name is se.
Try the code below to get a account-level SAS with blob object read permission and has 1 day's lifetime:
static void Main(string[] args)
{
CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials("<storage account name>", "<storage key>"), true);
var accesspolicy = new SharedAccessAccountPolicy()
{
Permissions = SharedAccessAccountPermissions.Read,
Services = SharedAccessAccountServices.Blob,
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(1),
ResourceTypes = SharedAccessAccountResourceTypes.Object
};
var accountSAS = account.GetSharedAccessSignature(accesspolicy);
Console.WriteLine(accountSAS);
Console.ReadKey();
}
Result :
As you can see, the se param is there and indicates this sas will expire after 1 day.
So you can use this sas to access your blobs :
Related
I am new to Azure Data Factory and have an interesting requirement.
I need to move files from Azure Blob storage to Amazon S3, ideally using Azure Data Factory.
However S3 isnt supported as a sink;
https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-overview
I also understand from a variety of comments i've read on here that you cannot directly copy from Blob Storage to S3 - you would need to download the file locally and then upload it to S3.
Does anyone know of any examples, in Data factory, SSIS or Azure Runbook that can do such a thing, I suppose an option would be to write an azure logic-app or function that is called from Data Factory.
Managed to get something working on this - it might be useful for someone else.
I decided to write an azure function that uses a HTTP request as a trigger.
These two posts helped me a lot;
How can I use NuGet packages in my Azure Functions?
Copy from Azure Blob to AWS S3 using C#
Please note my answer to the Nuget packages if you are using Azure functions 2.x.
Here is the code - you can modify the basis of this to your needs.
I return a JSON Serialized object because Azure Data Factory requires this as a response from a http request sent from a pipeline;
#r "Microsoft.WindowsAzure.Storage"
#r "Newtonsoft.Json"
#r "System.Net.Http"
using System.Net;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Primitives;
using Newtonsoft.Json;
using Microsoft.WindowsAzure.Storage.Blob;
using System.Net.Http;
using Amazon.S3;
using Amazon.S3.Model;
using Amazon.S3.Transfer;
using Amazon.S3.Util;
public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
{
log.LogInformation("Example Function has recieved a HTTP Request");
// get Params from query string
string blobUri = req.Query["blobUri"];
string bucketName = req.Query["bucketName"];
// Validate query string
if (String.IsNullOrEmpty(blobUri) || String.IsNullOrEmpty(bucketName)) {
Result outcome = new Result("Invalid Parameters Passed to Function",false,"blobUri or bucketName is null or empty");
return new BadRequestObjectResult(outcome.ConvertResultToJson());
}
// cast the blob to its type
Uri blobAbsoluteUri = new Uri(blobUri);
CloudBlockBlob blob = new CloudBlockBlob(blobAbsoluteUri);
// Do the Copy
bool resultBool = await CopyBlob(blob, bucketName, log);
if (resultBool) {
Result outcome = new Result("Copy Completed",true,"Blob: " + blobUri + " Copied to Bucket: " + bucketName);
return (ActionResult)new OkObjectResult(outcome.ConvertResultToJson());
}
else {
Result outcome = new Result("ERROR",false,"Copy was not successful Please review Application Logs");
return new BadRequestObjectResult(outcome.ConvertResultToJson());
}
}
static async Task<bool> CopyBlob(CloudBlockBlob blob, string existingBucket, ILogger log) {
var accessKey = "myAwsKey";
var secretKey = "myAwsSecret";
var keyName = blob.Name;
// Make the client
AmazonS3Client myClient = new AmazonS3Client(accessKey, secretKey, Amazon.RegionEndpoint.EUWest1);
// Check the Target Bucket Exists;
bool bucketExists = await AmazonS3Util.DoesS3BucketExistAsync (myClient,existingBucket);
if (!bucketExists) {
log.LogInformation("Bucket: " + existingBucket + " does not exist or is inaccessible to the application");
return false;
}
// Set up the Transfer Utility
TransferUtility fileTransferUtility = new TransferUtility(myClient);
// Stream the file
try {
log.LogInformation("Starting Copy");
using (var stream = await blob.OpenReadAsync()) {
// Note: You need permissions to not be private on the source blob
log.LogInformation("Streaming");
await fileTransferUtility.UploadAsync(stream,existingBucket,keyName);
log.LogInformation("Streaming Done");
}
log.LogInformation("Copy completed");
}
catch (AmazonS3Exception e) {
log.LogInformation("Error encountered on server. Message:'{0}' when writing an object", e.Message);
}
catch (Exception e) {
log.LogInformation("Unknown encountered on server. Message:'{0}' when writing an object", e.Message);
return false;
}
return true;
}
public class Result {
public string result;
public bool outcome;
public string UTCtime;
public string details;
public Result(string msg, bool outcomeBool, string fullMsg){
result=msg;
UTCtime=DateTime.Now.ToString("yyyy-MM-dd h:mm:ss tt");
outcome=outcomeBool;
details=fullMsg;
}
public string ConvertResultToJson() {
return JsonConvert.SerializeObject(this);
}
}
You can use Skyplane to copy data across clouds (110X speedup over CLI tools, with automatic compression to save on egress). To transfer from Azure blob storage to S3 you can call one of the commands:
skyplane cp -r az://azure-bucket-name/ s3://aws-bucket-name/
skyplane sync -r az://azure-bucket-name/ s3://aws-bucket-name/
ADF now includes SFTP as a sink. From the same link provided in the question (supported as a sink is the far-right column):
Using the AWS Transfer family you can set up an SFTP server and add a user with an SSH public key, then use that configuration to set up an SFTP connection from ADF that will connect directly to an S3 bucket.
Download Files From Azure Storage using AzCopy into a temporal local repository
You can download the files from Azure Cloud storage to your local system, just follow the below command, use the recursive flag to copy all the files
azcopy /Source:[source_container_url] /Dest:[local_file_path] /Sourcekey:[source_storage_account_access_key] /s
Upload Local Files to Amazon S3 using aws s3 cp command
aws s3 cp local_file_path s3://my-bucket/ --recursive
I'm trying to build a custom activity in Azure Data Factory that gets a blob as input dataset and would like to pass this blob's sas token path to an API that requires this type of path.
Is there any way to get the blob's path with the sas token in the custom activity?
I figured out a way to do it. Part of the custom activity in ADF v1 is the Execute method that has a context parameter. From that context you can get the connection string to the blob storage and the path of the blob and then you can extract the sas token like this:
public override IDictionary<string, string> Execute(
AOMDotNetActivityContext context,
IActivityLogger logger)
{
string blobConnectionString = context.ConnectionString;
CloudStorageAccount inputStorageAccount = CloudStorageAccount.Parse(blobConnectionString);
var blob = new CloudBlob(new Uri(inputStorageAccount.BlobEndpoint, Path.Combine(context.FolderPath, context.FileName)), inputStorageAccount.Credentials);
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(48),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Delete
};
string sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
string fullUri = new Uri(blob.Uri, sasBlobToken).ToString();
I am checking to get NAS storage list.
I tested 2 ways, one ways is using BAP id, another way is direct account id
first
Using BAP id, get account list.
Using account id, get NAS Storage list.
==> I didn't NAS Storage list
second
Using direct account id, get NAS Storage list
===> successly, get NAS Storage list
I don't Understand difference between ways.
i attached first test code,
"getNasNetworkStorageCount" method returned NAS stroage count, but "getNasNetworkStorage" return "null".
public void test() {
String userId = "IBMxxxxx";
String apiKey = "xxxxx";
client = new RestApiClient().withCredentials(userId, apiKey).withLoggingEnabled();
Account.Service accountService = Account.service(client);
List<Brand> brandList = accountService.getOwnedBrands();
for (Brand brand : brandList) {
Brand.Service brandService = brand.asService(client);
Account.Mask mask = new Account.Mask();
mask.id();
mask.companyName();
mask.accountStatusId();
mask.email();
mask.hardwareCount();
mask.hardware();
mask.virtualGuestCount();
mask.virtualGuests();
mask.nasNetworkStorage();
mask.nasNetworkStorageCount();
brandService.clearMask();
brandService.setMask(mask);
List<Account> accountList = accountList = brandService.getOwnedAccounts();
for (Account account : accountList) {
if(account.getNasNetworkStorageCount() != 0){
System.out.print(account.getNasNetworkStorageCount() + " == ");
System.out.println(account.getNasNetworkStorage().size());
}
}
System.out.println(accountList.size());
}
}
Your results might be those because when you run the SoftLayer_Brand::getOwnedAccounts method it only returns the account for the current user (i.e. the user that’s calling the API)
You can run this Java example and see that effectively the brand retrieves the right account for the user caller, and then all NAS Network Storages that belong to it.
package SoftLayer_Java_Scripts.Examples;
import com.google.gson.Gson;
import com.softlayer.api.*;
import com.softlayer.api.service.Account;
import com.softlayer.api.service.Brand;
import com.softlayer.api.service.network.Storage;
import java.util.List;
public class GetNasNetworkStorage
{
public static void main( String[] args )
{
String user = "set me";
String apiKey = "set me";
long brandId = 2L;
ApiClient client = new RestApiClient().withCredentials(user, apiKey);
Brand.Service brandService = Brand.service(client, brandId);
try
{
List<Account> accountsList = brandService.getOwnedAccounts();
Gson gson = new Gson();
for (Account account : accountsList) {
Account.Service accountService = account.asService(client);
List<Storage> nasStorageList = accountService.getNasNetworkStorage();
for (Storage storage : nasStorageList) {
System.out.println(gson.toJson(storage));
}
}
}
catch(Exception e)
{
System.out.println("Script failed, review the next message for further details: " + e.getMessage());
}
}
}
The difference is that the Brand service is to manage brand accounts whilts using directly the account service is to manage all the information about a particular account.
Currently it may be an issue with the object mask that you are using, however the problem of use the Brand service is that this service was designed only to display the basic information of the all accounts which belong to the brand it was not designed to display all the information of the related accounts (even if you use object masks). I am going to report the issue of the object mask to softlayer, I mean the one related that the nasNetworkStorage returns null, but I already reported similar issues and they were not fix it, because as I told you that is not the propuse of the service.
You also can try setting the object mask as a string maybe that works e.g.
brandService.setMask("mask[id,companyName,accountStatusId,email,hardwareCount,hardware,virtualGuestCount,VirtualGuest,nasNetworkStorage,nasNetworkStorageCount]");
Anyway the most reliable way to get that information of your accounts associated to the brand is using the master user of each accout, I mean using the account service; even the softlayer agent portal uses the master account to get more information of a particular account in your brand.
Let me know if you have more questions
Regards
I have researched on how to export BLOBs to image. A DB has an IMAGE column storing several thousand images. I thought of exporting the table but I get a BLOB file error in EMS SQL Manager for InterBase and Firebird.
There have been good posts, but I have still not been able to succeed.
SQL scripts to insert File to BLOB field and export BLOB to File
This example has appeared on numerous pages, including Microsoft's site. I am using INTERBASE (Firebird). I have not found anything related to enabling xp_shell for Firebird, or EMS SQL Manager for InterBase and Firebird (which I have also installed). My guess is: its not possible. I also tried Installing SQL Server Express, SQL Server 2008, and SQL Server 2012. I am at a dead end without having even connected to the server. The reason being I have not managed to start the server. Followed the guide at technet.microsoft: How to: Start SQL Server Agent but there are no services on the right pane to me.
PHP file to download entire column (may not post link due to rep limitation).
It has a MySQL connect section that daunts me. There on my computer is the DB as a GDB file, I also have XAMPP. I can figure out a way to use this as a localhost environment. I hope this is making sense.
Last solution is to use bcp, an idea posted on Stack Overflow titled: fastest way to export blobs from table into individual files. I read the documentation, installed it, but cannot connect to server. I use -S PC-PC -U xxx -P xxx (The server must be wrong) But the information I find all uses -T (Windows Authentication)
Summing up. I am using Firebird, as EMS SQL Manager. I try to extract all images from images table into individual files. These tools both have SQL script screens, but it appears to be in conjunction with xp shell. What would you suggest? Am I using the wrong SQL manager to accomplish this?
There are several ways:
Use isql command BLOBDUMP to write a blob to file,
Use a client library (eg Jaybird for Java, Firebird .net provider for C#) to retrieve the data,
With PHP you can use ibase_blob_get in a loop to get bytes from the blob, and write those to a file.
I don't use nor know EMS SQL Manager, so I don't know if (and how) you can export a blob with that.
The example you link to, and almost all tools you mention are for Microsoft SQL Server, not for Firebird; so it is no wonder those don't work.
Example in Java
A basic example to save blobs to disk using Java 8 (might also work on Java 7) would be:
/**
* Example to save images to disk from a Firebird database.
* <p>
* Code assumes a table with the following structure:
* <pre>
* CREATE TABLE imagestorage (
* filename VARCHAR(255),
* filedata BLOB SUB_TYPE BINARY
* );
* </pre>
* </p>
*/
public class StoreImages {
// Replace testdatabase with alias or path of database
private static final String URL = "jdbc:firebirdsql://localhost/testdatabase?charSet=utf-8";
private static final String USER = "sysdba";
private static final String PASSWORD = "masterkey";
private static final String DEFAULT_FOLDER = "D:\\Temp\\target";
private final Path targetFolder;
public StoreImages(String targetFolder) {
this.targetFolder = Paths.get(targetFolder);
}
public static void main(String[] args) throws IOException, SQLException {
final String targetFolder = args.length == 0 ? DEFAULT_FOLDER : args[0];
final StoreImages storeImages = new StoreImages(targetFolder);
storeImages.store();
}
private void store() throws IOException, SQLException {
if (!Files.isDirectory(targetFolder)) {
throw new FileNotFoundException(String.format("The folder %s does not exist", targetFolder));
}
try (
Connection connection = DriverManager.getConnection(URL, USER, PASSWORD);
Statement stmt = connection.createStatement();
ResultSet rs = stmt.executeQuery("SELECT filename, filedata FROM imagestorage")
) {
while (rs.next()) {
final Path targetFile = targetFolder.resolve(rs.getString("FILENAME"));
if (Files.exists(targetFile)) {
System.out.printf("File %s already exists%n", targetFile);
continue;
}
try (InputStream data = rs.getBinaryStream("FILEDATA")) {
Files.copy(data, targetFile);
}
}
}
}
}
Example in C#
Below is an example in C#, it is similar to the code above.
class StoreImages
{
private const string DEFAULT_FOLDER = #"D:\Temp\target";
private const string DATABASE = #"D:\Data\db\fb3\fb3testdatabase.fdb";
private const string USER = "sysdba";
private const string PASSWORD = "masterkey";
private readonly string targetFolder;
private readonly string connectionString;
public StoreImages(string targetFolder)
{
this.targetFolder = targetFolder;
connectionString = new FbConnectionStringBuilder
{
Database = DATABASE,
UserID = USER,
Password = PASSWORD
}.ToString();
}
static void Main(string[] args)
{
string targetFolder = args.Length == 0 ? DEFAULT_FOLDER : args[0];
var storeImages = new StoreImages(targetFolder);
storeImages.store();
}
private void store()
{
if (!Directory.Exists(targetFolder))
{
throw new FileNotFoundException(string.Format("The folder {0} does not exist", targetFolder), targetFolder);
}
using (var connection = new FbConnection(connectionString))
{
connection.Open();
using (var command = new FbCommand("SELECT filename, filedata FROM imagestorage", connection))
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
string targetFile = Path.Combine(targetFolder, reader["FILENAME"].ToString());
if (File.Exists(targetFile))
{
Console.WriteLine("File {0} already exists", targetFile);
continue;
}
using (var fs = new FileStream(targetFile, FileMode.Create))
{
byte[] filedata = (byte[]) reader["FILEDATA"];
fs.Write(filedata, 0, filedata.Length);
}
}
}
}
}
}
I am having trouble with facebook authentication for Mobile Services in Azure.
To be more specific, I already have an application that is using Facebook C# SDK and it works fine. I can log on, fetch list of my friends and so. I want to keep using this SDK, but I also want to authenticate for Azure Mobile Service.
So, my plan was, log on with Facebook C# SDK (as I already do today), get the authentication token, and pass it to the MobileServiceClient.LoginAsync() - function. That way, I can still have all the nice features in Facebook C# SDK, and also use the built in authentication system in Mobile Services for Azure.
var client = new FacebookClient();
dynamic parameters = new ExpandoObject();
parameters.client_id = App.FacebookAppId;
parameters.redirect_uri = "https://www.facebook.com/connect/login_success.html";
parameters.response_type = "token";
parameters.display = "popup";
var loginUrl = client.GetLoginUrl(parameters);
WebView.Navigate(loginUrl);
When load is complete, followin is executed:
FacebookOAuthResult oauthResult;
if (client.TryParseOAuthCallbackUrl(e.Uri, out oauthResult) && oauthResult.IsSuccess)
{
var accessToken = oauthResult.AccessToken;
var json = JsonObject.Parse("{\"authenticationToken\" : \"" + accessToken + "\"}");
var user = await App.MobileService.LoginAsync(MobileServiceAuthenticationProvider.Facebook, json);
}
However, I get this exception when I call the last line of code above:
MobileServiceInvalidOperationException, "Error: The POST Facebook login request must specify the access token in the body of the request."
I cannot find any information on how to format the accesstoken, I have tried a lot of different keys (instead of "authenticationToken" as you see in my sample). I also have tried just to pass the accesstoken string, but nothing seem to work.
Also, if I use the MobileServiceClient.LoginAsync() for making a brand new login, it works just fine, but it seem silly to force users to log on twice.
Any help is greatly appreciated!
The format expected for the object is {"access_token", "the-actual-access-token"}. Once the login is completed using the Facebook SDK, the token is returned in the fragment with that name, so that's what the Azure Mobile Service expects.
BTW, this is a code which I wrote, based on your snippet, which works. It should handle failed cases better, though, but for the token format, this should be enough
private void btnLoginFacebookToken_Click_1(object sender, RoutedEventArgs e)
{
var client = new Facebook.FacebookClient();
dynamic parameters = new ExpandoObject();
parameters.client_id = "MY_APPLICATION_CLIENT_ID";
parameters.redirect_uri = "https://www.facebook.com/connect/login_success.html";
parameters.response_type = "token";
parameters.display = "popup";
var uri = client.GetLoginUrl(parameters);
this.webView.LoadCompleted += webView_LoadCompleted;
this.webView.Visibility = Windows.UI.Xaml.Visibility.Visible;
this.webView.Navigate(uri);
}
async void webView_LoadCompleted(object sender, NavigationEventArgs e)
{
AddToDebug("NavigationMode: {0}", e.NavigationMode);
AddToDebug("Uri: {0}", e.Uri);
string redirect_uri = "https://www.facebook.com/connect/login_success.html";
bool close = (e.Uri.ToString().StartsWith(redirect_uri));
if (close)
{
this.webView.LoadCompleted -= webView_LoadCompleted;
this.webView.Visibility = Windows.UI.Xaml.Visibility.Collapsed;
string fragment = e.Uri.Fragment;
string accessToken = fragment.Substring("#access_token=".Length);
accessToken = accessToken.Substring(0, accessToken.IndexOf('&'));
JsonObject token = new JsonObject();
token.Add("access_token", JsonValue.CreateStringValue(accessToken));
try
{
var user = await MobileService.LoginAsync(MobileServiceAuthenticationProvider.Facebook, token);
AddToDebug("Logged in: {0}", user.UserId);
}
catch (Exception ex)
{
AddToDebug("Error: {0}", ex);
}
}
}