Protocol used by the Azure Storage explorer to connect to Azure storage - azure-storage

Can someone please help to find the protocol used by the Azure Storage Explorer to connect to Azure storage??
is it SMB or REST?

Azure Storage Explorer (ASE) is a wrapper around azcopy command tool.
Here is a sample of azcopy command I pasted into the notepad
and Azcopy internally uses REST api.
In order to capture all the REST api calls going out you can also use fiddler tool.
Follow the instruction from the link below and you should be able to see them.
https://learn.microsoft.com/en-us/power-query/web-connection-fiddler
So the order is ASE uses -> azcopy uses -> REST API.
or you can also find the azcopy logs at this location for individual session at "%USERPROFILE%.azcopy"

It is REST.
Storage Explorer makes use of Storage SDKs for JavaScript which are wrapper over REST API.

Related

What are the steps to export a VM using the vmware vcenter 7 rest api

I'm attempting to build some custom automation to handle the import / export of VM's to / from an on-prem vmware cluster.
So far I have authenticated the rest api, can get a VM's info, but I cannot work out how to approach exporting the selected VM.
I believe I'll need to create a download session & iterate through its files, saving them to disk one by one whilst keeping the download session alive, but the documentation seems to skirt around the concept of exporting a VM and focus predominantly on deploying.
Does anyone have an example / list of steps required to achieve exporting a VM via the Rest API?
As of 7.0U2, that functionality doesn't exist in the vSphere Automation (REST) API. Here are the top level VM functions: link
If you're open to using the vSphere Web Services (SOAP) API, there's an exportVM function available: link
If you want to automate VMs import/export I recommend to use OVF Tool / PowerCLI.
I leave you a KB with example https://kb.vmware.com/s/article/1038709

BigQuery retrieve data from sftp

I have an internet facing sftp server that has regularly csv files update. Is there anyway command to have BigQuery to retrieve data from this sftp and put it into tables. Alternatively, any API or python library that support this?
As for BigQuery - there's no integration I know of with SFTP.
You'll need to either:
Create/find a script that reads from SFTP and pushes to GCS.
Add a HTTPS service to the SFTP server, so your data can be read with the GCS transfer service (https://cloud.google.com/storage-transfer/docs/)
Yet another 3rd party Tool supporting (S)FTP In and Out to/from GCP is Magnus - Workflow Automator which is part of Potens.io Suite - supports all BigQuery, Cloud Storage and most of Google APIs as well as multiple simple utility type Tasks like BigQuery Task, Export to Storage Task, Loop Task and many many more along with advanced scheduling, triggering, etc.. Also available at Marketplace.
FTP-to-GCS Task accepts a source FTP URI and can transfer single or multiple files based on input to a destination location in Google Cloud Storage. The resulting uploaded list to Google Cloud Storage is saved to a parameter for later use within the Workflow. The source FTP can be of types SFTP, FTP, or FTPS.
See here for more
Disclosure: I am GDE for Google Cloud and creator of those tools and leader on Potens team

What are advantage of Isilon OneFS File System Access Api over accessing the file system using SMB or NFS?

I want to create some utility that read/write the files with permission (ACL) from/to Isilon server. This utility will access the server either on LAN or VPN. Here my main concern is to achieve performance too for file/folder enumeration and copy files data with attributes/acl/timestamp too.
As I know, you can access the file storage using SMB if server is on Windows server else NFS if server is on unix/linux.
so I want some basic information that in what scenarios OneFS Api's are better than accessing directly over NFS/SMB.
I'm an Isilon admin at a big company. TL;DR Its just another way to access your files
Most of my clients-systems access their files using SMB, and a smaller number use NFSv3. Generally NFS is best suited to Unix clients, and SMB is best with Windows, but you can mount SMB with Linux and you can run an NFS client on Windows. Probably the biggest advantage with NFS/SMB is they are commonly used protocols that most IT admins are familiar with.
API access would be the best approach if you are already writing custom software, or implementing a web framework that was geared toward REST API integration. If implementing using REST API, then Isilon's API access might be the easiest choice.

AWS Cognito substitute on Pivotal Cloud Foundry (PCF)

I'm exploring Pivotal Cloud Foundry (PCF)'s PAS for moving our organisation's on-premise application onto private cloud. Going by the documentation, I'm unable to find if PCF has any offering for end user authentication & authorisation like we have Cognito on AWS?
If not, are there any other external service that can be used in conjunction with PCF for the purpose?
Thanks for your help.
I don't know if it'll be an exact replacement, but Pivotal Cloud Foundry has a single sign-on option available.
https://docs.pivotal.io/p-identity/1-6/index.html
It uses the same UAA, which comes with the platform and controls access to the platform, but is completely separate and allows you to provide similar access controls to your applications running on top of Pivotal Cloud Foundry.

Azure Iot sdk - upload blob metadata?

It's great that Azure Iot hub and sdk supports blob uploading. However it seems blob metadata uploading is not there. Any plan to add the feature? It's a very handy feature for small projects.
Or it's already supported but I missed something?
That's correct. There is no method (support) for the blob metadata uploading in the Azure IoT Device Client SDK.
However, the following workaround can be used for this feature. It's based on the uploading files with IoT Hub using the REST API calls.
Step 1: The device ask an Azure IoT Hub for upload references
Step 2: Upload blob and metadata
Step3: Complete the device uploading process:
This feature is tracked by https://github.com/Azure/azure-iot-sdk-csharp/issues/165 for the .NET Azure IoT SDK.