File sharing app via Cloud Storage - dropbox

I was looking for a file sharing cloud storage solutions.
My requirements are below
1. Should be able to share contents to limited users(Or public availability)
2. Users should be able to download the contents from the shared folder to their private cloud storage.
Many cloud storage services like Dropbox, Google Cloud Storage, amazone, Microsoft azure allows public folders, but is there an option to copy the public shared content to the users private cloud storage area without any copying to a third party server?
Also is it possible to copy contents from on service provider to other service provider without any third party server?(eg: From Dropbox to Google Drive)

Have you looked at Cloudberry Explorer? It let's you copy from public to public cloud (e.g. Amazon to Constant) or public to private cloud (e.g. Amazon to Cloudian). It is fairly intuitive and inexpensive.

Related

Use Google Storage Transfer API to transfer data from external GCS into my GCS

I am working on a web application which comprises of ReactJs frontend and Java SpringBoot backend. This application would require users to upload data from their own Google Cloud storage into my Google Cloud Storage.
The application flow will be as follows -
The frontend requests the user for read access on their storage. For this I have used oauth 2.0 access tokens as described here
The generated Oauth token will be passed to the backend.
The backend will also have credentials for my service account to allow it to access my Google Cloud APIs. I have created the service account with required permissions and generated the key using the instructions from here
The backend will use the generated access token and my service account credentials to transfer the data.
In the final step, I want to create a transfer job using the google Storage-Transfer API. I am using the Java API client provided here for this.
I am having difficulty providing the authentication credentials to the transfer api.
In my understanding, there are two different authentications required - one for reading the user's bucket and another for starting the transfer job and writing the data in my cloud storage. I haven't found any relevant documentation or working examples for my use-case. In all the given samples, it is always assumed that the same service account credentials will have access to both the source and sink buckets.
tl;dr
Does the Google Storage Transfer API allow setting different source and target credentials for GCS to GCS transfers? If yes, how does one provide these credentials to the transfer job specification.
Any help is appreciated. Thanks!
This is not allowed for the the GCS Transfer API unfortunately, for this to work it would be required that the Service Account have access to both the source and the sink buckets, as you mentioned.
You can try opening a feature request in Google's Issue Tracker if you'd like so that Google's Product Team can consider such a functionality for newer versions of the API, also you could mention that this is subject is not touched in the documentation, so it can be improved.

Can I use AWS S3 with Google Speech-to-Text in larger files?

I tried to use Google Cloud Speech-to-Text in my node.js project. It works fine with smaller files that I've on my disk but I wanted to get longer files that are stored in AWS S3. Is it possible or I need to use Google Cloud Storage?
You can use google cloud storage libraries in your node.js code to access AWS s30 storage:
"The Cloud Storage XML API is interoperable with some cloud storage tools and libraries that work with services such as Amazon Simple Storage Service (Amazon S3) and Eucalyptus Systems, Inc. To use these tools and libraries, change the request endpoint (URI) that the tool or library uses so it points to the Cloud Storage URI (https://storage.googleapis.com), and configure the tool or library to use your Cloud Storage HMAC keys." For more information please check Google documentation
For longer audio files, you can only use files in Google Cloud Storage. You can't use audio files stored in AWS S3. https://cloud.google.com/speech-to-text/docs/reference/rest/v1/RecognitionAudio

Setting Bluemix object storage permissions

I am currently working on a project in Data Science Experience that involves writing CSV files to Bluemix Object Storage.
There are several colleagues involved in this project and I want to know how to set Bluemix permissions so that everyone can access the Object Storage container for the project through their respective Bluemix account.
Bluemix Object Storage services does not support access control for Bluemix platform users.

Set permission for static website on Azure Blob Storage

I have some static websites hosted on Azure Blob Storage and I want to grant access to those websites only for authenticated users from an ASP.NET MVC application.
I can't have the Blob Storage public.
I think I cannot use Shared Access Signatures taking in consideration that the website uses lots of javascript, css that are downloaded automatically by the main .htm page.
What's the best solution in this case?
If the permissions from your application must be checked then you can build a controler in your application that will act as a proxy between client and blob storage. No Shared Signatures, only regular blob keys on and your regular authentication for users.
The action takes the url relative to your blob as argument. You can add a custom route so is nicely handle links you can make inside your static website.

Account key vs shared access signature

I'm looking for guidance on how to securely use azure storage in an public facing production environment.
My simplest scenarios is multiple windows 8 store clients uploading images to azure. The account key is stored in app.config.
Is it ok to distribute the account key as part of my mobile application?
Or should I have a backend service that creates shared access signatures for the container / blob?
Thanks in advance.
Sharing your account key in your mobile application is not desirable because the clients get complete access to your account and can view/modify other data. Shared Access Signatures are useful in such cases as you can delegate access to certain storage account resources. You can grant them access to a resource for a specified period of time, with a specified set of permissions. In your case, you want to provide them access to only write blob content. You can find more details about SAS and how to use it here - http://msdn.microsoft.com/en-us/library/windowsazure/ee395415.aspx