Best way to upload and manage media files with Azure Media Services ad Asp.Net Core - asp.net-core

I'm very new with Azure and Azure Media Services, so I've a lot of questions about media upload, store and managements. This is the scenario:
I'm creating an API with Asp.Net Core that allow the users to upload Video, Audio and Images. So the user can see the list of videos, images and audio uploaded with a thumbnail. For now I'm just working on Video Uploading. When a video is uploaded, I create a thumbnail that's stored into a Container associate with an Assets ( using a custom TransformOutput ). I'm following the official documentation: https://learn.microsoft.com/it-it/azure/media-services/latest/stream-files-tutorial-with-api .
So, everytime an user upload a video, an Assets and a Container is created. Into container there is the thumbnail and the container is private, so you can not access to the files without an SAS Key. And there is where I've some doubts:
Server side I've to create a List with thumbnails, every thumbnail is stored in a private container with the video file, meta etc etc. but I can't swith the entire container to public because I don't want to allow the direct access to the videos, but just to the thumbnails.
Maybe the best solution is to separate the thumbnails into another Public Container?
My original idea was to create, for every user, 4 Containers:
User_X_VideoContainer ( private )
User_X_AudioContainer ( private )
User_X_ThumbnailsContainer ( public )
User_X_ImagesContainer ( public )
So when an user upload a video, I can store the video into User_X_VideoContainer and Thumbnail into User_X_ThumnailsContainer and use different level of access to the single Container. But I don't know if this is a good practice because Azure Media Services create first an Assets and so a Container with everything inside. So witch is the best way to store and manage user files that had different type of level access?
Thanks!

I would recommend to store the public data in another storage account for better security (and use Azure CDN to cache the content). This storage account could be attached to your AMS account so you can specify this storage account as a destination for the thumbnail output in the job, based on the AMS transform.
Thumbnail output documentation.

Related

React Native & Firebase Cloud Storage - Create buckets dynamically

I am looking at creating an app that uses cloud storage for image storing, and according to this it seems like it is smart to create a bucket per user. However, this seems like a bad idea when you think about scaleability and such because you'd have millions of buckets. My question is: for an app that uses storage buckets to store images, is it better to create a per-user bucket or use a single bucket and just name files uniquely according to user-email and limit accesses to the files inside to each user?
It seems like every doc I visit mentions creating the bucket either in the console or in gsutil but I am looking to see if there is a way to do it from the react-native client side. This way when a user creates a new account, a new bucket can be allocated to them. I have looked into the Google Cloud JSON API too.

Can we use SDKs directly in Suitelet?

Implementing a requirement to store images in AWS bucket instead of NetSuite. Since the bucket is private, I have to upload and generate the URL in backend/suitelet.
I tried to include AWS SDK into Suitelet by defining, but that doesn't work.
I want to get to know whether can we use/include SDKs inside Suitelet?
How can I implement a solution for this without using any third party solutions?
How are permissions for the links managed? Can you make them publicly viewable? Remember unless the links you generate are timestamped anyone with the link can get to the image.
In terms of uploading the images check out https://github.com/DeepChannel/netsuite-savedsearch-s3
If you need to keep have each image have a magic link you could use a Heroku app or an AWS lambda. The app would check a hash based on link parameters and proxy the image if the hash is valid. If your images are supposed to be private to a customer this would be the way to go.
If you are using the images generally on a website then just make the bucket publicly readable and use the API to upload.

How to store images for mobile development

I decided to use back4app for easily creating my backend and for having a built in hosting solution.
I'm quite a newbie with this tool so my question will seem "simple":
I was wondering how will I store the images of my mobile application. As far as I know they use AWS so I thought the service would provide like an interface to upload some images to a S3 bucket...
Should I create a personal bucket or does the service offer that kind of feature ?
The idea is to store then the absolute url of the image in my model. For example each Class has a cover field of type string.
you're right, Back4App use AWS.
Back4App prepared the Backend for you, for example, if you try to save a file direct at your Parse Dashboard, you will can access the image and you already have a absolute URL.You can configure the column with a type File, like below:
Add a column with the File type
After upload a file, you will can access click at the box :)
After that upload the file

Can I use Autodesk viewing API to render local DWG (2D) files to my browser?

The main goal of my project is to read Autocad(DWG) drawings from my local server to output them in a web browser (Chrome).
I managed to do it with the View and Data API in JAVA from Autocad with buckets, Key, etc. but when it comes to read offline files with this sample code from https://github.com/Developer-Autodesk/view-and-data-offline-sample, the DWG format did not work.
Do you have suggestion or have a clue to use the offline API with DWG files?
The Autodesk View & Data API (developer.autodesk.com) allows you to display a DWG on your website using a zero-client (WebGL) viewer. You need to upload the DWG to the Autodesk server, translate it, and either then download the translation to store on your local server (as demonstrated on extract.autodesk.io) or keep it on the Autodesk server. You might consider downloading it to be advantageous because then you don't need to implement the OAuth code on your server.
Buckets on the Autodesk server can only be accessed using the accesstoken created from your API keys, so it is secure in that only someone with your accesstoken and who knows the URN can access your translated file. However, for the viewer on your client-page to access the file, you need to provide it with your accesstoken. This does mean that someone could separately access your translated file by grabbing the accesstoken and URN from your webpage. But if you're serving up the model on a public page, then you presumably don't care about that.
There is a 'list' API available, but this is white-listed (available on request), so getting your accesstoken and urn for one file doesn't automatically give access to your other files - unless someone can guess the other filenames (or iterate to find them).
If you use a non-permanent bucket, then your original (untranslated file) becomes unavailable when the bucket expires, or you can explicitly delete the untranslated file (using the delete API).
Files translated via the View & Data API are not accessible via A360. They are stored in a separate area. (But I wouldn't be at all surprised if an A360 file access API became available in the near future :-).
Finally, unless you want to interact with the displayed file via the viewer's JavaScript API, you may prefer just to upload your files to A360, share the translated model, and then iframe embed them in your webpage.

Can the uploadcare widget be used without the uploadcare service?

Can uploadcare-widget be used without using the upload care service?
The goal:
Use the widget (specifically to allow users to upload files from their google drive/dropbox accounts).
Instead of using upload care's backend, use your own backend, i.e. node.js/aws s3.
Yes, it can. It's open source!
Although you will have to either replicate or get rid of functionality that relies on Uploadcare infrastructure:
uploads (this is the easiest part)
fetching files from social networks and cloud storage services
image preview and cropping that relies on Uploadcare CDN
So unless you're moving enormous amounts of files, most cost efficient way is to use Uploadcare as it is. BTW, you can use your own S3 storage and even upload directly to your S3 buckets.