Can we use share point directory as destination to my file sink? - rabbitmq

We have a spring cloud data flow stream which reads an excel from an SFTP source location and processes it and generates an output excel which is to be copied to a common share point folder.
Is there any way we can do this...?
We could write to sftp sink, but our requirement is to create the output in share point folder.

There is no Microsoft SharePoint Sink implementation. You have to use some other intermediary to store the file then transfer it with some other Microsoft tool or implement such a SharePoint Sink yourself: https://medium.com/xebia-engineering/java-use-microsoft-graph-api-to-access-sharepoint-sites-1a26427c9b83
What SharePoint is: 
SharePoint is a customizable web application that can integrate with other Microsoft services and applications, such as Outlook, Teams, Microsoft Viva, OneDrive for Business, and more. 
So, as long as I know OneDrive, there will be just enough for you to use File Sink to store the file. Then OneDrive will sync it with your cloud account and from there you need to configure your SharePoint to sync with your OneDrive.

Related

Create Blob storage UNC path in cloud

I have used blob storage as a file storage account in .NET Core Web application hosted on Azure app service(PaaS).
My requirement is to create .zip files and then attach to email where it requires UNC path for attachment.
Here I have one option to use app service local storage for temporary file creation and use in attachment.
I am searching other option to map blob storage to any virtual drive in cloud and get its UNC path or any other option?
Also, Can you please suggest what are the possible options to map Azure Blob storage drive in network? I know the following one - App service local storage, VM, Local machine network drive.
First of all, you need to know the concept of UNC path, and then azure webapp can be regarded as a virtual machine in essence, and azure blob storage can also be regarded as a machine. Therefore, it is not feasible to send mail directly through azure blob.
Suggestion:
1. I check the information, you can try azure files to store files and use them.
I think this should be the fastest way, without using other azure products.
Download the file to the project directory, you can create a temporary folder, such as: MailTempFolder, you can download the file from the blob to this folder, and then you can get the UNC path to send mail.
After the sending is successful, just delete the file, it will not occupy too much space of the azure webapp, even if the sending fails, you can still get the zip file without downloading it again.

Copy files between cloud storage providers

I need to upload a large number of files to one cloud storage provider and then copy those files to another cloud storage provider using software that I will write. I have looked at several cloud storage providers and I don't see an easy way to do what I need to do unless I first download the files and then upload them to the second storage provider. I want to copy directly using cloud storage provider API's. Any suggestions or links to storage providers that have API's that will allow copying from one provider to another would be most welcome.
There is several option you could choose. First using cloud transfer services such as Multi Cloud. I've using it to transfer from AWS S3 or Egnyte to Google Drive.
Multicloud https://www.multcloud.com which is free to for 30GB data traffic per month.
Mountain duck https://mountainduck.io/ if connector are available you could mount each cloud services as your hard drive, and move each file easily.
I hope this could help.
If you want to write code for it use Google's gsutil :
The gsutil cp command allows you to copy data between your local file
system and the cloud, copy data within the cloud, and copy data
between cloud storage providers.
You will find detailed info in this link :
https://cloud.google.com/storage/docs/gsutil/commands/cp
If you want a software, use Multicloud. https://www.multcloud.com/
It can download directly from the web and it can also transfer the file from one cloud storage like dropbox to another like google drive.
Cloud HQ also as a chrome extension is one of the best solutions to sync your data between clouds. You can check it out.

How can I transfer Dropbox file data to SQL Table?

I'm getting files four days a week through my Dropbox folder and I need to add that data to my sql server.
In the past I've been using FTP to transfer files, but I'm not sure if FTP will work with Dropbox and I don't know how to do it.
I've had some experience with SSIS in the past and I'm pretty sure that SSIS could do this task, but I'm not able to add integration services extension to my SQL Server.
Does anyone have any idea what would be the easiest way to transfer these files to the database?
There are some third party components that allow you to read from Dropbox:
Kingswaysoft SSIS Dropbox Source Component
CDATA - Dropbox SSIS Components
Or you have to use an HTTP connection manager to download the file using Dropbox api:
http://www.sqlis.com/post/Downloading-a-file-over-HTTP-the-SSIS-way.aspx

Azure Data Factory with Integration Runtime - Delete (or move) file after copy

I have an on premise server with the Microsoft Integration Runtime installed.
In Azure Data Factory V2 I created a pipeline that copies files from the on premise server to a blob storage.
After a successful transfer I need to delete the files on the on premise server. I am not able to find a solution for this in the documentation. How can this be achieved?
Recently Azure Data Factory introduced a Delete Activity to delete files or folders from on-premise storage stores or cloud storage stores.
You have the option to call Azure Automation using webhooks, with the web activity. In Azure Automation you can program a powershell or python script with a Hybrid Runbook Worker to delete the file from the on premise server. You can read more on this here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
Another easier option would be to program a script to be run on the server with the windows task scheduler where you run a script to delete the file. Make sure you program the script to be run after data factory has copied the files to the blob, and that's it!
Hope this helped!
If you are simply moving the file then you can use a Binary dataset in a copy activity. This combination makes a checkbox setting visible that when enabled will automatically delete the file once the copy operation completes. This is a little nicer as you do not need the extra delete activity and the file is "moved" only if the copy operation is a success.

How to allow silverlight to read any file on the file system, not just my documents?

I've noticed by default, Silverlight 4 applications only have read access to my documents.
Is there anyway to trust a silverlight application so that it can open a file from any location on the file system.
I can't expect my users to first have to copy files into the my documents folder before upload, is there a way to fully trust a particular silver light app?
Directly no. Silverlight doesn't provide it's own API to access file system outside My Documents. But you can always use COM in elevated trust applications to access any file in the system.
dynamic Fso = AutomationFactory.CreateObject("Scripting.FileSystemObject");
fso.CreateFolder("D:\\SilverFolder");
http://msdn.microsoft.com/en-us/library/system.runtime.interopservices.automation.automationfactory(VS.95).aspx
http://msdn.microsoft.com/en-us/library/ee721083(VS.96).aspx