Mule Managed file transfer synchronization - mule

I am looking for a solution that will involve tranferring Files across two SFTP locations. i have a requirement to move file from a dmz to a internal app server.
I have designed two flows. one that read file from dmz and moves to a quarantine zone. the second flow picks from from quar and moves to the app server. what are the recommendations from experts to create a event or sync based model to fir this requirement. I am seeing errors like no such file generating during poll when file size is larger and still in transfer .
please advise.
Thanks

If you are searching for one software, I recommend GoAnyWhere, in the workflow you can automate a lot of options, like this one that you talk.
you can check at : goanywhere.com (Managed File Transfer - MFT)
ps. I'm not a vendor, I just made a POC for one problem that I had and I saw that this can help you.

Related

Send very large file (>> 2gb) via browser

I have a task to do. I need to build a WCF service that allow a client to import a file inside a database using the server backend. In order to do this, i need to communicate to the server, the setting, the events needed to start and set the importation and most importantly the file to import. Now the problem is that these files can be extremely large (much bigger then 2gb), so it's not possible to send them via browser as they are. The only thing that comes into my mind is to split these files and send them one by one to the server.
I have also another requirement: i need to be 100% sure that this file are not corrupted, so i need to implement also a sort of policy for correction and possibly recover of the errors.
Do you know if there is a sort of API or dll that can help me to achieve my goals or is it better to write the code by myself? And in this case, which would be the optimal size of the packets?

How to separate the latest file from Multiple files in Mule

I have 5000 files in a folder and on daily basis new file keep loaded in same file. I need to get the latest file only on daily basis among all the files.
Will it be possible to achieve the scenario in Mule out of box.
Tried keeping file component inside Poll component( To make use of waterMark) but not working.
Is there any way we can achieve this. If not please suggest the best way ( Any possible links).
Mule Studio: 5.3, RunTime 3.7.2.
Thanks in advance
Short answer: Not really any extremely quick out of the box solution. But there are other ways. Im not saying this is the right or only way of solving it, but I've earlier implemented a similar scenario in this way:
A Normal File inbound with a database table as file-log. Each time a new file is processed, a component checks if its name appears in the table. By choice or filter I only continue if it isn't in there already - and after processing I add the filename to the table.
This is a quite "heavy" solution though. A simpler access would be to use an idempotent filter with a object store. For example a Redis server: https://github.com/mulesoft/redis-connector/blob/master/src/test/resources/redis-objectstore-tests-config.xml
It is actually very simple if your incoming file contains timestamp........you can configure the file inbound connector by setting file:filename-regex-filter pattern="myfilename_#[function:timestamp].csv". I hope this helps
May be you can use a quartz scheduler( mention the time in cron expression), followed by a groovy script in which you can start the file connector . Keep the file connector in another flow.

How to tracking process's status in Tibco?

I hope you show me resolve in my case.
When I define many process, how to get status data's tracking of that process. In other word, I want to get process's history. My purpose to show for my client checking.
I have defined a process communicate 3 applications and i deploy it to client.but unfortunately, my client would like to add more an application ( up to 4 apps) in the future. i wonder if how to do that? i perhaps open process again and edit it. Have a way create dynamic process.
Thanks very much.
PVA.
You get a very limited "history" in TIBCO Administator (more or less which process instances completed with success/failure; in case of failure it will also provided the exception and where in the process it failed). However that doesn't show you any tracking of the individual steps/activities that the process passed through. For this, you'd either have to put lots of logging steps into your process (and need to build something that parses this information from log files). Or you could use BusinessWorks ProcessMonitoring, which gives you a full history trail for each process automatically. However it not included with BW and you'll probably need a separate license.
Change the process in TIBCO Designer, build a new ear file, re-deploy the new EAR file in TIBCO Administrator.

Dropbox API - Using Dropbox as a server

I was wanting to use a file sharing server to keep certain files up-to-date and constant across multiple instances of my application across multiple computers - like (for example) writing a multiplayer game, which stores all the player's positions in a text file, and uses something like Dropbox to keep the text file constant across all the applications, and each application instance can change the file with that application's player's position, and then the rest of the applications can update accordingly. This is only an example, and is not what I intend to do using this technology. What I want to do does not rely on fast sharing of data very quickly - but only periodically downloading and updating the text file.
I was wondering how I might be able to do this using the Dropbox API for Objective-C without prompting the user for any Dropbox username/password - just store a single Dropbox account's login information, log into it automatically and update/download the file stored on it?
From what I have found out from experimenting, Dropbox prompts users for their passwords via a web-broswer, and is designed to accommodate multiple accounts, whereas I only need to accommodate the 'Server' account.
So, is there anyway to do this sort of thing using the Dropbox API, or should I use something else. Or do I need to find out how to write my own server. Using some sort of file sharing API seems a lot easier to me than writing an actual server.
Thanks for any help,
Ben
You might think about using Google App Engine (GAE). I had a similar requirement recently and I'm thinking this is a good option when you want centralized data. Plus you can do the no-browser account login by using your own custom authentication, or I think it's even possible via OAuth? Depends on how sensitive the data is I guess. I just rolled my own.
From my research I found that using Dropbox as a server has some issues with scalability, since you'll be limited to maybe 5,000 calls per day. source It's built on Amazon S3, so you could also look at using that directly.
GAE lifts that limit up to 675,000, but can be increased up to 91 million for free.
https://developers.google.com/appengine/docs/quotas
I did find an open-source project for doing this with Java, alternative you could look at Python example
I've written a daemon that continuously checks for updated files and syncs them. I wrote it for my own file manager iOS app. You can find the implementation here:
https://github.com/H2CO3/MyFile/tree/master/DropboxDaemon
I'm personally not an iOS developer but I came across this question while looking for something else and thought I would offer up another potential solution to the OP's question.
Microsoft just released something called Azure Mobile Services which supports iOS development (among other platforms). It's basically a convenient way to set up a back end system complete with push notifications, authentication, etc. without rolling your own. You don't need to know anything about Azure or servers as the setup process walks you through most of it. It is new so keep that in mind, but it looks promising for situations like this.
Here's a 10 minute video explaining how to use it with an iOS developed app along with links to more documentation:
http://channel9.msdn.com/posts/iOS-Support-in-Windows-Azure-Mobile-Services/
Hope this helps.

How do I distribute updates to a Access database front end?

I've got an Access 2007 database that I developed which connects to SQL Server for the actual data storage. I used the Package Solution Wizard to create a distributable installer which included access runtime (with an ACCDE file) which I went around and installed on 15 or so PCs. Anyway, my question is, what is the best way to distribute updates to this database? Right now I'd need to go around and remove and reinstall. That's not a problem... I was just wondering if there was another way.
I've tried leaving the front end on a network share but it seems that most people suggest storing the front-end on the local machine, which makes sense. The problems I've run into when I leave it on a network share (at least with Access 2003 mdbs) is that I find myself needing to compact and repair often and I also have to kill the open sessions (user's who have the file open) when upgrading. I would imagine it could also hypothetically create an unnecessary bottleneck if the user was not on the local network.
Automating front-end distribution is trivial. It's a problem that has been solved repeatedly. Tony Toews's http://autofeupdater.com is one such solution that is extremely easy to implement and completely transparent to the end user.
We developed a vbscript 'launcher' for our access apps. That is what is linked to on the start menu of user's pcs and it does the following.
It checks a version.txt file located on a network share to see whether it contains different text to a locally stored copy
If the text is different it copies the access mdb and the new version.txt to the user's hard drive.
Finally it runs the mdb in access
In order to distribute an update to the user's pc all that is required is to change the text in version.txt on the network share.
Perhaps you can implement something similar to this
Make a batch file on the server (network drive).
Create a shortcut link to that batch file.
Copy the shortcut to User's Desktop.
When user double-clicks on shortcut, it will copy a fresh copy from network to local.
Replace old database.adp on the server drive when you update a new version.
Each user gets a copy of database.adp on their machine.
Remove Security warning when opening file from network share is here.
Batch File
#ECHO OFF
REM copy from network drive to local
xcopy "Your_Network_Drive\database.adp" "C:\User\database.adp" /Y /R /F
REM call your database file - Access 2007
"C:\Program Files\Microsoft Office\Office12\MSAccess.EXE" "C:\User\database.adp"
This is a very old post and I used the autofeupdater until it stopped working so I wrote one of my own and it has evolved over the last few years into something that I have used with many clients. It's so simple to use and there is no interface. Just an EXE and a very simple config file.
Please check it out here. I can also help with custom solutions if none of the configurations work for your needs. http://www.dafran.ca/MS-Access-Front-End-Loader.aspx
After trying all of the solutions above (not exactly these solutions but these are the common suggestions in the Access community), I developed a system entirely within Access using VBA that allows an admin DB to create and publish objects to client DBs without the need for user intervention or management of multiple DB files.
This approach has several benefits:
1. It simplifies the development process by having a dedicated environment (admin DB) for development and testing totally separate from the client DBs.
2. It simplifies the update/distribution process by allowing a developer to push out updates in real time that client DBs can implement in the background, without involving users. Can also allow devs to roll back to previous versions if desired.
3. It could be used as a kind of change management system within Access for developers who want to commit multiple changes to objects and modules and retain past changes.
4. It allows for easier user access control by allowing an admin to easily assign certain objects to specific users/roles without needing to maintain multiple versions of the DB.
I will hopefully post the code to GitHub soon, I just have to get clearance from my workplace to release it. I will edit this post to include the link when I have.
We have usually kept the Access front ends on network drives, and just put up with the need to compact and repair on a regular basis. You will probably find you need to do that even when they are installed locally, anyway.
If you must have it installed locally, there are various tools which will enable you to "push out" software updates, and the guys over on ServerFault would have more information on those. Assuming such tools aren't available, the only other option I can think of is to write a small loader program that checks the local .MDB against a master copy on the server, and re-copies it across if they are different, before then launching the MDB.