How can we read files from samba server in mule? - mule

I want to read some files from samba server, it looks like smb protocol is not one of the supported protocols by mule components like ftp, sftp etc.. So can anyone share some ideas including example implementation for doing this? Let us assume we want to read a csv file and save it to payload so that it can be used to pass to weave transform component.

Typically, you would mount the share via the operating system and then start the Mule Runtime process in the context of a user/grp that has the proper permissions to rw.

Related

Export file from NetSuite's FileCabinet to FTP

File resides in the NetSuite file cabinet and needs to be placed on an FTP server each day.
I'm not sure how to handle this via Suitelet/RESTlet, or if it's possible - but would prefer to not use an external source/application.
My current and hopefully temporary workaround is a local scheduled task to run a script to pull files from NetSuite & upload to the FTP.
In SuiteScript 2.0, although unsecured FTP is still not support, but SS2.0 has the capability to do SFTP. See http://www.upilioconsulting.com/blog/netsuite-2016-2-sftp-suitescript-2-0/
In SuiteScript 1.0, it's not supported. The workaround is that you'll need to write a middleware code (i.e. in PHP) and let the middleware do the FTP transfer.
Netsuite doesn't interact with FTP.
You need a bridge server of some sort that runs a web app (full blown Apache or nginx running PHP or just a simple Node service)
Just get a server and install some web server/web service and POST your files to it (nlapiRequestURL with a Scheduled script). Have the web app on the bridge server send the files to the FTP server. If you are using Netsuite you can afford the cost of the bridge server.
One possible solution is to create a saved search on the Documents to list out all the files in Netsuite filtering by createdate or lastmodifieddate. Create a scheduler to fetch only the new files and save them locally where you want.
Note all the files will be in base64 encoded string format, you need to decode again to obtain the file.
As bknights said NetSuite doesn't support FTP. You need a web server(any server side language can do for that matter, I have written one in Node.js), to receive the files.
The file content for text file will be in Text format, so, no decode logic required for text files. However, binary/pdf/image and other would be in base64 format, as NetSuite's JS has no way of handling binary data. So, make sure you decode it before you create the file on your FTP Server.

Can I use the File connector in Mule Cloudhub for FTPS

Is there a way to configure a File connector for use in cloudhub, specifically related to reading in a file over FTPS and putting it into a file before beginning the actual processing of the contents?
Clarification:
I'm in cloudhub, which does not provide a filesystem in the same sense that a local/on-prem Mule setup has. One standard practice when dealing with streams (FTPS or similar) in order to avoid processing over the open stream is to take the incoming stream and use the File connector (outbound in this case) to put the inbound stream into a file, and then use that file for your flow process. How is this managed in CloudHub?
File Connector is to read files from paths specified on the server. They cannot be used to read from remote servers.
I case you want to have a File to start your flow with try the following.
<flow name="ftp_reader_flow">
<ftp: inbound> Read from the remote directory
...
<file:outbound> to a local directory
</flow>
<flow name="actual_processing_flow">
<file:inbound> read from the local directory.
... Continue with the processing
.....
</flow>
Hope this helps.
You can use the connector for temporary data with the tmp directory.
From the MuleSoft Documentation:
Disk Persistence
CloudHub does not guarantee that writing to disk survives hardware
failures. Instead, you must use an external storage mechanism to store
information. For small amounts of data, you can use the Object Store.
For applications that have large data storage requirements, we
recommend use of a cloud service such as Amazon S3. For temporary
storage, the File connector is still available and can be used with
the /tmp directory.
You can use File Connector in CloudHub as well, But Make sure your are reading or writing the file from classpath -src/main/resource or any folder from project classpath only.

Writing a file using a different account in Spring Integration

Using Spring Integration file:outbound-channel-adapter, is there a way to specify what user account to use when writing the file. We need to write files from one domain to another. We would like to be able to write them just using file shares, but to do this, we need to be able to log in to the remote box with an account in the remote domain.
We can get around this with FTP, but would like to use file writing.
Thanks
I assume you are talking about windows domains/shares.
There are SMB adapters in the Spring Integration Extensions repository.
It includes a sample configuration file.
You can build it from github or there's a snapshot in the spring snapshot repo.

How to access system properties from a Tomcat app deployed on Cloudbees?

I want to run a Tomcat app in Cloudbees. This app accesses some private and confidential properties from the file system. How could I access a file system on Cloudbees? Please note that it should be highly protected, e.g. 700 or similar.
Regards,
Marco
RUN#Cloud platform don't provide a persistent (nor distributed) filesystem. So you can't use it to as canonical store for those files, but need to use an external file store to match your security requirements, and copy them as application is starting (or lazy-load) to java.io.temp directory. As files are stored on RUN#Cloud there is no security issue as your server instance is fully isolated, and files will be deleted after application undeployed/passivated
So you can use Amazon S3 or comparable to store files
Another option is for you to attach properties to the RUN#Cloud instance as configuration parameters, and access them as System properties. See http://wiki.cloudbees.com/bin/view/RUN/Configuration+Parameters
If they data is modest in size - you could consider using properties - using the CLI you can set them using
bees config:set propertyName=value
you can then access that as a System property (for example) in your application. The properties themselves are stored encrypted by cloudbees.
I've actually moved to OpenShift since then and I solved the problem. Thank you for your answers

Accessing a resource file from a filesystem plugin on SymbianOS

I cannot use the Resource File API from within a file system plugin due to a PlatSec issue:
*PlatSec* ERROR - Capability check failed - Can't load filesystemplugin.PXT because it links to bafl.dll which has the following capabilities missing: TCB
My understanding of the issue is that:
File system plugins are dlls which are executed within the context of the file system process. Therefore all file system plugins must have the TCB PlatSec privilege which in turn means they cannot link against a dll that is not in the TCB.
Is there a way around this (without resorting to a text file or an intermediate server)? I suspect not - but it would be good to get a definitive answer.
The Symbian file server has the following capabilities:
TCB ProtServ DiskAdmin AllFiles PowerMgmt CommDD
So any DLL being loaded into the file server process must have at least these capabilities. There is no way around this, short of writing a new proxy process as you allude to.
However, there is a more fundamental reason why you shouldn't be using bafl.dll from within a fileserver plugin: this DLL provides utility functions which interface to the file servers client API. Attempting to use it from within the filer server will not work; at best, it will lead to the file server deadlocking as it attempts to connect to itself.
I'd suggest rethinking that you're trying to do, and investigating an internal file-server API to achieve it instead.
Using RFs/RFile/RDir APIs from within a file server plugin is not safe and can potentially lead to deadlock if you're not very careful.
Symbian 9.5 will introduce new APIs (RFilePlugin, RFsPlugin and RDirPlugin) which should be used instead.
Theres a proper mechanism for communicating with plugins, RPlugin.
Do not use RFile. I'm not even sure that it would work as the path is checked in Initialise of RFile functions which is called before the plugin stack.
Tell us what kind of data you are storing in the resource file.
Things that usually go into resource files have no place in a file server plugin, even that means hardcoding a few values.
Technically, you can send data to a file server plugin using RFile.Write() but that's not a great solution (intercept RFile.Open("invalid file name that only your plugin understands") in the plugin).
EDIT: Someone indicated that using an invalid file name will not let you send data to the plugin. hey, I didn't like that solution either. for the sake of completness, I should clarify. make up a filename that looks OK enough to go through to your plugin. like using a drive letter that doesn't have a real drive attached to it (but will still be considered correct by filename-parsing code).
Writing code to parse the resource file binary in the plugin, while theoratically possible, isn't a great solution either.