Export file from NetSuite's FileCabinet to FTP - api

File resides in the NetSuite file cabinet and needs to be placed on an FTP server each day.
I'm not sure how to handle this via Suitelet/RESTlet, or if it's possible - but would prefer to not use an external source/application.
My current and hopefully temporary workaround is a local scheduled task to run a script to pull files from NetSuite & upload to the FTP.

In SuiteScript 2.0, although unsecured FTP is still not support, but SS2.0 has the capability to do SFTP. See http://www.upilioconsulting.com/blog/netsuite-2016-2-sftp-suitescript-2-0/
In SuiteScript 1.0, it's not supported. The workaround is that you'll need to write a middleware code (i.e. in PHP) and let the middleware do the FTP transfer.

Netsuite doesn't interact with FTP.
You need a bridge server of some sort that runs a web app (full blown Apache or nginx running PHP or just a simple Node service)
Just get a server and install some web server/web service and POST your files to it (nlapiRequestURL with a Scheduled script). Have the web app on the bridge server send the files to the FTP server. If you are using Netsuite you can afford the cost of the bridge server.

One possible solution is to create a saved search on the Documents to list out all the files in Netsuite filtering by createdate or lastmodifieddate. Create a scheduler to fetch only the new files and save them locally where you want.
Note all the files will be in base64 encoded string format, you need to decode again to obtain the file.

As bknights said NetSuite doesn't support FTP. You need a web server(any server side language can do for that matter, I have written one in Node.js), to receive the files.
The file content for text file will be in Text format, so, no decode logic required for text files. However, binary/pdf/image and other would be in base64 format, as NetSuite's JS has no way of handling binary data. So, make sure you decode it before you create the file on your FTP Server.

Related

How can I transfer Dropbox file data to SQL Table?

I'm getting files four days a week through my Dropbox folder and I need to add that data to my sql server.
In the past I've been using FTP to transfer files, but I'm not sure if FTP will work with Dropbox and I don't know how to do it.
I've had some experience with SSIS in the past and I'm pretty sure that SSIS could do this task, but I'm not able to add integration services extension to my SQL Server.
Does anyone have any idea what would be the easiest way to transfer these files to the database?
There are some third party components that allow you to read from Dropbox:
Kingswaysoft SSIS Dropbox Source Component
CDATA - Dropbox SSIS Components
Or you have to use an HTTP connection manager to download the file using Dropbox api:
http://www.sqlis.com/post/Downloading-a-file-over-HTTP-the-SSIS-way.aspx

How to upload set of all the files in folder on flowForce server using system ftp store method?

I have setup flow force server in my local pc and I was able to run couple of sample jobs. Then I try to setup FTP store job by providing required details. I was also able to upload specific file in to FTP server as well.
But now I need to know whether there is a possibility to upload all the files in particular folder in to target folder in FTP server. Appreciate if someone who have knowledge above flowforce server can share their thoughts about this.

Is it possible to rename a file on FTP server programmatically using cocoa?

My basic requirement is to add a ".temp" suffix to a file while it is getting uploaded on to the FTP server (suffix should be only for time until the file is fully uploaded).
As per my understanding, i thought this could be achieved by: Add a suffix to file on the local machine, then upload it and after the upload is complete, rename the file to remove suffix on server.
But now the another problem is that i could not found a way to rename a file on FTP server using cocoa. I know renaming a file is feasible using Java or other languages but i want to achieve the same in objectiveC.
Please tell be if the above task is feasible and if it is what is the approach i should follow?
FYI: I know how to upload and download a file on FTP server using NSInputStream and NSOutputStream.
Thanks. I'd appreciate any help.
Take a look at CFNetwork FTP docs. They do not mention exactly your case, but there may be a way to sent RNFR and RNTO commands over the stream.
https://developer.apple.com/library/ios/#documentation/Networking/Conceptual/CFNetwork/CFFTPTasks/CFFTPTasks.html#//apple_ref/doc/uid/TP30001132-CH9-SW1
There is also a sample project you can look into:
https://developer.apple.com/library/ios/samplecode/SimpleFTPSample/Listings/Read_Me_About_SimpleFTPSample_txt.html
This library claims to do what you want (rename remote files)
http://www.chilkatsoft.com/ftp-objc.asp

Uploading multiple files given only relative local path

Say I have a user, and that user has an XML file which, among other things, includes the relative (to the XML file) path to one or more images stored on their local machine. I want them to be able to upload this XML file to a web server, and automatically upload the images.
So my XML file might contain:
<tag>Images\img_20120905_015463548.jpg</tag>
and I want to upload both the XML file and img_20120905_015463548.jpg in one operation.
The problem is, as best I can tell, I can't get a local web page to grab the images automatically using JS/jQuery due to the pesky web browser security model that won't allow me to upload arbitrary files off the local computer, or even know the real path of the XML file. After bashing my head against a brick wall for a few hours, I've come up with two possible solutions:
Upload the XML file, the server strips out the image file addresses and asks the user to locate each one. While it would get the job done, it's ugly and error-prone.
Use a batch file (or similar) to copy the XML file and images to a public-facing web server that the user can access on the local network, and then supply the public address of the XML file to my web server. It can then grab the images off the local public server. Problem: my IT department are too competent to allow users file access to public-facing servers. :)
Is there any solution out there I might have missed, that allows the user to upload multiple files given filenames only specified as a relative path?
Thanks in advance. :)
If you are not restricted to a web-only solution, this would be achievable using a plugin or desktop application. For instance, a desktop .NET or Java WebStart application or a signed and therefore trusted Java applet would be able to access the local XML file and any associated image files, then upload them to the web server using a POST, web services or WebDAV.

How to upload files directly to Amazon S3 from a remote server?

Is it possible to upload a file to S3 from a remote server?
The remote server is basically a URL based file server. Example, using http://example.com/1.jpg, it serves the image. It doesn't do anything else and can't run code on this server.
It is possible to have another server telling S3 to upload a file from http://example.com/1.jpg
upload from http://example.com/1.jpg
server -------------------------------------------> S3 <-----> example.com
If you can't run code on the server or execute requests then, no, you can't do this. You will have to download the file to a server or computer that you own and upload from there.
You can see the operations you can perform on amazon S3 at http://docs.amazonwebservices.com/AmazonS3/latest/API/APIRest.html
Checking the operations for both the REST and SOAP APIs you'll see there's no way to give Amazon S3 a remote URL and have it grab the object for you. All of the PUT requests require the object's data to be provided as part of the request. Meaning the server or computer that is initiating the web request needs to have the data.
I have had a similar problem in the past where I wanted to download my users' Facebook Thumbnails and upload them to S3 for use on my site. The way I did it was to download the image from Facebook into Memory on my server, then upload to Amazon S3 - the full thing took under 2 seconds. After the upload to S3 was complete, write the bucket/key to a database.
Unfortunately there's no other way to do it.
I think the suggestion provided is quite good, you can SCP the file to S3 Bucket. Giving the pem file will be a password less authentication, via PHP file you can validate the extensions. PHP file can pass the file, as argument to SCP command.
The only problem with this solution is, you must have your instance in AWS. You can't use this solution if your website is hosted in other Hosting Providers and you are trying to upload files straight to S3 Bucket.
Technically it's possible, using AWS Signature Version 4, Assuming your remote server is the customer in the image below, you could prepare a form in the main server, and send the form fields to the remote server, for it to curl it. Detailed example here.
you can use scp command from Terminal.
1)using terminal, go to the place where there is that file you want to transfer to the server
2) type this:
scp -i yourAmazonKeypairPath.pem fileNameThatYouWantToTransfer.php ec2-user#ec2-00-000-000-15.us-west-2.compute.amazonaws.com:
N.B. Add "ec2-user#" before your ec2blablbla stuffs that you got from the Ec2 website!! This is such a picky error!
3) your file will be uploaded and the progress will be shown. When it is 100%, you are done!