Downloading only new files in GoodData - gooddata

How can I use the "Download File" component to only download new files or files that have been updated remotely?
Consider a graph like the following:
where File Download is defined as:
I have many *.csv files in ${S3_OR_DATA_DIR_LOCATION}; I'm adding one every day).
How can I make sure GoodData only downloads new files AND files that have been updated? Would making the option "Overwrite existing files" False do it? Or would that only download new files and not update existing files that have been updated?

The File Download CloudConnect component by itself does not support action as downloading only a new file(s), which appeared in the source folder as it does not have any previous state remembering mechanism implemented, but as it has input port, it is possible to implement such mechanism yourself with using of File List CloudConnect component with a little help of Reformat, some Joiner and CSV Writer CloudConnect components. This way you can determine the content of the source folder and write it there in a plain text file. The mechanism can be designed the way, that the next processing would read the state file from the previous run and determine, what a new files are and then sends a list of a new files to the File Download CloudConnect component’s input port.
The another way how to process only a new files, which is way simpler then the process described in the previous article and therefore commonly used, is based on taking advantage of folder structures in the source folder, where there would be a dedicated folder for a new files and another dedicated folder for already processed files. The CloudConnect ETL process itself would then read a new files from its dedicated source folder and the last stage of the ETL process would contain File Copy/Move CloudConnect component used for transferring of the already processed new files from its dedicated folder to folder containing all already processed files.

Related

How to use "Copy Files to:" task two time in same pipeline in Azure Devops

I have created one Azure pipeline in which i have added two "Copy Files to" task to copy reports from two different folders. But second one is not working it copy previous task only.
First you need to check if the reports files are generated and which directory the reports are located from the build log.
If Copy files task failed to copy the files. Most likely the Source Folder or the File patterns in the Contents are wrongly specified, which causes the files cannot be found.
So you need to check if the files you want to copy resides in Source Folder you specified. And make sure the file paths match the patterns you specified in the Contents field. See below screenshot example.
Check here to learn more about the usage of Copy files task.

How to overwrite Files for Update Procedure

I'm trying to create an updater for my app in VB.NET, No, I do not want to use clickonce, it sucks because I have to deal with managing self signed certs etc.
I know the code to check for new update files:
http://pastebin.com/ZjYBWABu
I also know the code for specifying where those files download to, the issue is I dont want to just download 1 .exe...I want to download all the latest build files which I would have uploaded to my server, which i would have taken from my Bin\release folder of my project.
Then when the updater downloads the files to a directory, it would go to the directory of the application, and somehow overwrite/replace all the files that have changed...maybe by using a hash or something?
I do not know how to proceed with this. What I do know is this.
The updater and the main app would have to be separate so that the updater could do the replacing while the app is closed so it doesn't get file in use errors. After the updater app has finished it would then start up the main app from the new exe.
Would appreciate help here thank you guys.
I am currently working on a project for which I have to implement a similar approach for updates. The project is lengthy, it would take some time to finish. But this is how I have planned to apply the updates:
There will be two main parts of the application Launcher (main application program) and Updater (To download files from server and replace them with the new ones and then launch the new file)
The application will have the option to manually check for update and also to check for update on startup.
If an update is available, it asks the user to apply the update now or later.
If the user selects to apply the update now then Updater application is executed in a separate process and then Launcher application is closed from within the code in Launcher. I have following approaches in my mind to launch another program from within first one and then exit:
Execute the Updater directly from within the Launcher using Process.Start
If that causes problem then as second approach launch command prompt from Process.Start, execute another program (Updater) from command prompt, close the command prompt and then exit the Launcher.
The Updater application then downloads all the relevant files from the server and upon completion old application files are replaced with the new ones.
Update availability information from server will include the new Version_No of application. For the purpose of providing all files for update, I will compress (zip) all of them in a single file named as Application.Version_No (as given by the server).
Upon download completion decompress (unzip) them to a folder named as the same Application.Version_No.
After decompressing all the files in this (Application.Version_No) folder will be copied to the Bin folder of application.
The new application Launcher file is executed in a separate process and Updater application is closed from within the code in Updater.
I have NOT yet tried this scenario as currently my focus is on completing the main application, but surely this must work.
UPDATE:
Another approach to check for updates is to use a bootstrap like application startup. It will be the main entry point of the program. Upon execution it will check for the updates and if there is none the Launcher is executed otherwise it will download the files, replace the old ones and then execute the new / updated Launcher.
For copying / overriding the files
One approach is to include only those files in the compressed (zip) file which are required to be replaced with the old ones and then after the download completes, either directly decompress them to the Bin folder or decompress them to a designated folder and then copy all of them to the Bin folder.
As another approach which seems somewhat lengthy, an additional helper file (XML, text or any other format) could be prepared for the download.
This helper file contains information of updated files like version number of each file, location where these are to be copied etc.
The files may be downloaded to a specific folder named as the new application version.
After downloading all the required files to a specific folder process each file mentioned in the helper file. Compare version of every old file with the new downloaded file. If it is latest then replace it in the folder mentioned in the helper file.
As another step in between all the downloads may be verified prior to copying and replacing.
Built an updater that ships with a daemon. Main project here:
https://github.com/UVLabs/dotNetUpdatify
There should be a way to eliminate the use of the daemon, if i figure it out i will update.

Flowgear access to files on the local file system

I am creating a Flowgear workflow that needs to process a raft of XML data.
I have the xml data contained in a set of .xml files (approximately 400 files) in a folder on my local machine hard-drive and I want to read them into a workflow, run an XSLT transform and then write out the resultant XML to another folder on the same local hard-drive.
How do I get the flowgear workflow to read these files?
It depends on the use case, the File Enumerator works exceptionally well to loop (as in for-each) through each file. Sometimes, one wants to get a list of files in a particular folder and check whether a file has been found or not. For this, I would recommend a c# script to get a list of files with code:
Directory.GetFiles(#"{FilePath}", "*.{extension}", SearchOption.TopDirectoryOnly);
Further on, use the File node to read, write, or delete files from a file directory.
NB! You will need to install a DropPoint on the PC/Server to allow access to the files. For more information regarding Drop Points, please click here
You can use a File Enumerator or File Watcher to read the files up. The difference is that a File Enumerator will enumerate all files in a folder once, the File Watcher will watch a folder indefinitely and provide new files to the workflow as they are copied into the folder.
You can then use the File node to write the files back the the file system.

How to upload files to the "SuiteScripts" folder from Netsuite IDE (Eclipse)

I am fairly new to NetSuite and NetSuite scripting. My company has several dozen script files already in the NetSuite File Cabinet, under the default SuiteScripts folder. Also, I am using the SuiteCloud IDE, which is just basically Eclipse with a NetSuite plugin. This way I can download all of the scripts into a single SuiteCloud IDE project, work on them locally, and then upload them back to the server for testing.
When you create a new NetSuite project, one of the project settings is File Cabinet Folder. This defaults to a subdirectory under "SuiteScripts" with the same name as your project. For example, if your project is called "MyScripts", the default will be SuiteScripts/MyScripts. You can of course change this, but it is impossible to just specify the SuiteScripts folder alone, as I get an error saying "File Cabinet folder must have 2 segments." However, the existing scripts all live under SuiteScripts (no subdirectory). Any file that I upload to the server, whether it be a new file that I created locally or even a previously downloaded file that already exists in the File Cabinet, will end up in SuiteScripts/MyScripts. This can be hugely problematic, causing dupes and all kinds of other nastiness. Anyone have any experience with this? Thanks.
Yes, NetSuite has decided to limit the uploading functionality to subfolders of SuiteScripts. If I had to guess, their intention there is to force you to place your scripting projects in their own folders so that the SuiteScripts folder itself does not get cluttered with scripts.
You can specify a subfolder of SuiteScripts with any name; it does not have to be the name of your Eclipse project. You have a couple options, depending on how you want your files to be organized in Eclipse and in the File Cabinet.
The way we typically do it is to create a single folder that will house all of our scripts, call it SuiteScripts/Projects/. In the file cabinet, we create this Projects folder under SuiteScripts. In Eclipse's NetSuite Project Settings, we map our Eclipse project to SuiteScripts/Projects. In our Eclipse project, we group related source files logically into folders, like iPad Integration or Approval Process. Then we upload to the File Cabinet, and now we have a nice folder structure of organized scripts, something like:
SuiteScripts
Projects
iPad Application
iPadScript.js
iPadRESTlet.js
Approval Process
SalesOrderApproval.js
PurchaseOrderApproval.js
We have much more detailed naming standards for our files, but you get the picture.
My recommendation is to create a new folder in your SuiteScripts folder and move all existing scripts into there using the File Cabinet's "Move" button. Then, map your SuiteCloud Project to that new folder and upload/download as needed.
I agree with the erictgrubaugh's solution and I've been following stoic software's tutorials. But steavepoll if you want to change it for only one script then you can follow these steps:
Create new SuiteCloud Project under the same folder which you are targeting
Edit into mainfest.xml file(right click->NetSuite->Add Dependency References to Manifest)
Validate Project against Account
Deploy
It worked in my case

How to store files in an EXE file via vb 2010 app

I need to store files in an EXE file via VB 2010 app. I mean, lets say I made a software called setmaker and one that is called setup.exe. I want setmaker to store some files that you choose in setup.exe, and then when you run setup.exe it reads the files that you stored in it and extracts it to a location specified by you.
The following post How to copy file From Resources? discusses a method for copying a file that is an Embedded resource in an EXE to a folder on disk.
Hope that helps.