Couldn't find an example anywhere, but can you add conditionals inside a cloud-init file to do this if a particular value, else that?
Related
While developing transformations on local I set my transformation path to the target folders that are presented in Local PC and Once testing is completed on local I am moving our transformation to server repository to schedule it from server environments but every time I require to change the path set to the server folders. I believe it can be done by creating dynamic path or creating any variable but I am unable to resolve it. Is this option available in Pentaho? if Yes, Can you please help me for setting the dynamic path?
In This answer there is a link to a described solution, and in the answer i have a sample KTR that should help.
You can also use the pentaho properties file in different environments, meaning, you can utilize the same variable in both environments, say ${path}, but in each environment this has a different value.
kettle.properties can be found in your user folder .. C:\Users\YourUser.kettle
The standard way to handle environments in Kettleis with variables.
In the home directory there is a (hidden) folder named .kettle which contains every thing that should be local : your preferences, your shared connections, your cache, and, most of all, THE kettle.property file.
You can define variables in it, like a ${myPath}. To do this, use the menu Edit/Edit the Kettle.properties and add a variable named myPath and give it for value your prefered path, with an optional description.
Then, when you see a blue diamond with a $ on the right of a field in a step window (which means almost any field you'll need), you can press Crtl+Enter in the field and choose any variable defined in your kettle.properties. Alternatively, you may type or copy/paste ${your-variable-name} in the field.
Then, when launching spoon, it will not use the hard-coded path, but the content of the variable in the kettle.properties.
And nothing prevent you from having a different kettle.properties on your dev PC and on the prod server.
While we are there, three usefull tricks.
There is a predefined ${Internal.Job.Filename.Directory} variable contaning the path of the current transformation which by used for relative path. For example, ${Internal.Job.Filename.Directory}/../myDir/myFile.ext.
If you right-click anywhere on the screen, and go to the Properties/Parameters, you may also define your variable here.
You may also redefine these variables in the Run Option window that anoys you each time you rune a transformation (yes, there was a reason).
Finally, you can send these variables from job to jobs and transfos.
I want to use a property ('currentId') which has a certain start value. For each test case the value should be increased by 1. I can do that by adding an extra test step in each test case which increases the value but that would be much copy paste. The code for that would be (see reference):
def uniqueUserPortion = testRunner.testCase.testSuite.project.getPropertyValue("currentId")
// convert it to an Integer, and increment
def uniqueUserPortionInc = uniqueUserPortion.toInteger() + 1
// set the property back as string
testRunner.testCase.testSuite.project.setPropertyValue("currentId", uniqueUserPortionInc.toString())
To avoid that copy&paste I've added the code above to the Load Script of the project but it doesn't work:
testSuite.testCases.each {
*code above*
}
What can I do to use the code in one script/call for all test cases?
I could define the property as the start value plus the test case ID but that would be a definition in each test case again since I can not reference the #TestCase#ID in project level/property.
Issue with what your are trying
Load Script of the project is executed once when you import the project into soapui workspace. So, this approach does not work.
As you rightly mentioned, either you need to have it in a separate step of the each test case or you can add the same code as setup script. Yes, it is copy paste only
It is possible to achieve easily using SoapUI NG which pro edition using Event feature.
Then your next question may be : how to do it in Open Source edition of SoapUI.
Here is an soapuiExtensions which I did sometime ago which allows you do the same without having to copy paste for each test case in open source edition.
All you need do is have your groovy script into a specific file called 'TestCaseBeforeRun.groovy'. That means, the script is executed before running each test case.
For more details refer README
This soapuiExtensions library allows users to have some additional functionality in soapUI(free edition) tool, like soapui pro allows to do something before, after doing something.
For eg: User may want to do something before running a test case or after running a test case etc by implementing appropriate groovy script as required. Allow me to add an example here. Usually user may want to add credentials for the request step automatically, see the script samples/scripts/TestSuiteTestStepAdded.groovy
How to use this library:
set SOAPUI_HOME environment variable.
copy lib/SoapUIExtListeners.jar file under $SOAPUI_HOME/bin/ext directory
copy samples/listeners/custom-listeners.xml file under $SOAPUI_HOME/bin/listeners directory
copy samples/scripts directory under $SOAPUI_HOME
And implement appropriate groovy script available under $SOAPUI_HOME/scripts. Refer Mappings file in order to implement respective groovy script.
Note: for windows users, you may need to check %SOAPUI_HOME%\bin\soapui.bat which actually overwrites SOAPUI_HOME, need to fix soapui.bat script if requires.
Uses jdk 7, soapUI 4.5.1, and groovy 1.8.9
Dependency
log4j
UPDATE: this is realted to the note in the above.
As it was mentioned in the note, soapui.bat overrides SOAPUI_HOME environment variable on windows, needs to be tweaked a bit. May be you want to copy that groovy file under %SOAPUI_HOME%\bin\scripts (this is without tweaking soapui.bat)and retry. If your machine is linux then it should work if you copy the groovy file under $SOAPUI_HOME/scripts directory
I have a situation where I would like to ignore specific folders inside of where Flyway is looking for the migration files.
Example
/db/Migration
2.0-newBase.sql
/oldScripts
1.1-base.sql
1.2-foo.sql
I want to ignore everything inside of the 'oldScripts' sub folder. Is there a flag that I can set in Flyway configs like ignoreFolder=SOME_FOLDER or scanRecursive=false?
An example for why I would do this is say, I have 1000 scripts in my migration folder. If we onboard a new member, instead of having them run the migration on 1000 files, they could just run the one script (The new base) and proceed from there. The alternative would be to never sync those files in the first place, but then people would need to remember to check source control to prior migrations instead of just looking on their local drive.
This is not currently supported directly. You could put both directories at the same level in the hierarchy (without nesting them) and selectively configure flyway.locations to achieve the same thing.
Since Flyway 6.4.0 wildcards are supported in flyway.locations. Examples:
db/**/test
db/release1.*
db/release1.?
More info at https://flywaydb.org/blog/organising-your-migrations
When we run a *.vbs file, in processes, we used to get "wscript.exe". We can change this "wscript.exe" to our custom name by creating a shortcut and executing the shortcut.
Is it possible to display the current *.vbs file name in process, without using shortcuts?
No. Your script is running in an interpreter, and it's the interpreter executable name that is being displayed in the process list.
While it's not impossible to change the process name, you'd need admin privileges to be able to do this, and you'd need to rewrite the interpreter (wscript.exe) to actually do it. See this answer to a similar question.
Okay, I'll try to explain as good as I can... Quite a particular case.
Tools: SSIS 2008
We have a control flow that now needs to be triggered by an event: the presence of one or multiple files. (1,2 or 3)
The variables used:
BO_FileLocation_1
BO_FileLocation_2
BO_FileLocation_3
BO_FileName_1
BO_FileName_2
BO_FileName_3
There can be one, two or three files: defined in above variables. When they are filled in,
they should be processed. When they are empty, this means there's just one file file, the process should ignore them and jump to the next (file watcher?) task.
For example:
BO_FileLocation_1= "C:\"
BO_FileLocation_2 NULL
BO_FileLocation_3 NULL
BO_FileName_1= "test.csv"
BO_FileName_2 NULL
BO_FileName_3 NULL
The report only needs one file.
I'd need a generic concept that checks the presence of these files, it could be more generic than my SSIS knowledge can handle right now. For example handy, when there's a 4th file in the future. I was also thinking to work with a single script to handle all the logic.
Thanks in advance
A possibly irrelevant image:
If all you want is to trigger the Copy Source File to handle if one or more of the files is present, just use the OR Constraint in your flow. The following image shows you how:
First connect all to the destination:
Then click one of the green arrows. This will make its properties window pop up. Select the Logical ORinstead of the Logical AND:
If everything went well, you should now see the connections as dashed lines:
There are several possible solutions:
Create a sequence container and include all the file imports in the sequence container. Add int variables for RowCountFile1, RowCountFile2, and RowCountFile3 and set the value to 0 (this is the default value when you create an int variable). Add a RowCount transformation to each of the data flows. Create a precedence constraint from the sequence container to the "Do something" task. Set the precedence constraint to success and expression. Set the expression value to #RowCountFile1 > 0 || #RowCountFile2 > 0 || #RowCountFile3 > 0. The advantage of this approach is that you can take an action as soon as the files are detected, you import all available files, and you only take an action after all the files have been imported. You could then schedule running this SSIS package as a SQL Server Agent job step and run it as frequently as you want.
A variant on solution 1 is to use for each file enumerator containers inside the sequence container. This would be useful if you don't know the exact name of the file and you expect to import more than one under some circumstances. For instance, if you get a file every few minutes with a timestamp in its file name and your process doesn't run for some reason, then you may have to process multiple files to get caught up and then take an action once it has been done.
You could use the file watcher task as you outlined in your question. The only problem I have with the file watcher task is that the package has to be in a constantly running state. This makes it hard to troubleshoot problems and performance. It also can introduce other problems since I remember having some problems with the file watcher task years ago when it first came out. It may well be a totally stable task now, but I prefer other methods over the task after having been burned previously. If you really want the package to run continously instead of having it be called by a job, then you could always use a script task to check for file, sleep thread if not found, check again, etc. I'm sure that's what the file watcher task does, but I would trust my own C# over the task. Power to anyone who has had better experiences than me with File Watcher...
Use PowerShell. If you just want to take an action if a file appears and you aren't importing the data, then a PowerShell script could do this just as well as a SSIS package. The drawback is that you have to learn some basic PowerShell, it may be hard to maintain in the future since PowerShell is probably not your bread and butter core language, and you may have to rewrite the code again to a SSIS package if you want to import the data. You would probably call the PowerShell script from a SQL Server Agent job step, so scheduling can be handled pretty easily.
There are more options than what I listed, so let me know if you still want more suggestions.