How to execute jar file from pentaho / ETL - pentaho

I need to run a jar file from pentaho ETL . I have placed my Testvmarguments.jar file in
server/data-integration-server/tomcat/lib
. Created job,with shell script step to execute the jar file
Below is my log:-
INFO 02-01 17:05:59,002 - ImageImporter - Start of job execution
INFO 02-01 17:05:59,007 - ImageImporter - Starting entry [Shell]
INFO 02-01 17:05:59,008 - Shell - Running on platform : Linux
INFO 02-01 17:05:59,008 - Shell - Executing command : /home/Myname/MyFolder/dummy.txt
INFO 02-01 17:05:59,014 - Shell - (stderr) Unable to access jarfile Testvmarguments.jar INFO 02-01 17:05:59,015 - ImageImporter - Finished job entry [Shell] (result=[false])
INFO 02-01 17:05:59,015 - ImageImporter - Job execution finished
INFO 02-01 17:05:59,017 - Kitchen - Finished! ERROR 02-01 17:05:59,017 - Kitchen - Finished with errors
INFO 02-01 17:05:59,017 - Kitchen - Start=2014/01/02 17:05:56.504, Stop=2014/01/02 17:05:59.017
INFO 02-01 17:05:59,017 - Kitchen - Processing ended after 2 seconds.
Can some one help to overcome above error .
jar file in DI environment..!!!
Please find the attachment at link
I reason for using shell script is to execute a jar with runtime parameters .
java -Dfilepath=/home/Myfolder/Myname/Test -Dname=Myname -jar
Testvmarguments.jar
Here is my piece of code
package com.alliance.test;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileWriter;
public class TestCommandLine
{
public static void main(String[] args)
throws Exception
{
String filename = null;
String employeeName = null;
if (System.getProperty("filepath") != null) {
filename = System.getProperty("filepath");
}
if (System.getProperty("name") != null) {
employeeName = System.getProperty("name");
}
File file = new File(filename + "Test.txt");
if (!file.exists())
file.createNewFile();
FileWriter fw = new FileWriter(file);
BufferedWriter out = new BufferedWriter(fw);
out.write(filename + "\n");
out.write(employeeName);
out.close();
fw.close();
}
}
Thanks,
Surya

Don't call it as an external process as then your starting a new vm unnecessary. Instead just add jar to lib folder and call it from javascript step or udjc whichever

Try after putting jar inside the directory folder of data-integration(server/data-integration-server/tomcat)

I copied jar to libext folder.
used following statement in JS step
var testval2 = org.wtc.pentaho.PentahoSample.testSample("surya");

Related

Karate: How to test multipart form-data endpoint? [duplicate]

This question already has an answer here:
How to upload CSV file as a post request in Karate 0.9.0 version?
(1 answer)
Closed 2 years ago.
I have an file upload endpoint (/document) in a controller defined as follows:
#RestController
public class FileUploadController {
#Autowired
private PersonCSVReaderService personCSVReaderService;
#PostMapping(value = "/document", consumes= {MediaType.MULTIPART_FORM_DATA_VALUE})
public void handleFileUpload3(#RequestPart("file") MultipartFile file, #RequestPart("metadata") DocumentMetadata metadata) {
System.out.println(String.format("uploading file %s of %s bytes", file.getOriginalFilename(), file.getSize()));
personCSVReaderService.readPersonCSV(file, metadata);
}
}
I can test this endpoint using Advanced Rest Client (ARC) or Postman by defining the "file" part referencing the people.csv file and a text part specifying some sample metadata JSON.
Everything works fine and I get a 200 status back with the people.csv file contents being written to the console output by the service method:
uploading file people.csv of 256 bytes
{Address=1, City=2, Date of Birth=6, Name=0, Phone Number=5, State=3, Zipcode=4}
Person{name='John Brown', address='123 Main St.', city='Scottsdale', state='AZ', zipcode='85259', phoneNumber='555-1212', dateOfBirth='1965-01-01'}
Person{name='Jan Black', address='456 University Dr.', city='Atlanta', state='GA', zipcode='30306', phoneNumber='800-1111', dateOfBirth='1971-02-02'}
Person{name='Mary White', address='789 Possum Rd.', city='Nashville', state='TN', zipcode='37204', phoneNumber='888-2222', dateOfBirth='1980-03-03'}
Now, I want to run this as an automated Karate test. I have specified a MockConfig :
#Configuration
#EnableAutoConfiguration
#PropertySource("classpath:application.properties")
public class MockConfig {
// Services ...
#Bean
public PersonCSVReaderService personCSVReaderService() {
return new PersonCSVReaderService();
}
// Controllers ...
#Bean
public FileUploadController fileUploadController() {
return new FileUploadController();
}
}
I also have a MockSpringMvcServlet in the classpath and my karate-config.js is :
function fn() {
var env = karate.env; // get system property 'karate.env'
if (!env) {
env = 'dev';
}
karate.log('karate.env system property was:', env);
var config = {
env: env,
myVarName: 'someValue',
baseUrl: 'http://localhost:8080'
}
if (env == 'dev') {
var Factory = Java.type('MockSpringMvcServlet');
karate.configure('httpClientInstance', Factory.getMock());
//var result = karate.callSingle('classpath:demo/headers/common-noheaders.feature', config);
} else if (env == 'stg') {
// customize
} else if (env == 'prod') {
// customize
}
return config;
}
Other karate tests run ok using the mock servlet.
However, when I try this test to test the /document endpoint:
Feature: file upload end-point
Background:
* url baseUrl
* configure lowerCaseResponseHeaders = true
Scenario: upload file
Given path '/document'
And header Content-Type = 'multipart/form-data'
And multipart file file = { read: 'people.csv', filename: 'people.csv', contentType: 'text/csv' }
And multipart field metadata = { name: "joe", description: "stuff" }
When method post
Then status 200
I get this error:
16:14:42.674 [main] INFO com.intuit.karate - karate.env system property was: dev
16:14:42.718 [main] INFO o.s.mock.web.MockServletContext - Initializing Spring DispatcherServlet ''
16:14:42.719 [main] INFO o.s.web.servlet.DispatcherServlet - Initializing Servlet ''
16:14:43.668 [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$a4c7d08f] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
16:14:43.910 [main] INFO o.h.validator.internal.util.Version - HV000001: Hibernate Validator 6.0.14.Final
16:14:44.483 [main] INFO o.s.s.c.ThreadPoolTaskExecutor - Initializing ExecutorService 'applicationTaskExecutor'
16:14:44.968 [main] INFO o.s.b.a.e.web.EndpointLinksResolver - Exposing 2 endpoint(s) beneath base path '/actuator'
16:14:45.006 [main] INFO o.s.web.servlet.DispatcherServlet - Completed initialization in 2287 ms
16:14:45.066 [main] INFO c.i.k.mock.servlet.MockHttpClient - making mock http client request: POST - http://localhost:8080/document
16:14:45.085 [main] DEBUG c.i.k.mock.servlet.MockHttpClient -
1 > POST http://localhost:8080/document
1 > Content-Type: multipart/form-data
16:14:45.095 [main] ERROR com.intuit.karate - http request failed: null
file-upload.feature:13 - null
HTML report: (paste into browser to view) | Karate version: 0.9.2
I can only assume that the arguments did not conform to what my endpoint was expecting - I never entered the endpoint in debug mode.
I tried this:
And multipart file file = read('people.csv')
And multipart field metadata = { name: "joe", description: "stuff" }
But that was a non-starter as well.
What am I doing wrong? The people.csv is in the same folder as fileupload.feature, so I am assuming it is finding the file. I also looked at upload.feature file in the Karate demo project given here:
Karate demo project upload.feature
But I could not make it work. Any help appreciated. Thanks in advance.
The Postman request looks like this:
EDIT UPDATE:
I could not get the suggested answer to work.
Here is the feature file:
Feature: file upload
Background:
* url baseUrl
* configure lowerCaseResponseHeaders = true
Scenario: upload file
Given path '/document'
And header Content-Type = 'multipart/form-data'
* def temp = karate.readAsString('people.csv')
* print temp
And multipart file file = { value: '#(temp)', filename: 'people.csv', contentType: 'text/csv' }
And multipart field metadata = { value: {name: 'joe', description: 'stuff'}, contentType: 'application/json' }
When method post
Then status 200
And here is the console output from running that test:
09:27:22.051 [main] INFO com.intuit.karate - found scenario at line: 7 - ^upload file$
09:27:22.156 [main] INFO com.intuit.karate - karate.env system property was: dev
09:27:22.190 [main] INFO o.s.mock.web.MockServletContext - Initializing Spring DispatcherServlet ''
09:27:22.190 [main] INFO o.s.web.servlet.DispatcherServlet - Initializing Servlet ''
09:27:23.084 [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$a4c7d08f] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
09:27:23.327 [main] INFO o.h.validator.internal.util.Version - HV000001: Hibernate Validator 6.0.14.Final
09:27:23.896 [main] INFO o.s.s.c.ThreadPoolTaskExecutor - Initializing ExecutorService 'applicationTaskExecutor'
09:27:24.381 [main] INFO o.s.b.a.e.web.EndpointLinksResolver - Exposing 2 endpoint(s) beneath base path '/actuator'
09:27:24.418 [main] INFO o.s.web.servlet.DispatcherServlet - Completed initialization in 2228 ms
09:27:24.435 [main] INFO com.intuit.karate - [print] Name,Address,City,State,Zipcode,Phone Number,Date of Birth
John Brown,123 Main St.,Scottsdale,AZ,85259,555-1212,1965-01-01
Jan Black,456 University Dr.,Atlanta,GA,30306,800-1111,1971-02-02
Mary White,789 Possum Rd.,Nashville,TN,37204,888-2222,1980-03-03
09:27:24.482 [main] INFO c.i.k.mock.servlet.MockHttpClient - making mock http client request: POST - http://localhost:8080/document
09:27:24.500 [main] DEBUG c.i.k.mock.servlet.MockHttpClient -
1 > POST http://localhost:8080/document
1 > Content-Type: multipart/form-data
09:27:24.510 [main] ERROR com.intuit.karate - http request failed: null
file-upload.feature:14 - null
HTML report: (paste into browser to view) | Karate version: 0.9.2
Note: people.csv file reads successfully and prints in console.
Refer to this part of the docs: https://github.com/intuit/karate#read-file-as-string
So make this change:
* def temp = karate.readAsString('people.csv')
And multipart file file = { value: '#(temp)', filename: 'people.csv', contentType: 'text/csv' }
EDIT: my bad, also refer: https://github.com/intuit/karate#multipart-file
Feature: upload csv
Background: And def admin = read('classpath:com/project/data/adminLogin.json')
Scenario:
Given url baseUrl
And header Authorization = admin.token
And multipart file residentDetails = { read:'classpath:com/project/data/ResidentApp_Details.csv', filename: 'ResidentApp_Details.csv' }
When method POST
Then status 200
Note: Add only one extra line i.e And multipart file residentDetails = { read:'classpath:com/project/data/ResidentApp_Details.csv', filename: 'ResidentApp_Details.csv' }

mbrola Binary for linux CentOS

I am trying to use mbrola binary on CentOS box. I tried many binary listed on below page but none is working.
http://www.tcts.fpms.ac.be/synthesis/mbrola/mbrcopybin.html
I am getting following error -
Processing Utterance: com.sun.speech.freetts.ProcessException: Cannot start mbrola program:
I believe this is most likely incompatible binary for CentOS.
Can you please tell me if there is a binary available for CentOS ?
Code -
public static void createAudioFile(String text, String fileName) {
AudioPlayer audioPlayer = null;
//System.setProperty("freetts.voices", "com.sun.speech.freetts.en.us.cmu_time_awb.AlanVoiceDirectory");
System.setProperty("mbrola.base", Constants.mbrolaDiskPath);
Voice voice;
VoiceManager vm = VoiceManager.getInstance();
voice = vm.getVoice("mbrola_us1");
voice.allocate();
try{
String directoryPath = audioDir+fileName;
audioPlayer = new SingleFileAudioPlayer(directoryPath,Type.WAVE);
voice.setAudioPlayer(audioPlayer);
voice.speak(text);
voice.deallocate();
audioPlayer.close();
}
catch(Exception e){
e.printStackTrace();
}
}
I found Mbrola binary for CentOs from following location -
http://rpm.pbone.net/index.php3/stat/4/idpl/30430620/dir/centos_7/com/mbrola-301h-7.1.x86_64.rpm.html#content
Steps to follow -
1. Download the following rpm
ftp.gwdg.de mbrola-301h-7.1.x86_64.rpm
run > rpm -ivh mbrola-301h-7.1.x86_64.rpm. This will install mbrola binary into /usr/bin.
Copy /usr/bin/mbrola to your preferred location and set mbrola.base to it as - System.setProperty("mbrola.base", Constants.mbrolaDiskPath);
done.

How do you include multiple activities in the Gradle Liquibase plugin's runList field?

I’m using Gradle 2.7 on Mac Yosemite with Java 8. I’m using the Liquibase 1.1.1 plugin and would like to use it to do a couple of activities (build a test database and build my normal database). So I have
liquibase {
activities {
main {
File propsFile = new File("${project.rootDir}/src/main/resources/liquibase.properties")
Properties properties = new Properties()
properties.load(new FileInputStream(propsFile))
changeLogFile 'src/main/resources/db.changelog-master.xml'
url properties['url']
username properties['username']
password properties['password']
}
test {
url 'jdbc:h2:file:target/testdb'
username 'sa'
}
runList = (
"test"
"main"
)
}
}
But I can’t figure out the proper syntax for runList. I get the error when running the above …
* Where:
Build file '/Users/myuser/Dropbox/cb_workspace/cbmyproject/build.gradle' line: 163
* What went wrong:
Could not compile build file '/Users/myuser/Dropbox/cb_workspace/cbmyproject/build.gradle'.
> startup failed:
build file '/Users/myuser/Dropbox/cb_workspace/cbmyproject/build.gradle': 163: expecting ')', found 'main' # line 163, column 2.
"main"
^
1 error
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
According to one of there example, the runList must be placed after the activity block:
liquibase {
activities {
main {
File propsFile = new File("${project.rootDir}/src/main/resources/liquibase.properties")
Properties properties = new Properties()
properties.load(new FileInputStream(propsFile))
changeLogFile 'src/main/resources/db.changelog-master.xml'
url properties['url']
username properties['username']
password properties['password']
}
test {
url 'jdbc:h2:file:target/testdb'
username 'sa'
}
}
runList = 'test, main'
}
See the example here.
Hope this helps.

Puppet - Error: Could not retrieve catalog; skipping run

when I try to connect the puppet agent with puppet agent --test, I have this error :
Info: Retrieving plugin
Error: Could not retrieve catalog from remote server: Error 400 on SERVER :Could not find class <my_module> for <my_agent> on node <my_agent>
Warning: Not using cache on failed catalog
Error: Could not retrieve catalog; skipping run
I have import nodes on sites.pp and include <my_module> on nodes.pp
--edit--
Content of sites.pp :
import "nodes"
filebucket { main: server => "<my_master>" }
File { backup => main }
Exec { path => "/usr/bin:/usr/sbon:/bin:/sbin" }
Content of nodes.pp :
node "<my_agent>" {
include <my_module>
}
--edit--
What is the real problem ?
Thanks
I have created another VM, and that's working now ! =)
Maybe I have taken a mistake in the network configuration.

IncludeDirectory generate error Invalid file name for file monitoring

Code
var styleBundle = new StyleBundle("~/Content/Common") { Orderer = new FileBundleOrderer(server.MapPath("/Content/bundle.txt")) }
.IncludeDirectory("~/Content", "*.css", false);
Error message:
Invalid file name for file monitoring: '{ProjectPath}\Content'. Common
reasons for failure include:
- The filename is not a valid Win32 file name.
- The filename is not an absolute path.
- The filename contains wildcard characters.
- The file specified is a directory.
- Access denied.
'{ProjectPath}\Content' Content directory exist in project directory!!
Package version:
Microsoft.AspNet.Web.Optimi... 1.1.0