How to set a starting point in a job? - pentaho

The transformation file, ending in .ktr, runs with no problem:
2018/07/10 15:40:54 - Spoon - Transformation opened.
2018/07/10 15:40:54 - Spoon - Launching transformation [FullCoverageTestCaseSalesMappingToReconciledSchema]...
2018/07/10 15:40:54 - Spoon - Started the transformation execution.
2018/07/10 15:40:54 - FullCoverageTestCaseSalesMappingToReconciledSchema - Dispatching started for transformation [FullCoverageTestCaseSalesMappingToReconciledSchema]
2018/07/10 15:40:54 - Reconciled.Weather.0 - Connected to database [Reconciled.Weather] (commit=1000)
2018/07/10 15:40:54 - Reconciled.Sale.0 - Connected to database [ReconciledSchema] (commit=1000)
2018/07/10 15:40:54 - Reconciled.IceCream.0 - Connected to database [ReconciledSchema] (commit=1000)
2018/07/10 15:40:54 - WeatherMapping.0 - Finished reading query, closing connection.
2018/07/10 15:40:54 - IceCreamMapping.0 - Finished reading query, closing connection.
2018/07/10 15:40:54 - WeatherMapping.0 - Finished processing (I=8, O=0, R=0, W=8, U=0, E=0)
2018/07/10 15:40:54 - IceCreamMapping.0 - Finished processing (I=7, O=0, R=0, W=7, U=0, E=0)
2018/07/10 15:40:54 - Split Weather Text.0 - Finished processing (I=0, O=0, R=8, W=8, U=0, E=0)
2018/07/10 15:40:54 - Reconciled.Mapping.0 - Finished reading query, closing connection.
2018/07/10 15:40:54 - Reconciled.Mapping.0 - Finished processing (I=6, O=0, R=0, W=6, U=0, E=0)
2018/07/10 15:40:54 - Reconciled.IceCream.0 - Finished processing (I=0, O=7, R=7, W=7, U=0, E=0)
2018/07/10 15:40:54 - Reconciled.Sale.0 - Finished processing (I=0, O=6, R=6, W=6, U=0, E=0)
2018/07/10 15:40:54 - Reconciled.Weather.0 - Finished processing (I=0, O=8, R=8, W=8, U=0, E=0)
2018/07/10 15:40:54 - Spoon - The transformation has finished!!
The job file, ending in .kjb, has only a reference to the above transformation file. It fails:
2018/07/10 15:34:35 - Spoon - Starting job...
2018/07/10 15:34:35 - Full Coverage Test Case - Start of job execution
2018/07/10 15:34:35 - Full Coverage Test Case - ERROR (version 7.0.0.0-25, build 1 from 2016-11-05 15.35.36 by buildguy) : A serious error occurred during job execution:
2018/07/10 15:34:35 - Full Coverage Test Case - Couldn't find starting point in this job.
2018/07/10 15:34:35 - Full Coverage Test Case - ERROR (version 7.0.0.0-25, build 1 from 2016-11-05 15.35.36 by buildguy) : org.pentaho.di.core.exception.KettleJobException:
2018/07/10 15:34:35 - Full Coverage Test Case - Couldn't find starting point in this job.
2018/07/10 15:34:35 - Full Coverage Test Case -
2018/07/10 15:34:35 - Full Coverage Test Case - at org.pentaho.di.job.Job.execute(Job.java:532)
2018/07/10 15:34:35 - Full Coverage Test Case - at org.pentaho.di.job.Job.run(Job.java:436)
2018/07/10 15:34:35 - Spoon - Job has ended.

I added a Start thingy and connected it to the job thingy.That fixed it.

Related

Hangfire is stopping with log caught stopping signal

I have configured Hangfire for my Web API solution. But the Hangfire stops working after sometime logging about stopping signal. However, if API is sit again it loads and works normally. The log produced by Hangfire before stopping is as below. What should I do to prevent this.
[137] INFO Hangfire.Server.BackgroundServerProcess - Server xyz:26756:c4da0558 caught stopping signal...
[ServerWatchdog #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop ServerWatchdog:32e9252b stopped in 2.2406 ms
[ExpirationManager #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop ExpirationManager:802eb676 stopped in 2.2681 ms
[Worker #3] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop Worker:3d798436 stopped in 2.4496 ms
[CountersAggregator #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop CountersAggregator:f6091560 stopped in 3.1411 ms
[Worker #2] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop Worker:8b960c88 stopped in 3.7784 ms
[Worker #2] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop Worker:8b960c88 stopped in 3.7784 ms
[Worker #4] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop Worker:8e0e3cc8 stopped in 4.2624 ms
[DelayedJobScheduler #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop DelayedJobScheduler:066c3859 stopped in 4.5509 ms
[136] INFO Hangfire.Server.BackgroundServerProcess - Server xyz:26756:c4da0558 caught stopped signal...
[Worker #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop Worker:dfcd3314 stopped in 5.7981 ms
[RecurringJobScheduler #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop RecurringJobScheduler:2b8e4675 stopped in 6.1803 ms
[ServerJobCancellationWatcher #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop ServerJobCancellationWatcher:554692cc stopped in 6.216 ms
[BackgroundServerProcess #1] INFO Hangfire.Server.BackgroundServerProcess - Server xyz:26756:c4da0558 All dispatchers stopped
[ServerHeartbeatProcess #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop ServerHeartbeatProcess:f3cc4077 stopped in 8.3409 ms
[BackgroundServerProcess #1] DEBUG Hangfire.Server.BackgroundServerProcess - Server xyz:26756:c4da0558 is reporting itself as stopped...
[BackgroundServerProcess #1] INFO Hangfire.Server.BackgroundServerProcess - Server xyz:26756:c4da0558 successfully reported itself as stopped in 199.8624 ms
[BackgroundServerProcess #1] DEBUG Hangfire.Processing.BackgroundExecution - Execution loop BackgroundServerProcess:dd1eaf65 stopped in 214.6647 ms
[BackgroundServerProcess #1] INFO Hangfire.Server.BackgroundServerProcess - Server virtuadeskdev04:26756:c4da0558 has been stopped in total 212.7921 ms
This was because of Application Pool Idleness set to 20 minutes. I have changed the Start type to Always Running and now Hangfire does not stop.

Pentaho Reporting Ouput component using PDI is giving an error

I'm trying to generate a pdf file using PDI by passing the parameter to prpt using pentaho reporting output component.
However when I try to run the transformation it giving an error as below.
Please help me out in this.
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Unexpected error
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : org.pentaho.di.core.exception.KettleException:
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - There was an unexpected error processing report 'C:\Users\ramcharan.gottipati\Desktop\Task' to produce file 'C:\Users\ramcharan.gottipati\Task.CSV' with processor: CSV.
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - Failed to open URL connection
2018/09/25 13:48:01 - Pentaho Reporting Output.0 -
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:317)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:126)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at java.lang.Thread.run(Unknown Source)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceLoadingException: Failed to open URL connection
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.loader.URLResourceData.getResourceAsStream(URLResourceData.java:153)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.loader.AbstractResourceData.getResource(AbstractResourceData.java:83)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.docbundle.bundleloader.ZipResourceBundleLoader.loadBundle(ZipResourceBundleLoader.java:71)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.DefaultResourceManagerBackend.loadResourceBundle(DefaultResourceManagerBackend.java:321)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.ResourceManager.loadResourceBundle(ResourceManager.java:248)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.ResourceManager.load(ResourceManager.java:264)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.ResourceManager.create(ResourceManager.java:362)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.ResourceManager.create(ResourceManager.java:334)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createDirectly(ResourceManager.java:200)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.loadMasterReport(PentahoReportingOutput.java:164)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:176)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - ... 3 more
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - Caused by: java.io.FileNotFoundException: C:\Users\ramcharan.gottipati\Desktop\Task (The system cannot find the file specified)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at java.io.FileInputStream.open0(Native Method)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at java.io.FileInputStream.open(Unknown Source)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at java.io.FileInputStream.<init>(Unknown Source)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at java.io.FileInputStream.<init>(Unknown Source)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at sun.net.www.protocol.file.FileURLConnection.connect(Unknown Source)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.loader.URLResourceData.getResourceAsStream(URLResourceData.java:147)
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - ... 13 more
2018/09/25 13:48:01 - Pentaho Reporting Output.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2018/09/25 13:48:01 - tr_Pentaho_reporting_output_8.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Errors detected!
2018/09/25 13:48:01 - Spoon - The transformation has finished!!
2018/09/25 13:48:01 - tr_Pentaho_reporting_output_8.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Errors detected!
2018/09/25 13:48:02 - tr_Pentaho_reporting_output_8.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Errors detected!
2018/09/25 13:48:02 - tr_Pentaho_reporting_output_8.0 - Transformation detected one or more steps with errors.
2018/09/25 13:48:02 - tr_Pentaho_reporting_output_8.0 - Transformation is killing the other steps!
2018/09/25 13:48:56 - Spoon - Save as...
2018/09/25 13:49:08 - Spoon - Using legacy execution engine
2018/09/25 13:49:08 - Spoon - Transformation opened.
2018/09/25 13:49:08 - Spoon - Launching transformation [tr_PRO]...
2018/09/25 13:49:08 - Spoon - Started the transformation execution.
2018/09/25 13:49:09 - Spoon - The transformation has finished!!
I tried using PDI V8.0 and PR V8.0. Kindly let me know if I need to add any plugins or any modification
You have configured to open 'C:\Users\ramcharan.gottipati\Desktop\Task' as a file, while I asume you should provide a file name.
here field is 'Report definition file' should be like
"C:\Users\ramcharan.gottipati\Desktop\Task.prpt"
If you have to specify for field
'Output file' path to result pdf, like:
"C:\Users\ramcharan.gottipati\Desktop\Task.pdf"
What happens during execution? PDI will use input stream of data for pentaho report processor. Report (prpt) already should be configured to use proper input fields that match field names in input stream in kettle. Names and data types must match or you could get error to process report.
If all is ok, using this step is similar to run pentaho reporting engine using kettle input. Any can run pentaho reporting engine even without kettle at all.
Hope it helps.

Is Pentaho Reporting Output component of PDI works without parameters

enter image description here
Below is my requirement with example.
1.Base report will have data from this query.
Base report query:select * from Animals;
I used above query in prpt for base report.
2.I want to generate 2 saperate reports from above data using same prpt as my report design.I dont want to create again same prpt for
Report_1_query and Report_2_query as mentioned below.
Report_1_query: select * from Animals where AnimalType="Dog";
Report_2_query: select * from Animals where AnimalType="Cat";
This I was tried using Pentaho Reporting Output component with parametrs and its working fine.But I dont want to use parameters.
3. Now if you see screenshot I have taken all data in Table Input Component and applied java filter to select particular records and passed that records to
Pentaho Reporting Output component.According to me it will not work.If its not possible with this approach,request you to suggest any other way to acheive this.
Transformation screenshot attached for the reference.Getting below error while running attached transformation.
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Unexpected error
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : org.pentaho.di.core.exception.KettleException:
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - There was an unexpected error processing report '' to produce file 'B' with processor: PDF.
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - Unable to parse the document: ResourceKey{schema=org.pentaho.reporting.libraries.resourceloader.loader.URLResourceLoader, identifier=file:/C:/Pentaho/design-tools/data-integration, factoryParameters={}, parent=null}
2018/04/20 15:29:12 - Pentaho Reporting Output.0 -
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:317)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:126)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at java.lang.Thread.run(Unknown Source)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - Caused by: org.pentaho.reporting.libraries.resourceloader.ResourceCreationException: Unable to parse the document: ResourceKey{schema=org.pentaho.reporting.libraries.resourceloader.loader.URLResourceLoader, identifier=file:/C:/Pentaho/design-tools/data-integration, factoryParameters={}, parent=null}
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.xmlns.parser.AbstractXmlResourceFactory.create(AbstractXmlResourceFactory.java:214)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.DefaultResourceManagerBackend.create(DefaultResourceManagerBackend.java:225)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.ResourceManager.create(ResourceManager.java:382)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.ResourceManager.create(ResourceManager.java:334)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.resourceloader.ResourceManager.createDirectly(ResourceManager.java:200)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.loadMasterReport(PentahoReportingOutput.java:164)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:176)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - ... 3 more
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - Caused by: org.xml.sax.SAXParseException; systemId: file:/C:/Pentaho/design-tools/data-integration; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog.
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - at org.pentaho.reporting.libraries.xmlns.parser.AbstractXmlResourceFactory.create(AbstractXmlResourceFactory.java:205)
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - ... 9 more
2018/04/20 15:29:12 - Pentaho Reporting Output.0 - Finished processing (I=0, O=0, R=2, W=1, U=0, E=1)
2018/04/20 15:29:12 - TradeBlotterAnother - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Errors detected!
2018/04/20 15:29:12 - Spoon - The transformation has finished!!
2018/04/20 15:29:12 - TradeBlotterAnother - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Errors detected!
2018/04/20 15:29:12 - TradeBlotterAnother - ERROR (version 8.0.0.0-28, build 8.0.0.0-28 from 2017-11-05 07.27.50 by buildguy) : Errors detected!
2018/04/20 15:29:12 - TradeBlotterAnother [enter image description here][1]- Transformation detected one or more steps with errors.
2018/04/20 15:29:12 - TradeBlotterAnother - Transformation is killing the other steps!

Pentaho Data Integration dynamic connection (read connection from database)

Pentaho Data Integration: CE 6.1.0.1-196
I am newbie in Pentaho Data Integration.
I need to run the same query in multiple databases.
I created a table in the master database to store the connection information from other databases that need to be consulted.
Below the table structure.
SQL> desc database_connection;
Name Type Nullable Default Comments
------------- ------------- -------- ------- --------
DATABASE_NAME VARCHAR2(32) Y
JDBC_URL VARCHAR2(512) Y
USERNAME VARCHAR2(32) Y
PASSWORD VARCHAR2(32) Y
ENABLED VARCHAR2(1) Y
Sample Data
DATABASE_NAME: XPTO
JDBC_URL: (DESCRIPTION = (ADDRESS = (PROTOCOL = TCP)(HOST = xptosrv.xyz.com)(PORT = 1521))(LOAD_BALANCE = ON)(FAILOVER = ON)(CONNECT_DATA = (SERVER = DEDICATED)(SERVICE_NAME = XPTO.XYZ.COM)(FAILOVER_MODE = (TYPE = SELECT)(METHOD = BASIC)(RETRIES = 180)(DELAY = 5))))
USERNAME: SYSTEM
PASSWORD: blablabla
ENABLED: Y
My .ktr files:
(set_variables.ktr)
Table Input ---> Copy rows to result
The query associated with the input table run in master database.
select database_name, jdbc_url, username, password from database_connection where enabled = 'Y'
(db_query.ktr)
Table Input ---> Table output
The query associated with the table input run o (multiple databases) and store data in table output (master database)
My .kjb files:
(run_for_each_row.kjb)
Start ---> Transformation ---> Success
Transformation filename: ${Internal.Job.Filename.Directory}/db_query.ktr
Job Properties Parameters:
DATABASE_NAME
JDBC_URL
PASSWORD
USERNAME
(master_job.kjb)
Start ---> Transformation ---> Job for each row ---> Success
Transformation filename: ${Internal.Job.Filename.Directory}/set_variables.ktr
Job for each row filename: ${Internal.Job.Filename.Directory}/run_for_each_row.kjb
Job for each row ... Advanced tab
Copy previous results to parameters -> checked
Execute for every input row -> checked
Job for each row ... Parameters: DATABASE_NAME, JDBC_URL, PASSWORD, USERNAME
Execution log:
2016/10/06 10:36:15 - Spoon - Iniciando o job...
2016/10/06 10:36:15 - master_job - Início da execução do job
2016/10/06 10:36:15 - master_job - Starting entry [Transformation]
2016/10/06 10:36:15 - Transformation - Loading transformation from XML file [file:///D:/pdi/set_variables.ktr]
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - set_variables - Expedindo in?cio para transforma??o [set_variables]
2016/10/06 10:36:15 - Table input.0 - Finished reading query, closing connection.
2016/10/06 10:36:15 - Copy rows to result.0 - Finished processing (I=0, O=0, R=6, W=6, U=0, E=0)
2016/10/06 10:36:15 - Table input.0 - Finished processing (I=6, O=0, R=0, W=6, U=0, E=0)
2016/10/06 10:36:15 - master_job - Starting entry [Job for each row]
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - slave_job - Starting entry [Transformation]
2016/10/06 10:36:15 - Transformation - Loading transformation from XML file [file:///D:/pdi/db_query.ktr]
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016/10/06 10:36:15 - db_query - Expedindo in?cio para transforma??o [db_query]
2016/10/06 10:36:15 - Table input.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : An error occurred, processing will be stopped:
2016/10/06 10:36:15 - Table input.0 - Error occurred while trying to connect to the database
2016/10/06 10:36:15 - Table input.0 -
2016/10/06 10:36:15 - Table input.0 - Error connecting to database: (using class oracle.jdbc.driver.OracleDriver)
2016/10/06 10:36:15 - Table input.0 - Erro de ES: Connect identifier was empty.
2016/10/06 10:36:15 - Table input.0 - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Erro inicializando step [Table input]
2016/10/06 10:36:15 - Table output.0 - Connected to database [REPORT] (commit=1000)
2016/10/06 10:36:15 - db_query - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Step [Table input.0] falhou durante inicializa??o!
2016/10/06 10:36:15 - Table input.0 - Finished reading query, closing connection.
2016/10/06 10:36:15 - Transformation - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : Unable to prepare for execution of the transformation
2016/10/06 10:36:15 - Transformation - ERROR (version 6.1.0.1-196, build 1 from 2016-04-07 12.08.49 by buildguy) : org.pentaho.di.core.exception.KettleException:
2016/10/06 10:36:15 - Transformation - Falhou a inicializa??o de pelo menos um step. A Execu??o n?o pode sere iniciada!
2016/10/06 10:36:15 - Transformation -
2016/10/06 10:36:15 - Transformation -
2016/10/06 10:36:15 - Transformation - at org.pentaho.di.trans.Trans.prepareExecution(Trans.java:1142)
2016/10/06 10:36:15 - Transformation - at org.pentaho.di.trans.Trans.execute(Trans.java:612)
2016/10/06 10:36:15 - Transformation - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:1097)
2016/10/06 10:36:15 - Transformation - at org.pentaho.di.job.Job.execute(Job.java:723)
2016/10/06 10:36:15 - Transformation - at org.pentaho.di.job.Job.execute(Job.java:864)
2016/10/06 10:36:15 - Transformation - at org.pentaho.di.job.Job.execute(Job.java:608)
2016/10/06 10:36:15 - Transformation - at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:69)
2016/10/06 10:36:15 - Transformation - at java.lang.Thread.run(Thread.java:745)
2016/10/06 10:36:15 - slave_job - Finished job entry [Transformation] (result=[false])
2016/10/06 10:36:15 - master_job - Finished job entry [Job for each row] (result=[false])
2016/10/06 10:36:15 - master_job - Finished job entry [Transformation] (result=[false])
2016/10/06 10:36:15 - master_job - Job execution finished
2016/10/06 10:36:15 - Spoon - O Job finalizou.
Data from database_connection table is being read
2016/10/06 10:36:15 - set_variables - Expedindo in?cio para transforma??o [set_variables]
2016/10/06 10:36:15 - Table input.0 - Finished reading query, closing connection.
2016/10/06 10:36:15 - Copy rows to result.0 - Finished processing (I=0, O=0, R=6, W=6, U=0, E=0)
2016/10/06 10:36:15 - Table input.0 - Finished processing (I=6, O=0, R=0, W=6, U=0, E=0)
But I do not know what I'm doing wrong that these data are not passed as parameter.
I appreciate any help because I'm already stopped a few days ago this problem.
The examples I found here on stackoverflow and the pentaho forum did not help me much.
Project files (https://github.com/scarlosantos/pdi)
Thank you
This exact use case is nicely explained in FAQ Beginner Section.
To make it short:
0) Check you have all the drivers.
1) Do not forget to specify the names of these variables (right-click anywhere, Properties, Parameters) on the transformations and the job. And also that they are defined at job scope level.
2) IMPORTANT: you go to the View (on the left pane, you are most probably on Design), and share the connection so that PDI knows you connection in any transformation/job.
3) Edit the connection and in the HostName, DatabaseName,... boxes, you write ${HOST}, ${DATABASE_NAME},... or whatever name you gave to the variables. If you did step (1), just press Crtl-Space and select from the drop menu.
4) Edit with a notepad the file called C:\Users\yourname\.kettle\shared.xml. It's even fun if you keep a copy of the last working version. And, if you are brave enough, you can even produce this file with the PDI.
Now you are rising an interesting question: you seems to connect with the jdbc-url, which you can do in PDI (with the Generic Database Connection), however with that method the PDI does not know which sql-dialect you are using. So if you have some funny error along the flow, make sure you SELECT *, do NOT use lazy conversions and have a look at the types with a Right-click/Output Fields.
Use Set Variables step instead of copy results in your "set_variables.ktr", and use variables in your connection properties it will replace those variables at run-time and you will have dynamic db connection.

Error while executing pentaho transformation (talking to Sql server) through java program

I have a Kettle (5.4.0.1) transformation which executes query in MS Sql server 2008 R2 in "Table Input" task.
The task executes successfully in the Spoon UI directly or through Job.
Same transformation I want to execute through a Java program for which I got this code -
KettleEnvironment.init();
TransMeta metaData = new TransMeta("first_transformation.ktr");
Trans trans = new Trans( metaData );
trans.execute( null );
trans.waitUntilFinished();
But on executing it in a java program I get this error -
2015/07/30 20:08:34 - TestTransformation - Dispatching started for
transformation [TestTransformation] 2015/07/30 20:08:34 - XML Output.0
- Opening output stream in encoding: UTF-8 2015/07/30 20:08:34 - Table input.0 - ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55
by buildguy) : An error occurred, processing will be stopped:
2015/07/30 20:08:34 - Table input.0 - Error occurred while trying to
connect to the database 2015/07/30 20:08:34 - Table input.0 -
java.io.File parameter must be a directory.
[C:\Root\EclipseWorkSpace\TestProject\simple-jndi] 2015/07/30 20:08:34
- Table input.0 - Finished reading query, closing connection. 2015/07/30 20:08:34 - Table input.0 - ERROR (version 5.4.0.1-130,
build 1 from 2015-06-14_12-34-55 by buildguy) : Error initializing
step [Table input] 2015/07/30 20:08:34 - TestTransformation - ERROR
(version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) :
Step [Table input.0] failed to initialize!
org.pentaho.di.core.exception.KettleException: We failed to
initialize at least one step. Execution can not begin! at
org.pentaho.di.trans.Trans.prepareExecution(Trans.java:1149) at
org.pentaho.di.trans.Trans.execute(Trans.java:607) at
TestKettle.main(TestKettle.java:24)
What could be the issue here as the database connection happens successfully while executing through UI and same KTR file I am trying to execute in above code?
It seems that the Table Input Step in the Ktr is not able to read the data.
Firstly, if you are calling a ktr file from Java code, make sure that the database driver jar files is properly imported/build inside the Java project. I mean import the "sqlserver" database jar file in the java project.
You may try reading this blog. I have used Maven to handle dependency of jars (my db was : postgresql). And it works :)
Hope this helps :)
For anyone else facing this issue - The problem was the folder Simple-Jndi from Pentaho installation directory was required to be placed in the project folder which is not obvious from the error message above although it is mentioned in the error.
2015/07/30 20:08:34 - Table input.0 - java.io.File parameter must be a directory. [C:\Root\EclipseWorkSpace\TestProject\simple-jndi]
After this, I got two other errors which were more clear and was about files missing in the project folder - ESAPI.Properties and validation.properties. I downloaded them from link given here. Then the program ran successfully.