Pentaho - Carte Dowload a file - pentaho

I'm trying to use pentaho service carte to execute a ETL and return the file generated. How I can do this? Someone knows that it is possible or there is a other service to can do this?
Thanks!!!

Related

Log Analytics configuration via az cli / PowerShell?

is there a way to programmatically (az cli, PowerShell) to retrieve the following information:
For anyone ever in the need to achieve the above you can refer to Get-AzOperationalInsightsWorkspace and Get-AzOperationalInsightsDataSource. Wrote a simple PowerShell script that output the Log Analytics workspace plus Event Log settings in tabular format.

How do I pass variables to Fastload Script using a wrapper shell script

I'm working on teradata fastload scripts and I'm using the LOGON command to establish a connection with the database.
LOGON DBC_ip/username,password;
But for security purpose I would like to get the password from a vault like application using a shell script.
Initialy I was trying to create a wrapper shell script that would get the password from the vault and use it in the fast load script.
wrapper_shell_script--> fetch_password($password) --> execute fastload script using $password.
Example: LOGON DBC_ip/username,$password;
My question:
Is it possible to use external variables in fastload scripts. if yes, can it done using this process.
Could anyone please help me if this is possible or if there any other bette way to implement this.
Let me know if you need more details
Thank you in advance!!

simple-jdin/jdbc.properties is ignored in pentaho-server 8.2

If we need to perform queries in pentaho data integration (IDE), we need to add manually the datasource in simple-jdin/jdbc.properties
MyDatabase/type=javax.sql.DataSource
MyDatabase/driver=org.hsqldb.jdbcDriver
MyDatabase/url=jdbc:hsqldb:hsql://localhost/sampledata
MyDatabase/user=pentaho_admin
MyDatabase/password=password
This works as expected in the ide known as pentaho data integration, spoon or kettle.
But, the same in pentaho server 8.2 does not works.
Steps to reproduce the error
deploy or upload the transformation(.ktr) in the pentaho-server 8.2
add manually the datasource in the server /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
executed the transformation using pentaho server web console : run in background or schedule options
error: datasource not found
Alternative
Create the datasource manually using web console of pentaho server instead of manually modification of file : /../pentaho-server/pentaho-solutions/system/simple-jndi/jdbc.properties
Question
simple-jndi/jdbc.properties works in server or is just for development purposes (pdi)?
are the settings available in jdbc.properties file on server can you verify the same by referring to the pentaho data integrator > simple-jndi folder.
also are you able to connect to the database from the server may be using a database client to confirm.
NOTE:whenever you work with databases make sure you relevant library files to connect to the respective database.
From my personal experience, I was not able to make the server pick JNDI connection definitions from the simple-jndi/jdbc.properties file when the Pentaho Server was running with Tomcat.
I was only able to use JNDI on the Pentaho Server by defining JNDI Datasources in the Tomcat configuration files.

Pentaho report which contains more than 7 connections are not working in pentaho DI server

I am new to pentaho. Recently i created a job which generates some report. I am using JNDI connection in the report(.prpt). If I am using more than 7 connections inside a report, it will fail to generate report but it's working smoothly with reports which has lesser connections.
Note: If i execute using spoon, it will work fine even report has more JNDI connections but my requirement is to execute job using pentaho kettle API like this ->
https://address/pentaho-di/kettle/executeJob/?job=/home/pentaho/Test/main.kjb&level=Rowlevel
Seeking for help from pentaho experts.
It was actually due to issue in data factory.
Steps to do:
Stop your Pentaho DI Server if it is already running.
Navigate to
server\data-integration-server\tomcat\webapps\pentaho\META-INF and
edit the context.xml file.
In your jdbc/mart JNDI connection change the factory class from,
factory="org.apache.commons.dbcp.BasicDataSourceFactory" to,
factory="org.apache.tomcat.jdbc.pool.DataSourceFactory"
Save and close the file.
Clear tomcat\work and tomcat\temp directories.
Restart your PDI server.
Execute the transformation

Run ktr remotly using on pentaho BI server

I'm using pentaho BI server 7 EE.
I have uploaded a simple transformation (.ktr) and run it successfully throught PUC.
But when I try to call it throught REST service (http://localhost:8080/pentaho/kettle/runTrans?trans=%3Apublic%3Ademo%3AsimplePdiJob.ktr) it fails with the following response:
<webresult>
<result>ERROR</result>
<message>!RunTransServlet.Error.UnexpectedError!</message>
<id/>
</webresult>
I didn't find any exception in the log.
What can be the cause of this problem?
Thanks
Hi I do not know if you still have this problem but I managed to solve it these days doing an xaction and inserting in the in the XML code that I inserted a PHP script that made the call of my .ktr file and uploaded to the server a file Csv that was with which I would perform my ETL. I know that is not the best solution but for my case there is no way even in version 7 of bi server to upload csv files or some other extension.