I have set up Pentaho data integration 7.0 on an AWS server. But when I try to go to connect repository, in order to create a new repository, but my PDI 7.0 doesnt even show the option to connect to repository.
I am not getting any idea, how to get this option working ?
Related
I am working with Pentaho products: Pentaho Data Integration (PDI) and Pentaho Server.
From PDI, I could create connection to the Pentaho Repository running on a Pentaho Server. I already created some jobs or transformations, stored them in Repository and they're all well-executed.
However when I configurate the input/ output sources of jobs/ transformations, I could only use data files from my local machine. Is there any way that I can store data files on Repository, and configurate the jobs/ transformations to read data from them?
Thank you so much.
P/s: I am working with PDI v9.3 CE and Pentaho Server v9.3 CE.
This is one of the solution I found.
For each transformation/ job stored on Pentaho Repository, there are 3 running configurations: pentaho local, pentaho server, pentaho slave.
Pentaho local: execute trans/job on the machine being used. That means the trans/job can only read & write data in local storage.
Pentaho server: execute trans/job on the machine hosting Pentaho Server (also Pentaho Repository), wih this running mode, trans/job can read & write data in server's storage
Pentaho slave: execute trans/job using slave server's resource.
So, in my case, the solution is switching running config of trans/job to Pentaho Server config.
I made an API on integration studio on my local computer, Now I want to deploy API on WSO2 MI that is on remote server. I added server and chose WSO2 Remote Server and set other configs, but when server is started and CompositeExporter is synced, nothing happened on MI and API is not there.
I do not know what is wrong.
Can anyone help me?
Thanks
Please refer to https://ei.docs.wso2.com/en/7.2.0/micro-integrator/develop/using-remote-micro-integrator/ documentation to add remote EI or MI server into Integration Studio. Once you integrate a server, follow the below steps.
Open the Servers view (Windows > Show Views > Servers)
right-click on the added server and select Add or Remove option
Select Composite Exporter module to be packed.
My project uses published tableau data-sources.
These data-sources have been created using tableau desktop.
All connect to Hive database using the Native Hortonworks Hadoop Hive connector.
We have a database user and a tableau user with publish rights.
Database credentials are embedded in the extract and then it's published to tableau server.
The reports fetch data from these published data-sources.
The Hive database is now getting Kerberoized + SSL.
Will my existing published data-sources be of use anymore?
Do I have to re-create all the extracts again and publish them again to tableau server?
What will be the best plan to migrate all these data-sources to this new Kerberoized environment?
Regards
Please see below link from tableau community forum, versions may be different but people able to solve the Kerberos Hive connectivity issue.
https://community.tableau.com/thread/149383
I am new to apache ignite. I created ignite cluster and connect my nodejs thin client to it. It is working fine but It only create cache create functions specified in js file. Now I want to sync my sql server data with ignite. Any idea how I will do it?
I tried to connect with Grid gain but it does not allow me to create free cluster?
Please refer to 3rd Party Persistence documentation regarding RDBMS integration.
GridGain Web Console can help you set up database integration by generating Maven project corresponding to your RDBMS data model.
GridGain Community Edition is free to use as long as you deploy it on your own. But, it is also supported by stock Apache Ignite.
I have developed few SSIS packages in my local system and deployed it in to remote SQL Server. Now my local hard disk crashed. Is there a way to get back the packages that are deployed to SQL Server?
Thanks in advance
If you have access to the server itself, you or somebody with the appropriate permissions can login to the server that is running SQL Server Integration Services, from there you can open SSMS, connect to Integration Services, and export any package that has been deployed to that server.
Do something like this:
1) Start integration services project.
2) Right click, Add existing package.
3) Select SSIS Package store.
4) Type the name of the remote server.
5) You can import any package locally from the remote server.
If you are using the Project Deployment Model...
Open Visual Studio
Create a new Project but do not click the Integration Services project that is in the default Business Intelligence section. Instead, expand BI and click Integration Services. From there, use the Integration Services Import Project Wizard.
Point that to your SSISDB or an existing .ispac file and it will import the project, parameters and project level connection managers.