Is there any documentation/tutorial on how to migrate Kettle in an application (for executing transformations, not for creating them) from version 3.2 to the current 4.1? I could not find any useful ressource on the web, so any hint is appreciated.
Are you executing the job / transformation from repository ? If not - you are using .ktr / .kjb file then you don't need to worry on upgrading.
Regards,
Dino
Related
I build a simple inventory management app. Now I would like to publish it as a single executable file (.exe) including the database and all dependencies. The purpose is to use it offline on another machine. I tried with deployment mode: ``Self-contained`` and file publish option as: ``produce single file``, ``Enable ReadyToRun compilation`` and ``Trim unused assemblies`` in visual studio. But that does not include database niter producing a single file.
Now in this case what should I do? Can anyone help me?
Regards,
Nazmul
You should create a migration file with command and run it on SQL, or use ci/cd to deploy it.
Pay attention to these links:
about Migration
about deployment
Well, I found a solution by myself. By using SQLite Database instate of MSSQL. Since my application is not a multiuser application that will not be an issue. But if you are thinking of a multiuser app you better not use SQLite. You can use it by installing the Microsoft.EntityFrameworkCore.Sqlite plugin.
if anyone finds a better solution let me know.
I'm looking at a migration project to migrate a client from Datamanager to Datastage. I can see that IBM have helpfully added in the Migration Assistant tool, but less helpfully, I cannot find any details on how to actually use it.
I did use it some years ago, and I'm aware that you need to do it in a command line interface and it works by taking an extract file and creating a Datastage job out of it, which is then reinstalled. However I haven't got my notes any more from that process.
If there is a user guide out there for this tool, I'd love to see it.
Cheers
JK
Your starting point would be the IBM support page and then see "How to download the tool" and ensure you have the required version (10.2.1 + Fix Pack 13 + Interim Fix) installed. The user guide PDF is part of the install in the sub-folder "datamanager/migration".
I could not see any reference to the Pentaho on the Skybot documentation. Is there a way to schedule Pentaho transformations and jobs on the Skybot? I have tried creating agents and referring to the file path but nothing is working! Any pointers?
For executing or scheduling in Pentaho, you need to have Pentaho installed in your system. If you are using Linux system, first of all install Pentaho DI in your system. Once you have done that make use of the Skybot schedular. Point the Kitchen.sh or Pan.sh file of pentaho DI and the files you need to schedule/execute. You can take help of this link:
How to schedule Pentaho Kettle transformations?
If all is done you can execute a transformation. Skybot needs OS and Pentaho to execute/schedule a job. The same goes with Windows schedular or any other scheduling tool.
Hope it helps :)
This should be pretty simple. Just use the CLI tools to start your job. Kitchen I believe runs jobs. Pan will run transformations.
Here's the documentation for kitchen. It's very straight forward.
http://wiki.pentaho.com/display/EAI/Kitchen+User+Documentation
I am using pig version as 0.12,But for creating UDFs i am using the jar file of Pig 0.9 version.
I simply downloaded the jar file for Pig 0.9 version and added that in my eclipse classpath.
All the UDFs that I created using Pig 0.9 version API works fine.
But I would like to know the impact on that.
Is there any problem that I will face in future
The issues that you will face is API inconsistencies as time goes by. Some of the core APIs are relatively stable. Heck, most. But the longer you use an old Pig API the higher the chance you'll get an issue running in the cluster.
Something else to think about is are you overriding your Pig version in the cluster. For example, say you have an uber-jar with the pig scripts in it. If that JAR contains Pig v.09, you'll actually use that version rather than .12. By not migrating, you might be pulling in the wrong version of Pig.
can anyone give an detailed procedure on how to
migrate projects from gforge version(4.5) to teamforge version 5.2.0.
Migration includes source repository, bug tracking, wiki and discussions.
is it possible to shift all of them.
What's the best way to handle a situation like this?
Thank
You
Teamforge uses Subversion, so assuming you have commandline access to the server then you can use the svnadmin dump and svnadmin load commands.
You will need to run these commands for each repository.
Some (I don't know which) of the pages and wiki are also stored in subversion repositories, so you may be able to migrate that using the same method.