Pentaho Data Integration Unable To Open in Repository - pentaho

Pentaho Data Integration is giving me an error every time I try to run it on the server.
It says that it cannot find my transformation in the repository, but I know it's there. I triple checked.
Why do you think I am getting this error message? What could be the cause?

Related

How to run Pentaho console on linux from localhost:port/pentaho?

I am trying to use pentaho which I downloaded from sourceforge (pentaho files). I run the schema-workbench shell correctly and a window opens with the interface, but I still haven't been able to connect to the admin console on http://localhost:8080/pentaho.
Any ideas on which this doesn't seem to work for me?
Best regards
You have a start-pentaho.sh which launches (after a long the first time) the pentaho server on port 8080.
That is, if you have downloaded the correct package, because Pentaho contains many packages: one is the server, another one is the client-tools which contains the schema-workbench as well as the pdi (Pentaho Data Integrator), and the prd (Pentaho Report Designer) as well as few others.
You are running the wrong file. To open the pentaho console, you need to download the PNTAHO SERVER and run 'start-pentaho.sh'
Pentaho by default will start PuC Pentaho USer Console on http://localhost:8080/pentaho once server is up and running. For getting the data integration i.e Spoon interface go to
For Windows : Pentaho install directory>> design-tools>> data-integration>>spoon.bat
For Linux/Mac:Pentaho install directory>> design-tools>> data-integration>>spoon.sh
I hope this helps.

Issue with SSIS executing task to convert Excel to CSV

We have a task where we need to automatically convert an excel file to a csv to prep it for loading into a SQL database. The developers built this process into a SSIS package. For the conversion, they initially tried to have a task in the SSIS package execute a VBscript to convert the file. When they were running this on there local machines, this worked correctly. When they ran the package manually through VS on the server, it ran correctly. When they ran the package manually via the Integration Catalog it ran correctly. We did this both as our accounts and as the service account and got the same results. However, when we scheduled it as a job it would hang on the part of the process that executed the VBScript. No errors, it would just hang until you killed the job.
The job was executing as the service account which has full admin access on the server, explicit full access to the share where the files are stored and converted (which is on the same server) and full admin access to SQL. The job owner is set to sa which uses the service account. And all the job does is execute the package from the integration catalog which works if you run it independently of a job. When we compared the ssisdb execution report for the manual run in the integration catalog using the service account to the job run they looked the same except the job hung on the conversion task and the other did not.
After spending some time trying to figure this out, the developers tried a different solution. They changed the conversion script from using VBScript to using C#. They ran the package from there local machine and once again the package worked. This time when they ran it manually on the server it failed. When we ran it from the integration catalog it failed and when we ran it from a job it failed.
The error we keep getting is "Create CSV file: Error: Exception has been thrown by the target of an invocation" After spending several hours looking into this error nothing suggested seems to be working.
We also tried these same solutions on a newly built server to make sure we weren't dealing with an odd configuration setting that could have been changed (It is a Dev server) and it still failed there.
At this point, we are pretty lost at what is happening. The first solution worked, but for some reason would never work as a job. The 2nd solution works everywhere except when ran on the server.
We are looking at some other solutions to try to get around this. The next thing may be trying to using powershell to convert the file, but not sure if that will bring us back to the same issue. Any suggestion you guys have will be greatly appreciated
Also, we are using SQL Server 2012 dev edition, VS 2012 pro, Windows Server 2012 R2
This might be because of a bug that Excel has when trying to run jobs (that use Excel) and no user is logged on a specific machine. This might affect also the excel library. The solution is to create the following 2 folders:
C:\Windows\SysWOW64\config\systemprofile\Desktop
C:\Windows\System32\config\systemprofile\Desktop
and then restart the machine. :)
(Edited to show that a restart is needed. thanks for Dustin for checking this)

Where can I see error details for Azure Data Lake Analytics U-SQL jobs in case of failure?

One way is to download file and run a local job to get error details. Sometimes it is hard to go this way. Is there any place where I can see error details, as the details shown at the portal or in job error panel are not complete and you can't reach to the cause of failure.
First, how to you look at the error messages?
If you are using the Azure Portal, there is a fix planned to show all the error messages.
If you use VisualStudio, then please make sure you install the latest version of the ADL Tools for VS. Also, if you have access to the job profile, you can follow the steps outlined here if you feel you cannot see it all: Debugging u-sql Jobs (although with the latest version of the ADL Tools for VS you should not need to have to look for it anymore).

SQL Server Integration Services Package

I have just recently started using SSIS and it's been a great ride so far. I get an error message when attempting to add an existing package. My guess is that packages that were created on an original computer cannot be globally accessed even if you have the .dtsx file?
Below, is a link of the image of the error I am getting.
http://imgur.com/B95pVg5
Please give me any help!

django-jenkins ERROR: runTest (django_jenkins.tasks.lettuce_tests.LettuceTestCase)

I've been working with lettuce for interface testing for the last year and lately I've been trying to use it with django-jenkins for continuous integration.
However, I am having problems on executing only the lettuce tests cases I have into the app directory of my Django project.
When I run python manage.py jenkins I get the lettuce tests executed and the lettuce.xml file with the execution details created into the reports folder. Although I am getting two errors I don't know why (see the errors at the end of my post).
I've been trying to google it to see if someone else is having the same problem but I couldn't find anything. If anyone has any idea I will appreciate.
I've been following this tutorial.
You can check the error here
Here you can access my code on github.