Problem when trying to read EXCEL after implementing OFFICE 365 "Confidentiality Label" - pentaho

I have an ETL routine in PENTAHO and I'm migrating to APACHE HOP.
But I came across a situation, the HOP step/plugin "Microsoft Excel Input" cannot read the data before I open the excel file and click confirm Add Confidentiality Label.
In PENTAHO PDI this problem does not occur, does anyone have any tips?
IMG 1
After clicking and adding a confidentiality label like "public" for example and saving and closing the file, the process works perfectly.
Note: This only happens with some files.

This sounds like a problem that will not have a clear and direct answer and will require some changes in the code.
The code for Apache Hop is managed on Github.
You can create an issue there and one of the developers will help you get this sorted out. When creating a ticket please be as specific as you can be and add a sample, that will improve the chances of getting a fix on short notice.

Related

Unable to save both user changes when coding during same session

How I understand it, in Colaboratory two people can write code in the same document at the same time. This doesn't seem to be working for us: we are coding in separate sections, but are still struggling to figure out to keep both of our changes. We get the notification below:
"The notebook has been changed outside of this session. Would you like to overwrite existing changes?"
or "Save Failed". Then I get two windows with my edits and his edits, but am unaware how to accept both overides.
Your guidance is much appreciated.
The information I was reading was out-of-date. Google Colaboratory removed the real-time edit functionality:
https://github.com/googlecolab/colabtools/issues/355

Is there a way to extract Access Modules without opening the file?

I ended up corrupting my database to where every time I attempt to open it, I get error 3022, "changes you requested to the table were not successful because they would create duplicate values in the index."
Recovery of the file does not seem possible and my previous back up is a month ago. I have been able to extract everything but the Modules, which is what I need to recover the most. None of the standard ways I have found work because they require the ability to open the database (For example, trying to set it as a VBA reference still give the same error.)
Is there any way to get the modules or code out of the file without opening it?
Edit:
Was finally able to get access to the file. Using DBEngine.CompactDatabase it was able to do a compact and repair. The issue has boiled down to the "MSysAccessStorage" table is corrupt, and says "Id is not an index in this table". I know have access to everything, except the modules, which I can't open without the MSysAccessStorage working.
I'm going to keep poking at it but I'm not sure what options I have for fixing a system table. Any ideas would be helpful.
Unfortunately, the Visual Basic for Applications project has been corrupted. The original database doesn't even have any VBProjects when listing a count. I'm going to call this one a lost cause. Thanks everyone that tried to help.

BI Publisher - Fail to load and save data model

Started BI Publisher about a week ago.
When working on a new data model, about one or two queries in, I get this error when I try to save:
Failed to load servlet/res?s=%252F~developer1%252Ftest%252FJustin%2520Tests%252FOSRP%2520Information.xdm&desc=&_sTkn=9ba70c01152efbcb413.
I can no longer save my data model.
I tried deleting my queries, logging in and out, turning machine off and on, but no luck.
I'm currently resolved to saving all of my queries locally in notepad.
I can create a whole new data model and it will save fine, but then after two or three queries the same thing happens.
What's going on and why would anyone design such a confusing error message?
Any help would be greatly appreciated.
After restarting your server once you won't get this issue.It happens some time due to the connection problem.so restart should work for this.It resolved my problem.
None of the proposed solutions worked for me. I found out, on my own, that any unnecessary brackets around CASE in a select statement will cause this error. Remove the unnecessary brackets and the error goes away.
Oracle meta link Doc ID 2173333.1. In BI Publisher releases 11.1.1.8.x and up, there is an option to Manage Cache in the Administration section of BIP. This option was also added to 11.1.1.7 in patch 140715 (11.1.1.7.140715).
Clearing the object cache will resolve the saving errors:
Click on the Administration link
Manage BI Publisher
Manage Cache
Click on the 'Clear Object Cache'

MS Excel - VBA Automation Error.

I have a suite of MS-Excel scorecard calculators that I send to 200 odd clients. In all client environments save 1 the calculators work 100%. However one client is experiencing the following problem.
Periodically the spreadsheet gets a Microsoft Visual Basic Run-Time Error. The exact message is:
Microsoft Visual Basic
Run-time error '-2147417848(80010108)':
Automation error.
As I mentioned, this happens only at one customer site. Everywhere else the spreadsheet runs 100%.
Can anyone shed some light as to why this happens at one site out of 200 odd.
It is possible to provide some diagnosis.
Do a search of user's system for any *.exd files. They may be in a hidden folder.
Delete any *.exd file you find. They are temporary files and will be rebuilt by Excel.
Try to run the file again. Sorry I can't provide more detailed help, but this is a start based on the information provided.

SSIS Package Not Populating Any Results

I'm trying to load data from my database into an excel file of a standard template. The package is ready and it's running, throwing a couple of validation warnings stating that truncation may occur because my template has fields of a slightly smaller size than the DB columns i've matched them to.
However, no data is getting populated to my excel sheet.
No errors are reported, and when I click preview for my OLE DB source, it's showing me rows of results. None of these are getting populated into my excel sheet though.
You should first make sure that you have data coming through the pipeline. In the arrow connecting your Source task to Destination task (I'm assuming you don't have any steps between), double click and you'll open the Data Flow Path Editor. Click on Data Viewer, then Add and click OK. That will allow you to see what is moving through the pipeline.
Something to consider with Excel is that is prefers Unicode data types to Non-Unicode. Chances are you have a database collation that is Non-Unicode, so you might have to convert the values in a Data Conversion task.
ALSO, you may need to force the package to execute in 32bit runtime. The VS application develops in a 32bit environment, so the drivers you have visibility to are 32bit. If there is no 64bit equivalent, it will break when you try and run the package. Right click on your project and click Properties and under the Debug menu you'll need to change the setting Run64BitRuntime to FALSE.
you dont provide much informatiom. Add a Data View between your source and your excel destination to see if data is passing through. Do do it, just double click the data flow path, select data view and then add a grid.
Run your app. If you see data, provide more details so we can help you
Couple of questions that may lead to an answer:
Have you checked that data is actually passed through the SSIS package at run time?
Have you double checked your mapping?
Try converting within the package so you don't have the truncation issue
If you add some more details about what you're running, I may be able do give a better answer.
EDIT: Considering what you wrote in your comment, I'd defiantly try the third option. Let us know if this doesn't solve the problem.
Just as an assist for anyone else running into this - I had a similar issue and beat my head against the wall for a long time before I found out what was going on. My export WAS writing data to the file, but because I was using a template file as the destination, and that template file had previous data that had been deleted, the process was appending the data BELOW the previously used rows. So, I was writing out three lines of data, for example, but the data did not start until row 344!!!
The solution was to select the entire spreadsheet in my template file, and delete every bit of it so that I had a completely clean sheet to begin with. I then added my header lines to the clean sheet and saved it. Then I ran the data flow task and...ta-daa!!! Perfect export!
Hopefully this will help some poor soul who runs into this same issue in the future!