Access params in Pentaho from scripted data source? - scripting

I am currently developing a parameterized report using Pentaho.
Pentaho defines parameters that can be given at the time of generation or inserted by some external source via a Pentaho API.
Now comes the question. I have a scripted data source in groovy and would like to parameterize it a bit. How can I/what's the best way to access the parameters (defined in pentaho) in a scripted data source?
If you use an SQL data source you can directly say ${ParamName} and it replaces the string; however if you use a scripted source this doesn't seem to work.
Any and all comments are more than welcome!
P.S. Sorry for this seemingly trivial question, but we all know how badly documented pentaho is.

Do either of these pages help?
http://www.sherito.org/2011/11/pentaho-reportings-metadata-datasources.html
http://forums.pentaho.com/archive/index.php/t-96689.html

Related

Alter data source

Power BI can connect to various data sources and run SELECT queries.
Is it possible to run also other queries (INSERT INTO, UPDATE...)?
Now I need it for a postgresql database, but could use also for others in the future.
No, you can't run directly INSERT/UPDATE queries from Power BI. This isn't the idea of the tool. If you find you need it, then probably there is a major flaw in your design, or you are not using the right tool for this job. But there are few ways to workaround this (again, I'm not saying that you SHOULD do it). Usually this is done in a combination with custom written Power App, embedded in your report in Power Apps visual. The idea is that the app will write to the database, and will refresh your report after that (if needed).
You can start here and I will recommend you to look at this in-depth session - Writing back data to PowerBI from your reports.
The answer is No if I am very straight forward. PBI is a analysis platform for data. There are probably some advance way to do that but, this is not logical or good idea to think about manipulating data from report or from any BI tools. You can search answers from different blog where the same questions asked. For more details, you can check below links-
help link 1
help link 2

Liquibase load data in a format other than CSV

With the load data option that Liquibase provides, one can specify seed data in a CSV format. Is there a way I can provide say, a JSON or XML file with data that Liquibase would understand?
The use case is we are trying to put in some sample data which is hierarchical. E.g. Category - Subcategory relation which would require putting in parent id for all related categories. If there is a way to avoid including the ids in the seed data via say, JSON.
{
"MainCat1": ["SubCat11", "SubCat12"],
"MainCat2": ["SubCat21", "SubCat22"]
}
Very likely to have this as not supported (couldn't make Google help me) but is there a way to write a plugin or something that does this? Pointer to a guide (if any) would help.
NOTE: This is not about specifying the change log in that format.
This not currently supported and supporting it robustly would be pretty difficult. The main difficultly lies in the fact that Liquibase is designed to be database-platform agnostic, combined with the design goal of being able to generate the SQL required to do an operation without actually doing the operation live.
Inserting data like you want without knowing the keys and just generating SQL that could be run later is going to be very difficult, perhaps even impossible. I would suggest approaching Nathan, who is the main developer for Liquibase, more directly. The best way to do that might be through the JIRA bug database for Liquibase.
If you want to have a crack at implementing it, you could start by looking at the code for the LoadDataChange class (source in Github), which is where the CSV support currently lives.

SSIS for testing bulk data

can SSIS be used in testing excel files containing large volumes of data (around 50,000 rows) and that needs to be verified against the data available in database. The format and content of the data needs to be validated.
I am new to SSIS and I have been trying implementing SSIS for some time now and I am not sure if i am investing my time in correct place.
Has anyone done a similar kind of implementation
There is a ton of information on the web about this.
You can accomplish this using SQL Import/ Export tool or SSIS. Depends on how complex/simple you want to get.
If you're trying to automate this, then SSIS is a better choice.
The primary use for SSIS is data warehousing as the product features a fast and flexible tool for data ETL and validations.
what all does SSIS do? Here's a little answer.
How to perform Validation?
Another example here
Hope this helps.

HL7 Continuity of Care Document (CCD) Development using SQL Programming

I have been given a project to create HL7 Continuity of Care Documents (CCD) using data stored in a SQL Server 2008 database. I have intermediate to advanced knowledge in SQL programming but I have never used FOR XML statements.
Has anybody ever built a stored procedure that would successfully create CCD's strictly using only SQL programming with FOR XML?
Any tips would be greatly appreciated. If anybody used anything else besides SQL, feel free to let me know, but my background is mainly in SQL, T-SQL, with some knowledge in Java and VB.
Thanks
The way that we approached this issue in our .Net application is that we first created classes from the CCD or CDA xsd (which can be obtained from several locations, including the HL7 store) using Microsoft's xsd tool.
We then wrote (a lot of) code that creates and populates the CCD classes from data in the database.
In order to present the data to the user, we first serialized the record to string using the XMLSerializer, then converted the generated XML to HTML using the ccd.xsl transform file. There are several locations for the ccd.xsl including this one.

Which tool to use to export SQL schema from ODBC database?

I have a database in a format which can be accessed via ODBC. I'm looking for a command-line tool to generate SQL file with DROP/CREATE statements from it, preferably with all the information including table/field comments and table relations. (Possibly for a tool to parse the file and import the schema too, but I guess this would be relatively easier to find). Need this to automate workflow, to be able to design the database visually but store it in SVN in code form.
Which tool should I use?
If this helps, the database in question is MS Access, but I guess there's a higher chance of finding a generic ODBC tool...
Okay, I wrote the tool to export access schema/parse SQL files myself, it's available here:
https://bitbucket.org/himselfv/jet-tool
Feel free to use if anyone needs it.
Adding this because I wanted to search an ODBC schema, and came across this post. This tool lets you dump a csv format of the schema itself:
http://sagedataobjects.blogspot.co.uk/2008/05/exploring-sage-data-schema.html
And then you can grep away..
This script may work for you with some modifications. Access (the application) is required though.