Data Validation in Tableau - testing

I am looking for ways to validate data that tableau generates. I know we cannot access elements from the tableau server itself but has someone approached in some way to tackle the data validation for the reports? I am very much interested in learning what type of testing is possible around tableau and its reports.
Thanks in Advance,
RV

We have a testing group that follow behind our workbook developers and test functionality and validity of calculations. One of their primary methods is to examine (sometimes download to Excel) the underlying data which should include fields from the originating data source in addition to the results of calculated fields.

Related

Power BI maxing connections to DB :( Can we populate multiple tables with single Sql.Database call?

I am assisting my team troubleshoot an issue with a Power BI report we are developing. We have a rather complex data model in the source SQL database, so we have created 5-6 views to better manage the data. We have a requirement to use DirectQuery, as one key requirement for the report is that the most up-to-date data in the database is visible, rather than having a delay in loading/caching the data. We also have the single data source, just the one database.
When we run the report, we see a spike of 200-500 connections to the database from the specific user for the report data source, and those connections don't close. This is clearly an issue and unsustainable for any product. We have a ticket open with Microsoft premium support to address the connections not closing, but in the meantime, I'm wondering if we're doing something wrong inside the report?
When I view the queries in the query editor, we basically have one query for each view, and it's a simple:
let
Source = Sql.Database(Server, Database)
query_view_name = Source{[Schema ......]}[Data]
in
query_view_name
(I don't have the raw code in front of me, but that's the gist of it.)
It seems to me, based on analytics in the database, that "Sql.Database" is opening a new connection every time this view is called. And with 5-6 views, that's 5-6 connections at a minimum; then each time a filter is changed, it's more connections, and it's compounds from there until the database connection pool is maxed out.
Is there a way to populate all the tables using a single connection to the database? Why would Power BI be using so many connections? Can we populate multiple tables in the advanced query editor? Using DirectQuery, are there any suggestions for what we can look at/troubleshoot/change in the report?
Thanks!
Power BI establishes multiple connections to the database to load multiple tables in parallel. If you don't want this, you can turn it off from Options->Current file->Data Load->Enable parallel loading of tables:
Keep in mind, that turning this option off most likely will increase the model loading time.
You may want to take a look at Maximum connections per data source option in Options->Current file->Direct query and the whole section Query reduction beneat it. Turning on Slicer selection and Filter selection on this page is highly recommended for cases like yours, but you need to train your users that they need to click on apply to see the results.
Ok.
We have a rather complex data model in the source SQL database, so we have created 5-6 views to better manage the data.
That's fine.
We have a requirement to use DirectQuery,
But now you're going to have a bad time. DirectQuery + complex views is a recipe for poor performance. Queries against your views will add joins, potentially across the whole model for filter context, as well as Measure and Calculated Column expressions. And these queries will change dynamically, based on the user's interaction with the report. So it's very difficult to see and test all the possible queries.
Basic guidance is to use import mode against views, and only use DirectQuery against properly-indexed tables. To address data freshness, you can replace the views with tables you load and keep up-to-date from your application, or perhaps use an Indexed View, etc.

Better test reporting

I'm looking for some help designing a better summary report. Right now we publish and send everything (execution% by modules, defects etc) in an excel and I was hoping if we could use that excel data to generate a live dashboard that would be accessible by a URL.
To add, the execution data comes from QTest and defects from JIRA. At this point we are even ok with filling data in excel manually and using that as a source for any reporting tool.
If a free tool is available, even more better.
Any leads, helps, feedback is appreciated.
Thanks,
MD
Sounds like you need Microsoft's Power BI. We've done a lot of reporting from JIRA using this free tool (Desktop). If you need to share it with others "real time", you'll prefer the online experience for about $10/user/month. But if you're looking to stay "free", you can simply share the Power BI file with your stakeholders.
I recommend AGAINST using the already built in JIRA APP. It seems to want to pull back all your issues. Instead, use a REST API Call like this:
https://domain/rest/api/2/search?jql=filter=22605&fields=id,key,summary,description
If you get more issues back than your Issue Search is configured for, the pagination can be a little tricky. Also multiple values in a custom field need special handling.
Or if you're on premise and know your JIRA DB, direct SQL is an efficient way to go.
We use both mechanisms... (REST and SQL). SQL let us add logic in the view of the data that JIRA itself doesn't report on easily. (Parent-Child-subchild relationships and roll up of effort, story points, etc)
The best part of the Power BI solution is you should be able to integrate the data from JIRA and your test tool. (We pull from JIRA and our time tracking system).

Power BI - Getting data from sources with no direct API

I am working on designing a Power BI dashboard and would require some insights on the source of data. My source of financial data comprises of Kyriba, Bloomberg and ClearWater Analytics. However, no API exists for either of these in Power BI. What’s the best way to pull data from these sources into PowerBI? Manual generation of excel always? Need assistance in understanding the best way to automate this process? Thank You!
unfortunately or maybe fortunately these sites are highly secured, so at the moment there is no connector to these companies' data. But I am quite sure that there are some automatic extraction in csv files or even excel files. Once you have these files you can do everything you want in your Power BI.
Maybe you can contact your IT to check if there is any ftp channel possible to automate the import.
Good luck!

How to mix SSAS with other sources in PowerBi

Using PowerBi we cannot find any way to mix data coming from SQL Server Analysis Services with other data sources (Excel to make it easy).
As soon as we select the SSAS data source, the new source button is greyed and no way to use it.
Trying the inverse (Excel first) seems to work but importing specific SSAS data (which in our case is several milion rows) so in fact is hardly usable as we have to know in advance what columns are we going to use for every report ... not quite user friendly!
Is there any way to do it the "logical" way?
thks
PowerBI does not allow to mix DirectQuery with other data sources.
If it fits your task you could use your source in Import mode instead.
https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-use-directquery/

The Pentaho BI Platform Workflow Issue

I have been working with Pentaho for the last few days. I have been able to setup the Pentaho Report Designer to generate a sample report by follow their documentation. Then I follow this article http://www.robertomarchetto.com/www/how_to_use_pentaho_report_designer_tutorial and managed to export the report to Pentaho BI server.
All I don't understand is Pentaho workflow. What should be the process I should follow which means what's the purpose of exporting the export to Pentaho BI server? Why there is a Data Integration tool? Why there is a BI sever when I can export the report from the Designer tool?
Requirement
All I want to do is retrieve the data from the MYSQL DB. Put them into a data-mart. Then from the data-mart generate a report.(According to what I have read, creating a data mart is the efficient way).
How can I get it done?
Pentaho Data Integration can be used to make this report generation automated.
In report designer you will be passing a parameter or set of parameters to generate a single report output.
With Data integration you can generate the reports for different set of parameters. for eg: if reports are generated on daily basis, we can make it automated for the whole month, so that there is no need of generating reports daily and manually.
And using the Pentaho Business Intelligence server we can make all these operations scheduled.
To generate Data/Table(Fact tables/dimension table) in MYSQL DB From difference source like files/different DB - Data Integration tool comes in to picture .
To create Schema on top of Fact tables - Mondrian tool
To handle user/roles on top of created cubes -Meta data editor
To create simple reports on top of small tables - Report Designer
For sequential Execution (at a go) usage of DI jobs/transformation , Reports, Java script - Design Studio
thanks to user surya.thanuri # forums.pentaho.com
The Data Integration tool is mostly for ETL, it's a separate tool and you can ignore it unless you are doing complex analysis of data from multiple dissimilar data sources. You don't need to 'export' reports to the pentaho server, you can write them directly to a directory then refresh the repository from inside the Pentaho web application. Exporting them is just one workflow technique.
You're going to find that there are about a dozen ways to do any one thing with Pentaho. For instance I use the CDA datasources with my reports vice placing the sql code inside my report. Alternatively you can link up to a Data Integration server to execute the Data Integration scripts to view a result set.
Just to answer your datamart question. In general a datamart should probably be supported by either the Data Integration tool (depending on your situation I don't exactly recommend this) or database functions/replication streams (recommended).
Just to hazard a guess, it sounds like someone tossed you a project saying: We need a BI system, here's the database where the data is stored, here are the reports we're already getting. X looked at Pentaho and liked it. You should use that.
First thing you need to do is understand the shape of the data, volume, tables, interrelations. Figure out what the real questions they want to answer are. Determine whether they need real time reporting, etc..etc. Just getting the datamart together itself, if you even need one, can take quite awhile. I think you may have jumped the gun on Pentaho itself.
thanks to user flamierd # forums.pentaho.com