tabular in-memory vs multidimensional and molap-mode - ssas

I have a question regarding the ssas-models tabular and multidimensional cube.
I've read that both models can work in a real-time-mode (direct query mode & rolap).
My questions concerns the tabular model in in-memory-cache-mode and the multidimensional model in molap-mode. How recent is the data there? Can I define myself how often the data gets refreshed or how is this managed?
thank you in advance!

first, in regards to real-time mode, ROLAP is indeed as real-time as the data source it is utilizing. Therefore, if it is accessing a data warehouse that performs daily ETL, it is only as up to date as the warehouse. SSAS Tabular direct query mode is only applicable with a SQL Server data source (currently).
The main purpose of ROLAP or direct query mode is to yes, allow for real-time (if that is a reporting requirement) but mainly to put the processing requirements on the data source server rather than the Analysis Services server.
Second, in regards to Tabular in-memory and MOLAP multidimensional modes, yes, you define the frequency via a scheduled SSIS package or XMLA script.

Related

Olap analysis and Reporting

please could someone tell me why some people do this after creating our data warehouse we create report (Repporting ) and Olap analysisenter image description here
my question why will we do olap analysis and we create repport what is the Beneficial of doing both of them , i think reporting is sufficient to help the client to analyse the data.But still some client ask for both .
I use Analysis services models as the source for all reporting. In your case you may have transactional reporting (large amounts of row-level data) which doesn't lend itself to the technology. Analysis services would be better suited to data which is likely to be aggregated.
Tabular models are a great way to present data to users for them to interact with as they can be designed in a way which makes them better for self-service data analytics.
I've also implemented the hybrid approach you mentioned. This can be useful if businesses have varying report requirements. For example dashbiarding could be done using power bi connected to the tabular model whereas transactional reporting such as large emailed spreadsheets could be run from the sql server (perhaps using ssrs or power bi paginated).

Row Level Security (RLS) for a SSAS Tabular Model

I am new to SSAS technologies for developing analytical models. I have to build several tabular models for a huge application in which security is quite relevant. What I would like to do is to re-use the row level security existing in the sources of the cube and apply it to the cube itself.
For example, if I build a tabular model from two tables of a schema, and these two table have RLS enabled, I would like the cube to take this security into account, so that when I access reports and log in as user, I will only see aggregated data according to the permissions I have.
Searching through the web I found ways of implementing RLS within the cube, but none about inheriting it from the sources. But again, I am new to the technology, so I preferred to ask here.
Thanks
The most obvious solution to your request is running SSAS Tabular in thin mode (called DirectQuery mode). As long as in the Existing Connections dialog in Visual Studio you set ImpersonateCurrentUser, when a user queries the SSAS model, SSAS will in turn send one or more SQL queries to the database under the end user's credentials. RLS in the SQL database will come into play here.
One caveat is that I would only recommend DirectQuery in SSAS 2016 not prior. Another caveat is that performance will be slow compared to a cached model in SSAS. So if performance isn't acceptable then turn off DirectQuery and reimplement RLS inside SSAS. Also DirectQuery uses zero caching of results currently so the load against SQL will not be offloaded to SSAS at all. Finally, if you use DirectQuery and ImpersonateCurrentUser you may have to setup Kerberos if your SQL server isn't on the same server as SSAS so that user credentials can double hop.

Does CQRS With OLTP and OLAP Databases Make Sense?

I have several OLTP databases with API's talking to them. I also have ETL jobs pushing data to an OLAP database every few hours.
I've been tasked with building a custom dashboard showing hight level data from the OLAP database. I want to build several API's pointing to the OLAP database. Should I:
Add to my existing API's and call the OLAP database and use a CQRS type pattern, so reads come from OLAP, while writes come from OLTP. My concern here is that there could be a mismatch in the data between reads and writes. How mismatched the data is depends on how often you run the ETL jobs (Hours in my case).
Add to my existing API's and call the OLAP databases then ask the client to choose whether they want OLAP or OLTP data where API's overlap. My concern here is that the client should not need to know about the implementation detail of where the data is coming from.
Write new API's that only point to the OLAP database. This is a lot of extra work.
Don't use #1: when management talk of analyzed reports it don't bother data mismatch between ETL process - obviously you will generate a CEO report after finishing ETL for the day
Don't use #2: this way you'll load transnational system with analytic overhead and dissolve isolation between purpose of two systems (not good for operation and maintenance)
Use #3 as its the best way to fetch processed results, Use modern tools like Excel, PowerQuery, PowerBI to allow you to create rich dashboard with speed instead of going into tables and writing APIs.

SSAS - MSBI - Solution - Suggestions

Is it correct in my understanding that we can build SSAS cubes sourcing from the transaction Systems? I meant the not the live but copy of the Live.
I'm trying to see if there is any scope to address few reporting needs without the need to build a traditional Data Warehouse and then build cubes on top of the data warehouse, instead build cubes to do Financial monthly aggregated reporting needs sourcing from backup copy of the Transaction systems.
Alternatively, if you have any better way to proceed please suggest.
Regards,
KK
You can create a set of views on top of you transactional system tables and then build your SSAS cubes ontop of those views. This would be less effort than creating a fully fledged datawarehouse.
I am a data warehouse developer (and therefore believe in cubes), but not every reporting solution warrants the cost of building a cube. If your short to medium term reporting requirements are fixed and you don't have users requiring data to be sliced differently each week, then a series of fixed reports may suffice.
You can create a series of SQL Server Reporting Services reports (or extract to Excel) either directly against your copied transactional data, or against a series of summarised tables that are created periodically. If you decide to utilise a series of pre-formatted reporting tables, try to create tables that cover multiple similar reports (rather than 1 monthly report table = 1 report) for ease of ongoing maintenance.
There are many other important aspects to this that you may need to consider first. Like how busy is the transaction system, what is the size of the data, concurrency and availability issues etc.
It is absolutely fine to have a copy of your live data and then build a report on the top of it. Bear in mind that the data you see in the report will not be the latest and there will be a latency factor depending on the frequency of your data pull.

ROLAP not working - how to design the cube/DSV for ROLAP?

I'm trying to configure storage mode ROLAP for a partition in an existing SSAS cube. The cube is a little messy in that the measure group is defined by a named query (as opposed to a table) and the dimensions are defined in several different data source views (DSV).
This is the error message I get when querying the cube with mdx:
Executing the query ...
Server: The operation has been cancelled.
Errors in the high-level relational engine. The 'dbo_dim_account' table that is required for a join cannot be reached based on the relationships in the data source view.
Execution complete
Note that MOLAP storage mode with proactive caching works fine. This problem occurs only when storage mode is ROLAP or HOLAP.
Also, I have tried to add the tables of all dimensions to the DSV of the cube in question but that doesn't seem to help.
Any ideas?
Not an expert here, but you could try importing the AS DB in Visual Studio.NET - "Import Analysis Services database" in the New Project... dialog.
Once in there, you can see the table schemas for the Data Source View (which is where the relational tables are defined that the cubes are extracted from). Next, look to make sure the "dbo_dim_account" table is there and that your fact table is related to it.
It may be that a dimension and fact have to be in the same DSV for the relation to work?
Also, maybe the SSAS flight recorder or Application log would have more issues?