What paramters do I consider to choose SSAS or Powerpoint?
I am totally new at choosing a BI tool. Where do I start? Is setting up SSAs a doable job for a novice?
I think you mean SSAS (tabular modeling) vs PowerPivot. Cathy Dumas has a great post that lists some considerations to take into account when deciding which tool is right for the job...
http://blogs.msdn.com/b/cathyk/archive/2011/10/27/when-to-choose-tabular-models-over-powerpivot-models.aspx
Related
I am looking at ways to bring an Excel Spreadsheet, which uses lots of Visual Basic Access code - onto the Power Platform. The purpose of it to do Forecasting and Optimisation algotithms.
What is the best way to re-create this code?
I am thinking that we move all the data from the spreadsheet into Dataverse. It seems the only option to run the code is to put it into an Azure Function?
Is there a way to do this another way, through AI Builder or Power Automate? Could this be ran through Power Fx - or is Power Fx limited to basic instructions?
It seems limited in the way we can run code through Power Platform... any help or advice is appreciated.
Todd
I dont think a 1:1 copy paste action is possible (because of that VB access code).
Although Dataverse is a relational database it supports most common database functionalities. And if it is simple data (and not too much of it) you could alternatively use SharePoint lists. This is often simpler, and less expensive in licenses.
Also Power Automate can be used for automatically updating data in DV or Lists.
So if you manage to get you data into on of these two, the "Power Platform way" would be to expose it using Power BI.
It can be a overhvelming undertaking if you are not familliar with PP at all. And I would advise you to see if it is possible to slice it into smaller chuncks and try it out one small bit a a time.
Hi i'm working on a large SSAS tabular model and it's so slow every time i change even the smallest thing it goes away and thinks about it for ages.
The model is massive and pretty sure that's the problem but I've inherited it like that so at the moment cant do anything about that.
Is there a way to stop SSAS loading all the data (or less data) while i'm developing the model in visual studio?
I encountered the same problem in my cube; apparently due to the hundreds of measures my model has. I tried the process clear method described by Vercelli but it did not help in my situation. To resolve this issue I did the following:
Open the project in VS
Go to Model in the Menu
Go to Calculation Options
Select Manual Calculation
This brought my time down from 7 - 10 minutes doing something as simple as hiding a measure to 3 to 5 seconds.
If you process-clear your workspace DB no data will appear on Visual Studio.
Navigate with SSMS to your workspace instance. The Database will appear as your tabular model followed by your userName and a GUID. Right-Click -> Process Database-> Process Clear.
Please try this with caution if your are not sure which DB you are processing.
PS: If your fact tables are partitioned, you can process-clear those and then process only some of the partitions in order to have some data to test.
There's nothing you can do about it for the time being - I'm in a similar situation and have searched far and wide for a solution!
What sometimes helps is if you delete the workspace files (the ones with the guid at the end) from the SSAS\data directory. Doing this deletes all the data in the local version of the model which will bring the file size a fair bit. I do it every now and then and find it does usually help - it still keeps the table structures and relationships etc
For big, mature models Visual Studio is not the best dev environment. Try to use Tabular Editor from GitHub created by Daniel Otyker. It's an editor that works in both on-line and off-line modes and is super fast. What's more, it lets you change things in bulk using either GUI or scripting via a bit of C# (nothing to be afraid of, though). There's a lot of documentation on GitHub and good examples that can very easily be customized. Also, Daniel has a video on YT that teaches how to use the editor to the best advantage. Hope this helps.
When building SSAS cubes with Visual Studio 2019 and you are having performance issues while maintaining the cube, for example, 3-8 minute long delays between modifications to the cube. The way to fix this issue is to Turn Off Automatic Calculation for the model and set the model calculation to Manual Calculation.
To change the calculation method, from the Visual Studio 2019 Enterprise SSDT main menu bar "Model", select "Calculation Options" and select "Manual".
As you can see, it allows toggling between "Automatic Calculation" and "Manual Calculation". If you select "Manual Calculation", then you have a menupad offering the option to "Calculate Now".
Microsoft introduced MDX for analysis services and since then few things have changed in the market place. Microsoft now have column store analysis services tabular and power pivot that run on DAX. Also database vendors have moved to in-memory (SAP Hana). I have long given up on MDX as unnecessary in the current DAX tabular environment, however SAP HANA excel pug-in now uses MDX to query HANA models and I'm trying to access if its worth learning MDX again.
Thanks
Using MDX is one of several options to query SAP HANA information models.
Standard SQL queries would do just as well.
MDX is mainly aimed at providing a common interface language to access data sources and return the data into multi-dimensional structures.
It also provides several language concepts not covered by SQL, e.g. hierarchy processing.
I've yet to see a user that would write his MDX statements for ad-hoc reporting by hand...
I work for a company that has a very mature and precise olap environment - MDX is 100% relevent.
We will start to look to move certain functionality into the Tabular/DAX world but I wouldn't imagine stopping MDX for a good while.
To me it is a very pretty declarative language - elegant and powerful - much more so than sql or what I've so far seen of DAX.
If sql is checkers(draughts) then mdx is chess!
From what I gather, MDX is typically used for OLAP multidimensional data stores. In SSAS 2012, it looks like DAX is used for tabular models, but MDX is supported as well. So why are there two query languages available for one system? Furthermore, which is the recommended one to use for tabular model and why? Is DAX faster than MDX in these scenarios?
I've yet to install or play with SSAS 2012 yet so if I might be missing something.
DMX is a data mining query language, which you use to identify inherent patterns in the data. User Smith is correct regarding which languages can be used on which platform. DAX utilises the xVelocity in-memory analytics engine which performs faster on average than the multidimensional cube technology (ROLAP, MOLAP). MDX is not as fast as DAX queries on a tabular model.
I don't have all the answers but from my understnading this is how it goes.
MDX is for the developer.
DAX is for the superuser (excel user/ business user).
Microsoft thought that MDX was too complicated for a business user so they conjured up DAX which is very similar to Excel functions, and they thought it would be quick for the end user to catch on to this if they were already familiar with Excel.
I don't know which is necessarily faster, I think it is more about what you are comfortable using. Personally, I use both languages, but I really only tend to use DAX in powerpivot and excel (obviously). So, I am more comfortable using MDX wherever I can.
HTH, oh and by the way there is also DMX which I have yet to use but believe this can also be used with SSAS. And yes I agree two/three query languages for one system is a pretty confusing thought but once you read up on it and use them a bit it makes a little more sense because their situational usage is so much different.
We are in a situation to decide one among MS SSAS and Pentaho. Is there any comparision list for both of them with pros and cons? Any help is very much appreciated.
Use Pentaho. I'm using MS Analysis, Integration and Reporting for two years. A lot of bugs, very weak error reporting, monsterous products. A lot of lags in design time. If you need more coplex answer ask your questions what side of thees products you interestd in.