SSAS gives you better control over data refresh times as well (assuming live connect). Or you can try to optimise your model.Īnother is that SSAS has much better support for incremental refreshing of data, which is a big hole in PBI desktop at the moment. If your model is big (and I mean big) then moving to SSAS Tabular is easy. "One obvious reason why you might port to SSAS from PBI Desktop is the 1GB file size limit. You might want to do this for the following reason that mentioned: In saying that, an efficient Power BI Desktop model can import many billions of rows and perform well - and it's cheaper than buying an SSAS seat (on premise or Azure)"Īlso, ".port to SSAS from PBI Desktop" means moving the model built in PBI Desktop to SSAS tabular. ![]() However when the data model becomes more complex then it starts to favour SSAS. I'd say if your data volumes are small to medium and it's not too complex then there would me more advantages in using PBI over SSAS. So if you build a model in PBI Desktop, it's pretty easy to port to SSAS if need (and vice versa). "Power BI Desktop is essentially a cut down version of SSAS Tabular. In regards to building DM, it can be done either in SSAS or Power BI. Hi is fine to go Power BI Premium/ Power BI Reporting Services if your data is highly confidential. So now I am actually looking into using a SSAS cube on an azure cloud as solution to this performance issue. And we tried the community and got no answer that would change that speed. I am aware that the measure needs to calculate a year for each month, but with that small amount of rows I am puzzled. So that the question asked: "what is the distinct amount of paying members in May and a year back?" and "what is the distinct amount of paying members in April and a year back?" etc. The measure is a distinct count on a column with member numbers and the calculation needs to for each month in the chart count the distinct amount of members 12 month back. ![]() But we are nowing facing performance issues with a single measure that makes the model use 15+ seconds to update the numbers. ![]() It consists of 7 million rows and is a very simple model that in most cases performs well. That is an answer I would like to try and get Your thoughts on because I am having problems with a fairly small dataset in my power BI datamodel.
0 Comments
Leave a Reply. |