In actual world SSAS Tabular tasks, it is advisable to run many various testing situations to show your buyer that the information in Tabular mannequin is right. In case you are operating a Tabular Mannequin on high of a correct knowledge warehouse then your life could be a bit simpler than once you construct your semantic mannequin on high of an operational database. Nevertheless it might be nonetheless a reasonably time-consuming course of to run many check instances on Tabular Mannequin, then run related assessments on the information warehouse and evaluate the outcomes. So your check instances all the time have two sides, one facet is your supply database that may be an information warehouse and the opposite facet is the Tabular Mannequin. There are lots of methods to check the system, you’ll be able to browse your Tabular Mannequin in Excel, connecting to your Knowledge Warehouse in Excel and create pivot tables then evaluate the information coming from Tabular Mannequin and the information coming from the Knowledge Warehouse. However, for what number of measures and dimensions you are able to do the above check in Excel?
The opposite method is to run DAX queries on Tabular Mannequin facet. In case your supply database is a SQL Server database, then it is advisable to run T-SQL queries on the database facet then match the outcomes of either side to show the information in Tabular Mannequin is right.
On this submit I’d prefer to share with you a solution to automate the DAX queries to be run on a Tabular mannequin.
Immediately, that is going to be a protracted submit, so you may make or take a cup of espresso whereas having fun with your studying.
Whereas I can’t cowl the opposite facet, the supply or the information warehouse facet, it’s value to automate that half too as it can save you heaps of occasions. I’m certain the same course of will be developed in SQL Server facet, however, I go away that half for now. What I’m going to clarify on this submit is only one of many attainable methods to generate and run DAX queries and retailer the ends in SQL Server. Maybe it’s not excellent, however, it’s a good place to begin. If in case you have a greater concept it might be nice to share it with us within the feedback part under this submit.
- SQL Server Evaluation Providers Tabular 2016 and later (Compatibility Stage 1200 and better)
- An occasion of SQL Server
- SQL Server Administration Studio (SSMS)
What I’m going to clarify may be very easy. I need to generate and run DAX queries and seize the outcomes. Step one is to get all measures and their related dimensions, then I slice all of the measures by all related dimensions and get the outcomes. On the finish I seize and retailer the ends in a SQL Server temp desk. Let’s take into consideration a easy situation:
- you may have only one measure, [Internet Sales], from ‘Web Gross sales’ desk
- The measure is said to only one dimension, “Date” dimension
- The “Date” dimension has solely 4 columns, Yr, Month, Yr-Month and Date
- you need to slice [Internet Sales] by Yr, Month, Yr-Month and Date
So it is advisable to write 4 DAX queries as under:
EVALUATE SUMMARIZE( 'Web Gross sales' , Date'[Calendar Year] , "Web Gross sales", [Internet Total Sales] )
EVALUATE SUMMARIZE( 'Web Gross sales' , 'Date'[Month Name] , "Web Gross sales", [Internet Total Sales] )
EVALUATE SUMMARIZE( 'Web Gross sales' , 'Date'[Year-Month] , "Web Gross sales", [Internet Total Sales] )
EVALUATE SUMMARIZE( 'Web Gross sales' , 'Date'[Date] , "Web Gross sales", [Internet Total Sales] )
It’s straightforward isn’t it? However, wait. What when you’ve got 10 measures associated to 4 dimension and every dimension has 10 columns? That sounds laborious doesn’t it? Properly, in actual world situations you gained’t slice all measures by all related dimensions, however, you continue to must do so much. What we’re going to do is to generate and run the DAX queries and retailer the ends in a desk in SQL Server. How cool is that?
OK, that is the way it works…
- Making a Linked Server for SSAS Tabular occasion from SQL Server
- Producing DAX queries utilizing Tabular DMVs
- Operating the queries by means of Tabular mannequin and getting/storing the ends in a SQL Server temp desk
Creating Linked Server for SSAS Tabular (OLAP Service)
I’m not going to an excessive amount of particulars on this. Yow will discover numerous assets over the web on how one can create a Linked Server for an occasion of SSAS in SQL Server. Right here is the way in which you’ll be able to create a Lined Server for SSAS from SSMS GUI:
- Open SSMS and connect with an occasion of SQL Server
- Increase “Server Objects”
- Proper click on “Linked Servers”
- Click on “New Linked Server…”
- Within the “New Linked Server” window, below “Common” pane, enter a reputation for the linked server
- Be sure you choose “Microsoft OLE DB Supplier for Evaluation Providers”
- Enter the SSAS Server within the “Location” part
- Enter a desired database identify within the “Catalog” part
- Click on “Safety” pane
- Click on “Add” button and choose a “Native Login” from the dropdown checklist
- Tick “Impersonate”
- Click on “Be made utilizing the login’s present safety context” then click on OK
Word: Your safety setting could also be completely different from above.
Now that we received our linked server sorted, let’s run some queries utilizing the linked server and ensure it’s working as anticipated. The question construction for an SSAS Linked Server is as under:
choose * from openquery([LINKED_SERVER_NAME], ‘DESTINATION QUERY LANGUAGE‘)
As a easy check I run the next question which certainly is passing the DAX question to the Tabular mannequin to run and retrieve the outcomes:
choose * from openquery([TABULAR2017], 'EVALUATE ''Date''')
The above question brings all values from the ‘Date’ desk from Tabular mannequin into SQL Server. The outcomes clearly will be saved in any type of regular SQL tables.
Let’s have a better have a look at the above question:
We now have to make use of OPENQUERY to move the DAX question by means of the Linked Server, run it in Tabular facet and get the outcomes. OPENQUERY accepts DAX question in string format on the second argument. As we put desk names in a single quote in DAX then now we have so as to add a further single quote to the desk identify as proven within the screenshot under.
Earlier than we proceed let’s see what DAX question building we’d like and what the question sample we’re after. That is what we get if we run all of the DAX queries that talked about earlier on this article, in a batch, sure! We are able to run a number of DAX queries in a single run when utilizing Linked Server, which isn’t a shock. From SSMS viewpoint we’re simply operating a batch of SQL statements, aren’t we?
choose * from openquery([TABULAR2017] , 'EVALUATE SUMMARIZE(''Web Gross sales'' , ''Date''[Calendar Year] , "Web Gross sales", [Internet Total Sales])') choose * from openquery([TABULAR2017] , 'EVALUATE SUMMARIZE(''Web Gross sales'' , ''Date''[Month Name] , "Web Gross sales", [Internet Total Sales])') choose * from openquery([TABULAR2017] , 'EVALUATE SUMMARIZE(''Web Gross sales'' , ''Date''[Year-Month] , "Web Gross sales", [Internet Total Sales])') choose * from openquery([TABULAR2017] , 'EVALUATE SUMMARIZE(''Web Gross sales'' , ''Date''[Date] , "Web Gross sales", [Internet Total Sales])')
This can be a generic model of the above queries:
choose * from openquery([TABULAR2017], ‘EVALUATE SUMMARIZE(”FACT_TABLE”, ”RELATED_DIMENSION”[COLUMN_NAME], “MEASURE_GIVEN_NAME“, [MEASURE_NAME])’)
As you see now we have the next sample repeated in all queries:
- A “SELECT” assertion with “OPENQUERY” together with the linked server identify
- Within the question argument now we have “EVALUATE SUMMARIZE(“
Then now we have:
- two single quotes
- FACT_TABLE: the desk that hosts the measure
- two single quotes and a comma
- one other two single quotes
- RELATED_DIMENSION: this can be a dimension tables which has a associated to the measure
- once more two single quotes
- open bracket
- COLUMN_NAME: the column from the dimension that’s getting used to slice the measure
- shut bracket
- double quote, sure! this one is double qoute
- MEASURE_GIVEN_NAME: that is the identify that we gave to the measure, like an alias
- double quote
- open bracket
- shut bracket
- a detailed parentheses
- a final single quote
- and eventually one other shut parentheses
To date we simply ran a DAX question from SQL Server by means of a Linked Server, within the subsequent few traces we are going to run DMVs to get the metadata we have to generate the DAX queries and run them from SQL Server by means of the Linked Server. To generate the DAX question with the above sample we’d like the next 5 DMVs:
- DISCOVER_CALC_DEPENDENCY
- TMSCHEMA_TABLES
- TMSCHEMA_MEASURES
- TMSCHEMA_COLUMNS
- TMSCHEMA_RELATIONSHIPS
Learn extra about Dynamic Administration Views (DMVs) right here.
Whereas we don’t want all columns from the DMVs, I choose simply the columns we’d like and I additionally put some situations within the the place clause that I clarify the explanation for utilizing these situations afterward. However for now the queries that we’re after appear like under DMV queries:
choose [Object] , [Expression] , [Referenced_Table] from $SYSTEM.DISCOVER_CALC_DEPENDENCY the place [Object_Type] = 'measure'
choose [Name] , [ID] from $SYSTEM.TMSCHEMA_TABLES the place not IsHidden
choose [TableID] , [Name] , [Expression] from $SYSTEM.TMSCHEMA_MEASURES the place not IsHidden and [DataType] <> 2
choose [TableID] , [ExplicitName] from $SYSTEM.TMSCHEMA_COLUMNS the place not [IsHidden] and [Type] <> 3 and never [IsDefaultImage] and [ExplicitDataType] = 2 and [State] = 1
choose [FromTableID] , [ToTableID] from $SYSTEM.TMSCHEMA_RELATIONSHIPS the place IsActive
As you see I used some enumerations within the above queries as under:
- In TMSCHEMA_MEASURES, “DataType” exhibits the information sort of the measure. The attainable values are:
Enumeration | Description |
2 | String |
6 | Int64 |
8 | Double |
9 | DateTime |
10 | Decimal |
11 | Boolean |
17 | Binary |
19 | Unknown (the measure is in an Error state) |
20 | Variant (measure with various knowledge sort) |
So including “DataType <> 2” to the the place clause when querying TMSCHEMA_MEASURES signifies that we’re NOT excited by textual measures like once you outline a measure to indicate the consumer identify utilizing USERNAME() operate in DAX.
- In TMSCHEMA_COLUMNS, I used “Kind”, “ExplicitDataType” and “State” enumerations. The attainable values for the above enumerations are:
Title | Enumeration | Description |
Kind | 1 | Knowledge (Comes from knowledge supply) |
2 | Calculated (Calculated Column) | |
3 | RowNumber (That is an inside column that’s NOT seen. It represents the row quantity.) | |
4 | CalculatedTableColumn (A calculated column in a calculated desk) | |
ExplicitData
Kind |
1 | Computerized (When calculated columns or calculated desk columns set the worth to Computerized, the kind is routinely inferred) |
2 | String | |
6 | Int64 | |
8 | Double | |
9 | DateTime | |
10 | Decimal | |
11 | Boolean | |
17 | Binary | |
19 | Unknown (The column is in an Error state) | |
State | 1 | Prepared (The column is queryable and has up-to-date knowledge) |
3 | NoData (The column continues to be queryable) | |
4 | CalculationNeeded (The column shouldn’t be queryable and must be refreshed) | |
5 | SemanticError (The column is in an Error state due to an invalid expression) | |
6 | EvaluationError (The column is in an Error state due to an error throughout expression analysis) | |
7 | DependencyError (The column is in an error state as a result of a few of its calculation dependencies are in an error state) | |
8 | Incomplete (Some components of the column haven’t any knowledge, and the column must be refreshed to deliver the information in) | |
9 | SyntaxError (The column is in an error state due to a syntax error in its expression) |
So including “Kind <> 3 and ExplicitDataType = 2 and State = 1” to the the place clause when querying “TMSCHEMA_COLUMNS” signifies that we’re solely within the columns which can be NOT inside row numbers and their knowledge sort is string and they’re queryable and able to use.
The following step is to place the above queries within the OPENQUERY. The queries on the finish will appear like the under queries:
choose [Object] MeasureName , [Expression] , [Referenced_Table] ReferencedTable from openquery([TABULAR2017] , 'choose [Object] , [Expression] , [Referenced_Table] from $SYSTEM.DISCOVER_CALC_DEPENDENCY the place [Object_Type] = ''measure''' )
choose [TableID] , [Name] MeasureName , [Expression] from openquery([TABULAR2017] , 'choose [TableID] , [Name] , [Expression] from $SYSTEM.TMSCHEMA_MEASURES the place not [IsHidden] and [DataType] <> 2' )
choose [FromTableID] , [ToTableID] from openquery([TABULAR2017], 'choose [FromTableID] , [ToTableID] from $SYSTEM.TMSCHEMA_RELATIONSHIPS the place [IsActive]' )
choose [Name] TableName , [ID] from openquery([TABULAR2017], 'choose [Name] , [ID] from $SYSTEM.TMSCHEMA_TABLES the place not IsHidden' )
choose [TableID] , [ExplicitName] RelatedColumn from openquery([TABULAR2017], 'choose [TableID] , [ExplicitName] from $SYSTEM.TMSCHEMA_COLUMNS the place not [IsHidden] and [Type] <> 3 and never [IsDefaultImage] and [ExplicitDataType] = 2 and [State] = 1' )
Now we need to be part of the above tables to get:
- Seen measures
- The bottom tables utilized in measures (referenced tables)
- Associated dimension to the measures
- Columns of these associated dimensions
Having the 4 above parts we are able to dynamically generate the DAX question that we wish by becoming a member of the above 5 queries. I used CTE building to affix the above queries:
;with MeasureReferences as ( choose [Object] MeasureName , [Expression] , [Referenced_Table] ReferencedTable from openquery([TABULAR2017], 'choose [Object] , [Expression] , [Referenced_Table] from $SYSTEM.DISCOVER_CALC_DEPENDENCY the place [Object_Type] = ''measure'' ' ) ) , Measures as ( choose [TableID] , [Name] MeasureName , [Expression] from openquery([TABULAR2017], 'choose [TableID] , [Name] , [Expression] from $SYSTEM.TMSCHEMA_MEASURES the place not [IsHidden] and [DataType] <> 2' ) the place charindex('SUM', ltrim(rtrim(solid([Expression] as varchar(max))))) = 1 ) , Relationships as ( choose [FromTableID] , [ToTableID] from openquery([TABULAR2017], 'choose [FromTableID] , [ToTableID] from $SYSTEM.TMSCHEMA_RELATIONSHIPS the place [IsActive]' ) ) , Tables as ( choose [Name] TableName , [ID] from openquery([TABULAR2017], 'choose [Name] , [ID] from $SYSTEM.TMSCHEMA_TABLES the place not IsHidden' ) ) , Columns as ( choose [TableID] , [ExplicitName] RelatedColumn from openquery([TABULAR2017], 'choose [TableID] , [ExplicitName] from $SYSTEM.TMSCHEMA_COLUMNS the place not [IsHidden] and [Type] <> 3 and never [IsDefaultImage] and [ExplicitDataType] = 2 and [State] = 1' ) ) choose solid(mr.ReferencedTable as varchar(max)) TableName , solid(m.MeasureName as varchar(max)) MeasureName , solid((choose TableName from Tables the place [ID] = r.[ToTableID] ) as varchar(max) ) RelatedDimension , solid(c.RelatedColumn as varchar(max)) RelatedColumn from Measures m be part of MeasureReferences mr on solid(mr.MeasureName as varchar(max)) = solid(m.MeasureName as varchar(max)) be part of Relationships r on (choose ID from Tables the place solid(mr.ReferencedTable as varchar(max)) = solid(TableName as varchar(max)) ) = r.[FromTableID] be part of Columns c on c.[TableID] = r.[ToTableID]
Let’s revisit the DAX question that we’re going to generate utilizing the outcomes of the above question.
EVALUATE SUMMARIZE( 'Web Gross sales' , 'Date'[Calendar Year] , "Web Gross sales", [Internet Total Sales] )
If we run the above question that is what we get:
That appears effective, however, after we generate DAX queries, we are going to routinely detect all associated dimensions to all measures and generate the question in order that it slices every measure by each single columns of associated dimensions. In that case our column names might be completely different from what we see within the above screenshot for every question that we run.
So now we have to hard-code the column names which isn’t preferrred. As well as, we’re going to insert that knowledge in a SQL Server desk. With hardcoded column names then we could have some meaningless dimension values within the left column and a few measure values in the appropriate column. Subsequently, now we have to alter the above question a little bit bit in order that it dynamically use the column names as values for 2 further columns. So the results of the question brings 4 columns, the primary column (from the left) accommodates the column identify together with its worth subsequent to it within the second column. The third column exhibits the measure identify and the fourth column exhibits the measure values.
This appears a lot better. I ran the next DAX question to get the above outcome:
EVALUATE SELECTCOLUMNS ( SUMMARIZE ( 'Web Gross sales' , 'Date'[Calendar Year] , "Measure Title", "Web Gross sales" , "Worth", [Internet Total Sales] ), "Dimension Title", "'Date'[Calendar Year]" , "Dimension Worth", 'Date'[Calendar Year] , "Measure Title", "Web Whole Gross sales" , "Measure Worth", [Internet Total Sales] )
The following step is to dynamically generate the latter DAX question utilizing the outcomes of DMVs operating by means of the Linked Server.
Within the following question we outline a neighborhood variable to generate the DAX question, then we use “Print” T-SQL operate to see the outcomes.
Word: The “Print” operate has a limitation on displaying giant strings, so we are going to solely a portion of the outcomes. Learn extra about “Print” right here.
declare @SQL varchar(max) = null ;with MeasureReferences as ( choose [Object] MeasureName , [Expression] , [Referenced_Table] ReferencedTable from openquery([TABULAR2017], 'choose [Object] , [Expression] , [Referenced_Table] from $SYSTEM.DISCOVER_CALC_DEPENDENCY the place [Object_Type] = ''measure'' ' ) ) , Measures as ( choose [TableID] , [Name] MeasureName , [Expression] from openquery([TABULAR2017], 'choose [TableID] , [Name] , [Expression] from $SYSTEM.TMSCHEMA_MEASURES the place not [IsHidden] and [DataType] <> 2' ) the place charindex('SUM', ltrim(rtrim(solid([Expression] as varchar(max))))) = 1 ) , Relationships as ( choose [FromTableID] , [ToTableID] from openquery([TABULAR2017], 'choose [FromTableID] , [ToTableID] from $SYSTEM.TMSCHEMA_RELATIONSHIPS the place [IsActive]' ) ) , Tables as ( choose [Name] TableName , [ID] from openquery([TABULAR2017], 'choose [Name] , [ID] from $SYSTEM.TMSCHEMA_TABLES the place not IsHidden' ) ) , Columns as ( choose [TableID] , [ExplicitName] RelatedColumn from openquery([TABULAR2017], 'choose [TableID] , [ExplicitName] from $SYSTEM.TMSCHEMA_COLUMNS the place not [IsHidden] and [Type] <> 3 and never [IsDefaultImage] and [ExplicitDataType] = 2 and [State] = 1' ) ) choose @SQL = ISNULL(@SQL, '') + 'choose * from openquery ([TABULAR2017], ''EVALUATE SELECTCOLUMNS(SUMMARIZE ('''''+[TableName]+''''', '''''+RelatedDimension+'''''['+RelatedColumn+'], "Measure Title", "'+MeasureName+'", "Worth", ['+MeasureName+']) , "Dimension Title", "'''''+RelatedDimension+'''''['+RelatedColumn+']", "Dimension Worth", '''''+RelatedDimension+'''''['+RelatedColumn+'], "Measure Title", "'+MeasureName+'", "Measure Worth", ['+MeasureName+'])'') ' from ( choose solid(mr.ReferencedTable as varchar(max)) TableName , solid(m.MeasureName as varchar(max)) MeasureName , solid((choose TableName from Tables the place [ID] = r.[ToTableID] ) as varchar(max) ) RelatedDimension , solid(c.RelatedColumn as varchar(max)) RelatedColumn from Measures m be part of MeasureReferences mr on solid(mr.MeasureName as varchar(max)) = solid(m.MeasureName as varchar(max)) be part of Relationships r on (choose ID from Tables the place solid(mr.ReferencedTable as varchar(max)) = solid(TableName as varchar(max)) ) = r.[FromTableID] be part of Columns c on c.[TableID] = r.[ToTableID] ) as tbl Print @SQL
You’ll be able to copy/paste and run each question that’s generated to get the outcomes.
I manually ran the next question that’s copied from the outcomes:
choose * from openquery ([TABULAR2017], 'EVALUATE SELECTCOLUMNS(SUMMARIZE (''Web Gross sales'', ''Foreign money''[Currency Code], "Measure Title", "Web Whole Gross sales", "Worth", [Internet Total Sales]) , "Dimension Title", "''Foreign money''[Currency Code]", "Dimension Worth", ''Foreign money''[Currency Code], "Measure Title", "Web Whole Gross sales", "Measure Worth", [Internet Total Sales])')
The final step is to execute all generated queries and retailer the ends in a SQL Server Desk.
That is a simple one. We simply must execute the dynamic SQL saved in @SQL native variable then we retailer the ends in a desk we create in SQL Server. For the sake of this submit I create a world non permanent desk in SQL Server. So the ultimate question will appear like this:
if object_id('tempdb..##Outcomes') shouldn't be null drop desk ##Outcomes create desk ##Outcomes (DimensionName varchar(max) , DimensionValue varchar(max) , MeasureName varchar(max) , MeasureValue bigint ) --Create a world temp desk declare @SQL varchar(max) = null ;with --Get measures, their associated dimensions and dimenion columns MeasureReferences as ( choose [Object] MeasureName , [Expression] , [Referenced_Table] ReferencedTable from openquery([TABULAR2017], 'choose [Object] , [Expression] , [Referenced_Table] from $SYSTEM.DISCOVER_CALC_DEPENDENCY the place [Object_Type] = ''measure'' ' ) ) , Measures as ( choose [TableID] , [Name] MeasureName , [Expression] from openquery([TABULAR2017], 'choose [TableID] , [Name] , [Expression] from $SYSTEM.TMSCHEMA_MEASURES the place not [IsHidden] and [DataType] <> 2' ) the place charindex('SUM', ltrim(rtrim(solid([Expression] as varchar(max))))) = 1 ) , Relationships as ( choose [FromTableID] , [ToTableID] from openquery([TABULAR2017], 'choose [FromTableID] , [ToTableID] from $SYSTEM.TMSCHEMA_RELATIONSHIPS the place [IsActive]' ) ) , Tables as ( choose [Name] TableName , [ID] from openquery([TABULAR2017], 'choose [Name] , [ID] from $SYSTEM.TMSCHEMA_TABLES the place not IsHidden' ) ) , Columns as ( choose [TableID] , [ExplicitName] RelatedColumn from openquery([TABULAR2017], 'choose [TableID] , [ExplicitName] from $SYSTEM.TMSCHEMA_COLUMNS the place not [IsHidden] and [Type] <> 3 and never [IsDefaultImage] and [ExplicitDataType] = 2 and [State] = 1' ) ) choose @SQL = ISNULL(@SQL, '') + 'choose * from openquery ([TABULAR2017], ''EVALUATE SELECTCOLUMNS(SUMMARIZE ('''''+[TableName]+''''', '''''+RelatedDimension+'''''['+RelatedColumn+'], "Measure Title", "'+MeasureName+'", "Worth", ['+MeasureName+']) , "Dimension Title", "'''''+RelatedDimension+'''''['+RelatedColumn+']", "Dimension Worth", '''''+RelatedDimension+'''''['+RelatedColumn+'], "Measure Title", "'+MeasureName+'", "Measure Worth", ['+MeasureName+'])'') ' from ( choose solid(mr.ReferencedTable as varchar(max)) TableName , solid(m.MeasureName as varchar(max)) MeasureName , solid((choose TableName from Tables the place [ID] = r.[ToTableID] ) as varchar(max) ) RelatedDimension , solid(c.RelatedColumn as varchar(max)) RelatedColumn from Measures m be part of MeasureReferences mr on solid(mr.MeasureName as varchar(max)) = solid(m.MeasureName as varchar(max)) be part of Relationships r on (choose ID from Tables the place solid(mr.ReferencedTable as varchar(max)) = solid(TableName as varchar(max)) ) = r.[FromTableID] be part of Columns c on c.[TableID] = r.[ToTableID] ) as tbl --Generate DAX queries dynamically insert into ##Outcomes execute (@SQL) --Execute the DAX queries choose DimensionName, DimensionValue, MeasureName, FORMAT(MeasureValue, '#,#.#') MeasureValue from ##Outcomes the place MeasureValue <> 0 and MeasureValue shouldn't be null
The above question ought to work on any Tabular Mannequin in the event you setup the linked server appropriately. Nevertheless, as it’s possible you’ll seen, it is going to generate numerous queries for all attainable combos of slicing a measure with all columns from all associated dimensions. The queries will run towards an occasion of SSAS Tabular one after the other. Subsequently, when you’ve got numerous measures and dimensions, then for certain you’ll face efficiency points. Sadly that is the case in actual world tasks. However, what you are able to do is to select a few of the most dimensions which can be crucial ones to the enterprise and prohibit the above question to generate just some potentialities. The opposite level is that in a number of instances you actually don’t want to check all combos of measures and all columns from associated dimensions. So you’ll be able to add some extra situations to the question to generate much less queries as desired. For example, within the above question, have a look at the “Measures” CTE. I put a situation in to get solely the measures that their expressions begin with “SUM”. The rationale for that’s that I needed to get solely the measures which can be principally summation base measures. In actual world tasks you’ll have lots of of measures and operating the above question with none situations doesn’t sound fairly proper.
Q: Is the question particular to SSAS Tabular Mannequin?
A: Sure, it’s. However, you are able to do one thing related for SSAS Multidimensional.
Q: Is that this methodology depending on the Tabular server identify and/or database identify?
A: So far as you setup the Linked Server appropriately there should not be any points.
Q: Can we use this methodology for testing Energy BI fashions?
A: Sure. You simply must open your Energy BI Desktop (pbix) file and discover its native port quantity. Then you’ll be able to create a Linked Server to your Energy BI file and then you definitely’re good to go. Learn extra about discovering Energy BI Desktop Native Port quantity right here.
Q: This methodology is barely to get measures and their associated dimensions’ columns in SSAS Tabular facet. Ultimately now we have to check the outcomes with the underlying knowledge supply(s) like an information warehouse. How ought to we check towards the supply programs.
A: As talked about earlier, we’re solely overlaying the SSAS Tabular facet. You are able to do one thing related in your knowledge warehouse facet and evaluate the outcomes. One of many challenges could be discovering the column mappings between your knowledge warehouse and the SSAS Tabular mannequin. There’s a “SourceColumn” out there within the “$SYSTEM.TMSCHEMA_COLUMNS” DMV to get the supply column names. That may be a very good place to begin. Then you should utilize dynamic SQL to generate the queries and run towards your knowledge warehouse, getting the ends in a SQL Server desk. The remainder could be straightforward to check the 2 outcomes.
Q: Is that this methodology legitimate for Azure Evaluation Providers too?
A: Sure it’s.
Associated
Uncover extra from BI Perception
Subscribe to get the newest posts despatched to your e mail.