Power BI On Demand Webinar

Once again it has been ages between posts and this one is a shameless plug for the company I work for, Dimensional Strategies.  We have created a great On Demand webinar for Power BI,  here’s the link and a synopsis of the webcast

Power BI Webcast

Power BI for Office 365 brings Self-service analytics for all your data to the business

Log on to DSI’s on-demand preview and learn how your organization can unleash Interactive Power View dashboards to:

 

Discover and Combine – Gary in Finance can find, connect to, explore, and compile data easily from various sources

Model and Analyze – Mary your Analyst can Bring the power of in-memory analytics and flexible modeling into Excel with Power Pivot.

Share and Collaborate – Sharon and Marc can use collaborative Power BI sites quickly to share live reports and custom data sets with the rest of the team.

Get Answers and Insights – Graham in Marketing can Ask questions and get back instant answers with the new Q&A feature.

Visualize – See your data in new ways with bold interactive visualizations in Excel using Power View and Power Map.

Access Anywhere – Stephane from Sales can stay connected from anywhere, on any device, with mobile access to his key live and interactive reports.

Day over Day Change Calculations DAX – Accounting for Weekends

Well it’s been a while between posts, thought I would share a little DAX formula I used to determine day over day changes. If you are asking why not use the built in PreviousDay Time Intelligence function, I did, however Mondays posed a bit of a problem previous day being Sunday and all.

How to Tackle this problem hmm?

  1. Add a calculated Column to your Date Dimension, I called mine GoBack:
    • IF(Weekday([altDateKey]) = 1, -2, -1)
      • if the weekday = 1 then go back 2 else go back 1
  2. Add New Measure to your Fact Table Previous Day Amount
    • Previous Day Amount:=IF(HASONEVALUE(‘Date'[GoBack]), CALCULATE([Amount], DATEADD(‘date'[altDateKey],
      VALUES(‘Date'[goback])
      , DAY)), BLANK())

      • If the table returned by Values function returns more than one row set value = null else calculate the amount filtered with DateAdd function
        • The Key is the Values function which returns the value of the [goback] calculated column for the selected date, this value is used as the second parameter in the DateAdd Function, negative integers let you go back to the future
  3. Add New Measures to express your day over day calcs ie:
    • D-o-D Change (%):=if(OR(ISBLANK([Previous Day Amount]), [Previous Day Amount] = 0), Blank(),[Amount] – [Previous Day Amount]) / [Previous Day Amount]

Hoping to Blog more in 2013

Issues with OLEDB Connection to Analysis Services SSIS

Was pulling my hair out trying to get the OLEDB SSAS driver to work in SSIS 2012, it worked perfectly in SQL 2088R2, my MDX query would execute and preview correctly in the OLEDB source component,but when I executed the package in SSDT it would fail during pre execute with less than helpful error message. Thankfully I happened to find the post below by Sherry Li, it explains in detail the steps required to setup the OLEDB SSAS Connection to work correctly.

http://bisherryli.wordpress.com/2012/08/14/ssis-113-querying-ssas-cube-with-mdx-from-ssis/

Thanks Sherry, very much appreciated

What can I say, I’m proud

I am truly proud to announce that the Percentiles Analysis Services Stored Procedure I blogged about a few months back, https://richmintzbi.wordpress.com/2012/06/07/percentiles-like-excel-to-the-4th-power-t-sql-sql-clr-mdx-assp/, has been added to the Analysis Services Stored Procedure Project on CodePlex;  http://asstoredprocedures.codeplex.com/; http://asstoredprocedures.codeplex.com/wikipage?title=Percentiles&version=4. A Huge thanks to Greg Galloway, Chris Webb and Darren Gosbell for taking time out of their busy lives to help me get the Percentile ASSP to a state where it could be shared with the greater analysis services community.

Percentiles like Excel to the 4th power (T-SQL, SQL CLR, MDX, ASSP)

With the release of SQL 2012 additional analytic functions have been added to the SQL server T-SQL toolkit including 3 Percentile functions: PERCENTILE_CONT; PERCENTILE_DISC; PERCENTILE_RANK, sadly no new MDX Percentile functions where added. So I thought I would share 4 ways to calculate percentiles just like Excel using: MDX; SSAS Stored Procedures; T-SQL and SQL CLR Aggregates.

The example  queries in this post use the AdventureWorksDW2012 database and the AdventureWorksDW2012 UDM Analysis Services Database, with the primary tables being FactCallCenter and DimDate. The goal was to find the the 90th percentile for the ‘average issue wait time’ by shift.   To ensure that the data used for SSAS and SQL examples was the same I created a very simple cube called callcentre in the AdventureWorksDW2012 model, see screen shot below, the cube contains one measure and two dimensions

  • Measures
    • Average Time Per Issue
  • Dimensions
    • Date (existing date dimension)
    • Shifts ( degenerate from the shifts column in the FactCallCentre Table)

The source for percentile knowledge was Wikipedia, come on it’s on the internet it must be true, both Excel percentile formulas percentile.inc and pecentile.exc are defined here under the Alternative methods heading.

Basically according to Wikipedia calculating an Excel like percentile comes down to two steps:

– Get the ordinal of the value with the following formulas

  • Exclusive

n = \frac{P}{100}(N+1)

  • Inclusive
n = \frac{P}{100}(N-1)+1

– Calculate the percentile Value by splitting the ordinal into its integer component k and decimal component d,
such that n = k + d. Then v_P is calculated as:

 v_P = \begin{cases}   v_1, & \mbox{for }n=1 \\   v_N, & \mbox{for }n=N \\   v_k+d(v_{k+1}-v_k), & \mbox{for }1 < n < N   \end{cases}

Here Goes…

T-SQL

the code snippet below displays the query used to calculate a percentile value using the exclusive method


/*steps
1. determine the index where percentile value is located
2. Calculate the percentile value
*/

declare @percentile numeric(19,2)
set @percentile = .90

;with percentile (rowIndex, rangePoint, [Shift], AverageTimePerIssue) as
(
select
 row_number() over (partition by [Shift] order by AverageTimePerIssue asc) as rowIndex ,
 ((count(1) over (partition by [Shift])+1) * @percentile) as rangePoint,
 a.[Shift], AverageTimePerIssue from FactCallCenter a
where AverageTimePerIssue is not null
)

select [Shift],
case when max(rangePoint) % 1 <> 0 THEN
 CAST(ROUND(MIN(AverageTimePerIssue) +
 (
 (MAX(AverageTimePerIssue) - MIN(AverageTimePerIssue))
 * (max(rangePoint) % 1)),2) as Numeric(19,2))
ELSE
 CAST(round(max(AverageTimePerIssue),2) as numeric(19,2)) END as percentileValue from
percentile
where rowIndex between Floor(rangePoint) and ceiling(rangePoint)
group by [shift],rangePoint

if you want to use the inclusive method you would modify the ((count(1) over (partition by [Shift])+1) * @percentile)  portion of the query to read ((count(1) over (partition by [Shift])-1) * .9)+1

SQL CLR Aggregate

While a T-SQL script works it requires a lot of  repetitive typing for multiple calculations and is not very reusable, I mean you could always bookmark this post or save the script as a stub for reuse, but I thought that a CLR Aggregate would be a better solution. As you can see selecting a percentile using an aggregate function requires much less typing and allows for reuse of the function without having to re write the implementation.

select [shift], [dbo].[percentiles]([AverageTimePerIssue], 90, 'EXC') from FactCallCenter
group by [shift]

The Aggregate Function takes three parameters:

  1. The column to evaluate the percentile value from
  2. The percentile to find as an integer
  3. The calculation Method {‘EXC’, ‘INC’}

To create the CLR aggregate function you can create a new SQL Server Database Project if you are using Visual Studio 2010 and have installed the new SQL CLR Database Project template which allows you to deploy to SQL  2012 or a new Visual C# SQL CLR Database Project if you are using VS 2010 with SQL 2005 or 2008, these projects will deploy your function directly to SQL Server. If you do not want to create a database project you can also simply created a new class library, then compile and register your assembly and function using SQL scripts (see script below). If you do not want to recreate the function you can download the compiled .dll here and execute the code below, assuming you saved the file to ‘c:\Temp’ , in the context of the database where you want to register the assembly and create the function


EXEC sp_configure 'show advanced options' , '1';
go
reconfigure;
go
EXEC sp_configure 'clr enabled' , '1'
go
reconfigure;
-- Turn advanced options back off
EXEC sp_configure 'show advanced options' , '0';
go

CREATE ASSEMBLY [sql.clr.percentiles] from 'c:\Temp\sql.clr.percentiles.dll'
WITH PERMISSION_SET = SAFE
GO

CREATE AGGREGATE [dbo].[percentiles]
(@value [float], @percentile [smallint], @calcMethod [nvarchar](4000))
RETURNS[float]
EXTERNAL NAME [sql.clr.percentiles].[Percentiles]
GO

Below you will find the code for the aggregate, the code basically performs the same operations as the SQL Equivalent, it finds the ordinal based on the previously defined formulas and then returns the percentile value using the formula as shown earlier.

using System;
using System.Collections;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using System.IO;
using Microsoft.SqlServer.Server;
[Serializable]
[Microsoft.SqlServer.Server.SqlUserDefinedAggregate(Format.UserDefined, IsInvariantToDuplicates = true, IsInvariantToNulls = false, IsNullIfEmpty = true, IsInvariantToOrder = false, MaxByteSize = -1)]
public struct Percentiles : Microsoft.SqlServer.Server.IBinarySerialize
{
 public void Init()
 {
 valueList = new List();
 }

public void Accumulate(SqlDouble value, SqlInt16 percentile, SqlString calcMethod)
 {
 if (!value.IsNull)
 {
 valueList.Add(value.Value);
 }

if (!percentile.IsNull)
 {
 PercentileValue = percentile.Value > 100 ? (short)100 : percentile.Value;
 }
 Method = calcMethod.ToString();
 }

public void Merge(Percentiles Group)
 {
 foreach (double d in Group.valueList)
 {
 valueList.Add(d);
 }
 PercentileValue = Group.PercentileValue;
 }

public SqlDouble Terminate()
 {
 double rangePoint = 0;
 switch (Method)
 {
 case "EXC":
 {
 rangePoint = (((Convert.ToDouble(PercentileValue) / 100) * (valueList.Count + 1))) - 1; // remove 1 as array is zero based
 break;
 }
 case "INC":
 {
 rangePoint = (((Convert.ToDouble(PercentileValue) / 100) * (valueList.Count - 1)) + 1) - 1; // remove 1 as array is zero based
 break;
 }
 default:
 {
 return SqlDouble.Null;
 }
 }
 valueList.Sort();

 //if rangePoint is a whole number
 if ( rangePoint % 1 == 0)
 {
 return valueList[(int)rangePoint];
 }
 else if ((int)Math.Ceiling(rangePoint) < valueList.Count)
 {
 return valueList[(int)Math.Floor(rangePoint)] +
 (rangePoint % 1 * (valueList[(int)Math.Ceiling(rangePoint)] - valueList[(int)Math.Floor(rangePoint)]));
 }
 else
 {
 return SqlDouble.Null;
 }
 }

public void Read(BinaryReader binaryReader)
 {
 if (valueList == null)
 {
 valueList = new List();
 }

Method = binaryReader.ReadString();

 PercentileValue = binaryReader.ReadInt16();

long readerLength = binaryReader.BaseStream.Length;
 while (binaryReader.BaseStream.Position  {
 valueList.Add(binaryReader.ReadDouble());
 }
 }

public void Write(BinaryWriter binaryWriter)
 {
 binaryWriter.Write(Method);
 binaryWriter.Write(PercentileValue);
 foreach (double d in valueList)
 {
 binaryWriter.Write(d);
 }
 }

private string Method { get; set; }
 private short PercentileValue{ get; set; }
 private List valueList { get; set; }
}

MDX

calculating a percentile in MDX follows the same pattern; get the ordinal from your ordered set and calculate the value as seen in the code below. In an effort to keep the MDX readable, allow for easier troubleshooting and better performance I broke the calculation into several calculated measures with the final calculated measure ‘Percentile’ using the measures defined earlier in the script, thanks to Chris Webb and Greg Galloway for their MDX optimization suggestions.

The code below provides the MDX used to generate a percentile value.

//number of items in the set
WITH MEMBER [measures].[Count]
as COUNT(
NONEMPTY(({[Shifts].currentmember}*
{[Date].[Day of Month].[Day of Month].members})
 ,[Measures].[Average Time Per Issue]))

//percentile index

MEMBER [Measures].[RangePoint] AS
(([measures].[Count]+1) *.90) -1 /* subtract 1 from total as item array is 0 based*/

MEMBER [Measures].[RangePoint_Int]
 AS
 INT([Measures].[RangePoint])

member Measures.[floor] as
 (BOTTOMCOUNT(NONEMPTY(({[Shifts].currentmember}*
{[Date].[Day of Month].[Day of Month].members})
 ,[Measures].[Average Time Per Issue]), [measures].[Count], [Measures].[Average Time Per Issue]).
 item(([Measures].[RangePoint_int])),
 [Measures].[Average Time Per Issue])

member Measures.[Ceiling] as
(BOTTOMCOUNT(NONEMPTY(({[Shifts].currentmember}*
{[Date].[Day of Month].[Day of Month].members})
 ,[Measures].[Average Time Per Issue]), [measures].[Count], [Measures].[Average Time Per Issue]).
 item(([Measures].[RangePoint_int]+1)),
 [Measures].[Average Time Per Issue])

MEMBER [Measures].[Percentiles] AS
IIF([Measures].[RangePoint] - [Measures].[RangePoint_Int] = 0,
//rangepoint is a whole number
(mySet.item([Measures].[RangePoint]-1),
 [Measures].[Average Time Per Issue]),
 //rangepoint is not a whole number
 Measures.[floor]
+
([Measures].[RangePoint] - [Measures].[RangePoint_Int])
*
(
Measures.[ceiling]
-
measures.[floor])
)
select {[Measures].[Percentiles]} on 0
,
([Shifts].[Shift].[Shift].members) on 1
from callcentre

To switch to the inclusive method you would modify the rangePoint calculated measure to read


MEMBER [Measures].[RangePoint] AS
((([measures].[Count]-1) *.90)+1) -1

SSAS Stored Procedures (ASSP)

Like T-SQL, the MDX solution requires lots of typing and reuse requires re-implementation; if you type anywhere like I do extra typing means lots of room for error. Fortunately SSAS provides a mechanism for creating reusable user defined functions, Analysis Services Stored Procedures.

To use the predefined Stored Procedure requires a fraction of the code and no need to re-implement the logic


with member measures.[90th percentile] as
[ssas storedprocedures].ValueAtPercentile(
NONEMPTY(([Date].[Day of Month].[day of month].members, [Measures].[Average Time Per Issue])),
 [Measures].[Average Time Per Issue], .90, true, "EXC")

select measures.[90th percentile] on 0,
 [Shifts].[Shift].[Shift].members on 1
from callcentre

The function takes 5 Parameters:

  1. The set to evaluate
  2. The measure to get the percentile of
  3. The percentile to find as a decimal
  4. The order the set{true, false}
  5. The calculation Method{‘EXC’, ‘INC’}

It should be noted that ASSP’s used within calculated measures may not perform as well as straight MDX, however for me having both options available is really useful. I did some high level testing  with these scripts using the SQL 2012 Adventure Works Cube and found the ASSP’s performed up to 14 time faster than the straight MDX scripts, bonus.

  • Using my laptop, fully loaded with 3 instances of SQL Server including 2012 with all SSAS flavours,  the ASSP executed in two seconds for both the Inclusive and Exclusive modes where the straight MDX took 28 seconds for Inclusive and 9 Seconds for Exclusive

My work on this stored procedure was heavily influenced by the Analysis Services Stored Procedure Project on Codeplex, it was a huge help in the learning  how to write ASSP’s and it provided an Order() function that I was able to reuse. I have submitted the function to the ASSP Project Authors in hopes that it may be included there. For those of you that do not want to create the ASSP you can download the compiled .dll here and follow the steps outlined at the end of this section to add the assembly to Analysis Services.

If you want to create your own ASSP Library you can follow the steps below.

  • create a new C# Class Library project in Visual Studio
  • add the following references
    • Microsoft.AnalysisServices (C:\Program Files (x86)\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.AnalysisServices.DLL)
    • msmgdsrv (C:\Program Files (x86)\Microsoft Analysis Services\AS OLEDB\10\msmgdsrv.dll)
  • Set the target framework project property to .Net Framework 3.5
  • In project properties, Sign the assembly
  • Rename the Class1.cs to something meaningful (in my solution it is called percentiles.cs)
  • Add the following using statements

using Microsoft.AnalysisServices.AdomdServer;
using Microsoft.AnalysisServices;
using System.Collections.Generic;

Download the following file and add it to your solution (setOrdering Class from ASSP codeplex project)

  • Replace all the code under the using statements with the following

namespace ssas.storedprocedures
{
 public class PercentileFunctions
 {
 [SafeToPrepare(true)]
 public static double RangePoint(Set inputSet, double percentileValue, string calcMethod)
 {
 switch (calcMethod)
 {
 case "EXC":
 {
 return ((percentileValue) * (inputSet.Tuples.Count + 1)) - 1;
 }
 case "INC":
 {
 return (((percentileValue) * (inputSet.Tuples.Count -1))+1) - 1;
 }
 default:
 {
 return ((percentileValue) * (inputSet.Tuples.Count + 1)) - 1;
 }
 }
 }

[SafeToPrepare(true)]
 public static double ValueAtPercentile(Set inputSet, Expression sortExpression, double percentileValue, bool sortAscending, string calcMethod)
 {
 //get position where percentile falls
 double Rank = RangePoint(inputSet, percentileValue, calcMethod);

 //order the set ascending using Codeplex SSAS Stored Procedure Function
 if (sortAscending)
 {
 Set s = setOrdering.Order(inputSet, sortExpression);

//if the Rank is a whole number
 if ((Rank % 1) == 0)
 {
 return sortExpression.Calculate(s.Tuples[Convert.ToInt32(Rank)]).ToDouble();
 }
 //if Rank is a decimal
 else
 {
 return
 (sortExpression.Calculate(s.Tuples[(int)Math.Floor(Rank)]).ToDouble())
 +
 (Rank % 1 *
 (sortExpression.Calculate(s.Tuples[(int)Math.Ceiling(Rank)]).ToDouble()
 - sortExpression.Calculate(s.Tuples[(int)Math.Floor(Rank)]).ToDouble())
 );
 /*(((sortExpression.Calculate(s.Tuples[Convert.ToInt32(Rank) + 1]).ToDouble()
 - sortExpression.Calculate(s.Tuples[Convert.ToInt32(Rank)]).ToDouble())) * (Rank - Convert.ToInt32(Rank)));*/
 }
 }
 else {
 //if the Rank is a whole number
 if ((Rank % 1) == 0)
 {
 return sortExpression.Calculate(inputSet.Tuples[Convert.ToInt32(Rank)]).ToDouble();
 }
 //if Rank is a decimal
 else
 {
 return
 (sortExpression.Calculate(inputSet.Tuples[Convert.ToInt32(Rank)]).ToDouble())
 +
 (((sortExpression.Calculate(inputSet.Tuples[Convert.ToInt32(Rank) + 1]).ToDouble()
 - sortExpression.Calculate(inputSet.Tuples[Convert.ToInt32(Rank)]).ToDouble())) * (Rank - Convert.ToInt32(Rank)));
 }
 }
 }
}
}
  • Compile the Code
  • Open SSMS, connect to Analysis Services
  • Right click on the Assemblies folder and select New Assembly
  • Browse to the compiled .dll
  • check the include debug information if you want debug the function (how to debug link)
  • use in a mdx query

Happy querying.

Performance Point Services Excel Like Stacked Bar Chart

My favorite thing about excel is that you can do just about anything in it, like create a stacked bar chart representing the growth of a measure over time and represent each time period increment as a distinct segment in the each bar like the chart below.

My least favorite thing about excel is being are asked to replicate ‘simple’ excel charts using a PerformancePoint Analytical Chart. Easy right; just create a Quarter-To-Date Calculated Measure in your Cube (Contoso Cube used this example) throw it and your quarters into a chart and you’re done. Well the chart displays the data correctly, measure grows over time but the colors aren’t as nice and the chart does not stack by the value attributed to each quarter

Ok so what happens if we move the QTD measure to the bottom axis and quarters to the series, again the data is correct but this time although the chart identifies each segment attributed to each quarter the chart is no longer displays columns for each quarter.

So what we need is the second chart but with some new calculated measures on the bottom axis.

Each of the measures on the bottom axis represent a time period we are interested in and only returns the value for that period and prior periods. Let’s have a look at the MDX that I used to accomplish the visualization in this chart.

/*Q1 Measure 
Use iif to check if the currentmember of the fiscal quarter attribute = “Q1”, 
True – build string to represent the quarter member, use sttomember function to convert to member, complete --tuple with measure
False - null*/

CREATE MEMBER CURRENTCUBE.[measures].[Q1]
as
iif(LEFT([fiscal quarter].currentmember.membervalue, 2) = "Q1",
(strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity]), null),
FORMAT_STRING = "Standard", 
VISIBLE = 1; 

/*Q2 Measure 
Use iif to check if the number of the period represented by the currentmember of the fiscal quarter attribute <=2, 
True – Check to see if empty, set to null, else build string to represent the quarter member, use strtomember function to convert to member
complete tuple with measure
False - null*/

CREATE MEMBER CURRENTCUBE.[measures].[Q2] as
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 2,
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 2 and
isEMPTY((strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),
null,
(strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),null),/*)*/
FORMAT_STRING = "Standard", 
VISIBLE = 1; 

/*Q3 and Q4 are the same as Q2 just increment the <= number*/

CREATE MEMBER CURRENTCUBE.[measures].[Q3] as
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 3,
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 3 and
isEMPTY((strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),
null,
(strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),null),/*)*/
FORMAT_STRING = "Standard", 
VISIBLE = 1; 

CREATE MEMBER CURRENTCUBE.[measures].[Q4] as
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 4,
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 4 and
isEMPTY((strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),
null,
(strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),null),/*)*/
FORMAT_STRING = "Standard", 
VISIBLE = 1; 

/*Use scope statements to set the value for the all member of the quarter attribute when combined with the calculated members to null*/

SCOPE(([Fiscal Quarter].[All],{[Measures].[Q1]})); 
THIS =  null; 
END SCOPE; 

SCOPE(([Fiscal Quarter].[All],{[Measures].[Q2]})); 
THIS =  null; 
END SCOPE; 

SCOPE(([Fiscal Quarter].[All],{[Measures].[Q3]})); 
THIS =  null; 
END SCOPE; 

SCOPE(([Fiscal Quarter].[All],{[Measures].[Q4]})); 
THIS =  null; 
END SCOPE;