Day over Day Change Calculations DAX – Accounting for Weekends

Well it’s been a while between posts, thought I would share a little DAX formula I used to determine day over day changes. If you are asking why not use the built in PreviousDay Time Intelligence function, I did, however Mondays posed a bit of a problem previous day being Sunday and all.

How to Tackle this problem hmm?

  1. Add a calculated Column to your Date Dimension, I called mine GoBack:
    • IF(Weekday([altDateKey]) = 1, -2, -1)
      • if the weekday = 1 then go back 2 else go back 1
  2. Add New Measure to your Fact Table Previous Day Amount
    • Previous Day Amount:=IF(HASONEVALUE(‘Date'[GoBack]), CALCULATE([Amount], DATEADD(‘date'[altDateKey],
      VALUES(‘Date'[goback])
      , DAY)), BLANK())

      • If the table returned by Values function returns more than one row set value = null else calculate the amount filtered with DateAdd function
        • The Key is the Values function which returns the value of the [goback] calculated column for the selected date, this value is used as the second parameter in the DateAdd Function, negative integers let you go back to the future
  3. Add New Measures to express your day over day calcs ie:
    • D-o-D Change (%):=if(OR(ISBLANK([Previous Day Amount]), [Previous Day Amount] = 0), Blank(),[Amount] – [Previous Day Amount]) / [Previous Day Amount]

Hoping to Blog more in 2013

Advertisements

Issues with OLEDB Connection to Analysis Services SSIS

Was pulling my hair out trying to get the OLEDB SSAS driver to work in SSIS 2012, it worked perfectly in SQL 2088R2, my MDX query would execute and preview correctly in the OLEDB source component,but when I executed the package in SSDT it would fail during pre execute with less than helpful error message. Thankfully I happened to find the post below by Sherry Li, it explains in detail the steps required to setup the OLEDB SSAS Connection to work correctly.

http://bisherryli.wordpress.com/2012/08/14/ssis-113-querying-ssas-cube-with-mdx-from-ssis/

Thanks Sherry, very much appreciated

What can I say, I’m proud

I am truly proud to announce that the Percentiles Analysis Services Stored Procedure I blogged about a few months back, https://richmintzbi.wordpress.com/2012/06/07/percentiles-like-excel-to-the-4th-power-t-sql-sql-clr-mdx-assp/, has been added to the Analysis Services Stored Procedure Project on CodePlex;  http://asstoredprocedures.codeplex.com/; http://asstoredprocedures.codeplex.com/wikipage?title=Percentiles&version=4. A Huge thanks to Greg Galloway, Chris Webb and Darren Gosbell for taking time out of their busy lives to help me get the Percentile ASSP to a state where it could be shared with the greater analysis services community.

Power View Really Improved in Office 2013

I Have recently had the opportunity to experiment with Power View embedded directly in Excel 2013. I can happily say that Power View has come a long way since the initial public release with SQL 2012 RTM, and would like to share a couple of things that I think are pretty cool.

  • Power View now supports Map visualizations with Bing Maps
    • You tell Bing where the data should go by placing members with location values in the Locations area or by placing longitude and latitude members in the provided fields.


PowerPivot 2013 allows you to assign geographical categories to your Column to help Bing geocode

  • You can add Multiple Members to the Axis in Charts and Graphs giving users the ability to drill through into greater detail by double clicking. The interface also provides the ability to navigate back to the previous level


  • Drill Down available in the Matrix table type,, providing an interactive pivot chart like experience


Percentiles like Excel to the 4th power (T-SQL, SQL CLR, MDX, ASSP)

With the release of SQL 2012 additional analytic functions have been added to the SQL server T-SQL toolkit including 3 Percentile functions: PERCENTILE_CONT; PERCENTILE_DISC; PERCENTILE_RANK, sadly no new MDX Percentile functions where added. So I thought I would share 4 ways to calculate percentiles just like Excel using: MDX; SSAS Stored Procedures; T-SQL and SQL CLR Aggregates.

The example  queries in this post use the AdventureWorksDW2012 database and the AdventureWorksDW2012 UDM Analysis Services Database, with the primary tables being FactCallCenter and DimDate. The goal was to find the the 90th percentile for the ‘average issue wait time’ by shift.   To ensure that the data used for SSAS and SQL examples was the same I created a very simple cube called callcentre in the AdventureWorksDW2012 model, see screen shot below, the cube contains one measure and two dimensions

  • Measures
    • Average Time Per Issue
  • Dimensions
    • Date (existing date dimension)
    • Shifts ( degenerate from the shifts column in the FactCallCentre Table)

The source for percentile knowledge was Wikipedia, come on it’s on the internet it must be true, both Excel percentile formulas percentile.inc and pecentile.exc are defined here under the Alternative methods heading.

Basically according to Wikipedia calculating an Excel like percentile comes down to two steps:

– Get the ordinal of the value with the following formulas

  • Exclusive

n = \frac{P}{100}(N+1)

  • Inclusive
n = \frac{P}{100}(N-1)+1

– Calculate the percentile Value by splitting the ordinal into its integer component k and decimal component d,
such that n = k + d. Then v_P is calculated as:

 v_P = \begin{cases}   v_1, & \mbox{for }n=1 \\   v_N, & \mbox{for }n=N \\   v_k+d(v_{k+1}-v_k), & \mbox{for }1 < n < N   \end{cases}

Here Goes…

T-SQL

the code snippet below displays the query used to calculate a percentile value using the exclusive method


/*steps
1. determine the index where percentile value is located
2. Calculate the percentile value
*/

declare @percentile numeric(19,2)
set @percentile = .90

;with percentile (rowIndex, rangePoint, [Shift], AverageTimePerIssue) as
(
select
 row_number() over (partition by [Shift] order by AverageTimePerIssue asc) as rowIndex ,
 ((count(1) over (partition by [Shift])+1) * @percentile) as rangePoint,
 a.[Shift], AverageTimePerIssue from FactCallCenter a
where AverageTimePerIssue is not null
)

select [Shift],
case when max(rangePoint) % 1 <> 0 THEN
 CAST(ROUND(MIN(AverageTimePerIssue) +
 (
 (MAX(AverageTimePerIssue) - MIN(AverageTimePerIssue))
 * (max(rangePoint) % 1)),2) as Numeric(19,2))
ELSE
 CAST(round(max(AverageTimePerIssue),2) as numeric(19,2)) END as percentileValue from
percentile
where rowIndex between Floor(rangePoint) and ceiling(rangePoint)
group by [shift],rangePoint

if you want to use the inclusive method you would modify the ((count(1) over (partition by [Shift])+1) * @percentile)  portion of the query to read ((count(1) over (partition by [Shift])-1) * .9)+1

SQL CLR Aggregate

While a T-SQL script works it requires a lot of  repetitive typing for multiple calculations and is not very reusable, I mean you could always bookmark this post or save the script as a stub for reuse, but I thought that a CLR Aggregate would be a better solution. As you can see selecting a percentile using an aggregate function requires much less typing and allows for reuse of the function without having to re write the implementation.

select [shift], [dbo].[percentiles]([AverageTimePerIssue], 90, 'EXC') from FactCallCenter
group by [shift]

The Aggregate Function takes three parameters:

  1. The column to evaluate the percentile value from
  2. The percentile to find as an integer
  3. The calculation Method {‘EXC’, ‘INC’}

To create the CLR aggregate function you can create a new SQL Server Database Project if you are using Visual Studio 2010 and have installed the new SQL CLR Database Project template which allows you to deploy to SQL  2012 or a new Visual C# SQL CLR Database Project if you are using VS 2010 with SQL 2005 or 2008, these projects will deploy your function directly to SQL Server. If you do not want to create a database project you can also simply created a new class library, then compile and register your assembly and function using SQL scripts (see script below). If you do not want to recreate the function you can download the compiled .dll here and execute the code below, assuming you saved the file to ‘c:\Temp’ , in the context of the database where you want to register the assembly and create the function


EXEC sp_configure 'show advanced options' , '1';
go
reconfigure;
go
EXEC sp_configure 'clr enabled' , '1'
go
reconfigure;
-- Turn advanced options back off
EXEC sp_configure 'show advanced options' , '0';
go

CREATE ASSEMBLY [sql.clr.percentiles] from 'c:\Temp\sql.clr.percentiles.dll'
WITH PERMISSION_SET = SAFE
GO

CREATE AGGREGATE [dbo].[percentiles]
(@value [float], @percentile [smallint], @calcMethod [nvarchar](4000))
RETURNS[float]
EXTERNAL NAME [sql.clr.percentiles].[Percentiles]
GO

Below you will find the code for the aggregate, the code basically performs the same operations as the SQL Equivalent, it finds the ordinal based on the previously defined formulas and then returns the percentile value using the formula as shown earlier.

using System;
using System.Collections;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using System.IO;
using Microsoft.SqlServer.Server;
[Serializable]
[Microsoft.SqlServer.Server.SqlUserDefinedAggregate(Format.UserDefined, IsInvariantToDuplicates = true, IsInvariantToNulls = false, IsNullIfEmpty = true, IsInvariantToOrder = false, MaxByteSize = -1)]
public struct Percentiles : Microsoft.SqlServer.Server.IBinarySerialize
{
 public void Init()
 {
 valueList = new List();
 }

public void Accumulate(SqlDouble value, SqlInt16 percentile, SqlString calcMethod)
 {
 if (!value.IsNull)
 {
 valueList.Add(value.Value);
 }

if (!percentile.IsNull)
 {
 PercentileValue = percentile.Value > 100 ? (short)100 : percentile.Value;
 }
 Method = calcMethod.ToString();
 }

public void Merge(Percentiles Group)
 {
 foreach (double d in Group.valueList)
 {
 valueList.Add(d);
 }
 PercentileValue = Group.PercentileValue;
 }

public SqlDouble Terminate()
 {
 double rangePoint = 0;
 switch (Method)
 {
 case "EXC":
 {
 rangePoint = (((Convert.ToDouble(PercentileValue) / 100) * (valueList.Count + 1))) - 1; // remove 1 as array is zero based
 break;
 }
 case "INC":
 {
 rangePoint = (((Convert.ToDouble(PercentileValue) / 100) * (valueList.Count - 1)) + 1) - 1; // remove 1 as array is zero based
 break;
 }
 default:
 {
 return SqlDouble.Null;
 }
 }
 valueList.Sort();

 //if rangePoint is a whole number
 if ( rangePoint % 1 == 0)
 {
 return valueList[(int)rangePoint];
 }
 else if ((int)Math.Ceiling(rangePoint) < valueList.Count)
 {
 return valueList[(int)Math.Floor(rangePoint)] +
 (rangePoint % 1 * (valueList[(int)Math.Ceiling(rangePoint)] - valueList[(int)Math.Floor(rangePoint)]));
 }
 else
 {
 return SqlDouble.Null;
 }
 }

public void Read(BinaryReader binaryReader)
 {
 if (valueList == null)
 {
 valueList = new List();
 }

Method = binaryReader.ReadString();

 PercentileValue = binaryReader.ReadInt16();

long readerLength = binaryReader.BaseStream.Length;
 while (binaryReader.BaseStream.Position  {
 valueList.Add(binaryReader.ReadDouble());
 }
 }

public void Write(BinaryWriter binaryWriter)
 {
 binaryWriter.Write(Method);
 binaryWriter.Write(PercentileValue);
 foreach (double d in valueList)
 {
 binaryWriter.Write(d);
 }
 }

private string Method { get; set; }
 private short PercentileValue{ get; set; }
 private List valueList { get; set; }
}

MDX

calculating a percentile in MDX follows the same pattern; get the ordinal from your ordered set and calculate the value as seen in the code below. In an effort to keep the MDX readable, allow for easier troubleshooting and better performance I broke the calculation into several calculated measures with the final calculated measure ‘Percentile’ using the measures defined earlier in the script, thanks to Chris Webb and Greg Galloway for their MDX optimization suggestions.

The code below provides the MDX used to generate a percentile value.

//number of items in the set
WITH MEMBER [measures].[Count]
as COUNT(
NONEMPTY(({[Shifts].currentmember}*
{[Date].[Day of Month].[Day of Month].members})
 ,[Measures].[Average Time Per Issue]))

//percentile index

MEMBER [Measures].[RangePoint] AS
(([measures].[Count]+1) *.90) -1 /* subtract 1 from total as item array is 0 based*/

MEMBER [Measures].[RangePoint_Int]
 AS
 INT([Measures].[RangePoint])

member Measures.[floor] as
 (BOTTOMCOUNT(NONEMPTY(({[Shifts].currentmember}*
{[Date].[Day of Month].[Day of Month].members})
 ,[Measures].[Average Time Per Issue]), [measures].[Count], [Measures].[Average Time Per Issue]).
 item(([Measures].[RangePoint_int])),
 [Measures].[Average Time Per Issue])

member Measures.[Ceiling] as
(BOTTOMCOUNT(NONEMPTY(({[Shifts].currentmember}*
{[Date].[Day of Month].[Day of Month].members})
 ,[Measures].[Average Time Per Issue]), [measures].[Count], [Measures].[Average Time Per Issue]).
 item(([Measures].[RangePoint_int]+1)),
 [Measures].[Average Time Per Issue])

MEMBER [Measures].[Percentiles] AS
IIF([Measures].[RangePoint] - [Measures].[RangePoint_Int] = 0,
//rangepoint is a whole number
(mySet.item([Measures].[RangePoint]-1),
 [Measures].[Average Time Per Issue]),
 //rangepoint is not a whole number
 Measures.[floor]
+
([Measures].[RangePoint] - [Measures].[RangePoint_Int])
*
(
Measures.[ceiling]
-
measures.[floor])
)
select {[Measures].[Percentiles]} on 0
,
([Shifts].[Shift].[Shift].members) on 1
from callcentre

To switch to the inclusive method you would modify the rangePoint calculated measure to read


MEMBER [Measures].[RangePoint] AS
((([measures].[Count]-1) *.90)+1) -1

SSAS Stored Procedures (ASSP)

Like T-SQL, the MDX solution requires lots of typing and reuse requires re-implementation; if you type anywhere like I do extra typing means lots of room for error. Fortunately SSAS provides a mechanism for creating reusable user defined functions, Analysis Services Stored Procedures.

To use the predefined Stored Procedure requires a fraction of the code and no need to re-implement the logic


with member measures.[90th percentile] as
[ssas storedprocedures].ValueAtPercentile(
NONEMPTY(([Date].[Day of Month].[day of month].members, [Measures].[Average Time Per Issue])),
 [Measures].[Average Time Per Issue], .90, true, "EXC")

select measures.[90th percentile] on 0,
 [Shifts].[Shift].[Shift].members on 1
from callcentre

The function takes 5 Parameters:

  1. The set to evaluate
  2. The measure to get the percentile of
  3. The percentile to find as a decimal
  4. The order the set{true, false}
  5. The calculation Method{‘EXC’, ‘INC’}

It should be noted that ASSP’s used within calculated measures may not perform as well as straight MDX, however for me having both options available is really useful. I did some high level testing  with these scripts using the SQL 2012 Adventure Works Cube and found the ASSP’s performed up to 14 time faster than the straight MDX scripts, bonus.

  • Using my laptop, fully loaded with 3 instances of SQL Server including 2012 with all SSAS flavours,  the ASSP executed in two seconds for both the Inclusive and Exclusive modes where the straight MDX took 28 seconds for Inclusive and 9 Seconds for Exclusive

My work on this stored procedure was heavily influenced by the Analysis Services Stored Procedure Project on Codeplex, it was a huge help in the learning  how to write ASSP’s and it provided an Order() function that I was able to reuse. I have submitted the function to the ASSP Project Authors in hopes that it may be included there. For those of you that do not want to create the ASSP you can download the compiled .dll here and follow the steps outlined at the end of this section to add the assembly to Analysis Services.

If you want to create your own ASSP Library you can follow the steps below.

  • create a new C# Class Library project in Visual Studio
  • add the following references
    • Microsoft.AnalysisServices (C:\Program Files (x86)\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.AnalysisServices.DLL)
    • msmgdsrv (C:\Program Files (x86)\Microsoft Analysis Services\AS OLEDB\10\msmgdsrv.dll)
  • Set the target framework project property to .Net Framework 3.5
  • In project properties, Sign the assembly
  • Rename the Class1.cs to something meaningful (in my solution it is called percentiles.cs)
  • Add the following using statements

using Microsoft.AnalysisServices.AdomdServer;
using Microsoft.AnalysisServices;
using System.Collections.Generic;

Download the following file and add it to your solution (setOrdering Class from ASSP codeplex project)

  • Replace all the code under the using statements with the following

namespace ssas.storedprocedures
{
 public class PercentileFunctions
 {
 [SafeToPrepare(true)]
 public static double RangePoint(Set inputSet, double percentileValue, string calcMethod)
 {
 switch (calcMethod)
 {
 case "EXC":
 {
 return ((percentileValue) * (inputSet.Tuples.Count + 1)) - 1;
 }
 case "INC":
 {
 return (((percentileValue) * (inputSet.Tuples.Count -1))+1) - 1;
 }
 default:
 {
 return ((percentileValue) * (inputSet.Tuples.Count + 1)) - 1;
 }
 }
 }

[SafeToPrepare(true)]
 public static double ValueAtPercentile(Set inputSet, Expression sortExpression, double percentileValue, bool sortAscending, string calcMethod)
 {
 //get position where percentile falls
 double Rank = RangePoint(inputSet, percentileValue, calcMethod);

 //order the set ascending using Codeplex SSAS Stored Procedure Function
 if (sortAscending)
 {
 Set s = setOrdering.Order(inputSet, sortExpression);

//if the Rank is a whole number
 if ((Rank % 1) == 0)
 {
 return sortExpression.Calculate(s.Tuples[Convert.ToInt32(Rank)]).ToDouble();
 }
 //if Rank is a decimal
 else
 {
 return
 (sortExpression.Calculate(s.Tuples[(int)Math.Floor(Rank)]).ToDouble())
 +
 (Rank % 1 *
 (sortExpression.Calculate(s.Tuples[(int)Math.Ceiling(Rank)]).ToDouble()
 - sortExpression.Calculate(s.Tuples[(int)Math.Floor(Rank)]).ToDouble())
 );
 /*(((sortExpression.Calculate(s.Tuples[Convert.ToInt32(Rank) + 1]).ToDouble()
 - sortExpression.Calculate(s.Tuples[Convert.ToInt32(Rank)]).ToDouble())) * (Rank - Convert.ToInt32(Rank)));*/
 }
 }
 else {
 //if the Rank is a whole number
 if ((Rank % 1) == 0)
 {
 return sortExpression.Calculate(inputSet.Tuples[Convert.ToInt32(Rank)]).ToDouble();
 }
 //if Rank is a decimal
 else
 {
 return
 (sortExpression.Calculate(inputSet.Tuples[Convert.ToInt32(Rank)]).ToDouble())
 +
 (((sortExpression.Calculate(inputSet.Tuples[Convert.ToInt32(Rank) + 1]).ToDouble()
 - sortExpression.Calculate(inputSet.Tuples[Convert.ToInt32(Rank)]).ToDouble())) * (Rank - Convert.ToInt32(Rank)));
 }
 }
 }
}
}
  • Compile the Code
  • Open SSMS, connect to Analysis Services
  • Right click on the Assemblies folder and select New Assembly
  • Browse to the compiled .dll
  • check the include debug information if you want debug the function (how to debug link)
  • use in a mdx query

Happy querying.

Generating Alert Emails with Dynamic Attachments using SSIS

A couple months back a colleague asked me if there was an easy way to send email alerts to defined users when a stored procedure returned a non-empty result set, easy right? Well thats what I thought, then the real fun started, turns out that there were lots of stored procedures with different columns and datatypes going to different people. After some head scratching and googling a light bulb went off, and a solution started to take shape.

– Use a SQL Table to store the Meta Data, SP_Name, email address, file name

– Use SSIS to iterate through the metadata

– use a script task to create excel documents using excel automation (worked in BIDS or SSDT but package failed when executed from SQL Server so with the help of this awesome post, Creating-basic-Excel-workbook-with-Open-XML, I was able to use the Open XML SDK to generate the Excel docs.)

– save the documents to a file share

– use a script task to email attachments to users.

What follows is a how I did it.

Couple prerequisites:

1. You will need the Open XML sdk which can be found here

2. As you are reading my blog I assume that you have either BIDS 2008 R2 or SQL Server Data Tools (SQL 2012) installed, however one of the two is required

3. The following database objects: table to store meta data and stored procedures to call. Below you will find a script to create the SQL objects that I used in to validate the solution

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE procedure [dbo].[sp_DummyProc1]
as
BEGIN
select top 10 users.UserID,users.UserName, users.UserType, users.AuthType from ReportServer.dbo.Users
END
GO
/****** Object:  StoredProcedure [dbo].[sp_DummyProc2]    Script Date: 31/05/2012 7:10:46 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
create procedure [dbo].[sp_DummyProc2]
as
BEGIN
select top 10 Subscriptions.SubscriptionID from ReportServer.dbo.Subscriptions
END
GO
/****** Object:  Table [dbo].[MyTest]    Script Date: 31/05/2012 7:10:46 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[MyTest](
	[myKey] [int] NOT NULL,
	[myValue] [int] NULL,
	[RV] [timestamp] NOT NULL,
PRIMARY KEY CLUSTERED
(
	[myKey] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO
/****** Object:  Table [dbo].[t_StoredProceduresToCall]    Script Date: 31/05/2012 7:10:46 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[t_StoredProceduresToCall](
	[procedureKey] [int] IDENTITY(1,1) NOT NULL,
	[storedProcName] [varchar](200) NULL,
	[emailAddress] [nvarchar](100) NULL,
	[fileName] [nvarchar](100) NULL,
 CONSTRAINT [PK_t_StoredProceduresToCall] PRIMARY KEY CLUSTERED
(
	[procedureKey] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO
SET ANSI_PADDING OFF
GO
INSERT [dbo].[MyTest] ([myKey], [myValue]) VALUES (1, 2)
GO
INSERT [dbo].[MyTest] ([myKey], [myValue]) VALUES (2, 2)
GO
SET IDENTITY_INSERT [dbo].[t_StoredProceduresToCall] ON

GO
INSERT [dbo].[t_StoredProceduresToCall] ([procedureKey], [storedProcName], [emailAddress], [fileName]) VALUES (1, N'sp_DummyProc1', N'someone@somewhere.com', N'excelFile')
GO
INSERT [dbo].[t_StoredProceduresToCall] ([procedureKey], [storedProcName], [emailAddress], [fileName]) VALUES (2, N'sp_DummyProc1', N'someone@somewhere.com', N'excelFile1')
GO
SET IDENTITY_INSERT [dbo].[t_StoredProceduresToCall] OFF
GO

Initial Plumbing

  • Create new Integration Services Project


  • Create the following package scoped variables:
Name Data type Value Expression
attachments String
directory String C:\Folder\
emailAddress String someone@somewhere.com
emailAddresses Object System.Object
filename String
fileNames Object System.Object
getStoredProcQuery String select StoredProcName,[fileName] from [dbo].[t_StoredProceduresToCall]
where emailAddress = ‘someone@somewhere.com’
“select StoredProcName,[fileName] from [dbo].[t_StoredProceduresToCall]
where emailAddress = ‘” + @[User::emailAddress] + “‘”
storedProcName String
StoredProcResults Object System.Object
  • Create an ADO.Net Connection Manager, pointing to the db where sql objects exist, name it adonet

Create a Query that returns a distinct list of email addresses the results will be used in our outer loop

  • Drag a SQL Task onto the Control Flow surface, call it GetEmailAddresses
    • Edit the task
    • Set the ResultSet to Full Result Set
    • Set the connection type to ADO.NET
    • Select the adonet connection
    • Set the SQL Statement to
    • SELECT distinct emailaddress from t_StoredProceduresToCall


    • Click on the Result Set Item in the Navigation list on the left
Click Add and Map Result Name 0 to the User:emailAddresses variable

Create the Outer Loop to send results to distinct email addresses

  • Drag a Foreach loop Container onto the Control Flow surface, call it Foreach Email
  • Connect the GetEmailAddresses sql task
    • Open the editor and set the enumerator to Foreach ADO Enumerator
  • Select the User::emailAddresses variable for the ADO source object variable


  • Select Variable Mappings and map the User::emailAddress variable to index 0


Get the Stored Procedures to evaluate by user

  • Drag a execute sql task into the Foreach Email container, call it Get Stored Procedures by User
    • Edit the task
  • Set ResultSet to Full Result Set
  • Set ConnectionType to ADO.NET
  • Set Connection to adonet
  • Set SQLSourceType to Variable
  • Set SourceVariable to User::getStoredProcQuery


  • Click on Result Set
  • Click Add and assign User::StoredProcResults to Result Name 0


Create the Inner Loop to generate the Excel Files by email address

  • Drag a Foreach loop Container into the Foreach Email container, call it Foreach SP by email
  • Connect the Get Stored Procedures by User sql task
    • Open the editor and set the enumerator to Foreach ADO Enumerator
  • Select the User::StoredProcReults variable for the ADO source object variable


  • Select Variable Mappings
    • map the User::storeProcName variable to index 0
    • map the User::fileName variable to index 1


Generate Excel Files and Save to file System

  • Drag a Script Task into the Foreach SP by email container, call it Generate Excel Files
    • Edit the task
  • Set the ReadOnlyVariable to
    • User::directory
    • User::emailAddress
    • User::filename
    • User::storedProcName
  • Set the ReadWriteVariables to
    • User::attachment


  • Click Edit Script and add the following references to the project
  • I am assuming that you are somewhat familiar with SSIS Script tasks and feel comfortable within the VSTA environment
    • DocumentFormat.OpenXml
      • C:\Program Files (x86)\Open XML SDK\V2.0\lib\DocumentFormat.OpenXml.dll
    • System.Xml.Linq
      • C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0\System.Xml.Linq.dll
    • WindowsBase
      • C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0\WindowsBase.dll


  • Add the following using declarations to the code
    using DocumentFormat.OpenXml.Packaging;
    using DocumentFormat.OpenXml.Spreadsheet;
    using DocumentFormat.OpenXml;
    using System.Xml.Linq;
    using System.Linq;
    using System.Data.SqlClient;
    
  • Replace the Main() Method with the following Code
    • the code 
    • executes the stored procedure found in the User::storedProcName variable
    • if the SP returns any rows
      • a unique directory is created
      • The CreateExcelWorkbook method is called which builds a new excel workbook dynamically based on the DataReader passed into the function and saves the file
      • File location is added to the User::attachements variable
     public void Main()
            {
    
                using (SqlConnection conn = (SqlConnection)Dts.Connections["adonet"].AcquireConnection(null))
                {
                    SqlCommand comm = new SqlCommand();
                    comm.Connection = conn;
                    comm.CommandType = CommandType.StoredProcedure;
                    comm.CommandText = Dts.Variables["User::storedProcName"].Value.ToString();
    
                    if (conn.State == ConnectionState.Closed)
                        conn.Open();
                    SqlDataReader reader = comm.ExecuteReader();
                    try
                    {
                        if (reader.HasRows)
                        {
                            try
                            {
                                string szFileName = createDirectory() + "\\" + Dts.Variables["User::fileName"].Value.ToString() + ".xlsx";
                                CreateExcelWorkbook(szFileName, reader);
    
                                Dts.Variables["User::attachments"].Value = Dts.Variables["User::attachments"].Value.ToString() + "|" + szFileName;
                            }
                            catch
                            {
                                Dts.TaskResult = (int)ScriptResults.Failure;
                            }
                        }
                    }
                    catch
                    {
                        Dts.TaskResult = (int)ScriptResults.Failure;
                    }
                    finally
                    {
                        Dts.TaskResult = (int)ScriptResults.Success;
                        reader.Close();
                    }
                }
            }
    
  • Add the following Methods
    • These methods are what do all of the hard work, I made some mods to generate the rows and columns but all in all the author of Creating-basic-Excel-workbook-with-Open-XML, Mika Wendelius deserves all of the credit for providing a great tutorial and the code to back it up.
       private string createDirectory()
            {
                //Create Folder for email recipient
                string newPath = System.IO.Path.Combine(Dts.Variables["User::directory"].Value.ToString(), Dts.Variables["User::emailAddress"].Value.ToString());
                System.IO.Directory.CreateDirectory(newPath);
                // Create the subfolder
                newPath = System.IO.Path.Combine(newPath, System.Guid.NewGuid().ToString());
                System.IO.Directory.CreateDirectory(newPath);
    
                return newPath;
            }
    
     public static SpreadsheetDocument CreateWorkbook(string fileName)
            {
                SpreadsheetDocument spreadSheet = null;
                SharedStringTablePart sharedStringTablePart;
                WorkbookStylesPart workbookStylesPart;
    
                try
                {
                    // Create the Excel workbook
                    spreadSheet = SpreadsheetDocument.Create(fileName, SpreadsheetDocumentType.Workbook, false);
    
                    // Create the parts and the corresponding objects
                    // Workbook
                    spreadSheet.AddWorkbookPart();
                    spreadSheet.WorkbookPart.Workbook = new Workbook();
                    spreadSheet.WorkbookPart.Workbook.Save();
    
                    // Shared string table
                    sharedStringTablePart = spreadSheet.WorkbookPart.AddNewPart();
                    sharedStringTablePart.SharedStringTable = new SharedStringTable();
                    sharedStringTablePart.SharedStringTable.Save();
    
                    // Sheets collection
                    spreadSheet.WorkbookPart.Workbook.Sheets = new Sheets();
                    spreadSheet.WorkbookPart.Workbook.Save();
    
                    // Stylesheet
                    workbookStylesPart = spreadSheet.WorkbookPart.AddNewPart();
                    workbookStylesPart.Stylesheet = new Stylesheet();
                    workbookStylesPart.Stylesheet.Save();
                }
                catch (System.Exception exception)
                {
                    MessageBox.Show(exception.Message);
                }
    
                return spreadSheet;
            }
    
            public static bool AddWorksheet(DocumentFormat.OpenXml.Packaging.SpreadsheetDocument spreadsheet, string name)
            {
                DocumentFormat.OpenXml.Spreadsheet.Sheets sheets = spreadsheet.WorkbookPart.Workbook.GetFirstChild();
                Sheet sheet;
                WorksheetPart worksheetPart;
    
                // Add the worksheetpart
                worksheetPart = spreadsheet.WorkbookPart.AddNewPart();
                worksheetPart.Worksheet = new Worksheet(new SheetData());
                worksheetPart.Worksheet.Save();
    
                // Add the sheet and make relation to workbook
                sheet = new DocumentFormat.OpenXml.Spreadsheet.Sheet()
                {
                    Id = spreadsheet.WorkbookPart.GetIdOfPart(worksheetPart),
                    SheetId = (uint)(spreadsheet.WorkbookPart.Workbook.Sheets.Count() + 1),
                    Name = name
                };
                sheets.Append(sheet);
                spreadsheet.WorkbookPart.Workbook.Save();
    
                return true;
            }
            public void CreateExcelWorkbook(string workbookName, SqlDataReader reader)
            {
                DocumentFormat.OpenXml.Packaging.SpreadsheetDocument spreadsheet;
                DocumentFormat.OpenXml.Spreadsheet.Worksheet worksheet;
                spreadsheet = CreateWorkbook(workbookName);
                if (spreadsheet == null)
                {
                    return;
                }
                AddWorksheet(spreadsheet, "Error Records");
                worksheet = spreadsheet.WorkbookPart.WorksheetParts.First().Worksheet;
    
                int fieldCount = reader.FieldCount;
    
                uint iterator = 1;
                for (int i = 0; i < reader.FieldCount; i++)
                {
                    SetCellValue(spreadsheet, worksheet, iterator, 1, CellValues.String, reader.GetName(i).ToString());
                    iterator++;
                }
                iterator = 1;
                uint row = 2;
                while (reader.Read())
                {
                    for (int j = 0; j < reader.FieldCount; j++)
                    {
                        SetCellValue(spreadsheet, worksheet, iterator, row, CellValues.String, reader.GetValue(j).ToString());
                        iterator++;
                    }
                    row++;
                    iterator = 1;
                }
    
                worksheet.Save();
                spreadsheet.Close();
            }
            private static bool SetCellValue(DocumentFormat.OpenXml.Packaging.SpreadsheetDocument spreadsheet, DocumentFormat.OpenXml.Spreadsheet.Worksheet worksheet, uint columnIndex, uint rowIndex, DocumentFormat.OpenXml.Spreadsheet.CellValues valueType, string value)
            {
                DocumentFormat.OpenXml.Spreadsheet.SheetData sheetData = worksheet.GetFirstChild();
                DocumentFormat.OpenXml.Spreadsheet.Row row;
                DocumentFormat.OpenXml.Spreadsheet.Row previousRow = null;
                DocumentFormat.OpenXml.Spreadsheet.Cell cell;
                DocumentFormat.OpenXml.Spreadsheet.Cell previousCell = null;
                DocumentFormat.OpenXml.Spreadsheet.Columns columns;
                DocumentFormat.OpenXml.Spreadsheet.Column previousColumn = null;
                string cellAddress = ColumnNameFromIndex(columnIndex) + rowIndex;
                // Check if the row exists, create if necessary
                if (sheetData.Elements().Where(item => item.RowIndex == rowIndex).Count() != 0)
                {
                    row = sheetData.Elements().Where(item => item.RowIndex == rowIndex).First();
                }
                else
                {
                    row = new DocumentFormat.OpenXml.Spreadsheet.Row() { RowIndex = rowIndex };
    
                    for (uint counter = rowIndex - 1; counter > 0; counter--)
                    {
                        previousRow = sheetData.Elements().Where(item => item.RowIndex == counter).FirstOrDefault();
                        if (previousRow != null)
                        {
                            break;
                        }
                    }
                    sheetData.InsertAfter(row, previousRow);
                }
    
                // Check if the cell exists, create if necessary
                if (row.Elements().Where(item => item.CellReference.Value == cellAddress).Count() > 0)
                {
                    cell = row.Elements().Where(item => item.CellReference.Value == cellAddress).First();
                }
                else
                {
                    // Find the previous existing cell in the row
                    for (uint counter = columnIndex - 1; counter > 0; counter--)
                    {
                        previousCell = row.Elements().Where(item => item.CellReference.Value == ColumnNameFromIndex(counter) + rowIndex).FirstOrDefault();
                        if (previousCell != null)
                        {
                            break;
                        }
                    }
                    cell = new DocumentFormat.OpenXml.Spreadsheet.Cell() { CellReference = cellAddress };
                    row.InsertAfter(cell, previousCell);
                }
    
                // Check if the column collection exists
                columns = worksheet.Elements().FirstOrDefault();
                if (columns == null)
                {
                    columns = worksheet.InsertAt(new DocumentFormat.OpenXml.Spreadsheet.Columns(), 0);
                }
                // Check if the column exists
                if (columns.Elements().Where(item => item.Min == columnIndex).Count() == 0)
                {
                    // Find the previous existing column in the columns
                    for (uint counter = columnIndex - 1; counter > 0; counter--)
                    {
                        previousColumn = columns.Elements().Where(item => item.Min == counter).FirstOrDefault();
                        if (previousColumn != null)
                        {
                            break;
                        }
                    }
                    columns.InsertAfter(
                       new DocumentFormat.OpenXml.Spreadsheet.Column()
                       {
                           Min = columnIndex,
                           Max = columnIndex,
                           CustomWidth = true,
                           Width = 9
                       }, previousColumn);
                }
    
                // Add the value
                cell.CellValue = new DocumentFormat.OpenXml.Spreadsheet.CellValue(value);
                if (valueType != DocumentFormat.OpenXml.Spreadsheet.CellValues.Date)
                {
                    cell.DataType = new DocumentFormat.OpenXml.EnumValue(valueType);
                }
                worksheet.Save();
    
                return true;
            }
    
            public static string ColumnNameFromIndex(uint columnIndex)
            {
                uint remainder;
                string columnName = "";
    
                while (columnIndex > 0)
                {
                    remainder = (columnIndex - 1) % 26;
                    columnName = System.Convert.ToChar(65 + remainder).ToString() + columnName;
                    columnIndex = (uint)((columnIndex - remainder) / 26);
                }
    
                return columnName;
            }
    

Email the files

  • Drag a Script Task into the Foreach Email container, call it Email Files
  • Edit the task
  • Set the ReadOnlyVariable to
    • User::emailAddress
  • Set the ReadWriteVariables to
    • User::attachment
  • Click Edit Script and add the following references to the project
  • Add the following using declarations to the code
    using System.Net;
    using System.Net.Mail;
    
  • Replace the Main() Method with the following Code
    • the code executes the stored procedure found in the User::storedProcName variable
    • if the SP returns any rows
      • a unique directory is created
      • Excel File is generated and saved
      • File location is added to the User::attachements variable
    • You will need to modify the following for this actually send emails
      • Replace from@somewhere.com with a valid from address in the creation of the MailMessage object
      • Replace smtpserver with a valid smtp server address in the creation of the SmtpClient object
      • Replace username and password with a valid values in the creation of the NetworkCredential object
      public void Main()
            {
                try
                {
                    string attachments = Dts.Variables["attachments"].Value.ToString();
                    Dts.Variables["attachments"].Value = "";
                    MailMessage emailMessage = new MailMessage("from@somewhere.com", Dts.Variables["emailAddress"].Value.ToString());
                    attachments = attachments.TrimStart('|');
                    string[] attachmentArray = attachments.Split('|');
                    for (int i = 0; i < attachmentArray.Length; i++)
                    {
                        Attachment a = new Attachment(attachmentArray[i]);
                        emailMessage.Attachments.Add(a);
                    }
                    emailMessage.Subject = "You have errors to deal with see attached documents";
                    // TODO: Add your code here
    
                    SmtpClient mySmtpClient = new SmtpClient("smtpserver");
                    mySmtpClient.Credentials = new System.Net.NetworkCredential("username","password");
    
                    mySmtpClient.Send(emailMessage);
    
                    Dts.TaskResult = (int)ScriptResults.Success;
                }
                catch
                {
                    Dts.TaskResult = (int)ScriptResults.Failure;
                }
            }
    

    Executing the package should result in email(s) being generated with attachments based on the result set returned by the stored procedures.
    Hope this helps a couple people out.

Dynamically generated Create Table Statement SQL Server

— Modified the script to capture and include column default values

It has been a while since I last added to the googleverse, been taking more than I’ve been giving lately, time to give a little back.

In this short post you will find a query that builds a CREATE TABLE statement given a valid table name. The script only handles a single Primary Key Constraint and does not handle foriegn key contraints, but even with it’s limitations I hope this helps someone out there.

declare @SqlStart nvarchar(max) = ''
declare @sql nvarchar(max) = ''
declare @sqlEnd nvarchar(max) = ''
declare @table varchar(max) = 'T_ESB_Submission_Template'

set @SqlStart = 'CREATE TABLE [' + @table + '] (' + Char(13) 
				
Select @sql = @sql + a.Column_Name + ' ' 
+ Data_Type 
+ CASE WHEN CHARACTER_MAXIMUM_LENGTH  IS NULL or DATA_TYPE = 'xml' THEN '' 
ELSE '(' + 
CASE WHEN CHARACTER_MAXIMUM_LENGTH>0 THEN 
CAST(CHARACTER_MAXIMUM_LENGTH as varchar(10))
ELSE 'max' END +')' END
+ CASE WHEN NUMERIC_PRECISION IS NULL OR DATA_TYPE in ('Int', 'tinyint', 'bigint', 'smallint') THEN '' 
ELSE '(' + CAST(NUMERIC_PRECISION as varchar(10)) +','+CAST(NUMERIC_SCALE as varchar(10)) +')' END 
+ CASE when exists (  select id from syscolumns where object_name(id)=@table 
		and name=a.column_name 
		and columnproperty(id,name,'IsIdentity') = 1  ) 
THEN 
' IDENTITY(' +  
cast(ident_seed(@table) as varchar) + ',' +  
cast(ident_incr(@table) as varchar) + ')' ELSE '' END 
+ CASE WHEN b.default_value is not null THEN ' DEFAULT ' + 
substring(b.default_value, 2, Len(b.default_value)-2) + ' ' ELSE '' END
+ CASE WHEN IS_NULLABLE = 'NO' THEN ' NOT NULL' ELSE ' NULL ' END + CHAR(13) + ','
from  Information_Schema.COLUMNS a
join 
(  SELECT so.name AS table_name, 
          sc.name AS column_name, 
          sm.text AS default_value
     FROM sys.sysobjects so
     JOIN sys.syscolumns sc ON sc.id = so.id
 left JOIN sys.syscomments SM ON sm.id = sc.cdefault
    WHERE so.xtype = 'U' 
      AND SO.name = @table) b on b.column_name = a.COLUMN_NAME and b.table_name = a.TABLE_NAME
where a.Table_Name = @table

if((select count(1) from INFORMATION_SCHEMA.TABLE_CONSTRAINTS where TABLE_NAME = @table) >0)
BEGIN
     select @sqlEnd =char(13) + 'CONSTRAINT [PK_' + @table + '_1] PRIMARY KEY NONCLUSTERED' +char(13)
	+'(' +char(13) + Column_Name + ' ASC ' + char(13)+')WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF,         IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY]) ON [PRIMARY]'
from INFORMATION_SCHEMA.KEY_COLUMN_USAGE  where TABLE_NAME = @table					

set @Sql = @SqlStart + SUBSTRING(@sql, 0, len(@sql)-1) + @sqlEnd
END
else
BEGIN
     set @Sql = @SqlStart + SUBSTRING(@sql, 0, len(@sql)-1) + ')'
END

print @sql				
			

SQL 2012 – Exciting times

Just installed SQL 2012 and it was the most pleasant installation experience that I have had with SQL Server, no work arounds or config file modifications; double clicked setup.exe and clicked next until finished. Make sure you uninstall any Denali/2012 pre RTM instances including PowerPivot that you may have and remove old Denali/2012 PowerPivot and SSRS service applications in SharePoint Central Admin

Performance Point Services Excel Like Stacked Bar Chart

My favorite thing about excel is that you can do just about anything in it, like create a stacked bar chart representing the growth of a measure over time and represent each time period increment as a distinct segment in the each bar like the chart below.

My least favorite thing about excel is being are asked to replicate ‘simple’ excel charts using a PerformancePoint Analytical Chart. Easy right; just create a Quarter-To-Date Calculated Measure in your Cube (Contoso Cube used this example) throw it and your quarters into a chart and you’re done. Well the chart displays the data correctly, measure grows over time but the colors aren’t as nice and the chart does not stack by the value attributed to each quarter

Ok so what happens if we move the QTD measure to the bottom axis and quarters to the series, again the data is correct but this time although the chart identifies each segment attributed to each quarter the chart is no longer displays columns for each quarter.

So what we need is the second chart but with some new calculated measures on the bottom axis.

Each of the measures on the bottom axis represent a time period we are interested in and only returns the value for that period and prior periods. Let’s have a look at the MDX that I used to accomplish the visualization in this chart.

/*Q1 Measure 
Use iif to check if the currentmember of the fiscal quarter attribute = “Q1”, 
True – build string to represent the quarter member, use sttomember function to convert to member, complete --tuple with measure
False - null*/

CREATE MEMBER CURRENTCUBE.[measures].[Q1]
as
iif(LEFT([fiscal quarter].currentmember.membervalue, 2) = "Q1",
(strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity]), null),
FORMAT_STRING = "Standard", 
VISIBLE = 1; 

/*Q2 Measure 
Use iif to check if the number of the period represented by the currentmember of the fiscal quarter attribute <=2, 
True – Check to see if empty, set to null, else build string to represent the quarter member, use strtomember function to convert to member
complete tuple with measure
False - null*/

CREATE MEMBER CURRENTCUBE.[measures].[Q2] as
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 2,
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 2 and
isEMPTY((strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),
null,
(strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),null),/*)*/
FORMAT_STRING = "Standard", 
VISIBLE = 1; 

/*Q3 and Q4 are the same as Q2 just increment the <= number*/

CREATE MEMBER CURRENTCUBE.[measures].[Q3] as
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 3,
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 3 and
isEMPTY((strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),
null,
(strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),null),/*)*/
FORMAT_STRING = "Standard", 
VISIBLE = 1; 

CREATE MEMBER CURRENTCUBE.[measures].[Q4] as
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 4,
iif( cint(right(left([fiscal quarter].currentmember.membervalue,2),1)) <= 4 and
isEMPTY((strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),
null,
(strtomember("[fiscal quarter].&["+
RIGHT([Fiscal Year].currentmember.name, 4)+
RIGHT([Fiscal Quarter].currentmember.name, 1) + "]"),
[Measures].[Inventory On Hand Quantity])),null),/*)*/
FORMAT_STRING = "Standard", 
VISIBLE = 1; 

/*Use scope statements to set the value for the all member of the quarter attribute when combined with the calculated members to null*/

SCOPE(([Fiscal Quarter].[All],{[Measures].[Q1]})); 
THIS =  null; 
END SCOPE; 

SCOPE(([Fiscal Quarter].[All],{[Measures].[Q2]})); 
THIS =  null; 
END SCOPE; 

SCOPE(([Fiscal Quarter].[All],{[Measures].[Q3]})); 
THIS =  null; 
END SCOPE; 

SCOPE(([Fiscal Quarter].[All],{[Measures].[Q4]})); 
THIS =  null; 
END SCOPE;