Database Journal
MS SQL Oracle DB2 Access MySQL PostgreSQL Sybase PHP SQL Etc SQL Scripts & Samples Tips Database Forum Rss Feed

» Database Journal Home
» Database Articles
» Database Tutorials
MS Access
SQL Scripts & Samples
» Database Forum
» Slideshows
Free Newsletters:

News Via RSS Feed

Rss Feed

Database Journal |DBA Support |SQLCourse |SQLCourse2

Featured Database Articles


Posted Sep 27, 2004

MDX in Analysis Services: Mastering Time: Moving Averages - Another Approach - Page 3

By William Pearson

Hands-On Procedure


Before creating our calculated member in MSAS (in our case, it will be for a measure, thus my reference to "calculated measure" throughout the article), we will again assemble the MDX involved using the MDX Sample Application that installs along with MSAS. While it has its limitations, the Sample Application is a good choice for straightforward MDX construction in the context of procedural articles; anyone with access to MSAS probably has access to this tool. The Sample Application will provide us with a platform from which to construct and execute the MDX we examine, and to view the results datasets we obtain.

We will assemble the MDX for a rolling average here, to get a focused view of how it works, before pronouncing the MDX adequate for use in constructing an alternate rolling averages calculated measure in Analysis Services. To do this, we will take the following steps:

1.  Start the MDX Sample Application.

2.  Clear the top area (the Query pane) of any queries or remnants that might appear.

3.  Ensure that FoodMart 2000 is selected as the database name in the DB box of the toolbar.

4.  Select the Warehouse cube in the Cube drop-down list box.

Having received our instructions from the information consumers, we will first create the core MDX query, just as we did in our previous session. To this core, we will add the rolling average calculated measure next. Once again, the core query might be viewed as a "frame" within which we are casting the rolling average. Using such a frame, we can quickly verify operation of our MDX, before we consider permanently installing the calculated measure into the Warehouse cube structure.

The business requirement involves quarters (at least in the initial specification), across which the average of Warehouse Sales will move. In addition, the moving average needs to be displayed specifically for U.S. Store States. As always, when we receive a specification to construct any such calculated member, we want to keep in mind flexibility and portability. (For example, in this case, we can certainly envision the usefulness of being able to present the rolling average for other dimensions at some other point in time).

Let's construct the core query next.

5.  Type the following query into the Query pane:

-- MXAS19-2, Preparation for Creation of Rolling Average Calculated Measure (Alternate 2)
   CrossJoin({ [Time].[Quarter].Members },{[Measures].[Warehouse Sales]}) 
   {[Warehouse].[All Warehouses].[USA].Children} ON ROWS

6.  Execute the query by clicking the Run Query button in the toolbar.

The Results pane is populated by Analysis Services, and the dataset shown in Illustration 1 appears.

Illustration 1: Core Query Populates the Results Pane

We see the four quarters of 1997 and 1998, respectively (the two years in the Warehouse cube), together with the Warehouse Sales measure, populating the columns across, and the U.S. Store States (from the Store dimension) appearing on the row axis. This query is identical to the core query we constructed in Introduction to Moving Averages. It provides the "frame" we need to ascertain that the calculated measure we create next 1) accumulates the base measure, Warehouse Sales, correctly, 2) over the window of the quarters.

7.  Select File -> Save As, name the file MXAS19-1, and place it in a meaningful location.

8.  Leave the query open for the next section.

At this stage, while our ultimate objective is to return the same end results dataset, our query will diverge from its form in our earlier examination of rolling averages. We will use the Avg() function once again, within a WITH clause that defines a calculated member, to generate an identical moving average, but, instead of using Avg() in combination with the LastPeriods() function, as we did in our last article, we will use the .Lag() function, within a ranged time window, for definition of the rolling average calculated measure. The .Lag() function returns a "peer" member (a member existing along the same dimension of the member specified in the function) that exists a specified number of positions prior to the specified member. Positions are determined by the natural order of the dimension, and, as is the case of many MDX functions of this variety, the position numbering system is zero-based (meaning position zero is the position of the specified member itself). For details surrounding the .Lag() function, keep an eye out for a subsequent article on the function in my Database Journal MDX Essentials series.

We take the following steps to create the calculated member, and place it within the results dataset "frame" we have prepared.

9.  Within the query we have saved as MXAS19-1, replace the top comment line of the query with the following:

-- MXAS19-2 Rolling Average Calculated Measure (Alternate 2) Create and Retrieve

10.  Save the query as MXAS19-2, to prevent damaging MXAS19-1.

11.  Type the following into the Query pane, between the top comment line and the SELECT statement already in place:

[Measures].[Rolling Avg 2] 
'Avg ( { [Time].CurrentMember.Lag(3) : [Time].CurrentMember },
    [Measures].[Warehouse Sales])'

12.  Append the calculated measure created in the WITH statement above, [Measures].[Rolling Avg 2], to the first line of the SELECT statement, changing the following line:

CrossJoin({ [Time].[Quarter].Members },{[Measures].[Warehouse Sales]}) 

to the following

   CrossJoin({ [Time].[Quarter].Members },{[Measures].[Warehouse Sales], 
      [Measures].[Rolling Avg 2]}) ON COLUMNS,

The Query pane appears as shown in Illustration 2, with our modifications circled.

Illustration 2: The Query with Our Modifications

13.  Execute the query by clicking the Run Query button in the toolbar.

The Results pane is populated, and the dataset partially depicted in Illustration 3 appears (the image has been trimmed due to size).

Illustration 3: Result Dataset - The Rolling Average Appears (Partial View of Dataset)

As we can see, the calculated measure appears to be producing the desired results - results identical to those generated by the rolling average measure we constructed in our last session, as we intended. Just as was the case with our first approach to the rolling average, we see that it properly divides the total of the Warehouse Sales values by the number of quarters added together, through the first Q4 (that is, for 1997). There are no quarters in the Warehouse cube prior to Q1 of 1997, so the rolling average cannot "look back" to earlier quarters, and thus cannot divide by a full four quarters until it reaches Q4 of 1997).

Taking the California (CA) stores as an example, once again, we can see that, in moving to Q1 of 1998, the total Warehouse Sales of Q2, Q3, and Q4 of 1997 (14,734.32, 19,768.15, and 15,076.37, respectively), taken together with the Warehouse Sales of Q1 of 1998 (9,670.46), divided by four (4), gives us the correct value for the average of four rolling quarters at Q1 of 1998 (14,812.33).

We have established that he results dataset with the alternate approach appears to be identical. Let's examine the difference in the way we deliver those results. We use the Avg() function once again, but, unlike our first rolling average, where we specified the "last four months" using the LastPeriods() function, as shown below:

'Avg (LastPeriods (4, [Time].CurrentMember), [Measures].[Warehouse Sales])'

we have specified the "last four months" through the use of a specified time range, constructed with a combination of the .CurrentMember and Lag() functions, together with the colon (:) operator, as shown below:

'Avg ( { [Time].CurrentMember.Lag(3) : [Time].CurrentMember }, [Measures].[Warehouse Sales])'

Specifically, we are asking that Rolling Average 2 be calculated as the Warehouse Sales within a rolling time window of "the current month and the immediately lagging three months" (for a window range spanning a total of four months). Lag() allows us to specify a "negative offset" from the current member, and serves us well in the creation of moving time windows such as this.

14.  Re-save the query, and close the Sample Application, as desired.

We mentioned, at this stage in our previous article, that one of the advantages within the way that our calculated measure retrieves the rolling average, based upon the "current" point in time, is that we can use such a calculated measure in a report, within which we design a parameter / prompt (or other mechanism) to supply the "as of" date. This supports the recurring reuse of the calculated measure by information consumers in a manner that bypasses any coding on their part, or even knowledge of how the rolling average works. We can give the calculated measure an intuitive name, and add it to the report authors' toolkits as another component that they can drag and drop into place anytime they need a moving average within a report.

At this point, we can go into Analysis Manager, as we did in our last article, and add the calculated measure permanently to the cube. This way, it can be accessed by any reporting application with connectivity to the cube (and, of course, the capability to "see" calculated members). For the step-by-step procedure, see the section titled Procedure: Building the Rolling Average Calculated Measure in Analysis Services in Mastering Time: Introduction to Moving Averages here at Database Journal. Performing these steps with the current MDX will again confirm accuracy of operation, from the data view within the cube. In addition, creation of the calculated member within the cube structure, where easy browses of the data can be performed with the measure in place, will demonstrate, just as it did with our previous approach, that the rolling average calculated measure operates in a "contextually sensitive" way, from the perspective of the time dimension levels at which it is used.


In this article, we continued our examination of "rolling averages," which are a popular capability in business intelligence and other analysis spheres. Having discussed the nature of these aggregations and the results they are intended to provide in depth in our previous article, we set out to replicate the satisfaction of an identical business requirement, using an alternative approach from the perspective of the MDX behind the moving averages calculated member. To this end, we began with a rehash of the original hypothetical business requirement, in which a group of information consumers have requested a particular moving average capability for analysis purposes.

We then used the Sample Application, which accompanies an installation of MSAS, to construct the MDX required to support a calculated member that fills the business requirement in a manner identical to our initial rolling average. We focused on the differences in this alternative approach, all of which lay inside the Avg() function, within the definition of the calculated member. Finally, we directed the reader to the last section of our previous article for the steps required to create the calculated member in Analysis Manager. Creation of the calculated member in Analysis Services provides developers and consumers alike an opportunity to examine the interaction of the calculated member with the data of the cube within which it resides. Ultimately, it provides a permanent means by which intended audiences might retrieve rolling averages, in reports, queries, and general browses of the cube.

» See All Articles by Columnist William E. Pearson, III

MS SQL Archives

Latest Forum Threads
MS SQL Forum
Topic By Replies Updated
SQL 2005: SSIS: Error using SQL Server credentials poverty 3 August 17th, 07:43 AM
Need help changing table contents nkawtg 1 August 17th, 03:02 AM
SQL Server Memory confifuration bhosalenarayan 2 August 14th, 05:33 AM
SQL Server Primary Key and a Unique Key katty.jonh 2 July 25th, 10:36 AM